• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

The Last-Stop-Speak-In-Hyperbole Official Revolution Specs Thread

Shogmaster said:
I bet you a hundred spot that he also has no idea what the exact specs are. :lol

I haven't been following the conversation but I'd like to add that Gaybrush (sp?) was the only person on these boards who indicated a nunchuck configuration before the revmote was unveiled. High props in my book.

Y2Kevbug11 said:
Kameo certainly looks miles better than anything on Xbox 1 even at 480p. Why are people saying that? Is it because its subjective so it can't really be argued?

Kameo looks FANTASTIC.

Kameo does NOT look miles better than anything on Xbox 1 in 480p. I tried the game on a standard TV and was not impressed. It LOOKS better than Xbox 1 for sure, but nothing mind-boggling. HOWEVER, as soon as I switched to 720p the difference between Xbox 1 was HUGE.
 
heidern said:
No, actually your reading comprehension and logical reasoning is lacking. Although judging by your juvenile attitude that isn't too surprising.

Juvenile? I'm not the one engaging in bedtime fantasy stories.

But look I'll help you out. I was referencing this so called fact:



I assumed this is based on the IGN article regarding the Rev specs which is based on spotty information from anonymous sources. Unless you can point to more specific information from a rock solid source any such information is not worthy of being classed as fact right now.

Let me run this by you slowly so you can understand:

T h e__f a c t__i s ,__t h e r e__a r e__n o__r e p o r t s / l e a k s___f r o m__d e v e l o p e r s__o f__N i n t e n d o__a d v i s i n g__a n y__o f__t h e m__t o__a i m__m u c h__h i g h e r__t h a n__t h e__g i v e n__k i t s . THIS IS FACT.


No, it is immature whining.

Whatever makes you sleep better. Do you want a flashlight under the blanky?


Well seeing as the specs allegedly aren't finalised I imagine he doesn't know the exact specs. But he will probably have a better idea of the specs than you, after all he knew stuff about the Revolution controller before anyone else here. Whether his reports regarding the specs are factually correct or not, maybe, maybe not, but he hasn't to my recollection misled us in the past. His reports are no more or less reliable than any other information that has emerged regarding the specs, but they are a relevent piece of information. Again, if you can provide some links to reliable information regarding your facts feel free.

He never led astray because he never said anything of substance here! He is a PR monkey. As an artist monkey that worked in a big congrlomerate like Nintendo, I myself can assure you that us monkeys don't get to know as much as you'd think. Only a lot of internal rumors until the official memos hit. And Nintendo like any good technologically driven corporation will not let that loose for the fear of leaks.





nightez said:
Did you read the RV530 specs? Its more than capable enough, the only sacrfices to be made are on CPU dept.

I was one of the first ones to suggest RV530 (X1300) as a possible GPU base months back if a faster CPU was to be used in the Rev, such as the LV PPC970FX that IBM is making. But since it looks like Broadway is PPC750 based, forget all that. It's a total mismatch with the CPU, and all code will be severly CPU starved if that was the case (which it won't).
 
Shogmaster said:
No, it is immature whining.

Whatever makes you sleep better. Do you want a flashlight under the blanky?

Look, you may know some stuff and sure, you scored some points by calling the relative power of the Revolution based on its case size but surely you're not denying that you act like a 14-year old prick in the process? I mean that's pretty much your GAF gimmick or am I missing the joke? :laffo.jif:
 
INTERNET said:
Look, you may know some stuff and sure, you scored some points by calling the relative power of the Revolution based on its case size but surely you're not denying that you act like a 14-year old prick in the process? I mean that's pretty much your GAF gimmick or am I missing the joke? :laffo.jif:


You take that away from me, and Bish won't have anything to beat me over the head with. FOR GOD'S SAKE, THINK OF BISHOP TL!!! *






* 'Tis just a joke. Don't ban me Bish. ;)
 
What about this card...?


14-102-622-01.JPG


Radeon X1300 100141 Video Card - OEM

It only costs $95.. so by the time the Revoltuion ships next year... it may be within Nintendo's range.
 
Gahiggidy said:
What about this card...?


14-102-622-01.JPG


Radeon X1300 100141 Video Card - OEM

It only costs $95.. so by the time the Revoltuion ships next year... it may be within Nintendo's range.


Most agree it would be totally wasted being mated to a PPC750+VMX @ 1GHz and only having 88MB of total usable RAM. That damn card you got there comes with 256MB just for texture/frame buffer for cryin' out loud....
 
"Most agree" == You're opinion?


.edit -- Its a $95 dollar card. That series only goes up to around $140. And you are saying its too advanced for Revolution?!
 
Wow, they have a passively cooled x1300? Awesome.

I don't know man, a 1gig cpu could probably feed that thing pretty well. Less complex scenes but detailed objects and shader effects. That said I'm expecting something lower than that as well, even if only because of the ram limitations. Hope it will have SM2.0 at least...
 
Gahiggidy said:
"Most agree" == You're opinion?

Read the damn thread. How many have chimed in about PPC750 being a total mismatch to RV530 already!


.edit -- Its a $95 dollar card. That series only goes up to around $140. And you are saying its too advanced for Revolution?!

It's too advanced for PPC750+VMX @ 1GHz.
 
I might add that if 24MB of 1-STRAM could keep up with 64MB of xboxen RAM... than 88MB of the smae stuff ought to equal 234MB of the conventional RAM.
 
Shogmaster said:
Read the damn thread. How many have chimed in about PPC750 being a total mismatch to RV530 already!




It's too advanced for PPC750+VMX @ 1GHz.
That's not a RV530.. its an X1300.
 
Gahiggidy said:
I might add that if 24MB of 1-STRAM could keep up with 64MB of xboxen RAM... than 88MB of the smae stuff ought to equal 234MB of the conventional RAM.

See, you Nintendo guys keep doing this BS to yourselves... And it's ot doing you any favors. Look at multiplatform games like Splinter Cell and tell me with a straight face if that additional RAM count makes difference or not.

1T-SRAM is not some magical stuff that will give you bigger capacity over "regular" RAM. It provides lower manifacturing cost due to minimal transistor usage per MB. It's largely a cost benefit, not performance.


Gahiggidy said:
That's not a RV530.. its an X1300.

X1300 is part of the RV530 line.
 
Well, dude. Here's is ATi's website...

http://www.ati.com



You tell me which card there is most likely going to be the basis for Hollywood.
 
somewhat off=topic: The Next-Gen GameBoy

http://www.joystiq.com/entry/1234000753072868/

So whatever happened to the next Game Boy?

Posted Dec 16, 2005, 11:00 AM ET by Ben Striegel
Related entries: Nintendo DS, Nintendo Revolution, Portable, Sony PSP

I’ve been looking through Joystiq’s archives quite thoroughly lately, and no matter how many articles I see speculating on future hardware—including ones not even remotely announced—I can’t seem to find anything relating to the next iteration of the Game Boy, aside from one tiny piece from July ‘04 confirming that, yes, Nintendo is working on it, maybe possibly perhaps. This got me to thinking: how will Nintendo market a traditional handheld after the undeniable success of their nonconformist third pillar? Obviously differentiation is key, as high hardware sales translate into higher software sales, which are the crux of a publishing giant such as Nintendo. How will the Game Boy Evolution differentiate itself from the DS and all of its subsequent progeny? Nintendo isn’t talking—yet—so we’ll opt for the next best thing: rampant speculation!



The GBE will be released six months to a year after the launch of the Revolution. Considering a best-case scenario, Japanese gamers could be playing the Evolution as early as next December. Why, you ask? For much the same reason as the near-coinciding launch of the GBA and GCN…
The Revolution will connect with the Game Boy Evolution, but not with the Nintendo DS. Aside from the usual benefits of a controller with a built-in screen, classic Nintendo titles can be downloaded onto the GBE’s internal memory for nostalgia on the go. And speaking of nostalgia…
Unlike the DS and the GBM, which are only equipped to play Game Boy Advance cartridges, the GBE will allow you to play your entire back catalog of Game Boy titles, thus preserving the longevity of the Game Boy brand. Obviously it won’t accept DS cartridges, despite the fact that…
The GBE will store game data on DS-esque flash memory cards, rather than an optical storage medium such as the PSP’s UMDs, thus eliminating the need for separate memory cards and battery-intensive lasers. And while we’re talking comparisons to the PSP…
The Game Boy Evolution will feature full 3D capabilities, though the hardware will only exhibit a marginal increase over the PSP’s graphics. Nintendo’s never felt the need to sell their consoles as the most graphically superior (see Game Boy vs. Game Gear), relying instead on the overall experience as the system’s unique selling point. But before we get away from the graphical aspects there’s one last thing worth mentioning…
The GBE’s screen resolution will be drastically increased over the GBA’s, though all the while retaining a similar aspect ratio. Despite consumer demand, the screen will not be backlit.
...until the GBE SP.

Not only will the GBE allow you to play all of your classic Game Boy games, it will allow you to play all of your multiplayer Game Boy games wirelessly, including games not built especially with the GBA wireless adaptor in mind. By the way, how’s this for a seamless segue into the next bullet point…
The Game Boy Evolution will be the hottest gift of 1991 with the addition of X and Y face buttons, as well as a quasi-analog d-pad which will register variable pressure in up to eight directions. Oh, but that’s not all…
The GBE will include gyroscopic technology which will allow games such as Wario Ware: Twisted to ship without external gyro packs, as well as augmenting some of the motion-sensitive capabilities of the Revolution’s controller when used in its stead. The unit will also feature a built-in rumble, which leads us to our next point…
Without sufficient progress in battery storage technology, the battery life for the Game Boy Evolution will be the lowest of any Nintendo handheld to date, though to make up for this fact the GBE’s rechargeable battery will be detachable. Thanks to a mandatory sleep mode requirement akin to that of the DS, as well as thanks to a small rechargeable internal battery, battery packs will be hot-swappable with minimal interruption of gameplay. And finally…
The unit will sport a clamshell design reminiscient of the GBA SP, though slightly larger and with more ergonomically designed shoulder buttons. It will also be available in an stunning variety of colors and themes, unless you live in Germany, France, Indonesia, Brazil, Thailand, any country both beginning and ending in a vowel, every third nation whose flag incorporates a cross, or pretty much anywhere where the native language is not Japanese.
There, happy? That should be more than enough to tide the Game Boy faithful over until E3 2006. Anything we overlooked? Let us know! Until then, happy speculation.
 
Chrono said:
Speaking of the next Gameboy, Shogmaster: do you think a $99 Gameboy as powerful as GCN is possible in 2008?

2008 is so far into the future that I couldn't say without reaching even farther than I usually do. But I do think it's in the realm of possiblity as long as you don't attach the usual Nintendo profit margin on it.

With PSP profit margin (or shall we say loss margin?), I would say definitely! :lol
 
Shogmaster, how close do you think the Rev can get to Kameo's graphics. That's all I pretty much would want on it, for a Zelda game with close to that visual look.
 
Oblivion said:
Shogmaster, how close do you think the Rev can get to Kameo's graphics. That's all I pretty much would want on it, for a Zelda game with close to that visual look.

Faffy or other coders would do a better job answering this stuff, but..... Besides the obvious resolution difference, if you forget all the fancy shaders FXs like that rediculously show-offy parallex mapping, oodles of normal maps, and other bling blings like motion blur, DoF, HDR lighting, and goo gobs of enemies from Kameos and I'd think it'd be all good on the Rev. ;)
 
Shogmaster said:
Faffy or other coders would do a better job answering this stuff, but..... Besides the obvious resolution difference, if you forget all the fancy shaders FXs like that rediculously show-offy parallex mapping, oodles of normal maps, and other bling blings like motion blur, DoF, HDR lighting, and goo gobs of enemies from Kameos and I'd think it'd be all good on the Rev. ;)

Now you're just messing with them :P
 
Shogmaster said:
I guess waiting for those informations to solidify will give plenty of mental masterbation time for you guys to "keep the hope alive" on the forums. *rolleyes*
Shogmaster said:
You can't fucking read is now also a fact.
Shogmaster said:
What you call "whining" is me kicking some much needed sense into your brick heads. JEBUS!
Shogmaster said:
Who the fuck is this "GT" and where is this fucking quote? Or is it just another one of your wishful conjurings?
Shogmaster said:
I'm not the one engaging in bedtime fantasy stories.
Shogmaster said:
Whatever makes you sleep better. Do you want a flashlight under the blanky?
pd_computer_rage_050623_t.jpg


Juvenile? Seems so.

Shogmaster said:
He never led astray because he never said anything of substance here! He is a PR monkey.
Careful with your facts. GT doesn't work in PR. Although it doesn't really matter, either way you can't exactly say he's less reliable than sources who haven't even identifed themselves.

Shogmaster said:
Let me run this by you slowly so you can understand:

T h e__f a c t__i s ,__t h e r e__a r e__n o__r e p o r t s / l e a k s___f r o m__d e v e l o p e r s__o f__N i n t e n d o__a d v i s i n g__a n y__o f__t h e m__t o__a i m__m u c h__h i g h e r__t h a n__t h e__g i v e n__k i t s . THIS IS FACT.

That's nice. It's also different to what you said originally which was about Nintendo's intentions as a whole, not the anonymous and spotty leaks of those intentions. I even had the decency to point it out again specifically for you, but in your petulant bravado you seem to have missed it. But here it is again for you.
Shogmaster said:
No such indications from Nintendo to developers. MS made it very clear to developers to aim much higher than Radeon 9800 and X800 from the beginning. Nintendo has not said anything of the sort to developers.
 
Well, excuse me if not spending every waking hour on the forum offends your sensibilities.

The GBE’s screen resolution will be drastically increased over the GBA’s, though all the while retaining a similar aspect ratio. Despite consumer demand, the screen will not be backlit.
...until the GBE SP.
I'm all for for speculation but that's something that would spark riots.

Liquid said:
late or not... at least he showed up, which is more than we can say for you. :lol
Oh, that's a nice thread. :lol I guess I should stop wasting my time in this one.
 
Shogmaster said:
None of the above, but the close as I can think of among the ATi line would be Mobility Radeon 9200 that had 3MB of 1T-SRAM embedded into the die.

I know you like to keep pointing out the 3MB of on-chip RAM, but, suppossing that that info is wrong, what would you suppose/suggest the Hollywood chip to be? Keep in mind that (as you said) 3MB wouldn't make much sense (I mean, isn't that what Flipper has basically?), IGN's sources aren't the end-all gospel and ultimatly that "spec" could be wrong and/or upped by Nintendo as AGAIN: the Hollywood seems to be a big mystery even to developers working on alpha (souped up GCN) kits.

Not suggesting Hollywood will be magic...I know that it would be stupid/inefficient to put a magical hot/powerful GPU together with a mere 1GHz CPU, but you have to agree that a measly 3MB of on-chip 1T-SRAM is an easy problem to fix!
 
Liquid said:
late or not... at least he showed up, which is more than we can say for you. :lol

Dude, I KNOW my limitations. I'm fucking getting near that guy again, especially after catching up on all his past tyranical hijinx over at B3D! :lol

But the thing with Vince is, he will keep arguing on points he is wrong on, with superior firepower of his tech knowledge, just to do it even if he know that the side he is representing is flat out wrong. Witness in that thread that he never let's up on the notion of different variations of Cell that might be offered as plain old wrong, just because I'm saying it. The funny thing is, a few days later, there was news from Sony that they will do just what I suggested and release mini Cells. :D

But just beacue I have that fact on my side, do I want to get back on that battle field with him? NO WAY JOSE! :lol



capslock said:
Shogmaster, I have a massive boil on my ass, should I use a boric compress on it or lance it?

When in doubt, squeeze it out!
 
DrGAKMAN said:
Shogmaster...
Read my last post please!

When in doubt, squeeze it... oh wait.

DrGAKMAN said:
I know you like to keep pointing out the 3MB of on-chip RAM, but, suppossing that that info is wrong, what would you suppose/suggest the Hollywood chip to be? Keep in mind that (as you said) 3MB wouldn't make much sense (I mean, isn't that what Flipper has basically?), IGN's sources aren't the end-all gospel and ultimatly that "spec" could be wrong and/or upped by Nintendo as AGAIN: the Hollywood seems to be a big mystery even to developers working on alpha (souped up GCN) kits.

Not suggesting Hollywood will be magic...I know that it would be stupid/inefficient to put a magical hot/powerful GPU together with a mere 1GHz CPU, but you have to agree that a measly 3MB of on-chip 1T-SRAM is an easy problem to fix!

Whether it's a easy problem to fix or not, it seems that they decided to stick with it, so all this speculation into more powerful GPU seems quite moot to me...

Sure, the idea of a mini X360 tailored for SD must be quite very attractive to many of you right now, but since all the reports are pointing the other direction, why don't you guys just temper your expectations now and be pleasantly surprised if they trun out to be false later? Seems the smarter thing to do to me. *shrugs*
 
Shogmaster said:
Most agree it would be totally wasted being mated to a PPC750+VMX @ 1GHz and only having 88MB of total usable RAM. That damn card you got there comes with 256MB just for texture/frame buffer for cryin' out loud....

Uh the 256mb is not for textures or frambuffer. It is like saying that the xbox has up to 64 mb for the framebuffer and textures.
 
Monk said:
Uh the 256mb is not for textures or frambuffer. It is like saying that the xbox has up to 64 mb for the framebuffer and textures.

Then what te hell is the local memory on videocards for? Game code? 3D models? XViDs of pr0n? PC don't use unified RAM structure like XBox so I don't know why you are bringing it up BTW.
 
Shogmaster said:
Then what te hell is the local memory on videocards for? Game code? 3D models? XViDs of pr0n? PC don't use unified RAM structure like XBox so I don't know why you are bringing it up BTW.

Its like an L2 cache. Like the gc's 24MB of 1tsram and xboxes 64MB of ram.
 
I have to say one thing about the specs, you are probably looking at a beefed up version of the gc. There is likely no way that the thing will allow for sm 3.0. Much like the gc nintendo would prefer if there is a cheaper method of getting the same effects like for example bump mapping last gen IF they are going to get it at all.

I also really doubt that they will go with 3mb of cache this time again, it doesnt make sense that the rest of the system be upgraded a little and the cache not to be aswell. At the very least the texture cache and be upgraded a MB each so you have 3 MB for the frame buffer and 2 MB for the texture cache.

I mean if you got 2x the polygon and texturing power, the caches need to be bigger.
 
i think in terms of in game polygon performance revolution will offer ~2.5x what can be achieved on gamecube. any more than this and revolution would lose its advantage over ps3 and xbox 360 as cost effective for developers. keeping this kind of polygon count would limit time and effort needed to create 3d models.

however, as the name suggests, hollywood's talent will be in effects.

i expect revolution to have beautifully optimised hardware that produces impressive visuals and offers great value in a 'bang for buck' sense.
 
Why THE FUCK doesn't someone jsut ask Nitnendo if it will support Shader 3.0?

IGN's got thier "Nintenod minute" thing... why dont' they ask that next time?!
 
Gahiggidy said:
Why THE FUCK doesn't someone jsut ask Nitnendo if it will support Shader 3.0?

IGN's got thier "Nintenod minute" thing... why dont' they ask that next time?!

Because the answer would be cryptic and un important...probably along the lines of "The shaders in the Revolution are not the important part, but when you see the shaders, you will say wow."
 
What about the people from ATi who are actually engineering this thing? Is there not a way to get a hold of thier names?
 
Shogmaster said:
Whether it's a easy problem to fix or not, it seems that they decided to stick with it, so all this speculation into more powerful GPU seems quite moot to me...

Sure, the idea of a mini X360 tailored for SD must be quite very attractive to many of you right now, but since all the reports are pointing the other direction, why don't you guys just temper your expectations now and be pleasantly surprised if they trun out to be false later? Seems the smarter thing to do to me. *shrugs*

I just find it hard to believe (as Monk said) that Nintendo (who's known for making efficient clean architectures) is basically sticking with (Flipper's) 3MB of on-chip RAM when everything else on the system is upgraded...seems like an oversought bottleneck.

And why do you think that's the end-all truth when it doesn't make sense AND even the IGN article says that the GPU is the biggest mystery right now? Not suggesting that Hollywood will somehow be a SDTV version of X360's GPU, nor do I really care about that...it's just I'm asking you *if the 3MB of on-chip RAM is false* what would you suggest they do to get the best performance out of such a small box? You're insight would be helpful to my curiosity.
 
Gahiggidy said:
What about the people from ATi who are actually engineering this thing? Is there not a way to get a hold of thier names?

dont want to interupt their important work :D :) probably couldn't get ahold of them even if you went all out -
 
Monk said:
Its like an L2 cache. Like the gc's 24MB of 1tsram and xboxes 64MB of ram.

Are you freakin high? 24MB of 1T-SRAM in GC and 64MB in XBox are the entire usable RAM. It's no fucking L2 cache. Do you even know what an L2 cache is?

Do not compare PC's RAM structure with GC and XBox's. They are entirely different. GC stores everything except for sound and animation on that 24MB of 1T-SRAM. The 3MB of eDRAM is used for streaming in textures and other data to create frame buffers. XBox is leaving how you devide up the RAM entirely up to the dev, and thus is a complete unified RAM set up.

On todays's videocards, it is streaming in data from main RAM to the vid card's local memory to do vertex shader operations, but the most all that room on the local memory of the videocard is to store all them textures along with the frame buffer.
 
Who at ATi is working on the Rev chipset is pretty irrelevant.

The system will be exactly how Nintendo wants it, and more importantly will cost exactly what Nintendo wants it to cost.

I have little doubt that ATi probably offered/showed Iwata and company several different architectures with tech demos to illustrate what the system's would be capable of on-screen (HDTV vs. SD etc. etc. etc.) and Nintendo chose the basic visual level that would be good enough for them at the lowest price possible.
 
soundwave05 said:
Who at ATi is working on the Rev chipset is pretty irrelevant.

The system will be exactly how Nintendo wants it, and more importantly will cost exactly what Nintendo wants it to cost.

I have little doubt that ATi probably offered/showed Iwata and company several different architectures with tech demos to illustrate what the system's would be capable of on-screen (HDTV vs. SD etc. etc. etc.) and Nintendo chose the basic visual level that would be good enough for them at the lowest price possible.
I bet ATI is going like "Stupid move, Nintendo" :lol


Unless they have something in mind that they don't know, and we don't know.
 
Nintendo X said:
I bet ATI is going like "Stupid move, Nintendo" :lol


Unless they have something in mind that they don't know, and we don't know.
ATi probably doesn't care about how much power Nintendo is going to get out of the set-up they chose, they just care about how much they're getting paid to design the chips (as I assume both Nintendo and MS will provide their own manufacturing solutions).
 
Well, I thought them encouraging them to go with a powerful graphics chip would ensure them a whole shitload of cash....Nintendo is just being cheap....and I think ATI is a tad bit on the greedy side.
 
Top Bottom