• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last-Stop-Speak-In-Hyperbole Official Revolution Specs Thread

I have no doubt they probably suggested many things, but when it comes down to it Nintendo is going to get exactly what they want. It's possible ATi convinced them to use something more powerful, or is pulling something highly specialized out of their bag o' tricks, but I'm not counting on it. If Nintendo wants cheap, that's what they'll be buying.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
AndoCalrissian said:
I have no doubt they probably suggested many things, but when it comes down to it Nintendo is going to get exactly what they want. It's possible ATi convinced them to use something more powerful, or is pulling something highly specialized out of their bag o' tricks, but I'm not counting on it. If Nintendo wants cheap, that's what they'll be buying.
Is thier such thing a "diminishing returns" when it comes to going cheap on chips?
 

GDGF

Soothsayer
Layman's question, but I was thinking about the 60/30 capped framerate of the Nintendo DS and was wondering if anybody else would accept a poly cap for the Revolution if they were promised 60fps on all games? (and nice texture effects to offset the loss)

(and incase you have no idea what i'm speaking of, apparently the reason NDS games have so fluid a framerate is because there is a hardwired polycap preventing it from generating more polys than it could display at 60fps or 30 when both screens are used for display graphics...something like that)
 

Monk

Banned
Shogmaster said:
Are you freakin high? 24MB of 1T-SRAM in GC and 64MB in XBox are the entire usable RAM. It's no fucking L2 cache. Do you even know what an L2 cache is?

Do not compare PC's RAM structure with GC and XBox's. They are entirely different. GC stores everything except for sound and animation on that 24MB of 1T-SRAM. The 3MB of eDRAM is used for streaming in textures and other data to create frame buffers. XBox is leaving how you devide up the RAM entirely up to the dev, and thus is a complete unified RAM set up.

On todays's videocards, it is streaming in data from main RAM to the vid card's local memory to do vertex shader operations, but the most all that room on the local memory of the videocard is to store all them textures along with the frame buffer.

Forgive me, i dont know what i was thinking when i wrote that. It doesn't really make any sense to me, after all ram is technically just memory used for caching. So arguing that ram=cache is pretty stupid of me. And you are right the ram is used to hold textures and the frame buffer in todays video cards.
 

Chittagong

Gold Member
ziran said:
there's an interesting article at n-sider on 'the nintendo strategy':
http://www.n-sider.com/articleview.php?articleid=495

That's quite a good read and what I've been arguing for the better part of the year - although I must say that Porter's theories are so extreme over-simplifications that meaningful conclusions of Nintendo's future success can't be made based on how well they map into his thoughts.

Nintendo enthusiasm and strategy literacy seem to go hand-in-hand, with N-Sider talking about Porter and Reggie discussing Kim & Mauborgne's Blue Ocean Strategy and Clayton M. Christenseen's Innovator's Dilemma. I'm looking forward to GDC with Iwata explaining puppies with Malcolm Gladwell's Tipping Point theory.
 

Any1

Member
ziran said:
there's an interesting article at n-sider on 'the nintendo strategy':
http://www.n-sider.com/articleview.php?articleid=495

God, i love articles like these. They make it seem like Nintendo's decision to pursue a different market and go a different route was actually a choice that they were smart enough realize unlike dumb ol sony and ms. The reason they are leaving the "real" console race isn't because it's no longer profitable or the many other bullshit excuses that they have given. It's because they could no longer compete, plain and simple. The people who play console games were more interested in what the competition had to offer. Old gamers grew tired of what Nintendo offered and new gamers weren't buying either. They weren't getting tired of the current form of gaming as PS2 sales will attest too. They were tired of Nintendo.

And you gotta love the last sentence as well. "As we'll soon see in 2006, Nintendo's strategic position will be hard, perhaps even impossible, to match. Best of luck to the imitators."

Possibly the most hilariously delusional sentence i've read on a Nintendo site yet.
 

Xrenity

Member
Any1 said:
They make it seem like Nintendo's decision to pursue a different market and go a different route was actually a choice that they were smart enough realize unlike dumb ol sony and ms.
READ the article.
Sony and Microsoft are pursuing the current video game industry. And there's absolutely nothing wrong with that.
The nothing part is even in bold.
 

Thraktor

Member
I know this was a few pages back, but I just wanted to clear something up,

Panajev2001a said:
Sigh... the # of pixels at NTSC resolution * 60 fps idea will never die :(.
:p.

Just look outside your window ;). (hint :))


If you read my post (although I'll admit I probably wasn't clear enough), I certainly wasn't implying that 18 million was the optimal number of polys per second for an SD console to push, but that, once you get to a certain level, so long as the devs aren't completely ignorant to the whole concept of LoD, an SD console can be reduntantly powerful in pure poly-pushing performance. I don't think that you would really see actual redundancy until about the 50mil mark, but, as I said in my post, there's no reason why the best games on an SD console capable of pushing 35mp/s couldn't look negligibly different to a console pushing many more polys at the same resolution. The reason I brought this up is to note that, even if a console can be redundantly powerful in poly-pushing performance, the same can't be said of lighting and effects, and that I hope Nintendo realises this if they want to fulfill their promise of a console that "won't look significantly different to PS3 or XBox 360 in SD resolution".
 

Nightbringer

Don´t hit me for my bad english plase
Sorry but I am one of the ones that don´t believe in the IGN specs and Shogmaster is using all his ego because he said that Revolution will use a PowerPC 750+VMX in beyond3D some months ago and he is now with his ego at the clouds.

Gamecube is dead in the market, it cannot compete against the others and this is a fact.

If Revolution is a revamped Gamecube Nintendo isn´t going to put more money on ATI for a new GPU, overcloking all that they have is enough. Gamecube is dead in the market, it cannot compete against the others and this is a fact. Why will Nintendo must wait until the second semester of 2006 for a revamped Gamecube with a strange controller?

But the most hylarious part from the IGN article is that they are talking about 88MB of main RAM, this is a joke, they are talking of use one chip of 8MB, another chip of 16MB and a third chip of 64MB for the total memory, using chips of 8MB and 16MB in 2006 is only a thing that an idiot hardware designer will do.

The last problem with the argument is that the PowerPC 750VX doesn´t exist, never existed, because the last PowerPC 750 is the GX and the VX never has been confirmed and it was only a stupid rumour from Apple Macintosh rumour sites.

PD: I don´t want to attack you Shog, but I believe that your arguments are wrong.
 
Any1 said:
God, i love articles like these. They make it seem like Nintendo's decision to pursue a different market and go a different route was actually a choice that they were smart enough realize unlike dumb ol sony and ms. The reason they are leaving the "real" console race isn't because it's no longer profitable or the many other bullshit excuses that they have given. It's because they could no longer compete, plain and simple. The people who play console games were more interested in what the competition had to offer. Old gamers grew tired of what Nintendo offered and new gamers weren't buying either. They weren't getting tired of the current form of gaming as PS2 sales will attest too. They were tired of Nintendo.

And you gotta love the last sentence as well. "As we'll soon see in 2006, Nintendo's strategic position will be hard, perhaps even impossible, to match. Best of luck to the imitators."

Possibly the most hilariously delusional sentence i've read on a Nintendo site yet.

Are you kidding? It was a choice Nintendo made. They could continue battling it out in the current market and continue to lose market share, however they've CHOSEN not to. Regardless, it's clear you didn't bother to read the article.
 
I don't agree with Shog's posts on Revolution either. I expect something inbetween the doom'n gloom 'Gamecube Turbo' and a killer nextgen box that rivals PS3. I believe Rev will be around half the performance of an Xbox 360 but more efficient, easier to code for.
 

DrGAKMAN

Banned
Nightbringer said:
Sorry but I am one of the ones that don´t believe in the IGN specs and Shogmaster is using all his ego because he said that Revolution will use a PowerPC 750+VMX in beyond3D some months ago and he is now with his ego at the clouds.

Gamecube is dead in the market, it cannot compete against the others and this is a fact.

If Revolution is a revamped Gamecube Nintendo isn´t going to put more money on ATI for a new GPU, overcloking all that they have is enough. Gamecube is dead in the market, it cannot compete against the others and this is a fact. Why will Nintendo must wait until the second semester of 2006 for a revamped Gamecube with a strange controller?

But the most hylarious part from the IGN article is that they are talking about 88MB of main RAM, this is a joke, they are talking of use one chip of 8MB, another chip of 16MB and a third chip of 64MB for the total memory, using chips of 8MB and 16MB in 2006 is only a thing that an idiot hardware designer will do.

The last problem with the argument is that the PowerPC 750VX doesn´t exist, never existed, because the last PowerPC 750 is the GX and the VX never has been confirmed and it was only a stupid rumour from Apple Macintosh rumour sites.

PD: I don´t want to attack you Shog, but I believe that your arguments are wrong.

Not that I'm trying to defend Shogmaster but he knows more about this kind of stuff than we do, so we can't really say he's wrong. Also...IGN has been right about alot of this stuff so calling into question their reliability makes no sense.

You're saying "if Revolution is just a GCN Turbo, why should Nintendo pay for it?" right? Well...to make it easier/cheaper to manufacture, making it fit into the smaller Revolution casing and giving it the umph it needs to be considered a "GCN Turbo" is reason enough to "pay" ATi & IBM to work on Hollywood & Broadway...it may be based on existing Flipper & Gekko technology, but it's still a major upgrade to overclock them, make them run cooler and in a smaller space.

As far as the memory configuration, IGN just said GCN's memory (24MB 1T-SRAM + 16MB A-RAM) plus an additional 64MB of 1T-SRAM...nowhere do they suggest the 8 + 16 + 64 sets you're saying. I'm thinking they'll have a 32 + 32 + 24 set-up of 1T-SRAM & the 16 of A-Memory (with the 24 + 16 in use for GCN BC and all the memory running in full-on Revolution mode). The reason for splitting the memory into chunks like this is to dissipate heat in the small Revolution casing as memory runs hot too (not just processors). It may even be 16 + 16 + 16 + 16 (64MB of extra memory) + 12 + 12 (GCN configuration memory) 1T-SRAM chips.

I can't question what Shogmaster says, 'cos again, he knows what he's talking about...the only thing I argue is the memory capacities 'cos these are things that can easily and quickly be changed. The main memory is believable, but what I really question is only 3MB of on-chip memory for Hollywood...this is basically the same as the on-chip memory in GCN's Flipper which makes no sense if the Hollywood is a "turbo" upgrade. For Nintendo wanting to make an efficient bang-for-their-buck architecture, it would make no sense to give it such a bottleneck by improving everything except on-chip memory! I think the GPU is either such a mystery that IGN's sources are wrong about it's on-chip memory...or (by working with basically modified GCN devkits for Revolution) they're assuming that Hollywood will just be Flipper and that's where the 3MB spec comes from?
 

Thraktor

Member
DrGAKMAN said:
I think the GPU is either such a mystery that IGN's sources are wrong about it's on-chip memory...or (by working with basically modified GCN devkits for Revolution) they're assuming that Hollywood will just be Flipper and that's where the 3MB spec comes from?

Reading over all the rumours and speculation, it seems quite clear that, while IBM's "Broadway" CPU had recently been finalised and is being sent off to developers (hence the info coming out now), none of the developers spoken to seem to have anything even resembling a finalised "Hollywood" GPU chipset. The current dev kits most probably do just have a standard Flipper, and in that case, the speculation that the on-chip eRam has remained at 3MB isn't really anything worth listening to.
 

Eric_S

Member
@Nightbringer]

From what I've been alluded to by IBM guys, there was a 750VX in the planning stages, but it was scrapped once it stood clear that there wouldn't be any viable financiation of it.

Oh, and the latest 750 CPU is the GL, and not the GX ;)

But the most hylarious part from the IGN article is that they are talking about 88MB of main RAM, this is a joke, they are talking of use one chip of 8MB, another chip of 16MB and a third chip of 64MB for the total memory, using chips of 8MB and 16MB in 2006 is only a thing that an idiot hardware designer will do

Depends on what you intend to do with the hardware in question. There is no point in spending silicon on something that is percieved to not render one any benefits when it comes to the bottom line.

That said, I've got no idea on what Nintendo feels is going to boost their bottom line or not.
 

DrGAKMAN

Banned
Thraktor said:
Reading over all the rumours and speculation, it seems quite clear that, while IBM's "Broadway" CPU had recently been finalised and is being sent off to developers (hence the info coming out now), none of the developers spoken to seem to have anything even resembling a finalised "Hollywood" GPU chipset. The current dev kits most probably do just have a standard Flipper, and in that case, the speculation that the on-chip eRam has remained at 3MB isn't really anything worth listening to.

Thank you!
 
Top Bottom