• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

DrWong

Member
Yep, I'm on fire after spending my morning with an american friend to help him get his "récepissé à la sous préfecture de Nogent le Perreux".

Suis sur Panam, quand tu veux pour un pain au cocholat, je pense que générationnellement ça devrait coller. Et j'ai aussi bossé ds la presse spé jeux ;]
 

IdeaMan

My source is my ass!
*Sees incredible post by Thraktor*

*Sees everyone comments about Nintendo's Kimoto*

On topic, what Thraktor says makes sense, it would also explain Epic's excitement for the system, and Epic hasn't been excited for a Nintendo console sense the N64.

Thraktor latests posts with consolidated informations & guesses regarding the Wii U CPU & GPU were interesting yes, but how could it be different from a gafer with a pigeon as avatar ?

Regarding Epic excitement, i wouldn't consider their declarations as a metric way to estimate the Wii U power, they are mainly PR statements. But it's still better than the Wii situation of course.
 
Maxrunner said:
Yes it says the console will put 4K resolutions which will not happen either....
Haven't we heard for some time that Sony wants to support 4K devices? It doesn't mean many games would actually take advantage of it.
 

Azure J

Member
Thraktor, your thought process is something I was going through the motions with and getting bits and pieces of out with when we were still in Speculation Thread 2, especially on Nintendo and the curiously well designed/performing HD7770. While I am still in a bit of uncertainty regarding a part that would reach the exact same performance and possess the same feature sets, I do feel that there's something to everything you posted on the GPU there.

TL;DR you more eloquently posted what I had been discussing with people in portions for a while now. Good job.
 

z0m3le

Banned
*Sees incredible post by Thraktor*

*Sees everyone comments about Nintendo's Kimoto*

.....
okwiththis.png


On topic, what Thraktor says makes sense, it would also explain Epic's excitement for the system, and Epic hasn't been excited for a Nintendo console sense the N64.

The performance of something in the 7700 range makes sense, so I do think he is on to something with the dev kits, lets hope that those tessellation units made it over.

Switching gears, seeing as how PS4 will use an AMD CPU, and both Xbox3 and Wii U will use an IBM CPU, won't that make developers target these two platforms and leave PS4 for ports?

Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7870, if our speculation of the 7700 range is right, that means the GPUs are comparable, the 7870 is twice what the 7770 is.

These will likely be customized cards, and in Wii U's case, likely won't resemble the HD7770 but should have performance somewhere around it.

TL:DR PS4 at best, about twice the graphical power of where we think Wii U's GPU will be. Also Wii U and Xbox3's architecture will have much more in common then PS4's (confirmed if AMD CPU in P4 is true) leaves less room for easy ports for PS4.
 

Thraktor

Member
Thraktor, your thought process is something I was going through the motions with and getting bits and pieces of out with when we were still in Speculation Thread 2, especially on Nintendo and the curiously well designed/performing HD7770. While I am still in a bit of uncertainty regarding a part that would reach the exact same performance and possess the same feature sets, I do feel that there's something to everything you posted on the GPU there.

TL;DR you more eloquently posted what I had been discussing with people in portions for a while now. Good job.

Thanks, I'm not surprised people came to the same conclusion, but I must have missed it during my lurking.

I would clarify, though, that while I do think it's possible that they're using a GPU based on the HD7770, I would expect the Wii U GPU to be clocked lower (perhaps 600MHz - 700MHz), so it wouldn't reach the same raw performance as the HD7770.

The performance of something in the 7700 range makes sense, so I do think he is on to something with the dev kits, lets hope that those tessellation units made it over.

Switching gears, seeing as how PS4 will use an AMD CPU, and both Xbox3 and Wii U will use an IBM CPU, won't that make developers target these two platforms and leave PS4 for ports?

Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7870, if our speculation of the 7700 range is right, that means the GPUs are comparable, the 7870 is twice what the 7770 is.

These will likely be customized cards, and in Wii U's case, likely won't resemble the HD7770 but should have performance somewhere around it.

TL:DR PS4 at best, about twice the graphical power of where we think Wii U's GPU will be. Also Wii U and Xbox3's architecture will have much more in common then PS4's (confirmed if AMD CPU in P4 is true) leaves less room for easy ports for PS4.

I think the important thing about going with a GPU based on the HD7770 would be that it'd have an almost identical feature set to the GPUs in the next XBox and PS4. If Nintendo put me in charge of designing the Wii U's internals, one thing I'd focus on would be making it as easy as possible to downport XBox3/PS4 games to the machine. That being the case, it would be smarter to go with a more modern feature set than to try to chase those machines in terms of raw power. A modestly powerful but modern chip like the HD7770 would work perfectly as the base of such a GPU, especially as both competing machines will likely be using more powerful versions of the exact same architecture.
 

ElFly

Member
Well that's a nice post.

I had thought before of what the rumor for a R700 based console meant, and unless nintendo went with the absolute lowest denominator, it'd crush a 360 in terms of shader/texture/render units.

Now the first rumors mentioned a R700, but later ones mentioned a 770 more specifically, which would make thraker's post more accurate.

e:although, checking, the r700 rumors come from 01net, but the rv770 come from a super shady document leaking the whole specs. Grain of salt.
 

HylianTom

Banned
The performance of something in the 7700 range makes sense, so I do think he is on to something with the dev kits, lets hope that those tessellation units made it over.

Switching gears, seeing as how PS4 will use an AMD CPU, and both Xbox3 and Wii U will use an IBM CPU, won't that make developers target these two platforms and leave PS4 for ports?

Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7870, if our speculation of the 7700 range is right, that means the GPUs are comparable, the 7870 is twice what the 7770 is.

These will likely be customized cards, and in Wii U's case, likely won't resemble the HD7770 but should have performance somewhere around it.

TL:DR PS4 at best, about twice the graphical power of where we think Wii U's GPU will be. Also Wii U and Xbox3's architecture will have much more in common then PS4's (confirmed if AMD CPU in P4 is true) leaves less room for easy ports for PS4.

Holy crap if indeed true.

I mean, I'd completely believe it. Sony can't lose any more money, so a loss-leader model when the business is already bleeding several billion per year would've been dumb.

I can easily see developers in Japan targeting those two consoles. Fascinating.

There are gonna be some upset folks.. yikes..
 

z0m3le

Banned
Thanks, I'm not surprised people came to the same conclusion, but I must have missed it during my lurking.

I would clarify, though, that while I do think it's possible that they're using a GPU based on the HD7770, I would expect the Wii U GPU to be clocked lower (perhaps 600MHz - 700MHz), so it wouldn't reach the same raw performance as the HD7770.



I think the important thing about going with a GPU based on the HD7770 would be that it'd have an almost identical feature set to the GPUs in the next XBox and PS4. If Nintendo put me in charge of designing the Wii U's internals, one thing I'd focus on would be making it as easy as possible to downport XBox3/PS4 games to the machine. That being the case, it would be smarter to go with a more modern feature set than to try to chase those machines in terms of raw power. A modestly powerful but modern chip like the HD7770 would work perfectly as the base of such a GPU, especially as both competing machines will likely be using more powerful versions of the exact same architecture.

The exact same GPU architecture, but the CPUs would be very different between the PS4 and WiiU/xbox3, which could lead to development problems as the CPUs would have completely different APIs.

I actually think that the Wii U up porting 360 games in just a few weeks is a high indicator that Xbox3 will have to match up well with Wii U's architecture, if Microsoft is looking at backwards compatibility anyways... This is going to be a very interesting generation...

Also just one last side note, what if the PS4 is the steam box and that is why it is using a more standard desktop CPU? (complete speculation there, but could save Sony somewhat)
 

MDX

Member
My guess is that the CPU was probably ready before the GPU, but I'd say that the difference was mainly due to the corporate culture of the two companies. AMD tends to reveal very little about their products until they're very close to being on shelves. IBM, by contrast, publishes very detailed information on most of their hardware before it's even released,

Good point. Didnt consider that.
 

Thraktor

Member
The exact same GPU architecture, but the CPUs would be very different between the PS4 and WiiU/xbox3, which could lead to development problems as the CPUs would have completely different APIs.

I actually think that the Wii U up porting 360 games in just a few weeks is a high indicator that Xbox3 will have to match up well with Wii U's architecture, if Microsoft is looking at backwards compatibility anyways... This is going to be a very interesting generation...

Also just one last side note, what if the PS4 is the steam box and that is why it is using a more standard desktop CPU? (complete speculation there, but could save Sony somewhat)

The Xbox 3's CPU should have a very similar architecture to the Wii U CPU's, actually. The PS4's looks to be quite different, but that's Sony's problem rather than Nintendo's. Actually, by my reckoning the next Xbox could well be pretty much two Wii Us duct-taped together in terms of architecture.

There'll be different APIs for all three consoles in any case, but that's fairly trivial if the all have similar feature-sets.
 

z0m3le

Banned
The Xbox 3's CPU should have a very similar architecture to the Wii U CPU's, actually. The PS4's looks to be quite different, but that's Sony's problem rather than Nintendo's. Actually, by my reckoning the next Xbox could well be pretty much two Wii Us duct-taped together in terms of architecture.

There'll be different APIs for all three consoles in any case, but that's fairly trivial if the all have similar feature-sets.

Sorry, my post must of looked confusing, I was in fact saying exactly that.
 

Nibel

Member
*Reads z0m3le's posts and Thraktor's posts*

*Old grumpy man voice* Well well, seems like not all Nintendo fanboys are stupid
 

guek

Banned
A 7770 equivalent would be absolutely phenomenal and well past my personal expectations

sadly, I don't feel like it lines up with the majority of rumors/dev impressions so I remain skeptical. Would be one hell of an upset though
 

Thraktor

Member
A 7770 equivalent would be absolutely phenomenal and well past my personal expectations

sadly, I don't feel like it lines up with the majority of rumors/dev impressions so I remain skeptical. Would be one hell of an upset though

Keep in mind that the HD7770 runs at 1GHz. If you were to clock a similar chip between 600Mhz and 700Mhz that would put the raw performance in the 768Gflops to 896Gflops range, which is about what I've been inferring from rumours/leaks/dev comments for a while now.
 
What were people predicting the Wii U's GPU was before the HD7770 speculation began?

Again, I'm only using that tired FLOPS measurement, which is only one aspect of the cpu, btu a 7770 clocked down to the 600…700MHz (as suggested by Thraktor) from the stock 1000MHz would push 768…896 GFLOPS, or 3.2…3.7x that of the Xbox 360's Xenos.

The prevailing rumour/speculation before this was that the Wii U would have a graphics chip which could do 1000 GFLOPS or greater.

So the new speculation is substantially slower than the old speculation (though the 7770 has additional features than the previously rumoured chips, on account of supporting DX11), unless my relative lack of sleep (see my earlier "it should have >1 GFLOPS" comment) made me totally miss what Thraktor was saying.


Edit: ...hell and damnation, beaten like the $POLITICAL_PARTY_I_DONT_LIKE in the next election cycle. Feh!
 

Linkhero1

Member
Keep in mind that the HD7770 runs at 1GHz. If you were to clock a similar chip between 600Mhz and 700Mhz that would put the raw performance in the 768Gflops to 896Gflops range, which is about what I've been inferring from rumours/leaks/dev comments for a while now.

Again, I'm only using that tired FLOPS measurement, which is only one aspect of the cpu, btu a 7770 clocked down to the 600…700MHz (as suggested by Thraktor) from the stock 1000MHz would push 768…896 GFLOPS, or 3.2…3.7x that of the Xbox 360's Xenos.

The prevailing rumour/speculation before this was that the Wii U would have a graphics chip which could do 1000 GFLOPS or greater.

So the new speculation is substantially slower than the old speculation (though the 7770 has additional features than the previously rumoured chips, on account of supporting DX11), unless my relative lack of sleep (see my earlier "it should have <1 GFLOPS" comment) made me totally miss what Thraktor was saying.


Edit: ...hell and damnation, beaten like the $POLITICAL_PARTY_I_DONT_LIKE in the next election cycle. Feh!

A FLOP is basically how fast it can preform an instruction in the GPU correct? I'm not up to date with all these terms, so please bare with me.
 
Top Bottom