• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Speculation Thread The Third: Casting Dreams in The Castle of Miyamoto

Thraktor

Member
Again, I'm only using that tired FLOPS measurement, which is only one aspect of the cpu, btu a 7770 clocked down to the 600…700MHz (as suggested by Thraktor) from the stock 1000MHz would push 768…896 GFLOPS, or 3.2…3.7x that of the Xbox 360's Xenos.

The prevailing rumour/speculation before this was that the Wii U would have a graphics chip which could do 1000 GFLOPS or greater.

So the new speculation is substantially slower than the old speculation (though the 7770 has additional features than the previously rumoured chips, on account of supporting DX11), unless my relative lack of sleep (see my earlier "it should have <1 GFLOPS" comment) made me totally miss what Thraktor was saying.


Edit: ...hell and damnation, beaten like the $POLITICAL_PARTY_I_DONT_LIKE in the next election cycle. Feh!

I'll clarify that I suggested a clock rate of 600MHz to 700MHz because I personally don't think the 1Tflops+ rumours are reliable enough to deviate from my previous expectation of a 700Gflops to 900Gflops chip. Of course if they are targeting 1Tflops then clocking the same chip to 800Mhz would achieve pretty much exactly that.
 
A FLOP is basically how fast it can preform an instruction in the GPU correct? I'm not up to date with all these terms, so please bare with me.

FLOPS means "Floating Point Operations Per Second". It means how many basic math expressions you can do ("2.1 x 4.5", "5.0 + 12.8", and so forth) on decimal numbers in a given amount of time.
 
I love that clip. Heck, I can quote the whole movie. It's my favorite movie of all time.

And yeah.. Jake passed two months ago today. He was always by my side, and would spend hours sleeping on the couch next to me while I'd be playing. He loved Wolf Link's singing, and his ears would always perk-up when he'd hear Epona. I'm going to be heartbroken for a long, long time, and I can't think about him for too long without losing it.

When I made the oath on Jake's honor last night, I was quite serious. For anyone who has inside info on a game in development, or info on the console itself, or on related accessories or the network, or maybe even Nintendo's efforts to woo third parties, please consider participating, my offer still stands. Here's the link:
http://www.neogaf.com/forum/showpost.php?p=36395255&postcount=10452

If somehow this helps NintendoGAF going into the future as far as leaks and information-gathering is concerned, I'd be giddy. Nayru knows we'll take all the help we can get.

Really sorry to hear about Jake mate, I know it can be real heartbreaking to lose a pet. I've got 2 cats, one I've had since a kitten who's nearly 3 now and the other who I adopted when one of my best friends passed away 7 months away. Rufus, my late friend's cat, has been diagnosed with chronic renal disease so he'll see his previous owner in the next year or two unfortunately.

Very good idea regarding the info, will be interesting to see and discuss what happens.
 
The performance of something in the 7700 range makes sense, so I do think he is on to something with the dev kits, lets hope that those tessellation units made it over.

Switching gears, seeing as how PS4 will use an AMD CPU, and both Xbox3 and Wii U will use an IBM CPU, won't that make developers target these two platforms and leave PS4 for ports?

Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7870, if our speculation of the 7700 range is right, that means the GPUs are comparable, the 7870 is twice what the 7770 is.

These will likely be customized cards, and in Wii U's case, likely won't resemble the HD7770 but should have performance somewhere around it.

TL:DR PS4 at best, about twice the graphical power of where we think Wii U's GPU will be. Also Wii U and Xbox3's architecture will have much more in common then PS4's (confirmed if AMD CPU in P4 is true) leaves less room for easy ports for PS4.

Nearly everything besides the name for that rumor was laughable.

It mentioned an 8-core "64-bit" cpu in the XB3 with 4g of ram
And a 4-core "32-bit" cpu in the PS4..with 2g of ram.

Pretty fucking laughable.
 
I'll clarify that I suggested a clock rate of 600MHz to 700MHz because I personally don't think the 1Tflops+ rumours are reliable enough to deviate from my previous expectation of a 700Gflops to 900Gflops chip. Of course if they are targeting 1Tflops then clocking the same chip to 800Mhz would achieve pretty much exactly that.

Either way, the stated TDP (80W at the specified frequency, obviously less when clocked down) is in a comfortable zone. (note for others: that means it doesn't run incredibly hot)

Edit: Can't find numbers, but this seems to be a lower TDP than the Xbox 360's Xenos gpu probably is. The entirety of the 360 was initially ~200W peak, but speculation out there apparently put the Xenos itself at "<100W". Anybody know for sure?
 

StevieP

Banned
UE4 support will tell us a lot more about Epic than it will about Nintendo. With UE3 and the Wii there was a good technical reason that it couldn't support the console: the Wii simply didn't haven the necessary feature-set (i.e. programmable shaders) that the engine was built around. The raw power of the console was of far less importance, Unreal Engine would be scalable enough to run on something as powerful as the Wii if it had a modern shader architecture.

This time round, no matter how big the power gap between Nintendo and the competition (a lot smaller than last time, anyway), there won't be any features the next XBox and PS4's GPUs will have that the Wii U's won't. They'll surely be faster, have more SPUs, etc. and will probably have more efficient tessellators, and so forth, but there won't be anything that the Wii U can't do, but slower. Any engine designed for those consoles will be able to run on Wii U, so long as the resolution/texture detail/effects, etc. are reduced appropriately. If UE4 isn't available on Wii U, it won't be because it's not technically capable of it, it'll be because Epic have made the decision that they don't want UE4 on Wii U.

The difficulty for Epic would be when someone like Crytek comes along with an engine that not only produces crazy-good graphics on the next XBox and Playstation, but can also scale down to Wii U as well, giving developers a lot more leeway to release their games on three platforms instead of UE4's two.

My personal guess is that Epic are playing the "we won't say what UE4 will run on" game to try to push Nintendo, MS and Sony into making their machines as powerful as possible. Being able to say to Nintendo "well, if you increased X spec by Y amount, we might be able to get UE4 working on it" is a good position for them to be in at the moment. Similarly, telling MS and Sony "If you make the console X times more powerful than Wii U then we'll be able to make UE4 XBox720/PS4 exclusive" can really prey on their desire to shut Nintendo out from the "mature" market again. Once the consoles are out, though, there's little reason for them to hold out, and they'll eventually announce Wii U support for the engine, although later than for the other two.

Great post. Just needed to be requoted.

Switching gears, seeing as how PS4 will use an AMD CPU, and both Xbox3 and Wii U will use an IBM CPU, won't that make developers target these two platforms and leave PS4 for ports?

Don't be so sure.

Also if Sony is really using a Southern Islands chip, I can't imagine them using a chip bigger than a HD7850

Fixed.

The Xbox 3's CPU should have a very similar architecture to the Wii U CPU's, actually. The PS4's looks to be quite different, but that's Sony's problem rather than Nintendo's. Actually, by my reckoning the next Xbox could well be pretty much two Wii Us duct-taped together in terms of architecture.

AMD has their fingers in a lot of pies right now.

Nearly everything besides the name for that rumor was laughable.

It mentioned an 8-core "64-bit" cpu in the XB3 with 4g of ram
And a 4-core "32-bit" cpu in the PS4..with 2g of ram.

Pretty fucking laughable.

The bits/cores portion of that rumour was BS. The ram is also BS. The architecture, however, may not be.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I'll clarify that I suggested a clock rate of 600MHz to 700MHz because I personally don't think the 1Tflops+ rumours are reliable enough to deviate from my previous expectation of a 700Gflops to 900Gflops chip. Of course if they are targeting 1Tflops then clocking the same chip to 800Mhz would achieve pretty much exactly that.

Isn't the Xenos GPU in 360 around 240Gflops. That would make the GPU in WiiU around 3 to 4 times more powerful.

I'd like it to be true but I really doubt it.
 

Thraktor

Member
Isn't the Xenos GPU in 360 around 240Gflops. That would make the GPU in WiiU around 3 to 4 times more powerful.

I'd like it to be true but I really doubt it.

Well, so far we're hearing that the console will have 1.5GB of RAM (3x XBox360), 32MB eDRAM on the GPU (~3x XBox360) and 3MB L2 cache on the CPU (again 3x XBox360), so I don't see 3x the raw GPU performance being outside the realm of possibility.
 

Azure J

Member
A 7770 equivalent would be absolutely phenomenal and well past my personal expectations

sadly, I don't feel like it lines up with the majority of rumors/dev impressions so I remain skeptical. Would be one hell of an upset though

It's funny, I'm in no way expecting it, but from what I understand, it could actually still line up with what's being said about the console currently. The 7770 is more capable without a doubt but taking it at face value, nothing about it screams that it would Broly the competition. With it's more modern parts and OpenGL support though and a drive for efficiency over sheer grunt, it'd be quite a competitor and future proof too. (say 720p/30 to the 720/PS4's 1080p/30 in the most demanding games)

Edit: I just realized, this could also explain the "GPU with a lot of features but no teeth" line Arkham used at one point.

What were people predicting the Wii U's GPU was before the HD7770 speculation began?

My thought train went something like this:

HD4850/70 (Ye Olden Days of pre-wsippel) which was readjusted to 4830 after said poster did the early detective work and found out that's what devs were using to develop games for the system some time in late June/early July (systems that were "locking up" due to the heat of both the GPU and CPU running full brast ; no actual Nintendo silicon inside). Figuring from the 4830, a 640SPU unit, a few posters had me wonder why they went with that instead of the already 40nm and 640SPU 4770 for ease of development until we got hints from a Brain_Stew/wsippel exchange that the GPU would have a nice chunk (32MB at the least) of eDRAM which made me believe that the 4830 was being used to "psuedo-emulate" the high bandwidth of the eDRAM at the same time it was guesstimating Nintendo's desired performance of the console barring aspects of the baseline that could be further modernized (tesselators and OpelGL support really).

Going forward, I speculated that final silicon would look/perform like the HD 5770/6770 cards thinking that they were different (but they're actually the same card rebranded). Debates went back and forth and I remember sticking by a 800SPU part with the understanding that Nintendo would opt for something on the older architecture AMD used for the 4830 used in the kits until actual specs of the HD7770 card came out. It's still a very curious card, massively low wattage and a lower number of SPUs (my understandign at the time was high SPUs = better overall) but outperforms everything in the 4XXX line including the 4870 thanks to radical arch shifts and from my understanding, a crazy high clock speed which it can approach without imploding thanks to being 28nm.

Of course, keep in mind everything in this post is about what I thought the performance/design could be likened to and not Nintendo hopping on newegg for a couple million of the cards labeled/described in the post. :p

I've been asleep. Were did this HD7770 rumor come from? Is that what Wii U's GPU is now meant to be based on, or is this just a guess?

Just speculation on what the system's GPU could look like when all is said and done.
 
Isn't the Xenos GPU in 360 around 240Gflops. That would make the GPU in WiiU around 3 to 4 times more powerful.

I'd like it to be true but I really doubt it.

The original 1 TFLOPS report was recently (last night, in fact) put into doubt once we actually got somebody to re-translate the Game Watch articles, this is true.

Now the question is: How reliable is that rumour that engadget reported many months back about the Wii U demo systems at E3 '11 having RV 770 based chips inside them?

It could be completely false. However, if it is true, then it lends additional credence to the range Thraktor is looking at (the 770 has a range generally between 800 and 1200 GFLOPS -- edit: the higher end of that range runs too hot to be useful).

But, of course, there is much straw-grasping here. :D
 

Bisnic

Really Really Exciting Member!
It took around 2 weeks to get this thread halfway to the 20k posts limit. I predict that by the time we reach E3, we'll be at our 4th thread! I would say 5, but around that time, higher chances of actual news may be enough to slow speculations.

GAF sure love speculations.
 

Bagu

Member
We're already over 10k?

Mother of God...

image.php
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Well, so far we're hearing that the console will have 1.5GB of RAM (3x XBox360), 32MB eDRAM on the GPU (~3x XBox360) and 3MB L2 cache on the CPU (again 3x XBox360), so I don't see 3x the raw GPU performance being outside the realm of possibility.

I did write up a post that attempts to discredit the idea that WiiU will have a 3 x performance of the 360 but deleted it.

I'm trying to be positive.

I'm trying to imagine what a machine like that will bring to the table. Obviously, we can expect the very best versions of current gen multi platform games. Games that should easily run at full 720p @ 60fps or 1080p @ a solid 30fps with good quality AA and much improved texture resolution whilst still having enough power left over to power the padlet screen with visually rich information.

If the GPU is feature complete relative to PS4/720 then scaled back games should be possible. I can see realtime physics/fluid/cloth/fog modelling taking a big hit as well as lighting and there would be a reduction in polygon count and more aggressive LOD but I would think it's possible to do a reasonable job of ports. By then. WiiU devs will have a good handle on the system and know how hard they can push it.
 
Similarly, telling MS and Sony "If you make the console X times more powerful than Wii U then we'll be able to make UE4 XBox720/PS4 exclusive" can really prey on their desire to shut Nintendo out from the "mature" market again..

I like your theorizing, but this right here isn't how the industry works. Epic can certainly say something along the lines of "your console is X, you should really consider making it Y to get UE4 running at its most optimal" but they can not go "Hey, guess what! The Wii U is X, you should make yours Y so UE4 will be exclusive to your systems"
 
The GPU is not going to be a 7770 or a 4850 or any other number.
It'll be built for Nintendo around what they want and what third parties say they need.
It could have a feature set of a 7xxx model with a much lower speed and without the dedicated RAM. We really can't say for sure. But it most definitely isn't just a repackaged Cenozoic.
 

AzaK

Member
Don't get your hopes up guys. After being up for hours last night with a screaming baby, I finally managed to get some sleep and in short dreamtime I met Reggie. We were just chilling and while I didn't want to be too pushy on the Wii U I asked him how he thinks it will do with MS and Sony coming out soon after. His reply contained the words

"..now gamers can play 360 level games..."

We're doomed, and I need a GAF holiday it seems.
 
R

Rösti

Unconfirmed Member
I figured out how to access those hidden images on http://www.warioworld.com. Nothing fancy as "newwii" turned out to be just an image of Wii. However there are two images of the Wii U logo on there, that I found via the Protocol Secure anyway:

https://www.warioworld.com/images/splash/splash_wiiu_high.jpg
https://www.warioworld.com/images/splash/splash_wiiu_low.jpg

Metadata has it changed at 2011:07:14 16:15:25. Two folders I've seen so far, "Splash" and "Generic". Generic features these two images for example:

https://www.warioworld.com/images/generic/download_left.gif
https://www.warioworld.com/images/generic/download_right.gif

If we keep digging we might unravel more.
 
Don't get your hopes up guys. After being up for hours last night with a screaming baby, I finally managed to get some sleep and in short dreamtime I met Reggie. We were just chilling and while I didn't want to be too pushy on the Wii U I asked him how he thinks it will do with MS and Sony coming out soon after. His reply contained the words

"..now gamers can play 360 level games..."

We're doomed, and I need a GAF holiday it seems.

Well, if his quote ended with "...on the Display Remote Controller while playing separate PS3 level games on the TV screen with the new tether-free Classic Controller Unlimited", it wouldn't be too bad a dream. ;)
 

Redford

aka Cabbie
You...guys...move... so fast...

Super good posts Thraktor, hope you don't go away again too soon!

---
Random pondering:

Every day I'm beginning to understand more and more why Nintendo handled the presentation of the Wii U's box at E3 the way they did last year. I bet if they could have a handheld home console that did all the processing in your hands and a simple receiver to display the result on the TV, they would.

I hope someday we can see something like that, it would be a logical successor to the U. (and if we truly are headed to a digital-media-only future)
 

Bagu

Member
Rösti;36443888 said:
I figured out how to access those hidden images on http://www.warioworld.com. Nothing fancy as "newwii" turned out to be just an image of Wii. However there are two images of the Wii U logo on there, that I found via the Protocol Secure anyway:

https://www.warioworld.com/images/splash/splash_wiiu_high.jpg
https://www.warioworld.com/images/splash/splash_wiiu_low.jpg

Metadata has it changed at 2011:07:14 16:15:25. Two folders I've seen so far, "Splash" and "Generic". Generic features these two images for example:

https://www.warioworld.com/images/generic/download_left.gif
https://www.warioworld.com/images/generic/download_right.gif

If we keep digging we might unravel more.

Are you the hero of which legends speak?
The man to learn the truth we desperately seek
 

Nibel

Member

MDX

Member
SONY

The assumption is is that Sony will come out with a machine that will outclass the WiiU.
Im not sure. Sony is in a pickle.

1. If they stay in the console business, their next console will come out two years after the WiiU. This will give Nintendo a user base lead, developers will be accustomed to the hardware, and a fair share of developers might keep the WiiU as a the lead platform. And expensive console, competition with Microsoft, might cause them to release a cheaper console ala ps1 or ps2.

2. Tablet controller. Sony has a history of adopting Nintendo's ideas. The tablet controller, if successful the first year, might spur Sony to emulate their own. This will drive up the costs of the console.

3. Vitality controller. I expect that Nintendo's next move, a year or two after the WiiU's release, is to bring out the vitality sensor. This is an area Sony has been looking into as well. So they might want to have it available in their arsenal. More costs.

4. Kinect. I got a feeling Sony is looking to bring out something similar. Especially since Apple has their SIRI. At any rate, if they do... more costs.

5. Move controller. Whats Sony going to do with this? Drop it next gen? Keep it as an accessory? Or bundle it with the console? Who knows.

Sony will want to launch with some kind of gimmick. They have a lot to chose from. But whatever they do, it will drive up the costs of the unit. And during that time, Nintendo will simply line up their titles to steal Sony's thunder when and if they release their console.
 
Rösti;36443888 said:
I figured out how to access those hidden images on http://www.warioworld.com. Nothing fancy as "newwii" turned out to be just an image of Wii. However there are two images of the Wii U logo on there, that I found via the Protocol Secure anyway:

https://www.warioworld.com/images/splash/splash_wiiu_high.jpg
https://www.warioworld.com/images/splash/splash_wiiu_low.jpg

Metadata has it changed at 2011:07:14 16:15:25. Two folders I've seen so far, "Splash" and "Generic". Generic features these two images for example:

https://www.warioworld.com/images/generic/download_left.gif
https://www.warioworld.com/images/generic/download_right.gif

If we keep digging we might unravel more.

I'm trying random image names after 'generic/' in the hope of finding something... have you got a better idea Rösti? Otherwise it could take ages to find something!
 
R

Rösti

Unconfirmed Member
I'm trying random image names after 'generic/' in the hope of finding something... have you got a better idea Rösti? Otherwise it could take ages to find something!
I'm doing the same thing. The source code and the CSS document provided very little, so far I've only found https://www.warioworld.com/images/generic/background.gif.

Directory listing is unfortunately prohibited, so we won't get anything from that unless we can bypass the 1024/2048 bit encryptions on that site.
 

ElFly

Member
Warioworld.com runs a Microsoft-IIS/6.0 server. Last I checked there was an easy vulnerability for this with WebDAV, but WarioWorld has WebDav disabled, so the option of an easy hack is gone.

Maybe we can get anonymous very angry with nintendo, and they can put their non lame hackers (like me) on it.
 
I'm finding it difficult getting through... My skills in this area are limited to be honest, maybe you guys could have better luck.

And yes, yes we are.
 

Fjolle

Member
Either way, the stated TDP (80W at the specified frequency, obviously less when clocked down) is in a comfortable zone. (note for others: that means it doesn't run incredibly hot)

Edit: Can't find numbers, but this seems to be a lower TDP than the Xbox 360's Xenos gpu probably is. The entirety of the 360 was initially ~200W peak, but speculation out there apparently put the Xenos itself at "<100W". Anybody know for sure?

How did you figure that out?

For comparisons sake the volume of the x360 is ~400 cin, and the wiiU is ~130 cin. Obviously you can't get rid of the same heat when you have a smaller box.
 
Warioworld.com runs a Microsoft-IIS/6.0 server. Last I checked there was an easy vulnerability for this with WebDAV, but WarioWorld has WebDav disabled, so the option of an easy hack is gone.

Maybe we can get anonymous very angry with nintendo, and they can put their non lame hackers (like me) on it.

Damn. Is there any feasible way of bypassing their security?
 

nordique

Member
Hey guys, I'm back!

***tumbleweed blows by***

Alright, I doubt any of you remember me, but I used to post a bit in the old Wii U speculation threads. I've been on a self-imposed exile from GAF for the past couple of months as I'm in my final year of college, but I've decided to make a brief return to cover a few things that I think need to be cleared up here in this thread.

There was a bit of discussion a few pages back on the console's CPU, and I thought it'd be worth spelling out what we know about the CPU, and what it means as far as the console's performance is concerned.

Wii U's CPU

The Wii U, by all accounts we've had thus far, has a tri-core IBM PowerPC CPU, with 2-way SMT (symmetric multi-threading), OoOE (out-of-order execution) and 3MB of L2 eDRAM cache. (The L1 cache, by the way, will be 32kB instruction cache + 32kB data cache, which is standard for all of IBM's PPC CPUs.) The CPU is expected to run somewhere north of 3Ghz (and I'd put 3.5Ghz as the absolute upper bound on that).

Given all this, the CPU would be expected to perform about the same in terms of raw computational power (flops) as the XBox360's CPU. The Wii U's CPU does have a couple of advantages, though, in terms of how efficiently that power is used.

A lot of modern CPU design isn't focussed so much on getting as many cores to run at as high clock-rates as possible (even in energy-guzzling server chips), but rather on getting the CPU to run as efficiently as possible, in particular minimizing the amount of wasted cycles. In old microprocessors, if the processor doesn't have the data it needs to perform an instruction, it has to wait until the data arrives from memory. Every clock cycle the data takes to arrive from memory is one in which the processor is doing nothing; it's a wasted cycle. If you're reading from memory a lot, these wasted cycles can add up to a considerable portion of the processor's time, and the actual performance you get will be a lot lower than the processor is in theory capable of.

There are two main technologies that modern processors use to minimize the proportion of cycles that are wasted. The first is the cache. This is a small pool of memory on the CPU that automatically pre-loads data as it's needed, and can then be accessed by the CPU at a much lower latency than main memory. The Wii U's CPU has a cache three times larger than the XBox360's CPU's, which means more data in low-latency range and hence fewer wasted cycles. It should also be expected that the cache in the Wii U is probably more advanced in it's implementation: ie. better at figuring out what data will be needed and when.

The second relevant technology is out-of-order execution. In a processor which is capable of out-of-order execution, instead of simply waiting when a piece of data isn't available for an instruction, the processor will go ahead and execute the next instruction along (if it can), or indeed the next instruction after that, or the next instruction after that, etc. This can, in some cases, substantially reduce the amount of wasted cycles, and it's a capability which the Wii U's CPU does have, and the XBox360's (and PS3's for that matter) doesn't have.

So what does this mean for games? Well, for some in-game CPU tasks, such as physics, the benefit will be pretty small. Physics is generally pretty linear code, so won't gain much from out-of-order execution or the bigger cache. By contrast, non-linear code like AI will benefit hugely from out-of-order execution and a large cache, so we'll see very significant improvements in AI performance on the Wii U's CPU compared to the XBox360's.

So, physics-heavy shooters like Red Faction wouldn't see much improvement CPU-wise on the Wii U, whereas AI-heavy games like RTSs should run significantly better. Most games will be somewhere in the middle, with a decent, if not amazing boost in performance over the XBox360's CPU. Keep in mind, though, that even if the CPU isn't any more efficient at running physics code, the fact that it's more efficient at other things means that developers have more power left to dedicate to physics, so all else being equal, we should still see better physics in Wii U games than XBox360 games in many cases.

It is worth noting, though, that even if the Wii U's CPU may be a bit disappointing to some, the HD consoles this generation (especially the PS3) went overboard with regard to their CPUs to the point where a lot of graphics work is offloaded to the CPU on many titles. A console with a modest boost in CPU power and a larger jump in GPU capabilities would make for a much more balanced machine.

Alright, the second thing I wanted to put up here before I head back into exile is about the GPU. It's just a bit of deduction/speculation on my part based on what we know.

Wii U's GPU

I've been thinking about the GPU a bit recently, and in particular what we can infer from the decision to use an R700 series (Radeon HD4xxx) chipset in the development kits. Firstly, we know that the Wii U's GPU is, to some extent, a custom chip. It may well be based around an existing chip, but at the very least it has 32Mb of eDRAM onboard, and quite possibly some other extra stuff we don't know about. We also know that it began development in 2009. We can expect that in 2009 and early 2010 Nintendo and AMD settled down on the basic specifications for the chip, ie the number of SPUs, TMUs and ROPs, the use of VLIW5, VLIW4 or GCN architectures, and the intended manufacturing node. Now, sometime in late 2010 or early 2011, when Nintendo were putting together the first dev kits to be sent out to third parties, the GPU quite obviously wasn't ready, so they had to go with one of AMD's off the shelf cards as a stand-in, and they chose one from the R700 line (I've heard the HD4830, but I don't know if we've got confirmation of this).

Why did they do this?

We can pretty safely say that whatever GPU ends up in the Wii U, it will be manufactured at a 40nm or smaller process. Why then go with an older 55nm card when there were plenty of 40nm HD5xxx and HD6xxx cards available which could provide pretty much identical performance with a lower power draw? What characteristic does the Wii U's GPU share with the HD4xxx series that it doesn't with any card in the HD5xxx or HD6xxx lines? There's only one aspect that I can think of:

The HD4xxx series were the only 640 SPU cards available at the time the dev-kits were being put together.

This is actually a fairly sensible reason for putting a R700 series card in the dev kit; Nintendo had settled on a core configuration with 640 SPUs (and perhaps 32 TMUs and 16 ROPs), so a HD4830 would naturally have been the best fit for a development kit. I don't think it would be a stretch to say that this is good evidence for the final GPU being a 640 SPU part.

Now comes the real speculation. Early this year, we started to get reports that developers were getting new development kits with a performance boost over previous kits. That's the sort of thing you'd expect to hear if Nintendo replaced the R700 stand-in card with an early production version of the actual Wii U GPU. This lines up exactly with AMD's new 28nm HD7xxx series coming off the production line, and in particular the HD7770 (Cape Verde), their first 640 SPU part since the HD4xxx series. The HD7770, clocked down to about 600MHz-700Mhz, would fit pretty much perfectly into Nintendo's requirements as far as performance, size and heat are concerned.

Nintendo approached AMD in 2009 looking for a reasonably powerful, but low-wattage GPU to put in their mid-2012 console. It's not unreasonable to speculate that AMD said "we've got a 640 SPU part on a 28nm process planned for late 2011, how about we customise something around that?". It explains why they went with a HD4xxx card in the dev kits, it explains why the dev kit power boost came when it did, and it fits very neatly to what we've heard about performance and power consumption.

And to the inevitable "Nintendo would never do 28nm" responses, keep in mind that Nintendo have always used the smallest available node in manufacturing their hardware, right back to the 350nm chips in the N64. Also this would have been decided back in 2009/2010, when it would have been reasonable to expect the 28nm node to be ready for a 2012 reasonably-priced console. In fact, the push back of the release date from the summer could well be due in part to a desire to wait until the yields on 28nm chips increase.

We also have to consider whether NEC (now Renesas), who manufactured the Gamecube and Wii GPUs, and we can expect are first in line to manufacture the Wii U GPU, are capable of manufacturing at 28nm. As it happens, NEC announced a deal back in 2009 (when Nintendo would have been making the decision) with none other than IBM, to manufacture 28nm chips at East Fishkill, New York, in the very same facility which the Wii U's CPU is being manufactured. How's that for a coincidence?

Great posts, very informative, and some nice guess-work

re-quoting in case any people missed them. We'll see how accurate they end up being, but at least there's some educated guesswork behind it

Merci :)
 
Dat tesselation.

Early on, some sources (prob: "guesses") were a DX10-equivalent chip with tesselation tacked on.


How did you figure that out?

For comparisons sake the volume of the x360 is ~400 cin, and the wiiU is ~130 cin. Obviously you can't get rid of the same heat when you have a smaller box.

Well, crap, you totally got me there. I did not at all have that particular variable in mind. Even the lack of an internal hard drive does not make up very much of that difference! I completely admit that by "comfortable zone" I was referring to a completely subjective measure based on what range of total system power I thought Nintendo might try to get away with.

So basically a WAG. Wasn't trying to be deceptive there, sorry. And I'll keep relative console sizes in mind when making statements like that in the future. Thank you. :)
 

ElFly

Member
I remember that Nibris (remember nibris, nintendo fans) got a Gamebryo license at one point.


If they can do it, maybe becoming a nintendo licensee is not that hard.
 

Shurayuki

Member
Xbox 360 has a tessellation unit and I believe it was the same one used all the way to the Radeon 4000-series of GPUs if that's what it's referring to.

Meh catching up takes so long in this thread it's impossible. I'm only this far atm, still a long way to catch up.

Yes they said that it wasn't really used on the Xbox. Not that it wasn't present, just the games.

Honestly, i hope E3 never comes. I will miss this thread so much :eek:
 

nordique

Member
Early on, some sources (prob: "guesses") were a DX10-equivalent chip with tesselation tacked on.

The assumption being based off the devkit GPUs


I'm pretty excited to see if Nintendo goes forward with a 28nm card.

This from Thraktor (and others, as I recall seeing it discussed in the past) is certainly intriguing

"We also have to consider whether NEC (now Renesas), who manufactured the Gamecube and Wii GPUs, and we can expect are first in line to manufacture the Wii U GPU, are capable of manufacturing at 28nm. As it happens, NEC announced a deal back in 2009 (when Nintendo would have been making the decision) with none other than IBM, to manufacture 28nm chips at East Fishkill, New York, in the very same facility which the Wii U's CPU is being manufactured. How's that for a coincidence?"
 
You guys should really not talk about vulnerabilities and hacking, i'm sure this is against some form of the GAF TOS. I know you guys mean well cause we are crazed for info but come on lol.
 

Azure J

Member
The GPU is not going to be a 7770 or a 4850 or any other number.
It'll be built for Nintendo around what they want and what third parties say they need.
It could have a feature set of a 7xxx model with a much lower speed and without the dedicated RAM. We really can't say for sure. But it most definitely isn't just a repackaged Cenozoic.

No offense, but well duh. :p

It's just nice to see where things fall along and with our only metrics being consumer cards, you'll see a lot of speculation drawing parallels to it.
 
Top Bottom