• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Schnozberry

Member
I'm puzzled by what kind of dedicated silicon they'd be using. I'm going to try and dig through some more Nintendo patents for some clues.
 
Nice to see you back bgassassin.

If I may ask, do you regret mentioning a 600+ GFLOPS GPU?

Nope. I stand by my opinions although that was based on a more traditional design. The problem is silicon dedicated to other tasks makes it harder to classify from a FLOPs angle, though I believe the performance would be similar. So no regrets at all as I was trying to make what wasn't tangible as tangible as possible.
 
Nice to see you back bgassassin.

If I may ask, do you regret mentioning a 600+ GFLOPS GPU?

"And if you think I deserve flak for what I’ve said in the past then I’m here, but you’re wasting your time trying because my view hasn’t changed yet"

Glad to see BG back. Now let's burn him at the stake!

*kidding*
 

Mr_B_Fett

Member
Potentially dumb question, but I was wondering and saw it posed on B3D as well: Is it possible to "mix" nodes for a component?

You could via stacking but this should be pretty obvious (at least to the chipworks folks) as aside from the node difference there would be relatively large TSVs. Moreover, I can't see any reason to do this given:

1) The GPU layout is custom so no savings in using an older node. It would simply be more expensive to stack and bring no benefits.
2) 55nm would seem to defeat the power consumption goals of the device.

In other words, if you are heavily customising the GPU anyway then there is nothing to gain from using an older node and to do so will only increase costs while reducing efficiency.
 

tkscz

Member
Nope. I stand by my opinions although that was based on a more traditional design. The problem is silicon dedicated to other tasks makes it harder to classify from a FLOPs angle, though I believe the performance would be similar. So no regrets at all as I was trying to make what wasn't tangible as tangible as possible.

Man, it's great to see the reason I joined gaf (and pretty much abandoned the GT forums) back. Also, this has been said before. We can't fully judge FLOPs do to parts of it being (for lack of a better term) being fixed. In fact, didn't we guess this in the old WUST? That the lighting would be fixed? I remember commenting on that during E3 2011.
 

guek

Banned
Nope. I stand by my opinions although that was based on a more traditional design. The problem is silicon dedicated to other tasks makes it harder to classify from a FLOPs angle, though I believe the performance would be similar. So no regrets at all as I was trying to make what wasn't tangible as tangible as possible.

That's all well and good but aren't you putting a little too much stock in that case in currently unquantifiable metrics?
 
the zelda demo in particular had a large emphasis on lighting. the 'day/night' switch seemed designed entirely to highlight the lighting change and that was a big part of the demo. i can't recall the bird demo right now. from what i've noticed lately, lighting is one of the easiest to utilize effects that truly make a game feeling modern. i can see them pushing good lighting in just about every mainline nintendo game.

If you get a chance to watch it have a look at the extended 'Bird' tech demo again (it's filmed off screen but the quality is decent in HD), the lighting is perhaps it's most impressive feature imo -

http://www.youtube.com/watch?v=i2Nsa06KRLo

Even more impressive is that the console is rendering the game twice, once on the TV and once on the Gamepad.

I have never really been worried that Nintendo could create fantastic looking games on WiiU hardware because of those two demos (which were created in a short time on the very first version of the devkits).

What they are going to create on final devkits with a decent amount of time and budget is going to be incredible imo.
 

Datschge

Member
I wrote Shin'en a mail since they, being familiar with Wii, 3DS and Wii U, would be in a great position to drop some kind of hints regarding the "secret sauce" on the GPU. Predictably I only got a reference to the obligatory NDA and, wrt to my guess that Nintendo might have taken the modern fixed functions of the 3DS into the Wii U in a compatible/comparable way, that the 3DS GPU and Wii U GPU are completely different and (as announced) designed by different companies. News to nobody really. So I'll just defer to bg's "dedicated silicon" from now on. ^^
 
Man, it's great to see the reason I joined gaf (and pretty much abandoned the GT forums) back. Also, this has been said before. We can't fully judge FLOPs do to parts of it being (for lack of a better term) being fixed. In fact, didn't we guess this in the old WUST? That the lighting would be fixed? I remember commenting on that during E3 2011.

So basically similar to how the Gamecube could do certain things for "free" without a performance hit?


Wii U being deceptively powerful confirmed?

4670284106_ce939bae69_m.jpg
 
If you get a chance to watch it have a look at the extended 'Bird' tech demo again (it's filmed off screen but the quality is decent in HD), the lighting is perhaps it's most impressive feature imo -

http://www.youtube.com/watch?v=i2Nsa06KRLo

Even more impressive is that the console is rendering the game twice, once on the TV and once on the Gamepad.

I have never really been worried that Nintendo could create fantastic looking games on WiiU hardware because of those two demos (which were created in a short time on the very first version of the devkits).

What they are going to create on final devkits with a decent amount of time and budget is going to be incredible imo.

Although I remember pointing it out at the time, there is very clear frame rate struggles on the game pad in that video. Otherwise, it does look very pretty.
 
Potentially dumb question, but I was wondering and saw it posed on B3D as well: Is it possible to "mix" nodes for a component?

This made me think of the old rumors that the WU's CPU and GPU were stacked, and I looked up the company rumored to have been involved in the WU fab

http://www.tezzaron.com/

They were brought up after the CTO made a few comments about Nintendo's next system using their technology

This is where things get tricky from the foundry perspective. A foundry might be willing to give a Tezzaron the information. But consider this: “Nintendo’s going to build their next-generation box,” said Patti. “They get their graphics processor from TSMC and their game processor from IBM. They are going to stack them together. Do you think IBM’s going to be real excited about sending their process manifest—their backup and materials—to TSMC? Or maybe TSMC will send it IBM. Neither of those is ever going to happen. Historically they do share this with OSATs, or at least some material information. And they’ve shared it typically with us because I’m only a back-end fab. I can keep a secret. I’m not going to tell their competition. There’s at least a level of comfort in dealing with a third party that isn’t a competitor.”

It was discussed briefly here

http://www.neogaf.com/forum/showpost.php?p=41951939&postcount=7076

and on B3D here

http://beyond3d.com/showpost.php?p=1669229&postcount=2690

And it makes you wonder if they indeed had any technology end up in the WU at all. Their memory stacking technology in particular

http://www.tezzaron.com/memory/FaStack_memory.html

http://www.tezzaron.com/memory/Octopus.html

http://www.tezzaron.com/memory/memory.html

And the listing of their partners, including IBM, Renesas, TSMC, etc

http://www.tezzaron.com/technology/3D_IC_Summary.html
 
That's all well and good but aren't you putting a little too much stock in that case in currently unquantifiable metrics?

No because I've never said that view was guaranteed nor said it in a manner that it should be taken that way. Plus considering it was almost a year ago when I made that post about a reduced amount of shaders and dedicated silicon, I'm not all of a sudden trying to make some kind of excuse. I said it then and I'll say now that I don't know what the silicon could be for. My personal interpretation is that it's for performance-based tasks. People are free to believe that or not as I can't control that, but I make no claims that is what the logic is for.
 

ozfunghi

Member
No because I've never said that view was guaranteed nor said it in a manner that it should be taken that way. Plus considering it was almost a year ago when I made that post about a reduced amount of shaders and dedicated silicon, I'm not all of a sudden trying to make some kind of excuse. I said it then and I'll say now that I don't know what the silicon could be for. My personal interpretation is that it's for performance-based tasks. People are free to believe that or not as I can't control that, but I make no claims that is what the logic is for.

I just have to ask, what are you basing your opinion on? Educated guesses, or developer/insider feedback you have been privy to but can't share? To be clear; what you are stating is that performance wise, what WiiU is able to put out, graphically that is, is 600 Gflops "worth" of processing power & effects, right?
 
I forgot to add that this old site has a picture of GC's GPU die though small.

http://www.segatech.com/gamecube/overview/

flipper_die.jpg


I just have to ask, what are you basing your opinion on? Educated guesses, or developer/insider feedback you have been privy to but can't share? To be clear; what you are stating is that performance wise, what WiiU is able to put out, graphically that is, is 600 Gflops "worth" of processing power & effects, right?

See my long post above. If it's missing something you're wanting to understand, then let me know.
 

Popstar

Member
According to the dubiously reliable Wikipedia, ASTC is an official extension for OpenGL now. That's where the thought crossed my mind. A tile mode would be interesting. Are we talking PowerVR type tile rendering, or something entirely different?
ASTC is an official extension to OpenGL. But there is no hardware supporting it yet.

We're not talking PowerVR type tile rendering. I mean tilemaps such as you'd see in a SNES game.
 
No because I've never said that view was guaranteed nor said it in a manner that it should be taken that way. Plus considering it was almost a year ago when I made that post about a reduced amount of shaders and dedicated silicon, I'm not all of a sudden trying to make some kind of excuse. I said it then and I'll say now that I don't know what the silicon could be for. My personal interpretation is that it's for performance-based tasks. People are free to believe that or not as I can't control that, but I make no claims that is what the logic is for.
Heh, I was going to contact you to see what you thought about the latest info, but I was waiting to see if we were able to more info about the customizations first. I didn't expect you to pop back up on you own. :)
 

ozfunghi

Member
Medal plz.

If anyone deserves a medal, it's Wsippel:

The Wii U GPU won't be built around or based on Southern Islands. From everything we know, it's a based on R700 but customized beyond recognition. It's unlike any off-the-shelf AMD GPU.

I talked to bgassassin a few days ago, and I believe we concluded that the chip is probably pretty slow on paper, maybe 300, 400GFLOPS or something, but extended with a couple of shortcuts to accelerate certain common, taxing operations.
 

ozfunghi

Member
bg too since wsippel said "we" in your quote; meaning both him and bg reached the same conclusion. :)

No, because BG said his conclusion was not the exact same in response to Wsippel's quote, and BG overshot, lol. Or at least Wsippel gets a bigger medal :)
 
bg too since wsippel said "we" in your quote; meaning both him and bg reached the same conclusion. :)

No, because BG said his conclusion was not the exact same in response to Wsippel's quote, and BG overshot, lol. Or at least Wsippel gets a bigger medal :)

Haha. I would give wsippel the bigger medal because he came from the lower FLOPs angle while I focused on lower ALUs and my take wasn't low enough based on what we know so far. The question now is if Nintendo took an approach similar to our ideas to address performance. Did they use those shortcuts? Dedicate silicon? Both? Neither? It will be interesting to learn.
 
Now it makes sense what BG was telling me over 6 months ago, that the GPU performance would be similar to an AMD e6760 but not based or made just like it except for being very close in power consumption. In that sense the Wii U GPU does share some things in common with that GPU but still completely custom and not like any GPU for a PC or the coming systems from MS and Sony.

However, Nintendo hopefully made the GPU "easy enough" to develop for so porting down-scaled modern engines would not be too much of a hassle so even it's custom "not ordinary" parts can still be used.

Bravo BG
 

ozfunghi

Member
Man reading that old thread has boglled my mind in two ways:

1/It's been nearly a YEAR!
2/And i still remember most of it, lol
 
Now it makes sense what BG was telling me over 6 months ago, that the GPU performance would be similar to an AMD e6760 but not based or made just like it except for being very close in power consumption. In that sense the Wii U GPU does share some things in common with that GPU but still completely custom and not like any GPU for a PC or the coming systems from MS and Sony.

However, Nintendo hopefully made the GPU "easy enough" to develop for so porting down-scaled modern engines would not be too much of a hassle so even it's custom "not ordinary" parts can still be used.

Bravo BG

LOL. I don't deserve a bravo that's for sure. But I do agree with the part in bold and that if this is the case hopefully Nintendo does what is necessary to make it easily accessible for others.


Also I just remembered this comment.

“Today we discovered a new hardware feature of the Wii U that shaves off 100 megabytes of texture memory usage in Toki Tori 2! [This] means we spend less time loading and have more memory available when the game is running.”

So it could be said we have some proof of dedicated silicon assisting performance. The problem is they most likely had to discover it on their own.
 

NBtoaster

Member
If Nintendo isn't even fully documenting the hardware or what the API supports the chances of [good] next gen engine downports are slim.
 
The problem is they most likely had to discover it on their own.
That's actually pretty ridiculous if true.

In fact, the whole situation wherein GAF has had to crowdsource images/ChipWorks has been exceedingly generous in providing dieshots presumably means the documentation doesn't detail any GPU specifications?

VGLeaks MO has essentially been copying and pasting documentation - which is how we have so many details of the PS4 and XBOX 3. Yet, nothing for the Wii U.

I wouldn't be surprised if third parties can't be assed if that's the case.
 
If Nintendo isn't even fully documenting the hardware or what the API supports the chances of [good] next gen engine downports are slim.

That's actually pretty ridiculous if true.

In fact, the whole situation wherein GAF has had to crowdsource images/ChipWorks has been exceedingly generous in providing dieshots presumably means the documentation doesn't detail any GPU specifications?

VGLeaks MO has essentially been copying and pasting documentation - which is how we have so many details of the PS4 and XBOX 3. Yet, nothing for the Wii U.

I wouldn't be surprised if third parties can't be assed if that's the case.

I agree on both accounts.

And yes to your question shinro. Well "any" might be a little much though.
 

ozfunghi

Member
Could anybody provide a short summary of what's been found? I'm not techy in the least, so I have trouble following even the OP.

More than likely 320 SPU => 352 Gflops
additional 4 MB eDRAM (new info)
additional 1 MB SRAM (new info)
lots of customized and thus unknown parts
 
Status
Not open for further replies.
Top Bottom