• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
So the old faithful are still hoping the "mysteries" of the GPU chip are going to yield some incredible magic and new form of graphics processing technology when its more than likely just all the gumpf needed for 1:1 Wii b/c. This is the end of the road, duders. Its been spec'd.

But this:
crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.
This is totally unacceptable and people still wonder why third party developers just dont want to deal with Nintendo. "Figure it out!" Yeah, western devs far and wide's response seems to be "Swivel on it".
 

tipoo

Banned
So I've been out for the last few pages and I'll read back later, but by now, can anyone tell me where all this fixed function hardware is? Outside the DSP (or whatever that is), eDRAM, shaders, TMUs etc, all the uncore parts of the GPU look exactly the same as any other unified shader GPU to me. Where are these supposed fixed function shaders that make it faster than it seems?
 

javac

Member
Blimey charlie, they didn't hold back, did they?

As in they are trashing Nintendo...because I don't know. Seems like a genuine article with information given to the reader as is. It isn't really skewed or bias or making Nintendo look bad. If it means anything I'm happy with the tech. If X is anything to go by I'm very happy.

If you mean 'they didnt hold back' as in they went all out with the tech and graphics Gobbledygook then, yeah that's Eurogamer for ya :p

So the old faithful are still hoping the "mysteries" of the GPU chip are going to yield some incredible magic and new form of graphics processing technology when its more than likely just all the gumpf needed for 1:1 Wii b/c. This is the end of the road, duders. Its been spec'd.

I cant speak on behalf of everyone but I'm genuinely not phased by the specs. Again, if I was happy with the DS in face of the PSP, the 3DS in face of the Vita and the Wii in face of the 360/PS3/PC then I think I and many others will still be happy with the Wii U. People expecting the Wii U specs to contain magic are fighting for a lost cause, but yeah that doesn't mean we wont get some stonkingly good looking games on the system. I'm 100% sure of it. Genuinely excited!
 
I look forward to no longer having my head taken off in threads where I correct people's inflated GPU performance figures for the WiiU with a more realistic "~350GFlops".

So I've been out for the last few pages and I'll read back later, but by now, can anyone tell me where all this fixed function hardware is? Outside the DSP (or whatever that is), eDRAM, shaders, TMUs etc, all the uncore parts of the GPU look exactly the same as any other unified shader GPU to me. Where are these supposed fixed function shaders that make it faster than it seems?

Same place as the 150 phantom GFlops.
 

SmokyDave

Member
As in they are trashing Nintendo...because I don't know. Seems like a genuine article with information given to the reader as is. It isn't really skewed or bias or making Nintendo look bad. If it means anything I'm happy with the tech. If X is anything to go by I'm very happy.

If you mean 'they didnt hold back' as in they went all out with the tech and graphics Gobbledygook then, yeah that's Eurogamer for ya :p

I meant this line: "we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league". I decided not to quote it in this thread though because of the high potential for derailing and now the remainder of my post is so ambiguous as to not make sense :(
 
So I've been out for the last few pages and I'll read back later, but by now, can anyone tell me where all this fixed function hardware is? Outside the DSP (or whatever that is), eDRAM, shaders, TMUs etc, all the uncore parts of the GPU look exactly the same as any other unified shader GPU to me. Where are these supposed fixed function shaders that make it faster than it seems?
I already answered you that, the area left to that kind of functions on the normal GPUs isn't remotely comparable in relative size to the area found on WiiU.

Texture compression algorithms, tessellation... those are the kind of blocs that can be found on the HD4780 die shot posted before. Even considering that Nintendo wouldn't customize it (since all games will be made towards the hardware, it doesn't make any sense to support lots of texture formats like a GPU on PC has to do), there's still A LOT of space in that die that could allow those fixed functions or whatever is placed on there.
 
So the old faithful are still hoping the "mysteries" of the GPU chip are going to yield some incredible magic and new form of graphics processing technology when its more than likely just all the gumpf needed for 1:1 Wii b/c. This is the end of the road, duders. Its been spec'd.


Don't get me wrong, but posts like this aren't helpful at all. First even if the WiiU GPU would have a full Wii GPU on it (obviously shrinked to 40nm) that still wouldn't explain everything that's still unknown on the GPU (as is stated in the OP, it would only take up sth. like 10-20% of the Die space). Second afaik nobody is claiming that Nintendo built a GPU that can do wonders. We are simply curious what's all the stuff that we don't know anything/enough of yet. If you aren't curious what that stuff is than you may be in the wrong thread.
 

tipoo

Banned
However there is almost certainly at minimum Hollywood inside this die as well considering how Wii u handles backwards compatibility. @550mhz that would give Hollywood 24gflops. Fixed functions are far better at doing their job than programmable shaders, but can do little else. It is more capable than 360, but it is impossible to really compare beyond that.



Hollywood could be on a separate clock/power plane as well, it doesn't have to be clocked at the same speed of the rest of the Wii U GPU. For perfect compatibility it would be at 243 MHz. We don't know which clock it goes to in Wii U mode or if it even stays on, it's also possible that it's on only for Wii mode. And even if it was, would it be able to help at higher resolutions than it was designed for? The thing pushes 480p max on the Wii.
 

Kenka

Member
DF article really sounds sensationalist. Why do they come with their own private lose definition of "next-gen" ? Is there a clear barrier between what is next-gen and what isn't ? They avoided to talk about the components in the bottom right of the chip that we don't still know about, and don't think it can influence (greatly) the overall output of the console ?


That's really amateurish.
 

User Tron

Member
So the old faithful are still hoping the "mysteries" of the GPU chip are going to yield some incredible magic and new form of graphics processing technology when its more than likely just all the gumpf needed for 1:1 Wii b/c. This is the end of the road, duders. Its been spec'd.

But this:

This is totally unacceptable and people still wonder why third party developers just dont want to deal with Nintendo. "Figure it out!" Yeah, western devs far and wide's response seems to be "Swivel on it".

Regardless if you know the specs or not, you'll always have to "figure it out". Knowing the specs doesn't help you that much, because you cannot see the limiting factors.
 

kinggroin

Banned
I meant this line: "we can now, categorically, finally rule out any next-gen pretensions for the Wii U - the GCN hardware in Durango and Orbis is in a completely different league". I decided not to quote it in this thread though because of the high potential for derailing and now the remainder of my post is so ambiguous as to not make sense :(

While there is still much to discover on this scan, I think it goes without saying that DF is correct, with maybe their insinuated definition of "next-generation" being the only potential point of contention amongst those who want something to argue about.

This system will live and die by its exclusive software. Let's hope for their sake, artistic talent will be enough to overcome the gulf in grunt power, and we see visually competitive games.
 

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
Don't get me wrong, but posts like this aren't helpful at all. First even if the WiiU GPU would have a full Wii GPU on it (obviously shrinked to 40nm) that still wouldn't explain everything that's still unknown on the GPU (as is stated in the OP, it would only take up sth. like 10-20% of the Die space). Second afaik nobody is claiming that Nintendo built a GPU that can do wonders. We are simply curious what's all the stuff that we don't know anything/enough of yet. If you aren't curious what that stuff is than you may be in the wrong thread.

The double standard of WUST and other threads where discussion elicited by the brutal Digital Foundry conclusion is hush-hushed in this similar manner is what has resulted in the echo chamber. Disillusioned posts that can be found also in this very thread are symptoms of that because things aren't matching up to their fictional version of reality as forged by that tight knit little community. We can all read between the lines of posts replying to "oh well, that rules out...." with "50% MYSTERY CHIP INNARDS" and what they're trying to attest. That aint simple curiosity; thats defence which at this point is misplaced.
 

wsippel

Banned
So I've been out for the last few pages and I'll read back later, but by now, can anyone tell me where all this fixed function hardware is? Outside the DSP (or whatever that is), eDRAM, shaders, TMUs etc, all the uncore parts of the GPU look exactly the same as any other unified shader GPU to me. Where are these supposed fixed function shaders that make it faster than it seems?
So you managed to count and identify every single block and compared it to an off-the-shelf R700? Mind to post your findings?
 

Mastperf

Member
A good chunk of the silicon is still of unknown nature.
Seems more of a case of some still holding out hope while others have already made up their mind. DF knows it's highly unlikely the unknown silicon will make a significant difference. They mention that there's still plenty unknown about the gpu and form an educated guess on what they do know. People have been doing that on both sides for months, and that was long before we had any info whatsoever.
 

Here are some quotes from DF that strike me as bias.

While the general capabilities of the Wii U hardware are now beyond doubt,

Really, if anything, they are more in doubt. Especially considering this quote from them

However, while we now have our most important answers, the die-shot also throws up a few more mysteries too - specifically, what is the nature of the second and third banks of RAM up on the top-left, and bearing in mind how little of the chip is taken up by the ALUs and TMUs, what else is taking up the rest of the space? Here we can only speculate, but away from other essential GPU elements such as the ROPs and the command processor, we'd put good money on the Wii U equivalent to the Wii's ARM 'Starlet' security core being a part of this hardware, along with an audio DSP. We wouldn't be surprised at all if there's a hardware video encoder in there too for compressing the framebuffer for transmission to the GamePad LCD display. The additional banks of memory could well be there for Wii compatibility, and could account for the 1MB texture and 2MB framebuffer. Indeed, the entire Wii GPU could be on there, to ensure full backwards compatibility.

So they side with 40-50% of the chip used up for a security core, backward compatibility, and an audio DSP? Seriously, these would account for much less space. Perhaps we are dealing with assymetric shaders?

They also neglect to state that the portion for Wii BC can be used for WiiU games. But they rather state that only 15% of the gpu is actually used for gpu tasks.

So with so much unknown about the chip, how can claim this?

AMD's RV770 hardware is well documented so with these numbers we can now, categorically, finally rule out any next-gen pretensions for the Wii U

Seriously? If anything, we have to reconsider our accessment of Wii U. RV770 hardware performance is no longer a ballpark given how much was customized. This thing has a lot more grunt then previously thought.
 

Mastperf

Member
Here are some quotes from DF that strike me as bias.



Really, if anything, they are more in doubt. Especially considering this quote from them



So they side with 40-50% of the chip used up for a security core, backward compatibility, and an audio DSP? Seriously, these would account for much less space. Perhaps we are dealing with assymetric shaders?

They also neglect to state that the portion for Wii BC can be used for WiiU games. But they rather state that only 15% of the gpu is actually used for gpu tasks.

So with so much unknown about the chip, how can claim this?



Seriously? If anything, we have to reconsider our accessment of Wii U. RV770 hardware performance is no longer a ballpark given how much was customized. This thing has a lot more grunt then previously thought.
You're doing the same thing you accuse them of. We have no idea how much grunt this thing has.
 
You're doing the same thing you accuse them of. We have no idea how much grunt this thing has.

Sorry, I made the assumption that space on the gpu would be used for graphic related things. Silly me

I retake my statement about having more grunt. It could in fact have less grunt. But RV700 chips can't be used as a ballpark
 

tipoo

Banned
Wii U!

352 Gflops GPU (???)
2GB T1 SRAM @ 12.8 GB/s (???)
32MB eDRAM @ 140 GB/s (???)
1MB sRAM/eDRAM (???)
???? CPU (???)

That would be massive and hugely expensive :p
2GB DDR3, not T1 SRAM.
Orbis is 176gb/s not 192. And once again Wii U memory isn't DDR3. Its T1 SRAM or somethin, but you got the bandwidth right this time. :)


What now? The main memory IS DDR3, we know that from the markings, it was cross checked with listings on Hynixs (I think) memory database. It's DDR3 for sure.


So you managed to count and identify every single block and compared it to an off-the-shelf R700? Mind to post your findings?

I'm just asking if it was pointed out while I was gone. Turn down the snark, please.
 

The Hermit

Member
Is the gap in power between WiiU vs next-gen the same as it was Wii vs PS360 or even bigger?

I didn't expected such a huge gap... gg Nintendo.
 

Cronq

Banned
The double standard of WUST and other threads where discussion elicited by the brutal Digital Foundry conclusion is hush-hushed in this similar manner is what has resulted in the echo chamber. Disillusioned posts that can be found also in this very thread are symptoms of that because things aren't matching up to their fictional version of reality as forged by that tight knit little community. We can all read between the lines of posts replying to "oh well, that rules out...." with "50% MYSTERY CHIP INNARDS" and what they're trying to attest. That aint simple curiosity; thats defence which at this point is misplaced.

My thoughts exactly; we still don't understand everything about the hardware and it's already been proclaimed as "weak". I don't believe it's a power house, but I do think it's a capable machine and will be more than sufficient for Nintendo and competent developers to make some awesome stuff. It's a balanced system and I think this die photo shows just how complex and sophisticated Nintendo is.
 

Mastperf

Member
Sorry, I made the assumption that space on the gpu would be used for graphic related things. Silly me

I retake my statement about having more grunt. It could in fact have less grunt. But RV700 chips can't be used as a ballpark
"Yeah, sorry. I won't stoop to their level." would have been a better reply.
Don't let me stop you from fighting your war.
 

kinggroin

Banned
Is the gap in power between WiiU vs next-gen the same as it was Wii vs PS360 or even bigger?

I didn't expected such a huge gap... gg Nintendo.

No. Not at all.

Still plenty sizeable it seems, but having a more modern (if extensively customized) architecture goes a long way to helping games incorporate the kind of effects we see in 360/PS3/PC games.
 

tipoo

Banned
Now why would Nintendo go and do such a thing? EDRAM is expensive. You have it, you use it.

I would hope, but is it possible that it only has the wiring to connect to Hollywood for BC, and that's why we've never heard of it? And this is 100% bum pulled theory, but could Hollywood be being used for realtime video decode/encode while it's in Wii U mode, making it unavailable to devs?

EDIT: Possibly I'm wrong and it's available to devs:


The NeoGAF folks could've just asked me and I would've told them about the 32MB MEM1 and 2MB MEM0/EFB without die shots :p

32MB 1T-SRAM MEM1, 2MB 1T-SRAM MEM0/EFB, 1MB SRAM ETB. I bet the I/O block with the tank is SATA and the 7x next to it USB.

https://twitter.com/marcan42
 

VariantX

Member
Is the gap in power between WiiU vs next-gen the same as it was Wii vs PS360 or even bigger?

I didn't expected such a huge gap... gg Nintendo.

I don't think its as huge as the gap between the previous generation and the current. Last I heard the gap between the Wii and the 360/PS3 was around 10x.
 

tkscz

Member
Really?! That's what I expected since the begginning, not something that could run nextgen games...

Anyway, I don't even know why I am talkinh about this, I just want some games...

You sir are a saint on gaf for that one sentence.
 
Comments from Marcan (who provided the clock speeds).

@MehNitesh 32MB 1T-SRAM MEM1, 2MB 1T-SRAM MEM0/EFB, 1MB SRAM ETB. I bet the I/O block with the tank is SATA and the 7x next to it USB.
The NeoGAF folks could've just asked me and I would've told them about the 32MB MEM1 and 2MB MEM0/EFB without die shots :p
I still think the entire Wii GX is in there too. It's not a ridiculous amount of die area - about 2x the size of the EFB 1T block.
https://twitter.com/marcan42
 

tipoo

Banned

That's me on twitter (MehNitesh) that sent him the die shot we got :p

Man, that much of a difference?

3.5x (to 5x on the high end of the PS4) doesn't seem too bad to me anymore, after the Wii. It's the CPU I'm looking at more. 1.3x the clock speed and 2.6x the raw core count, plus Jaguar has some interesting FPU enhancements that the PPC 750 was traditionally weak on (Jaguar doubles its FPU pipeline to 128 bit, supports 256 bit instructions, etc etc), and Marcan said the FPU was unchanged since then on the U. It appears the Jaguar would have a triple win in instructions per clock, clock speed, and core count. Unless there's some crazy hidden enhancements in Espresso, but again he's saying it appears identical.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
So everyone falling back to "fixed function logic makes it incomparable", it seems to me that the uncore (not sure if that term applies in GPUs, but everything that isn't a shader, eDRAM, TMU etc) parts are about as large as other GPUs. Where is all this fixed logic that will save it?
Who says it needs saving?
 
So I've been out for the last few pages and I'll read back later, but by now, can anyone tell me where all this fixed function hardware is? Outside the DSP (or whatever that is), eDRAM, shaders, TMUs etc, all the uncore parts of the GPU look exactly the same as any other unified shader GPU to me. Where are these supposed fixed function shaders that make it faster than it seems?
Try to keep up tipoo, it's been already answered:

http://www.youtube.com/watch?v=1Hc7gVmoMoA
 
The thing about the Digital Foundry article is, they say 1.5 the power of the 360, with no complex 3D games at 1080p.

First of all, maybe they're new to Nintendo's art styles of choice.

And two, I thought people were saying that with the 32MB of EDRAM, the Wii U got either 720p with 4x AA or 1080p for free?

IDK, as long as I get 1080p out of the second half of the Wii U's life cycle, I'll be happy with its power. 720p needs to die.
 

Mlatador

Banned
I don't think its as huge as the gap between the previous generation and the current. Last I heard the gap between the Wii and the 360/PS3 was around 10x.

In raw GFLOPS 360 GPU was 20x Wii, this time round 720 is looking to be 3.5x Wii u

All this talk about how many "x" times a console is more powerful than another. Sorry to be that guy again, but even if console A is 10x more powerful than console B, what does it matter in the end? Are/were the games 10x "better" or 10x more fun?

Now WAIT: All the talk about how many times a console is more powerful than another was much more relevant in the early years of gaming and especially in the early years of 3D graphics. Visions couldn't be realized because of that. 3D objects didn't match the ideas they were build from - simply because there weren't enough polygons and graphical effects to make objects look like devs wanted them to do, and yet the games were fun (and are IMHO still to this date).

So there was indeed a corelation between hardware power and "better" games, it seems, but taht was mostly true for games that strove for some sort of realism or huge worlds.

I'd say that since the PS360 the difference between what devs try to do and what hardware let's them do has become MUCH SMALLER. I'd even say that from that point on any improvements in hardware will only help them a little, since they are almost "free" now. They are free to do what they want (almost). Since the PS360 I feel they are not "restrained" by hardware anymore. If anthing, they are restrained by their own ideas, talent and artistic ability. OH, an by MONEY (huge development costs).

In addition to that, publishers,´and the market with all it's streamlined taste is holding them back more than anything ever!

There are so many people here (or in general) whose hopes of a better gaming future with better games almost soley rely on "better hardware", when that actually should be their least concern.

And yes, Wii U games will still look "very good". Some of them will even look "awesome". Why? Because there is only so much your eyes can see. And with all the tech inside of the Wii U, devs will be more than able to meet their visions if they want and work hard.
 

sniperpon

Member
All this talk about how many "x" times a console is more powerful than another.

I agree with this sentiment. To me, there are two ways to compare disparate pieces of hardware: the cold-blooded empirical approach (comparing gigaflops and what have you), and the more (in my opinion) real-world approach of what the user's experience is.

For instance, transitioning from the Genesis to the 3DO in 1993 was an Earth-shattering switch for me; FMV, cd-quality sound, s-video, 32-bit color, and even 3d in some of the games? It was mind blowing (I remember people lining up in the mall to get a look at Road Rash when it came out-- it'd be like seeing a working holodeck today).

In terms of raw computing power it might not have been a huge jump, I don't know the answer to that. But when you factor paradigm shifts like the switch to cds as part of the equation, the real world effect lit your hair on fire.

To stick with my analogy, in the real world (versus on paper) will the 720/PS4 "light our hair on fire" compared to the Wii U? I doubt it-- the games will look marginally better I'm sure, but I don't think it's going to be any sort of revolutionary difference (who knows, maybe I'll be wrong, but I'm just throwing it out there).
 

MDX

Member
This is totally unacceptable and people still wonder why third party developers just dont want to deal with Nintendo. "Figure it out!" Yeah, western devs far and wide's response seems to be "Swivel on it".


IWATA had addressed/explained that particular issue.
 

z0m3le

Banned
Hollywood could be on a separate clock/power plane as well, it doesn't have to be clocked at the same speed of the rest of the Wii U GPU. For perfect compatibility it would be at 243 MHz. We don't know which clock it goes to in Wii U mode or if it even stays on, it's also possible that it's on only for Wii mode. And even if it was, would it be able to help at higher resolutions than it was designed for? The thing pushes 480p max on the Wii.

It is possible that it is this way, but I find it unlikely since Nintendo has stated that some of hollywood was simply done inside the newer GPU architecture, since Wii U in Wii mode would have to use the newer GPU for BC, it likely all shares a clock/power plane.

Also I had the clocks wrong, if it was in fact an entire hollywood on this die, it would be 27GFLOPs fixed function shaders, but since some of hollywood will be done on the new architecture directly, it is likely that you'd only get a part of that 27GFLOPs as fixed function, I would guess around 16GFLOPs which would mean free lighting at the very least.
 

Orionas

Banned
The issue isn't the PSU, that's rated at 75W. I.e. there's tons of space left, even if we'd dedicate sth. like 20W to the 4 USB ports.

its not 20w... its low voltage usb ports, thats why people having issues with non self powered hdds, loosing connection, because when the hdd works and is at stress, trys to get more voltage, it cant so it conflicts, stops and restarts. And thats why nintendo suggests self powered external hdds...

I wonder why they choose it to be like it, they did some mistakes. Even though, lets give this amazing 20w to usbs, 55 left. lets detuct 24w for cpu, cd, etc, 26 left for the gpu.
 
My thoughts exactly; we still don't understand everything about the hardware and it's already been proclaimed as "weak". I don't believe it's a power house, but I do think it's a capable machine and will be more than sufficient for Nintendo and competent developers to make some awesome stuff. It's a balanced system and I think this die photo shows just how complex and sophisticated Nintendo is.
This line of thought is the problem. Like it has been said even before the offical reveal when listening to sources, it's not a substantial leap in performance, not what we traditionally witness with cycle transitions at least. Launch games both 1st party and 3rd party prove that.

Will things improve? Yes. Will we see great looking games? Yes. But again that's not the point, since we still see pretty impresive looking games coming out from the 360 and PS3. The point is Nintendo underdelivered again in processing capability but now the did so with a product that is not as engaging as the original one. That's where disapointment comes from.
 
Status
Not open for further replies.
Top Bottom