• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

z0m3le

Banned
At 352 gflops you probably won't notice much difference. Though it'll still be nice to see Nintendo IPs in HD.

360 never hit its theoretical flops iirc, there were bottlenecks that couldn't be overcome. Beyond that, a lot of flops were wasted trying to do effects that are cheaper to do in DX10.1/11... I'm not going to even go into efficiency differences between those architectures, but esspresso should comfortably out perform Xenos by more than 50% doing the same things... It is a matter of devs focusing on the platform enough to pull the extra performance out of it.

Matt, the only thing I can think of is Shader Model 4.1, R800 used SM5 iirc, so that would be a good reason to keep calling it R700 I guess.
 
360 never hit its theoretical flops iirc, there were bottlenecks that couldn't be overcome. Beyond that, a lot of flops were wasted trying to do effects that are cheaper to do in DX10.1/11... I'm not going to even go into efficiency differences between those architectures, but esspresso should comfortably out perform Xenos by more than 50% doing the same things... It is a matter of devs focusing on the platform enough to pull the extra performance out of it.

Matt, the only thing I can think of is Shader Model 4.1, R800 used SM5 iirc, so that would be a good reason to keep calling it R700 I guess.

Which basically means Wii U exclusives. SMT x FE should look mighty impressive as well as Zelda U.
 
That's a silly thing to say. We didn't know it was there to ask anyone. And he's obviously not forthcoming with info on his own accord.
I'm starting to doubt this guy. I mean, I can understand that with reverse engineering he could get the speed of the clocks, or even the amount of eDram/SRAM but... also the type of eDram? I mean, how can he know that those 2MB are of 1T-SRAM and not another kind of DRAM, since we know that 1T-SRAM IS NOT USED on the WiiU any more? ¡¡¡This contradicts the few confirmed info we had from MoSys!!!!
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Ok, to put things in perspective... if the chip is that expensive, why did Nintendo go that route? Wouldn't it have been cheaper to take a stock 4770 (RV740?) and a Wii GPU with?
Also, the guy from Chipworks says it's an impressive piece of kit... in what sence? Low TDP? Performance per watt? The eDRAM? And is this still cheaper or more performant (or both) than going with stock chips + GDDR5.

And does his comment mean we can expect some "bonusses"?
The chip is that expensive because it meets certain non-trivial goals. Which goals, apparently, are not met by an RV740.

The fact those goals don't align with Joe Gaffer's world views does not make the chip cheap. I know it's a bit hard for certain gaf audience to comprehend that.
 
Now we know why the Wii U isn't cheap.

But there's absolutely nothing there that shows it's significantly more powerful (than the PS3/360), at least, not so far.
 

nikatapi

Member
The chips is that expensive because it meets certain non-trivial goals. Which goals, apparently, are not met by an RV740.

The fact those goals don't align with Joe Gaffer's world views does not make the chip cheap. I know it's a bit hard for certain gaf audience to comprehend that.

It just sounds strange that Nintendo invested a lot of money in customizations in order to get a chip with questionable performance benefits compared to some "stock" chips that might actually cost less money and lead to a better performing and cheaper to produce system.

This is what baffles me personally, but i guess many questions will be answered once games built from the ground up to take advantage of the system are revealed. For now, we can only (mis)judge based on the game footage we have seen.
 
Such an expensive chip for so little. Is all that eDRAM really necessary? Why didn't they just go with a faster ram? It just makes no sense why they went this route... If they needed eDRAM/eSRAM they could have just used it for the CPU since those are usually held back by latency.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
GAF trolls screaming "7 year old hardware" = 0; Reality/Chipwork's profesisonal = 1

Folks aren't saying it's 7 year old hardware. They are saying it performs like 7 year old hardware.

I also like the way you misspelt professional.
 

Thraktor

Member
I'm starting to doubt this guy. I mean, I can understand that with reverse engineering he could get the speed of the clocks, or even the amount of eDram/SRAM but... also the type of eDram? I mean, how can he know that those 2MB are of 1T-SRAM and not another kind of DRAM, since we know that 1T-SRAM IS NOT USED on the WiiU any more? ¡¡¡This contradicts the few confirmed info we had from MoSys!!!!

I assume that the 1T-SRAM was a mistaken assumption on his part (as it matches the 1T-SRAM on Hollywood). No need to be hard on the guy, he's given us some very useful info.

Now we know why the Wii U isn't cheap.

But there's absolutely nothing there that shows it's significantly more powerful (than the PS3/360), at least, not so far.

In raw computational performance, perhaps, but there's a lot more to GPU design than raw computational performance.
 
The chip is that expensive because it meets certain non-trivial goals. Which goals, apparently, are not met by an RV740.

The fact those goals don't align with Joe Gaffer's world views does not make the chip cheap. I know it's a bit hard for certain gaf audience to comprehend that.

retailers supposedly still make a profit when Wii U is sold for $250. that is what makes the entire system cheap.
 

IdeaMan

My source is my ass!
For the french folks following this story, please, it's not coming from DF but from GAF tech-enthusiasts and Chipwork ! :p

I'm saying this because Gamekyo quoted only the DF article without reporting this crowdfounding part.
(and hello sebthemajin btw :) )

If someone want to write about this never-seen-before (i think) community project of funding a private lab to study the guts of a console, don't hesitate to submit it to NotEnoughShaders, it's worth its own article somewhere !
 

ozfunghi

Member
The chip is that expensive because it meets certain non-trivial goals. Which goals, apparently, are not met by an RV740.

The fact those goals don't align with Joe Gaffer's world views does not make the chip cheap. I know it's a bit hard for certain gaf audience to comprehend that.

I'm just trying to understand, not knock it. What goals exactly would make a stock chip (maybe slightly customized) not an option, which are present in the U-GPU? Obviously the eDRAM is something a stock chip doesn't have, but could that not have been avoided going for a PS4-like setup (albeit less performant)? Or how/why does the Wii U need this unlike for instance, Sony's next? Is that solely due to the gamepad/BC?

And if the chip is indeed a marvel, is it due to functionality needed for dual screens and console specific other features, or can we expect something beyond "352 Gflops" worth of performance as well?

And i'll ask my other question again, about how much Wii GPU parts could actually help out, seeing as that GPU was heavily outperformed by the WiiU GPU... would it even matter? We're talking about +/-1/30th the performance gain, doesn't look like that much of a performance gain.
 
For the french folks following this story, please, it's not coming from DF but from GAF tech-enthusiasts and Chipwork ! :p

I'm saying this because Gamekyo quoted only the DF article without telling about this crowdfounding subject.

(and hello sebthemajin btw :) )

If someone want to write about this never-seen-before (i think) community project of funding a private lab to study the guts of a console, don't hesitate to submit it to NotEnoughShaders, it's worth its own article somewhere !




Haha :p
btw, did you heard about fixed function after you showed the die to someone ? Or is it from before ?
Also, don't forget to come in the irc chat to talk about food :D
 

OryoN

Member
Great job on the labeling Thraktor! You almost wasted the entire alphabet. Had a couple more pieces of silicon been on that die, you'd have to scrap the whole project, lol. J/k


This GPU just got a whole lot more mysterious. I'm anxious to know more about all those other units. They all seem to be heavily focused on local caches. The pairs one peek my curiosity.

Also, one of those SIMD cores(upper right) is a bit larger than the rest. Slightly more specialized?
 

Saxrebel

Neo Member
So I'm pretty sure the Wii U GPU is so custom due to the HUGE need for lower latency and lower power consumption while getting as much raw power as possible, as obviously outlined in "Iwata Asks" and this thread. Those have been themes throughout the Wii U's development.

I just want to know what the GPU is made of, so even if we don't know what it's BASED off, we still could have an idea of what it's capable of.
 

IdeaMan

My source is my ass!
Haha :p
btw, did you heard about fixed function after you showed the die to someone ? Or is it from before ?

No it is from before.

The GPU was always described to me as a programmable type, not a "50% programmable + 50% fixed functions-topping thingy".

Now, maybe there are indeed a fair amount of fixed functions, but i really doubt it's to a great (or i would say fantasized :p) extent.
 

tipoo

Banned
Yeah, that one gets me too. Actually, it's right across from the CPU on the MCM...



Nope. Give it some time. We're not even close to finished here yet!


Possibly a scratchpad for GPGPU work? Traditionally the biggest con of that has been passing what the GPU worked on to the CPUs memory as that could often take longer than the calculation itself, perhaps a small memory that both can access quickly reduces that problem. Spitballing here.
 
It just sounds strange that Nintendo invested a lot of money in customizations in order to get a chip with questionable performance benefits compared to some "stock" chips that might actually cost less money and lead to a better performing and cheaper to produce system.

This is what baffles me personally, but i guess many questions will be answered once games built from the ground up to take advantage of the system are revealed. For now, we can only (mis)judge based on the game footage we have seen.

I've wondered the same thing.


Source? And Nintendo is losing money on the thing.

They're losing a very small amount. They'll probably be profitable on each console sold by the fall.
 

DrWong

Member
For the french folks following this story, please, it's not coming from DF but from GAF tech-enthusiasts and Chipwork ! :p

I'm saying this because Gamekyo quoted only the DF article without reporting this crowdfounding part.
(and hello sebthemajin btw :) )

If someone want to write about this never-seen-before (i think) community project of funding a private lab to study the guts of a console, don't hesitate to submit it to NotEnoughShaders, it's worth its own article somewhere !

Yep, just corrected this on a FR forum too ;]

but that's the joke, the point it is 2012 tech that is comparable to 06, 07 levels of performance

Yeah, sure.
 

LCGeek

formerly sane
but that's the joke, the point it is 2012 tech that is comparable to 06, 07 levels of performance

Sans the fact that it's DX9 level of performance vs a machine with a higher feature set. Its ok your side wants to knock it anyway it can try but at this point its just pathetic clueless trolling.
 
Going by recent Nintendo statements, the amount apparently isn't all that small.

Reggie has said in interviews that they are profitable as soon as a customer buys a single retail game. That means the console probably costs them about $10-20 more than they are selling it for.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
Reggie has said in interviews that they are profitable as soon as a customer buys a single retail game. That means the console probably costs them about $10-20 more than they are selling it for.

That was a misquote.
 

Schnozberry

Member
I know it's been said all over GAF multiple times, but the real world performance of the PS3 and 360 were far from their theoretical maximums. Hopefully whatever customizations Nintendo made to the GPU and memory subsystem were aimed at improving the gap between what is claimed and what is seen. The early ports on the system were pretty much a worst case scenario. Nintendo handed developers unfinished hardware and software, and didn't have their SDK or full power dev kits into their hands until a few months before the software would have gone gold.

To be clear, I'm not trying to say the Wii U will be able to come anywhere close to the performance of Orbis and Durango. Just that the gap between the Wii U and PS360 will likely be larger than what might be perceptible from just looking at the raw computational numbers. Remember when Sony advertised that the PS3 could push 1.8TFLOPs is some preposterous hypothetical situation?
 
Such an expensive chip for so little. Is all that eDRAM really necessary? Why didn't they just go with a faster ram? It just makes no sense why they went this route... If they needed eDRAM/eSRAM they could have just used it for the CPU since those are usually held back by latency.
GPU are also sensible to latency since they became programmable, not to speak if we enter the realms of GPGPU. If you have to perform multiple shaders, then the more the GPU is waiting for the data the less effectivity it has.
 
GPU are also sensible to latency since they became programmable, not to speak if we enter the realms of GPGPU. If you have to perform multiple shaders, then the more the GPU is waiting for the data the less effectivity it has.

I honestly can't see a GPU that doesn't even need a fan being limited by latency.
 

Earendil

Member
I honestly can't see a GPU that doesn't even need a fan being limited by latency.

Not sure what you are saying here. Are you saying that since it doesn't have a fan, it has other shortcomings that are more important than latency? Because if so, that's not an accurate way of looking at it. Every chip can benefit from low latency, that's what makes them more efficient. If the bandwidth is an issue, that can be negated by low latency, because the data gets to the processor faster, so overall it may actually perform better than a memory system with higher bandwidth, but higher latency. Without actual latency/bandwidth numbers, we can't say. But low latency is ALWAYS a good thing.
 

majik13

Member
Not sure what you are saying here. Are you saying that since it doesn't have a fan, it has other shortcomings that are more important than latency? Because if so, that's not an accurate way of looking at it. Every chip can benefit from low latency, that's what makes them more efficient. If the bandwidth is an issue, that can be negated by low latency, because the data gets to the processor faster, so overall it may actually perform better than a memory system with higher bandwidth, but higher latency. Without actual latency/bandwidth numbers, we can't say. But low latency is ALWAYS a good thing.

maybe I misunderstood the original comment but doesn't the wiiu have a fan?
 
Status
Not open for further replies.
Top Bottom