• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
So they want to get the most use possible out of a weak GPU. I kind of understand that, but part of me thinks they could have had significantly more power without the edram, accepted slightly less efficiency and still have much better performance for the same price.

Nintendo do like their edram though. Maybe it's just a philosophy thing.
The point on having a less powerful chip, but that can work much closer to it's maximum performance is that development costs are much lower.
If you have a predictable hardware, and you know exactly what it is capable of and what not, you cut A LOT of your development time that otherwise would be spent on finding how to achieve things or cutting content.

On the other hand, eDram is more expensive than normal RAM, but the transistor density is much bigger than any other part of the chip, so you can funnel more transistors per mm^2.
 

joesiv

Member
FLOPS have a direct impact on the visual spectacle a console will be able to display. A lot of people care.

A low power draw on the other hand, well, it might save you 70 cents a month! Score!

Granted, that 70 cents will easily pay for 2 WiiU VC games during this promotion ;)
 

deviljho

Member
"Impressive" is subjective, after 7 years I would just say that's expected.

Sure. I'm just saying chip architecture performance per watt has risen a lot since 2005.

Is the goal to be "impressive" rather than "expected"? I'm not sure how these labels are relevant at all. I don't see anything "impressive" with the hardware and expected cost of the MS and Sony consoles. Because I'll put the 8 jaguar cores into the "expected column" just as easily as Wii U's low power draw.

The Wii U currently is the only console to render to two different screens simultaneously, one of which is wireless, with a more or less lag-free experience. That is impressive. Perhaps Kinect 2 will be impressive as well.

My assumption was the two arms (HW and SW engineers) would be collaborating throughout the whole design process, but perhaps the SW side is so behind the curve, they weren't able to make any sensible suggestions.

Yeah, this is off-base.
 

Schnozberry

Member
FLOPS have a direct impact on the visual spectacle a console will be able to display. A lot of people care.

A low power draw on the other hand, well, it might save you 70 cents a month! Score!

John Q Gaffer cares an awful lot. Huge difference between that and the rest of the people who purchase consoles. The idea that the general gaming populace is obsessed with graphical performance is not beared out by any metric we can measure.
 

Thraktor

Member
FLOPS have a direct impact on the visual spectacle a console will be able to display. A lot of people care.

No, sir, they have an indirect impact. The ability of the computational components of a GPU to produce "visual spectacle" is entirely dependent on other aspects of the GPU's architecture, and in particular the manner in which data is fed to and from those computational components.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
It is in relation to the power draw. Wether you agree with me or not... I really couldn't care less, sorry.

Who cares about power? Really. If your saying that's one of the benefits of this design then it's a poor design choice. Folks would take more performance/power over a cheaper running console any day.
 

mantidor

Member
Sure. I'm just saying chip architecture performance per watt has risen a lot since 2005.

Besides, that's with an expensive SSD, battery, monitor, keyboard&trackpad etc. The SSDs in most ultrabooks alone probably cost over 100 at least.

And the WiiU comes with a controller and lag free streaming.

The point really is price, consoles can't be $700, we saw how that ended up being, people thinking Nintendo artificially gimped the WiiU just because are funny, it's absolutely not surprising the console packs impressive modern hardware, even if it doesn't give you all the flops you need.
 

Earendil

Member
FLOPS have a direct impact on the visual spectacle a console will be able to display. A lot of people care.

A low power draw on the other hand, well, it might save you 70 cents a month! Score!

I don't know where you live, but not ever country pays the same amount for electricity. So, it may not save you a lot of money, but it might make a difference for someone else.

We all need to remember this simple formula:

GAF's priorities != general public's priorities
 

The_Lump

Banned
Well another wrench is Iwata's comments about having to outsource and collaborate with other developers because they don't understand HD development. What are the odds they actually had some brilliant master plan of super efficient secret sauce when they're basically on record stating they don't know what the fuck they're doing?


I don't think they "didnt understand" HD development, they just had to basically re-jig the whole company to cope with the larger teams, new development process and higher costs. It's perfectly normal for a company to 'outsource'for help/ideas.
 

jay

Member
I wouldn't characterise the WiiU's performance as anyway impressive. In fact. For a new console that launched at the tail end of 2012. It's really unimpressive. They must have worked really hard to produce a console that was that gimped.

I just realized you and Baconsammy are two people.
 

Durante

Member
We all need to remember this simple formula:

GAF's feelings != general public's feelings
Looking at Wii U sales, the general public does not seem entirely enthused by its efficiency either.

Anyway, why are we talking about feelings or what each of us considers "impressive" in this thread?
 
My assumption was the two arms (HW and SW engineers) would be collaborating throughout the whole design process (i.e. we need this to do this, can you make it happen), but perhaps the SW side is so behind the curve, they weren't able to make any sensible suggestions.
Eh, what could the SW guys legitimately suggest that isnt solved by this design? Low latency? The Wii U GPU has it. Programmable shaders? Wii U has it. High Speed eDRAM? Wii U has it. SRAM? Wii U has it. Having your whole team up to speed on how to utilize programmable shaders? No GPU can solve that problem. It comes with experience ans learning.
 

Diffense

Member
Well, it seems Nintendo's hardware design philosophy since the GCN has not changed. Their experience with the N64's hardware might have been the turning point. Clearly they now prefer hardware for which the peak theoretical performance is close to achievable in practice without excessively arcane software engineering. It makes sense when you consider that Nintendo's hardware is primarily a vehicle for delivering their own software products and the success of their platform relies on consistent delivery of hit games. They WILL have to eat their own dog food...a lot of it.

So once again Nintendo has gone its own way and the proof will be in what they deliver over the lifespan of the Wii U. The Wii U GPU (unlike the Wii's) seems to be modern in terms of its capabilities if lacking some of the raw power of its competitors. It remains to be seen the extent to which the difference in computing power between the machines will show up in the games made for them. Can careful tuning and efficiency go toe-to-toe with raw power? Next E3 will be an interesting one.
 

majik13

Member
I don't know where you live, but not ever country pays the same amount for electricity. So, it may not save you a lot of money, but it might make a difference for someone else.

We all need to remember this simple formula:

GAF's priorities != general public's priorities

this is true, and as far as I remember power draw is very important in Japan culture. and guess where the Wii U was made?
 

Earendil

Member
Looking at Wii U sales, the general public does not seem entirely enthused by its efficiency either.

Anyway, why are we talking about feelings or what each of us considers "impressive" in this thread?

I should have said 'priorities', rather than feelings. But the point still stands. We do not speak for consumers as a whole. And the sales of any console are dependent on software, not hardware. I thought we all knew that.

this is true, and as far as I remember power draw is very important in Japan culture. and guess where the Wii U was made?

Electricity costs in Japan are at least twice what they are in the US. Just an FYI to people.
 

Durante

Member
So once again Nintendo has gone its own way and the proof will be in what they deliver over the lifespan of the Wii U. The Wii U GPU (unlike the Wii's) seems to be modern in terms of its capabilities if lacking some of the raw power of its competitors. It remains to be seen the extent to which the difference in computing power between the machines will show up in the games made for them. Can careful tuning and efficiency go toe-to-toe with raw power? Next E3 will be an interesting one.
I hate this kind of argument, because it seems to imply -- without convincing evidence -- that the competing designs are less efficient in accomplishing their own goals.
 

jaz013

Banned
FLOPS have a direct impact on the visual spectacle a console will be able to display. A lot of people care.

A low power draw on the other hand, well, it might save you 70 cents a month! Score!

I could ask virtually all the people I know that play videogames in regular basics, and NOBODY would care.

What they care? Is the game fun? Great! Does it looks good? Even better! Is there some blood/action packaged sequences/gritty theme/sexy characters? Perfect!

"But, hey, it's on a console with few FLOPs!", *blank stares*
 
So do we no for sure yet that it has 40 shaders per block, therefor a total of 320? Which would then equal 352gflops? Has any of this been confirmed? I realize this thing is highly customized.
 

lherre

Accurate
I hate this kind of argument, because it seems to imply -- without convincing evidence -- that the competing designs are less efficient in accomplishing their own goals.

In fact I think the other machines will be very efficient measured as flops/watts
 

Earendil

Member
I hate this kind of argument, because it seems to imply -- without convincing evidence -- that the competing designs are less efficient in accomplishing their own goals.

What were the hardware goals of the 360 and PS3? I would think it would be to get as close to their theoretical maximums as possible. Did this happen? A strong argument could be made that they did not. Same with the PS2 and Xbox. This is why people compare the Wii U to the Gamecube. Both systems appear to be designed to get as much performance out of a little bit of power as possible. And in the GC's case it did a pretty good job. Hopefully the Wii U will as well.

EDIT:

If we are talking about the next consoles, then I have no idea how efficient they will be. But they will certainly be expensive, and use a lot of power. It all comes down to priorities. Nintendo obviously had different ones than Sony and Microsoft do.
 

Thraktor

Member
I hate this kind of argument, because it seems to imply -- without convincing evidence -- that the competing designs are less efficient in accomplishing their own goals.

To be honest I think that all that we can really say is that Nintendo seem to be more specific in the goals they want to achieve (hence the heavily customised chip), whereas Sony and MS seem to be more content with a relatively standard modern GPU. Latte isn't necessarily going to be more efficient than Orbis and Durango's GPUs, just more tailored to the particular things Nintendo has prioritised.

In fact I think the other machines will be very efficient measured as flops/watts

The smaller manufacturing process will be a large part of this, though.
 

StevieP

Banned
What were the hardware goals of the 360 and PS3? I would think it would be to get as close to their theoretical maximums as possible. Did this happen? A strong argument could be made that they did not. Same with the PS2 and Xbox. This is why people compare the Wii U to the Gamecube. Both systems appear to be designed to get as much performance out of a little bit of power as possible. And in the GC's case it did a pretty good job. Hopefully the Wii U will as well.

There is still going to be a large power gulf between the Wii U and the other 8th gen systems. It's just not going to be the same kind of gap Nintendo faced with the Wii and the other 7th gen consoles.

However, there is still a gulf (in CPU, available memory, GPU power, etc). Physics never lie.

With that said, this GPU design is extremely custom and impressive for what it is. That gulf doesn't change that.
Edit: and by that, I mean the engineering effort - despite the "power" talk above - is impressive for what it is capable of at the power draw it has.
 

Earendil

Member
There is still going to be a large power gulf between the Wii U and the other 8th gen systems. It's just not going to be the same kind of gap Nintendo faced with the Wii and the other 7th gen consoles.

However, there is still a gulf (in CPU, available memory, GPU power, etc). Physics never lie.

With that said, this GPU design is extremely custom and impressive for what it is. That gulf doesn't change that.

I didn't mean to imply that it would be close to the other two, but rather to simply say that it is more than the sum of it's parts.
 

Durante

Member
To be honest I think that all that we can really say is that Nintendo seem to be more specific in the goals they want to achieve (hence the heavily customised chip), whereas Sony and MS seem to be more content with a relatively standard modern GPU. Latte isn't necessarily going to be more efficient than Orbis and Durango's GPUs, just more tailored to the particular things Nintendo has prioritised.
I can agree with that.

I just see no reason for the implication that modern GPU design (the result of at least 2 decades of research and development with the goal of rendering real-time graphics) is not efficient.
 
So do we no for sure yet that it has 40 shaders per block, therefor a total of 320? Which would then equal 352gflops? Has any of this been confirmed? I realize this thing is highly customized.
Well, the 40 shaders thing came when we still thought that this was a modified HD4000 part, but now it seems it has been even more customized than what we thought.
The basis to calculate those 40 shaders were the process of the GPU (40nm), and the area they occupied in comparison to other HD4000 cards.
What this mean is that ALUs are ALUs at the end, and fitting here more than 40 per block seems hard to believe no matter how modified this part is.

But as Tracktor said before, it could perfectly be that some of those unidentified blocks of hardware are a different kind of ALU intended to work in a different way than the normal ones, so at the end it could be anything.
 

StevieP

Banned
Well, the 40 shaders thing came when we still thought that this was a modified HD4000 part, but now it seems it has been even more customized than what we thought.
The basis to calculate those 40 shaders were the process of the GPU (40nm), and the area they occupied in comparison to other HD4000 cards.
What this mean is that ALUs are ALUs at the end, and fitting here more than 40 per block seems hard to believe no matter how modified this part is.

But as Tracktor said before, it could perfectly be that some of those unidentified blocks of hardware are a different kind of ALU intended to work in a different way than the normal ones, so at the end it could be anything.

tracktor.jpg
 

mrklaw

MrArseFace
The point on having a less powerful chip, but that can work much closer to it's maximum performance is that development costs are much lower.
If you have a predictable hardware, and you know exactly what it is capable of and what not, you cut A LOT of your development time that otherwise would be spent on finding how to achieve things or cutting content.

On the other hand, eDram is more expensive than normal RAM, but the transistor density is much bigger than any other part of the chip, so you can funnel more transistors per mm^2.

I guess. But given orbis and durangos relatively small jump over PS360, Nintendo could have had a reasonably cost effective machine that was close enough to those two to have a good chance of getting Multiplatform ports.
 
Well, the 40 shaders thing came when we still thought that this was a modified HD4000 part, but now it seems it has been even more customized than what we thought.
The basis to calculate those 40 shaders were the process of the GPU (40nm), and the area they occupied in comparison to other HD4000 cards.
What this mean is that ALUs are ALUs at the end, and fitting here more than 40 per block seems hard to believe no matter how modified this part is.

But as Tracktor said before, it could perfectly be that some of those unidentified blocks of hardware are a different kind of ALU intended to work in a different way than the normal ones, so at the end it could be anything.

so anotherwards we may never know how many shaders/stream processors this thing has? and therefor will never know its flops?
 

bachikarn

Member
Who cares about power? Really. If your saying that's one of the benefits of this design then it's a poor design choice. Folks would take more performance/power over a cheaper running console any day.

Is this an issue with the Japanese audience? That's the only thing I can think of.
 

prag16

Banned
I wouldn't characterise the WiiU's performance as anyway impressive. In fact. For a new console that launched at the tail end of 2012. It's really unimpressive. They must have worked really hard to produce a console that was that gimped.

Says the guy with xbox xbox xbox xbox ps3 xbox xbox xbox xbox right under his username.

The guy you're responding to is way too enthusiastic on the praise side. You're way too enthusiastic on the bashing side.

The extreme zealots on BOTH sides are detrimental to the dialogue.
 

StevieP

Banned
I guess. But given orbis and durangos relatively small jump over PS360, Nintendo could have had a reasonably cost effective machine that was close enough to those two to have a good chance of getting Multiplatform ports.

Unfortunately, as you see with some other games being released for PS360PC, that's a question of the publishers/bean counters/marketers more than it is the console's ability to receive them.
 

Diffense

Member
The question is not so much about the efficiency of GPU design but the hardware as a whole and how all the components work together. PS3 and 360 took rather different approaches and arrived at roughly the same real-world performance. As far as all the consoles are concerned, I'm primarily interested in the final product.
 
I can agree with that.

I just see no reason for the implication that modern GPU design (the result of at least 2 decades of research and development with the goal of rendering real-time graphics) is not efficient.
Well, off the shelf GPUs are designed with the understanding of being an external device. Such a thing leads to designs that are very effecient as external devices. However, the 360 and original XBox had some very well noted bottlenecks and so does the PS3.

I have no doubt Durango and Orbis will be much more efficient in achieving their goals than their predecessors as I dont think MS and Sony want to lose money again.
 
Says the guy with xbox xbox xbox xbox ps3 xbox xbox xbox xbox right under his username.

The guy you're responding to is way too enthusiastic on the praise side. You're way too enthusiastic on the bashing side.

The extreme zealots on BOTH sides are detrimental to the dialogue.

Are you taking about me?

Whats so zealot about saying "Its impressive... for its power draw?"
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Says the guy with xbox xbox xbox xbox ps3 xbox xbox xbox xbox right under his username.

The guy you're responding to is way too enthusiastic on the praise side. You're way too enthusiastic on the bashing side.

The extreme zealots on BOTH sides are detrimental to the dialogue.

I've got a WiiU. I'm just trying to be realistic.
 
Unfortunately, as you see with some other games being released for PS360PC, that's a question of the publishers/bean counters/marketers more than it is the console's ability to receive them.

Outside of a few selects, it seems like the higher development costs driven by "power" are really hurting the 3rd party scene. It will be worse this gen. I dont think all of the AAA 3rd party publishers will be able to afford to ignore the Wii U userbase just for that reason alone.
 

ozfunghi

Member
Between the 1GB gDDR3, 32MB eDRAM, 4MB eDRAM and 1MB SRAM... what about the beyond3D speculation that WiiU is memory/bandwidth starved and therefor has trouble with alpha textures, as seen in a couple of launch games and the "X" trailer? How likely is this still, or rather the result of developers still (needing to) getting the hang of it.
 

prag16

Banned
I hate this kind of argument, because it seems to imply -- without convincing evidence -- that the competing designs are less efficient in accomplishing their own goals.

Well PS3 and 360 definitely had inefficiencies which have been well documented. But that doesn't necessarily speak to the PS4/720.
 
In terms of power draw it's only impressive to me if it's from a mobile device since it'll use less energy resulting in less battery usage. Less wattage does mean less heat resulting in a cooler device, but I'd rather have a system that uses up to 200 watt.
 
Ok, it's Traktor, not Tracktor XD

JohnnySasaki86 said:
so anotherwards we may never know how many shaders/stream processors this thing has? and therefor will never know its flops?
Well, I think the ALU part it's the most safe one to bet of them all, so unless there are some other different ALUs thrown on here we can say that those 350Gflops are the more reasonable bet.

But because the system has been so much customized, even the safest bets are not completely safe at all XD

It would be great if someone knowledgeable enough could tell us a bit more about some of those mysterious blocks.
I think that the easier task now would be to find which blocks are the DSP and the ARM CPU to discard them.
 

QaaQer

Member
...crucial information simply wasn't available in Nintendo's papers, with developers essentially left to their own devices to figure out the performance level of the hardware.

Yup, this is the most important fact to have emerged from this whole endevour. WTF Nintendo?

tumblr_mgpw0dQcIG1s3zr3ro1_500_zps6dd746cb.gif
 

guek

Banned
On one hand, it is a bit much to pretend like the performance of this box is impressive outside of size and efficiency. In real world performance, there's nothing under the hood that stands up to the anything close to high end or even mid-range in the PC market. That's just the cold, hard, unadulterated truth and trying to use efficiency to paint these numbers as anything other than conservative in comparison to what else is on the market is just a thinly veiled attempt to compensate for what is an incredibly modest machine.

On the other hand, I think trying to tell people not to be impressed or to imply that they shouldn't be satisfied with these specs is a bit much. In fact, it's asinine. Personal impressions depend solely on their own expectations and what they personally want out of the machine. Stating no one cares how efficient Latte is because it doesn't stand up to what's out there is just flat on its face self absorbed and immature. Being satisfied with lower tech doesn't immediately make someone a luddite. It's also incredibly depressing to see people latch onto the idea that graphical performance is all that matters to the general public after how the monstrous HD twins struggled so much earlier this gen. Of course graphical performance matters but it's not the end all, be all determining factor for public appeal, even if that's how you feel personally. There is a general sense of condescension and pity among people who prefer graphical showcases towards people who don't and frankly it's pathetic and insulting.

Of course, there's nothing wrong whatsoever with being disappointed with the path nintendo has chosen to take. From a performance perspective in 2013, it's severely disappointing when compared to other modern products. But for some, graphics are just a means to an end . I just wish people would get off their high horses and just accept that some people have different tastes without all the crassness.
 

Earendil

Member
OK, so here's a noob question. In this image below, I'm assuming the darker parts are the ALUs, and the lighter brown parts are the SRAM for each ALU. Is that correct?

CCUBRmZ.png


Can we use this to count the ALUs in each SIMD? Surely that's possible right? Is each block a pair of ALUs? If so, I count 32, which doesn't jive with the 40 ALUs per SIMD we've been thinking.




Yup, this is the most important fact to have emerged from this whole endevour. WTF Nintendo?

tumblr_mgpw0dQcIG1s3zr3ro1_500_zps6dd746cb.gif

What the hell is that????????
 
Status
Not open for further replies.
Top Bottom