• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
In fact I think the other machines will be very efficient measured as flops/watts
Flops, watts, price - you can pick any two of the three, but not all three at once.
 

Schnozberry

Member
There is a general sense of condescension and pity among people who prefer graphical showcases towards people who don't and frankly it's pathetic and insulting.

It's not that I disagree, but this stuff is getting way off topic. Do we have any theories about where the tesselation unit is on Latte? Do we have any idea if Thraktor's theory about some ALU's existing outside the shader blocks has any merit?
 

Clefargle

Member
The Wii U currently is the only console to render to two different screens simultaneously, one of which is wireless, with a more or less lag-free experience. That is impressive.

Don't forget, it was also confirmed by nintendo of being capable of running 2 game pads simultaneously. None of this is impressive unless you look at the obvious priorities ninty had in mind, wireless streaming and bc.
 

mantidor

Member
In terms of power draw it's only impressive to me if it's from a mobile device since it'll use less energy resulting in less battery usage. Less wattage does mean less heat resulting in a cooler device, but I'd rather have a system that uses up to 200 watt.

As it always happens in these threads the interesting posts get lost, but someone a few pages ago mentioned that it almost seems like Nintendo wants to eventually cram the hardware into the controller, which is a pretty interesting proposition.


The only sure thing is that Nintendo is more concerned about other things than raw performance, but in no way this means they are tecnologically impaired. If the chip was custom designed it was for a good reason, not just because Nintendo hates flops and wants to piss off the graphic whores.
 

prag16

Banned
Are you taking about me?

Whats so zealot about saying "Its impressive... for its power draw?"

Overall, not just that post. But I hear ya, I'm also trying to look at things in as positive a light as possible. And you're not coming off as a douche bag at all, so you've got that going for you. :)

I know there will be some great games for me to play in the years to come so I'm good. But the whole technical deal is interesting to me to as a former compsci major, even though I'm years removed from my real hardware/architecture experience.
 
Overall, not just that post. But I hear ya, I'm also trying to look at things in as positive a light as possible. And you're not coming off as a douche bag at all, so you've got that going for you. :)

I know there will be some great games for me to play in the years to come so I'm good. But the whole technical deal is interesting to me to as a former compsci major, even though I'm years removed from my real hardware/architecture experience.

On Wii U im on the optimistic side in terms of visuals for 1 reason.

Nintendos games in HD WILL look great.

Im not a "graphics whore" but i do enjoy nice graphics. I just don't priorise it over Gameplay. And nice graphics + awesome gameplay is exactly what Nintendos going to deliver. And some 3rd party eclusives. W101 and lego City look really good -> IMO <-

I would like to know if theres any truth to the custom shaders aswell. And, if there are any, how that even works...
 

prag16

Banned
Flops, watts, price - you can pick any two of the three, but not all three at once.

We hope they can pick two. At PS360 launch they had one and a half... maybe.

@Cold Blooder: As has been said over and over I can't help; but shake my head at all the self proclaimed graphics whores who play on PS360 (rather than PC). Just like I could never help but shake my head at all the self proclaimed super hardcore gamers who play Call of Duty on PS360 and never played a PC shooter. :D
 
Is Thraktor.
XD Ok, I will never ever again talk about graphics, I must be blind...

OK, so here's a noob question. In this image below, I'm assuming the darker parts are the ALUs, and the lighter brown parts are the SRAM for each ALU. Is that correct?

CCUBRmZ.png


Can we use this to count the ALUs in each SIMD? Surely that's possible right? Is each block a pair of ALUs? If so, I count 32, which doesn't jive with the 40 ALUs per SIMD we've been thinking.
No, if you look closely, this has to be the memory connectors (part of the SRAM, because they are on all of the SRAM blocks you can see). The ALUs are in between the blocks, on the orange square at the middle.
You really can't see the ALUs or the other circuitry at all, but based on the SRAM and components visible, and comparing to what can be seen on other die shots, you have to try to guess what does every one of the parts of the GPU...
This is why it's so hard to try to guess what this GPU is capable off, since it has been severely customized, resulting in a much more difficult task to compare it to other known GPUs on the market.
 

Earendil

Member
XD Ok, I will never ever again talk about graphics, I must be blind...


No, if you look closely, this has to be the memory connectors (part of the SRAM, because they are on all of the SRAM blocks you can see). The ALUs are in between the blocks, on the orange square at the middle.
You really can't see the ALUs or the other circuitry at all, but based on the SRAM and components visible, and comparing to what can be seen on other die shots, you have to try to guess what does every one of the parts of the GPU...
This is why it's so hard to try to guess what this GPU is capable off, since it has been severely customized, resulting in a much more difficult task to compare it to other known GPUs on the market.

Thanks. Is there a known ratio of SRAM blocks per ALU? That's probably a stupid question, but surely there has to be a way to figure this out.
 
Thanks. Is there a known ratio of SRAM blocks per ALU? That's probably a stupid question, but surely there has to be a way to figure this out.
Well, this is how we got to the 350Gflops number, through the amount of SRAM blocks and comparing to "similar" AMD designs...

¡But since it's a totally custom design you may never know for sure!
 

Apath

Member
As it always happens in these threads the interesting posts get lost, but someone a few pages ago mentioned that it almost seems like Nintendo wants to eventually cram the hardware into the controller, which is a pretty interesting proposition.


The only sure thing is that Nintendo is more concerned about other things than raw performance, but in no way this means they are tecnologically impaired. If the chip was custom designed it was for a good reason, not just because Nintendo hates flops and wants to piss off the graphic whores.
Having the entire system in the controller would be very interesting. How would that work in relation to the TV? Have a shell that's purely for receiving a signal from the tablet and displaying it on the screen? Would you be able to play the system anywhere you want with just the tablet controller?
 
Question about tessellation, is it a fixed function feature, if so, is it also a big memory or bandwidth hog?

Reason why I ask, is Shinen comment about using tessellation for their next game. With Nintendo focus on memory latency, and the high bandwidth of the eDRAM, could it also make tessellation even more efficient.
 

Mondriaan

Member
Having the entire system in the controller would be very interesting. How would that work in relation to the TV? Have a shell that's purely for receiving a signal from the tablet and displaying it on the screen? Would you be able to play the system anywhere you want with just the tablet controller?
I noticed I wasn't the first person to suggest that Nintendo might have a goal of putting the whole thing in the controller.

I would speculate that if they did do that, they could remove the wireless chip that broadcasts to the tablet screen. TV screen connection would probably be wired. Maybe they could do without a TV screen connection if asymmetric game play doesn't take off. It would be a pretty big handheld if they left the disc drive in it, but a digital content only Nintendo console/handheld seems like batshit crazy territory to me.
 

v1oz

Member
LOL at Nintendo being secretive about the raw specs even to developers. It's like they were trying to pull a "fast one" not only on consumers but also their development partners. Coming out clean would have ended all speculation in an instant.
 

v1oz

Member
The low power draw is technically impressive, but to act like gamers should be patting Nintendo on the back for that achievement is just silly. Who's realistically going to care.
What's the power draw compared to a laptop with mid level GPU?
 

deviljho

Member
LOL at Nintendo being secretive about the raw specs even to developers. It's like they were trying to pull a "fast one" not only on consumers but also their development partners. Coming out clean would have ended all speculation in an instant.

You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you simply parroting an off-the-cuff remark from DF?
 

donny2112

Member
Coming out clean would have ended all speculation in an instant.

With how non-standard the GPU apparently is, there probably wasn't an easy way to "come clean." With PS4/720, it seems to be mostly standard PC parts with a little extra here or there, so there's a built-in familiarity with how to interpret the power. Not so, apparently, with Wii U.

There's also the current philosophical stance of Nintendo completely against going on about specs. That position can be harangued, but then it still goes back to the earlier point. :/
 

Schnozberry

Member
You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you parroting an off-the-cuff remark from DF?

Yeah, that part of the DF article was total nonsense. Like Nintendo Ninjas broke into developer offices around the world and setup devkits without the knowledge of anyone and just left them there without any documentation whatsover.
 

v1oz

Member
You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you simply parroting an off-the-cuff remark from DF?
It's information from DF article. They obviously have close sources within the dev community.
 

v1oz

Member
Yeah, that part of the DF article was total nonsense. Like Nintendo Ninjas broke into developer offices around the world and setup devkits without the knowledge of anyone and just left them there without any documentation whatsover.

Nope. DF have been very reliable with all their insider thus far.
 

deviljho

Member
With how non-standard the GPU apparently is, there probably wasn't an easy way to "come clean." With PS4/720, it seems to be mostly standard PC parts with a little extra here or there, so there's a built-in familiarity with how to interpret the power. Not so, apparently, with Wii U.

There's also the current philosophical stance of Nintendo completely against going on about specs. That position can be harangued, but then it still goes back to the earlier point. :/

Pretty much. Nintendo has very little to gain and a whole lot to loose by releasing technical specifications that aren't readily understood by the media and consumers. Also it is easily manipulated by its competitors, which has already happened once.

It's information from DF article. They obviously have close sources within the dev community.

Nope. DF have been very reliable with all their insider thus far.

With this particular piece of information, there isn't a whole lot to substantiate their claim. It's not even really a useful claim in the vague and editorial manner in which they wrote it.
 
What's the power draw compared to a laptop with mid level GPU?

Laptops in that bracket usually come with a 90 Watt external PSUs (power bricks). DTRs (Bleeding Edge Desktop Replacements) come with 120 - 150 W external PSUs (power bricks).

Wii U has a much lower power draw than either of those two brackets (33 W as measured by Eurogamer and Anantech), which again, is pretty insane now that I stop to think about it.

A Wii U (powered by a mains socket and is meant to be stationary) is drawing less juice than my portable low mid end battery operated laptop (which has a 3 - 4hour battery life using a standard 6 six cell battery).

Damn son.
 

Schnozberry

Member
Nope. DF have been very reliable with all their insider thus far.

So you accept as fact the contention that Nintendo purposely denied developers access to information about the hardware so they could "discover it themselves"? On what do you base this other than the conjecture in the poorly written DF article?
 
It's information from DF article. They obviously have close sources within the dev community.
Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!
 
Question about tessellation, is it a fixed function feature, if so, is it also a big memory or bandwidth hog?

Reason why I ask, is Shinen comment about using tessellation for their next game. With Nintendo focus on memory latency, and the high bandwidth of the eDRAM, could it also make tessellation even more efficient.

I would like to know where the tesselation unit is in that die shot. It's supposed to be there according to the leaked document.
 

v1oz

Member
With how non-standard the GPU apparently is, there probably wasn't an easy way to "come clean." With PS4/720, it seems to be mostly standard PC parts with a little extra here or there, so there's a built-in familiarity with how to interpret the power. Not so, apparently, with Wii U.

There's also the current philosophical stance of Nintendo completely against going on about specs. That position can be harangued, but then it still goes back to the earlier point. :/
Traditionally consoles have always used non standard parts. Particularly GPU's. And it's never really caused too many problems when specs were made public. Nintendo for example published a good amount of the GCN specs.
 

deviljho

Member
It's information from DF article. They obviously have close sources within the dev community.

Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!

Yup. That the lack of a possible counter-argument being presented by them as an alternative to their own claim (like NDAs) is a classic case of amateur editorializing.

Traditionally consoles have always used non standard parts. Particularly GPU's. And it's never really caused too many problems when specs were made public. Nintendo for example published a good amount of the GCN specs.

Yeah, and that didn't work out so great for them when MS used Nintendo's own numbers against them. Releasing numbers to the public doesn't help them at all.
 

MDX

Member
Is the following patent:

Graphics Processing System With Enhanced Memory Controller - Patent 8098255

related to the WiiU design?

A memory controller performs a wide range of memory control related functions including arbitrating between various competing resources seeking access to main memory, handling memory latency and bandwidth requirements of the resources requesting memory access, buffering writes to reduce bus turn around, refreshing main memory, and protecting main memory using programmable registers. The memory controller minimizes memory read/write switching using a "global" write queue which queues write requests from various diverse competing resources. In this fashion, multiple competing resources for memory writes are combined into one resource from which write requests are obtained. Memory coherency issues are addressed both within a single resource that has both read and write capabilities and among different resources by efficiently flushing write buffers associated with a resource.

In the drawings, the GPU has a:

Command processor
Transform unit
Setup/rasterizer
Texture unit
Texture environment unit
Pixel engine




http://www.docstoc.com/docs/1186623...h-Enhanced-Memory-Controller---Patent-8098255

http://www.google.com/patents?id=MS...=gbs_selected_pages&cad=3#v=onepage&q&f=false
 

joesiv

Member
Yup. That the lack of a possible counter-argument being presented by them as an alternative to their own claim (like NDAs) is a classic case of amateur editorializing.

Well I doubt it's as bad as DF is reporting, but having read Nintendo's previous developer docs (available in the SDK's), I'd say it's possible that Nintendo doesn't mention some things that other makers do. For example with the Wii, Nintendo did specify the clock speeds, but for performance, it literally told developers in the docs that they could expect 2x's the performance of the gamecube, and that's about it.

If the "developer" DF is referencing has a Wii U developer kit, then they'd also have the SDK, which would have the documentation that is available to all developers. Having said that, it's often the case where early developer SDK's won't have most of the information (especially specifics) that is in the finalized 1.0 SDK documentation. (Perhaps these "developers" told DF what they said prior to finalized 1.0 SDK documentation?)
 

NBtoaster

Member
So you accept as fact the contention that Nintendo purposely denied developers access to information about the hardware so they could "discover it themselves"? On what do you base this other than the conjecture in the poorly written DF article?

bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.
 

deviljho

Member
Well I doubt it's as bad as DF is reporting, but having read Nintendo's previous developer docs (available in the SDK's), I'd say it's possible that Nintendo doesn't mention some things that other makers do. For example with the Wii, Nintendo did specify the clock speeds, but for performance, it literally told developers in the docs that they could expect 2x's the performance of the gamecube, and that's about it.

If the "developer" DF is referencing has a Wii U developer kit, then they'd also have the SDK, which would have the documentation that is available to all developers. Having said that, it's often the case where early developer SDK's won't have most of the information (especially specifics) that is in the finalized 1.0 SDK documentation. (Perhaps these "developers" told DF what they said prior to finalized 1.0 SDK documentation?)

Sure, I am not doubting that at all. My gripe is mainly with the way DF made their broad, hyperbolic, and maybe even dismissive assessment of Nintendo's support to developers. That's not really a principle I'd expect from a team that does their kind of technical analysis. The reality is much more likely to be been nonexistent support and perfect support. I don't doubt that Nintendo is lacking in areas or that they don't need improvement in others.

bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.

The devkits were not finalized until the 2nd part of 2012. There is a poster here who is a developer and has a devkit, and this person previously knew the GPU clock speed.
 

AzaK

Member
You guys realise this is a thread about the DIE SHOT, not about whether Nintendo are lame or not. I've fallen victim to replying to some of the gunk in here, but can it stop and we keep this thread for the tech stuff.

Make another thread about how lame Nintendo is.
 

Schnozberry

Member
bgassassin was saying the same thing, clock speed and other performance metrics were not provided to devs.

We don't know what was given and what wasn't. With NDA's we may never know. We do know that final hardware wasn't given to devs until mid 2012, straight from the source, so it's possible that Nintendo hadn't finalized it when BG got his info.
 

KingSnake

The Birthday Skeleton
And this:

Héctor Martín (@marcan42) tweeted at 11:35 PM on Tue, Feb 05, 2013:
The 2MB MEM0/EFB (framebuffer in Wii mode) is used as fast general purpose RAM in Wii U mode. Dunno about the 1MB SRAM (Wii texture cache)
(https://twitter.com/marcan42/status/298922907420200961)

Edit:

Héctor Martín (@marcan42) tweeted at 0:02 AM on Wed, Feb 06, 2013:
Hm, actually, it seems to use at least 2.75MB of MEM0. They might be throwing in the texture cache SRAM as MEM0 too.
(https://twitter.com/marcan42/status/298929740063051776)

Later edit:

Héctor Martín (@marcan42) tweeted at 0:17 AM on Wed, Feb 06, 2013:
And no, devs don't get to touch MEM0. It's Nintendo territory (kernel code), that's why you haven't heard of it before.
(https://twitter.com/marcan42/status/298933496569810944)
 

Earendil

Member
So the die shots/released info are looking to confirm R6xx 160ALU then.

Let me look back through my old posts......

Where are you getting 160 ALUs?

what the fuck are you talking about?

He's going to go back through his old posts and find something that can prove he was right all along. The Wii U is two pocket calculators double-sided taped together and the rest of us are a bunch of delusional fanboys for not seeing the truth when he warned us.
 

AzaK

Member
Marcan has another series of tweets just now. I find this interesting:

Héctor Martín (@marcan42) tweeted at 11:33 PM on Tue, Feb 05, 2013:
Oh, and for those who claim it's not a Radeon-like design: http://t.co/69ErDYjB . R6xx. Register names match AMD ones too.
(https://twitter.com/marcan42/status/298922364740190208)

Has anyone claimed it's not Radeon based? That's news to me if so. It might not be a licensed AMD design as the Chipworks guy said (there would be AMD markings apparently), but doesn't mean it's not based on it. AMD could supply the "bits" to make it but Nintendo could have customised it so much that it was no longer "licensing" an AMD architecture.

NOTE: I'm just pulling shit out of my arse here, to try and make sense of it.
 
Much thanks to everyone involved for the information.

Like everyone else, I find it odd that Nintendo would exert so much effort to eke out gains in wattage efficiency for a home console. I wonder if this design was the opening move of the recently announced convergence of handheld and console hardware development; I hope there is some movement in that direction otherwise it would have been far more preferable to have seen these engineers do their work with a ~200 watt budget. Either way, interesting stuff and somewhat encouraging.
 
Status
Not open for further replies.
Top Bottom