• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
Those leaks from Marcan sure are helpful. Then on WiiU mode, there are 34MB of eDram instead of the 32MB we presumed.

The Abominable Snowman said:
So the die shots/released info are looking to confirm R6xx 160ALU then.
Let me look back through my old posts......
More like 320ALU. Comparisons with R700 series and die size brought us those numbers, along with the amount of SRAM of each block.
 

AzaK

Member
Those leaks from Marcan sure are helpful. Then on WiiU mode, there are 34MB of eDram instead of the 32MB we presumed.


More like 320ALU. Comparisons with R700 series and die size brought us those numbers, along with the amount of SRAM of each block.

+1 MB of SRAM. Be interesting to know what that is used for.


So what are the numbers we're looking at in terms of FLOPs now?
From what we know it hasn't changed; 352 GFlops.
 

v1oz

Member
Come on!!! If you're basing what you said on that article, it was clearly an assumption they made up to explain why no info has been leaked to the internet! It's not even presented as a fact!!!
That's not quite what they said. You are making the assumption that they stirred up this notion simply because no info was leaked on the internet. But you ignore the fact that Eurogramer have first hand information of their own from developers.

This is exactly what the article states, "as we understand it, this crucial information simply wasn't available in Nintendo's papers". In other words this information was not in the SDK. And if you go back to previous DF articles, they mentions frank discussions with developers working on Wii U devkits both anonymous and identified. So "no" they didn't just make up up this notion out of an information vacuum. But it's based on what knowledge they have of the SDK papers.

So you accept as fact the contention that Nintendo purposely denied developers access to information about the hardware so they could "discover it themselves"? On what do you base this other than the conjecture in the poorly written DF article?
It's not a fact but a notion. And I'm accepting it based on the fact Digital Foundry have been accurate with all their previous Wii U information. They are an established source and have earned a good reputation because of their generally good journalistic standards.
 

v1oz

Member
Yeah, and that didn't work out so great for them when MS used Nintendo's own numbers against them. Releasing numbers to the public doesn't help them at all.
Microsoft releasing Nintendo's own numbers had no effect at all. People already knew that the Xbox was the most powerful console and that the PS2 was the weakest. It was generally accepted that the GCN in the middle power-wise, which had no effect on sales because the PS2 still outsold the competition by huge amounts despite being a weaker machine than the Xbox.
 
This is exactly what the article states, "as we understand it, this crucial information simply wasn't available in Nintendo's papers".
I think this is the reason I believe that they are just guessing. I mean, what are they supposed to "understand" here? If it's something that they were told, it was enough to say "we were told from a developer that"...
I mean, Nintendo's SDK were incomplete until late, but not sending this crucial info with the kit is something that doesn't make any sense and I wouldn't believe it unless we have some sort of reliable confirmation.

No, according to Marcan, MEM0 (the 2MB of EDRAM and 1MB of SRAM) is off limits to devs.
Yes, it's usable but only by Nintendo. It must be some sort of shader caché or something like that.
 
I'm reading that the Wii had probably 61 GFlops. And the Xbox like 8 GFlops. So the flops didn't count in old generarion or they don't count at all?
 

v1oz

Member
I think this is the reason I believe that they are just guessing. I mean, what are they supposed to "understand" here? If it's something that they were told, it was enough to say "we were told from a developer that"...
I mean, Nintendo's SDK were incomplete until late, but not sending this crucial info with the kit is something that doesn't make any sense and I wouldn't believe it unless we have some sort of reliable confirmation.
As "we understand it" means "to the best of our knowledge". It's not term generally used to suggest guess work.
 

wsippel

Banned
Has anyone claimed it's not Radeon based? That's news to me if so. It might not be a licensed AMD design as the Chipworks guy said (there would be AMD markings apparently), but doesn't mean it's not based on it. AMD could supply the "bits" to make it but Nintendo could have customised it so much that it was no longer "licensing" an AMD architecture.

NOTE: I'm just pulling shit out of my arse here, to try and make sense of it.
It's apparently a bit more complicated, and I'm not sure Marcan is entirely on the right track.
 

KingSnake

The Birthday Skeleton
You guys realise this is a thread about the DIE SHOT, not about whether Nintendo are lame or not. I've fallen victim to replying to some of the gunk in here, but can it stop and we keep this thread for the tech stuff.

Make another thread about how lame Nintendo is.

This.

Please, make another thread about Nintendo/DF sucks. And really, that article from DF is not such a great piece of journalism to be mentioned on every page of this thread.
 

atbigelow

Member
"@atbigelow in Wii mode you get 24MB of MEM1 and 2MB of framebuffer and non-direct access to 1MB of texture cache."

So yeah it looks like that extra bit of RAM and such is primarily for Wii BC and then gets reused for other unknown things in Wuu mode. Cool.
 

prag16

Banned
You assume that they omitted information relevant to developers as opposed to developers being bound by NDAs. Or are you simply parroting an off-the-cuff remark from DF?

Oddly, MOST people seem to be taking that DF comment as absolute gospel truth, including everybody over at beyond3D...

EDIT: Alright, caught up on last few posts; I'll stop with the DF bashing too. But I contest the notion that they've proven themselves to be the pinnacle of journalistic integrity over time, and I'll leave it at that.
 

The_Lump

Banned
Is the following patent:

Graphics Processing System With Enhanced Memory Controller - Patent 8098255

related to the WiiU design?



In the drawings, the GPU has a:

Command processor
Transform unit
Setup/rasterizer
Texture unit
Texture environment unit
Pixel engine




http://www.docstoc.com/docs/1186623...h-Enhanced-Memory-Controller---Patent-8098255

http://www.google.com/patents?id=MS...=gbs_selected_pages&cad=3#v=onepage&q&f=false


Nice post. I've no idea if it's correct but good detective skills nonetheless. Certainly chimes with a lot of what's being said by the smart people?



As mentioned by others, I wish the random Nintendo philosophy debates would stay in another thread. This one is too valuable to be derailed...

So far this thread timeline seems to be:

-Awesome work by GAF/Chipworks
-Smart people dissecting the evidence
-Surprising numbers
-"is it better or worse than x/y"
-Overreactions
-Doom
-Smart people still dissecting the evidence
-"I was right all along"
-Doom
-DBZ debate
-Smart people still dissecting the evidence
-Nintendo smh
-Rubbish DF article
-Influx of trolls
-Corrections to surprising numbers, positive?
-Less Doom
-"is it better or worse than x/y"
-Sensible reactions
-Overreactions
-Nintendo suck at teh hardware still
-More surprising numbers, this time good?
-Forget the OP then, Nintendo suck at being a gaming company instead
-Overreactions
-Developers must hate Nintendo
-Doom
-Smart people still dissecting the evidence.


I'm gonna wait for more evidence to be dissected by smart people.


Edit: I realise I'm derailing the thread by talking about people derailing the thread. Sorry. Back to my cave I shall go.
 
It's apparently a bit more complicated, and I'm not sure Marcan is entirely on the right track.

It does seem a bit strange. And if the smaller pool of eDRAM is 1t-SRAM...Why? If anything, the texture cache was more sensitive to latency. If they were going to buy 1t-SRAM at all, they could have bought it for that. It would have saved them money and transistors by not having all that SRAM.
 

DynamicG

Member
Would the Wii hardware being directly integrated into the WiiU be a benefit to Nintendo's internal devs more than others? Would it make porting Wii code to WiiU easier?
 

Mildudon

Member
So... is it eDRAM or 1T.SRAM or is that still up in the air for now and or we talking abouth the big chunck of memory or the smaller one above it.
 
I'm not sure where you're getting 61, maybe like 16, if you count both the CPU and GPU

So, they've effectively jumped ~22X in GFLOPs and ~10X in RAM?

After 10 years of working with more or less the same architecture, that must feel like an infinite amount of power to Nintendo's internal devs. I imagine it's also quite intimidating - presumably after being so efficient for so long, that it would take a little while to fully comprehend how far you can push the machine.
 

The_Lump

Banned
So... is it eDRAM or 1T.SRAM or is that still up in the air for now and or we talking abouth the big chunck of memory or the smaller one above it.

On of the smaller pools above, I believe. That Marcam chap on twitter seems adamant it's 1t-SRAM (the one all the tiny banks, above left?)
 
Without going to far off topic, I just want to thank everyone for putting their time and effort into this. It's been a great read.

Also, have there been any theories regarding 'O' ? That looks like a lot of silicon with a tiny bit of sram compared to other sectors.
 
352 Gflops isn't too bad.
It's not massively outperformed by the Durango, and the Orbis isn't that far away either.

At least there's not a full gen jump between the Wii U and Durbis.
The jump from the Wii and Wii U is also pretty massive, so as a Nintendo fan it'll certainly be refreshing.
 

RurouniZel

Asks questions so Ezalc doesn't have to
On one hand, it is a bit much to pretend like the performance of this box is impressive outside of size and efficiency. In real world performance, there's nothing under the hood that stands up to the anything close to high end or even mid-range in the PC market. That's just the cold, hard, unadulterated truth and trying to use efficiency to paint these numbers as anything other than conservative in comparison to what else is on the market is just a thinly veiled attempt to compensate for what is an incredibly modest machine.

On the other hand, I think trying to tell people not to be impressed or to imply that they shouldn't be satisfied with these specs is a bit much. In fact, it's asinine. Personal impressions depend solely on their own expectations and what they personally want out of the machine. Stating no one cares how efficient Latte is because it doesn't stand up to what's out there is just flat on its face self absorbed and immature. Being satisfied with lower tech doesn't immediately make someone a luddite. It's also incredibly depressing to see people latch onto the idea that graphical performance is all that matters to the general public after how the monstrous HD twins struggled so much earlier this gen. Of course graphical performance matters but it's not the end all, be all determining factor for public appeal, even if that's how you feel personally. There is a general sense of condescension and pity among people who prefer graphical showcases towards people who don't and frankly it's pathetic and insulting.

Of course, there's nothing wrong whatsoever with being disappointed with the path nintendo has chosen to take. From a performance perspective in 2013, it's severely disappointing when compared to other modern products. But for some, graphics are just a means to an end . I just wish people would get off their high horses and just accept that some people have different tastes without all the crassness.

Quoting this because thank you. I hate how it's practically unacceptable to people here that people like me can like games on high end and mid end and low end devices alike because it's not the graphics tech that determines for me whether I'm enjoying a game or not.
 

BeauRoger

Unconfirmed Member
I cant believe that they have allowed themselves to get stuck in the same situation yet again. Wii was missing out on all the big third party games because of the hardware difference between it and the HD consoles. Now here we are, with wii u being extremely underpowered compared to the upcoming competing consoles. If the Wii sold extremely well and even then didnt get the big titles, then what hope does the wii u have? Will this be yet another generation where nintendo is the only company developing worthwhile games for their console?
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
I cant believe that they have allowed themselves to get stuck in the same situation yet again. Wii was missing out on all the big third party games because of the hardware difference between it and the HD consoles. Now here we are, with wii u being extremely underpowered compared to the upcoming competing consoles. If the Wii sold extremely well and even then didnt get the big titles, then what hope does the wii u have? Will this be yet another generation where nintendo is the only company developing worthwhile games for their console?

On the other hand, frankly, I'm not at all convinced 3rd parties were going to jump in en masse no matter how easy porting is.
 
I cant believe that they have allowed themselves to get stuck in the same situation yet again. Wii was missing out on all the big third party games because of the hardware difference between it and the HD consoles. Now here we are, with wii u being extremely underpowered compared to the upcoming competing consoles. If the Wii sold extremely well and even then didnt get the big titles, then what hope does the wii u have? Will this be yet another generation where nintendo is the only company developing worthwhile games for their console?

I'm no expert on technical aspects, but I think I understand enough to state that the difference between Wii and PS360 is in a whole different league between the Wii U and Durangorbis.

EDIT: And what schuelma said.
 

SmokeMaxX

Member
I cant believe that they have allowed themselves to get stuck in the same situation yet again. Wii was missing out on all the big third party games because of the hardware difference between it and the HD consoles. Now here we are, with wii u being extremely underpowered compared to the upcoming competing consoles. If the Wii sold extremely well and even then didnt get the big titles, then what hope does the wii u have? Will this be yet another generation where nintendo is the only company developing worthwhile games for their console?

Well, if Nintendo had a repeat of the Wii all over again, I don't think they'd be too upset.
 
Hmmm, taking Marcan's info into account, I believe there is some erroneous info in the OP. I checked it against the motherboard and the following schematic works out.

-The interface on the left and top left of the photo appears to be to the DDR3
-The interface on the lower right hand corner is to the CPU.
-The 7x on lower right hand corner does indeed seem to be to the USB ports
-Not sure about tank oscillator but SATA (to disc drive?) seems likely
- The two rectangles directly to the right of N8 run to HDMI controller and the module that presumably houses the dual ARM core for Gamepad encode/decode. So they must be video adapters of some sort.

I know this is not the info many of you are interested in, but the proximity of the interfaces w/ the unknown blocks of RAM/logic may give us some more clues.
 

AniHawk

Member
i have no idea what is going on, but it's been fascinating watching people decipher this picture. it's like someone translating hieroglyphics.
 
So, they've effectively jumped ~22X in GFLOPs and ~10X in RAM?

After 10 years of working with more or less the same architecture, that must feel like an infinite amount of power to Nintendo's internal devs. I imagine it's also quite intimidating - presumably after being so efficient for so long, that it would take a little while to fully comprehend how far you can push the machine.

The Wii GPU was 12 Gflops, WiiU GPU is apparently 352 so it's a 29x power leap, when you factor in the updated architecture and the fixed function stuff it will prob end up around a 40x power leap, that is an unreal leap (the highest ever leap in a single generation ?), i imagine the first party developers are in heaven.

Wii had 88MB's of Ram and WiiU has 1GB for games so an 11x leap there, not sure about the CPU.

Overall if you are comparing WiiU to Wii it's an incredible power leap.
 

BeauRoger

Unconfirmed Member
I'm no expert on technical aspects, but I think I understand enough to state that the difference between Wii and PS360 is in a whole different league between the Wii U and Durangorbis.

EDIT: And what schuelma said.

That seems to be the case, yes. The gap certainly seems smaller than it was last gen. Thats far from the only factor though. Is it enough, given their status and rep among 3rd parties and considering how last gen looked for nintendo, to simply say "look guys, its not quite as weak as it was last gen!"?
 

majik13

Member
The Wii GPU was 12 Gflops, WiiU GPU is apparently 352 so it's a 29x power leap, when you factor in the updated architecture and the fixed function stuff it will prob end up around a 40x power leap, that is an unreal leap (the highest ever leap in a single generation ?), i imagine the first party developers are in heaven.

Wii had 88MB's of Ram and WiiU has 1GB for games so an 11x leap there, not sure about the CPU.

Overall if you are comparing WiiU to Wii it's an incredible power leap.

how many giggleflips was the GC?
 
The Wii GPU was 12 Gflops, WiiU GPU is apparently 352 so it's a 29x power leap, when you factor in the updated architecture and the fixed function stuff it will prob end up around a 40x power leap, that is an unreal leap (the highest ever leap in a single generation ?), i imagine the first party developers are in heaven.

Wii had 88MB's of Ram and WiiU has 1GB for games so an 11x leap there, not sure about the CPU.

Overall if you are comparing WiiU to Wii it's an incredible power leap.

Well, that's even more holy crap.

(I don't think I've said it yet, but thank you to Chipworks and all the GAF members who've brought this together, it's a fantastic job).
 
Hmmm, taking Marcan's info into account, I believe there is some erroneous info in the OP. I checked it against the motherboard and the following schematic works out.

-The interface on the left and top left of the photo appears to be to the DDR3
-The interface on the lower right hand corner is to the CPU.
-The 7x on lower right hand corner does indeed seem to be to the USB ports
-Not sure about tank oscillator but SATA (to disc drive?) seems likely
- The two rectangles directly to the right of N8 run to HDMI controller and the module that presumably houses the dual ARM core for Gamepad encode/decode. So they must be video adapters of some sort.

I know this is not the info many of you are interested in, but the proximity of the interfaces w/ the unknown blocks of RAM/logic may give us some more clues.

In reference to what, if I may ask.
 

USC-fan

Banned
Hmmm, taking Marcan's info into account, I believe there is some erroneous info in the OP. I checked it against the motherboard and the following schematic works out.

-The interface on the left and top left of the photo appears to be to the DDR3
-The interface on the lower right hand corner is to the CPU.
-The 7x on lower right hand corner does indeed seem to be to the USB ports
-Not sure about tank oscillator but SATA (to disc drive?) seems likely
- The two rectangles directly to the right of N8 run to HDMI controller and the module that presumably houses the dual ARM core for Gamepad encode/decode. So they must be video adapters of some sort.

I know this is not the info many of you are interested in, but the proximity of the interfaces w/ the unknown blocks of RAM/logic may give us some more clues.

Wow that guy is going off on twitter. Also I see a lot of talk about the 320 is wrong and is lower. Hmmmm
 
Status
Not open for further replies.
Top Bottom