• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.
I have a question for you tech guys. I have a wii u with some games (zombi u, nintendo land,pikmin 3) and i also played bayonetta 2 demo ,mario 3d world demo and also mario kart 8 demo when i went to an event( Paris games week) and none of these games seems to have proper anti-aliasing. I'm a bit biased because i also play on my PC with always max AA on. Is there a reason for that ? I will buy those games no matter what but i was wondering why the lack of AA taking into account the big fat EDRAM (32 MB), resolution (720p) and newer GPU architecture.

AA takes processing power and bandwidth. It takes time for devs to figure out ways to squeeze in a decent AA without taking away from their graphics pipeline for polygons and shaders.
 

Jburton

Banned
I dont know, but its being tossed around like its a fact:

compairison-chart1.jpg

That chart is wrong, stream processor value for PS4 and X1 is the wrong way around.
 

The Boat

Member
I don't know if it's my eyes or if I'm just not too close to the Tv, but I never notice the aliasing on these Wii U games that I know don't have AA. If I look for it I can see it, if I move close to the image too, but from where I'm playing? Doesn't bother me. I'm thankful for not being eagle eyed like that.
 
AA takes processing power and bandwidth. It takes time for devs to figure out ways to squeeze in a decent AA without taking away from their graphics pipeline for polygons and shaders.

Bandwidth might be the issue after all, if what I have gathered is accurate. It certainly has enough eDRAM for 2xMSAA, but afaik the only game to use it is BLOPS2. Of course, consoles seem to be making due with shader-based techniques like FXAA since the move to deferred rendering, so it could just reflect that general trend. The eDRAM is surely being used for V-Sync via triple buffering in many titles, which is why you don't see tearing in any games (besides Darksiders). I wonder what 35.2 GB/s could achieve...
 

Powerwing

Member
Thanks for the answers, but i don't think lack of at least 2xmsaa is time related.
I've read some tech articles on ps3 and xbox 360 AA techniques and what i understand is going from Quincunx AA to MLAA took time for ps3 because you have to figure out how to do AA using the SPU and it was new AA technique but MSAA is like the simplest AA technique but it needs certain bandwith and framebuffer size. The xbox 360 has 10 mb and could not do 1280*720 with 2*msaa (Sub Hd with 2*msaa yes) , devs needed to use tiling for that.Wii u has like 32 or 38 (not sure) so as Fourth Storm says it's perhaps bandwidth issue. Again i'm not an expert just discussing this particular tech point.
 
Bandwidth might be the issue after all, if what I have gathered is accurate. It certainly has enough eDRAM for 2xMSAA, but afaik the only game to use it is BLOPS2. Of course, consoles seem to be making due with shader-based techniques like FXAA since the move to deferred rendering, so it could just reflect that general trend. The eDRAM is surely being used for V-Sync via triple buffering in many titles, which is why you don't see tearing in any games (besides Darksiders). I wonder what 35.2 GB/s could achieve...
Wouldn't that mean a 512 bit buffer between GPU and eDRAM? I thought that some people here already counted 1024 pins between the memory and the rest of the chip...
 
Here's a question I have asked in other threads with less than satisfying answers-- meaning I got nothing but joke posts.

What are the differences, good and bad, between eDRAM in the WiiU and eSRAM in the X1?
 
If they're saying that then we have to take it as a strong possibility. They're not your average developer not tinkering with the hardware after all, quite the contrary, chances are they either know via documentation or got to that number through other means.
 
In order to throw a bone here, there's only a few designs using 192 shader units, and ironically their introduction happened the same year the Wii U launched.

They are:

Radeon HD 7540D/7520G/7400G (Trinity IGP's)
Released: October, May, September 2012 (APU: A6-5400K/A6-4400M/A6-4355M)
Codename: Scrapper
Based on: HD 69xx's VLIW4 ISA
192 Shader Units
12 Texture Mapping Units
4 Raster Output Processors

Memory Bus: DDR3 Memory Controller (DDR3-1866/1600/1333)

292 GFlops @ 760 MHz (HD 7540D)
190 GFlops @ 496 MHz (HD 7520G)
125 GFlops @ 327 MHz (HD 7400G)

211 GFlops @ 550 MHz (Wii U GPU clock)


Take it as you will, but seeing how the timeline fits in perhaps it's not so farfetched to believe that perhaps they have something in common other than the already obvious DDR3 bus (VLIW4, or some structure/architecture decisions, perhaps). If only we could see a GPU die shot from those. Still we've looked before into IGP similarities here and we found blocks to be more akin to them before than R600/R700 parts... I think?
 

FLAguy954

Junior Member
I have a question:

Why is it so hard for some here to believe Nintendo customized the GPU to have an extra cluster of shaders? Because that is what it looks like to me.
 
Wouldn't that mean a 512 bit buffer between GPU and eDRAM? I thought that some people here already counted 1024 pins between the memory and the rest of the chip...

I have the highest res pic available of the chip. The pins to the eDRAM are an odd number and probably not indicative of bandwidth.
 
I have a question:

Why is it so hard for some here to believe Nintendo customized the GPU to have an extra cluster of shaders? Because that is what it looks like to me.
In this case, it is partially because it is a very unique customization. BG is now implying that Latte has 160 ALUs, but 192 threads. So..one ALU can handle 6 threads? I will have to read up on how that can work.
because it goes against the lower number. i guarantee you if this dev came out and said the Wii U had any number under 160 making it weaker.... the reaction would be has to be right he is a dev working on the cosole and an upcoming visual standout releasing on Wii U but cancelled for ps360 as they didnt have the specs needed to run the game as devs wanted.
Personally, I would be asking how we are getting what we are seeing in some of these Wii U games with such a lower shader unit count.
 
So the Wii U has secret sauce after all? I think the GPU is just different enough to make the "next gen" or modern GPU features not cost effective to use in 360/PS3 ports. As we are seeing in Super Mario 3D World, more modern effects are being used and I'm sure the Wii U can do much more than that.
 

AzaK

Member
So the Wii U has secret sauce after all? I think the GPU is just different enough to make the "next gen" or modern GPU features not cost effective to use in 360/PS3 ports. As we are seeing in Super Mario 3D World, more modern effects are being used and I'm sure the Wii U can do much more than that.

The Wii U just has a much more recent featureset compared to those old shitty machines. What was it, DX9 equivalent on XBox and something like 10.x?? on Wii U? Yes I know it's not DX but you get my drift.

It also has >3x the eDRAM of the 360 so it's gotta be faster/better/more efficient in some areas.
 

AzaK

Member
you would but you would have people i wont mention that would come in and say that should be right no need to question Wii U is a last gen system tech wise... when in reality it isnt just not as next gen as a ps4. its in the middle just like we have heard from the beginning....

Saying it's in the middle is a bit of a stretch. 1.8 Teraflops on the PS4 vs 176GFlops in raw GPU. 12GB/s main memory bandwidth vs 176GB/s.

I know not all GFlops are created equal but man, noway is that in the middle. Wii U is an enhanced 360/PS3, that's all. That may be fine for some (It would be for me if the rest of the platform as a whole wasn't bad) but people need to just admit where it sits.
 

efyu_lemonardo

May I have a cookie?
In order to throw a bone here, there's only a few designs using 192 shader units, and ironically their introduction happened the same year the Wii U launched.

They are:

Radeon HD 7540D/7520G/7400G (Trinity IGP's)
Released: October, May, September 2012 (APU: A6-5400K/A6-4400M/A6-4355M)
Codename: Scrapper
Based on: HD 69xx's VLIW4 ISA
192 Shader Units
12 Texture Mapping Units
4 Raster Output Processors

Memory Bus: DDR3 Memory Controller (DDR3-1866/1600/1333)

292 GFlops @ 760 MHz (HD 7540D)
190 GFlops @ 496 MHz (HD 7520G)
125 GFlops @ 327 MHz (HD 7400G)

211 GFlops @ 550 MHz (Wii U GPU clock)


Take it as you will, but seeing how the timeline fits in perhaps it's not so farfetched to believe that perhaps they have something in common other than the already obvious DDR3 bus (VLIW4, or some structure/architecture decisions, perhaps). If only we could see a GPU die shot from those. Still we've looked before into IGP similarities here and we found blocks to be more akin to them before than R600/R700 parts... I think?

I thought we knew for a fact that Latte was VLIW5..
edit: well, not for a fact, but it seemed to be supported by other evidence as well..
 
I thought we knew for a fact that Latte was VLIW5..
edit: well, not for a fact, but it seemed to be supported by other evidence as well..
Well, I don't know, but I think going back is essential in order to get it right sometimes.

We've hit lots of walls throughout this thread (and even before) because of this proprietary design, we might as well take into account what a dev has to say.

I don't really know how that fits into the core die shot we have though.
 
I wish we still had some bigger developers outside of Nintendo pushing the hardware. I'd love to see what a modern day Factor 5 could have done with it, and what they'd have to say about its performance.
Factor 5 is a kind of phantom menace (star wars puns oh yeah), they had expertize and all that ith Nintendo but only favored for as long as their platforms were powerful or even against competitors which was the case with N64 and Gamecube. That's why they jumped ship on the Wii, and sure they were trying to go back but only because they realized there was a niche there and they had no further options.

"If" Factor 5 was around today, and Lair had paid off, they'd be on PS4, not on XBone and not on Wii U, dudes craved for processing power.

EDIT: oops double post.
 

prag16

Banned
that why i will be a nintendo fan for life. the quality of their work is amazing.... how often do they have to use patches just to make a game run properly. but i guess it is what it is when you have a 38 man team trying to make a game in under a year.

Is it even that many people? Is there a source for that?
 

krizzx

Junior Member
wonder if DF will rerun tests and update. they should if its a giant fix.

I believe that SC Blacklist also got a performance patch, but no one re-reviewed it afterward.

Though this is interesting info though.

One none official dev states the Wii U got faster with the final devkits. Another states it has 192 Shaders(which is more in terms with what I was suggesting long ago).

This is certainly interesting/
 

Narroo

Member
actual performance. thats what Nintendo built this around. yes they customized and went there own route. it should be able to run all current engines and get decent ports. we live in an age of custimization and optimization. look at these big engines being ported down to freaking cell phones. EA said it when they were buddy buddy with nintendo its a stop gap console. better than what the last ones offered but not on the same level as ps4 xb1. they even reference they thought the Wii U would be the ps2 of this generation. ps2 was the weakest of 6th gen. not everything is measured in numbers gamecube was 3x less powerful than xbox1 but on a technincal level no game that gen outdid what rogue squadron did with 9 months of development as a launch game. an Resident evil 4 was my best looking game of that gen and the original version was built ground up for gamecube.


It's sounding like a Dreamcast right now. The DC came out at an odd time and was kinda in-between the PS1 and PS2 eras. Hopefully it doesn't suffer the same fate.
 

efyu_lemonardo

May I have a cookie?
"If" Factor 5 was around today, and Lair had paid off, they'd be on PS4, not on XBone and not on Wii U, dudes craved for processing power.

that's kind of a sad thought, though. I mean is it impossible for a big developer to want to push a piece of hardware, even if it isn't cutting edge? (or at least cutting edge at the time of release?)

are future Nintendo consoles doomed to be fully exploited only by first party teams and demosceners?
 

Log4Girlz

Member
Seems like COD Ghosts isn't any better on the Wii U than BO2 was:

http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-ghosts-face-off

Very slightly higher resolution than 360 but significantly worse framerate.

"Besides the variance in overall image quality, the main difference between the current-gen consoles lies with how the artwork is presented on each platform. While the core assets are fundamentally identical across all consoles, both the PS3 and Wii U versions of Call of Duty: Ghosts feature a few problems where lower-resolution textures are prominently displayed across characters and the environments. In many instances the impact of this is relatively minor and hard to notice beyond the occasional blurry poster or door texture, but at worst we see large walls and entire characters appearing on screen with what looks like lower-quality level-of-detail settings applied, sometimes on a permanent basis throughout a particular scene and even a whole level. This is particularly prevalent on the Wii U game, where the drop in artwork quality is much bigger and occurs far more frequently."

Dafuq
 
^ Probably so, unless there was HUGE financial gain to be had. (see PSone and PS2 cases)

But look at the Wii, theoretically it should have had that and they still managed to turn a blind eye on it.
 
The most common thing prevalent in these ports, is that, that 1 GB of available memory is only used for game code and holding the textures limited by 360/PS3. Porting over to WiiU means you have go through each line of code making sure that textures aren't being accessed directly by the GPU from the slow DDR3 which leads to bottlenecks.
 

prag16

Banned
When you beat the game it shows by name the treyarch Wii u dev teams and its 38 of them.

I'll look for myself at some point, but I'd have to wonder how many of them are "full time" on the Wii U version, how many are divided between projects, and how many are credited just to be credited with minimal direct involvement.
 
Personally, I would be asking how we are getting what we are seeing in some of these Wii U games with such a lower shader unit count.

I appreciate everyone trying to provide information and theories about Latte, but I have not heard anything to justify:

- Iwata calling it GPGPU (160 ALUs for that without any customization is laughable)
- DOF and lighting, specially DOF I have heard some games on PCs going half the FPS when activating DOF effects.
 

tipoo

Banned
I appreciate everyone trying to provide information and theories about Latte, but I have not heard anything to justify:

- Iwata calling it GPGPU (160 ALUs for that without any customization is laughable)

What? That doesn't take much to justify. Any GPU from the unified generation can be called a GPGPU. A GPGPU isn't a special "thing", it just means any GPU that can perform general processing operations. Which is anything from the Nvidia 8000 series and AMD 2000 series and onwards. ALUs have nothing to do with it, it doesn't imply more or less.
 
What? That doesn't take much to justify. Any GPU from the unified generation can be called a GPGPU. A GPGPU isn't a special "thing", it just means any GPU that can perform general processing operations. Which is anything from the Nvidia 8000 series and AMD 2000 series and onwards. ALUs have nothing to do with it, it doesn't imply more or less.

I know what you mean, but if you have less power it means you will have your GPU 100% on graphics tasks, making anything more out of it virtually impossible. I mean PS4 has 4 CUs dedicated to that.
 

krizzx

Junior Member
The most common thing prevalent in these ports, is that, that 1 GB of available memory is only used for game code and holding the textures limited by 360/PS3. Porting over to WiiU means you have go through each line of code making sure that textures aren't being accessed directly by the GPU from the slow DDR3 which leads to bottlenecks.

How do you know this? I've never heard of such a thing.
 

AzaK

Member
I know what you mean, but if you have less power it means you will have your GPU 100% on graphics tasks, making anything more out of it virtually impossible. I mean PS4 has 4 CUs dedicated to that.

Exactly and this is how the Wii U works (And every other GPU capable of GP). Some portion of graphics fidelity will have to be sacrificed to go GP on the GPU. How much will depend on what you want to do of course.

It's just that Wii U is anaemic to start with.
 

tipoo

Banned
I know what you mean, but if you have less power it means you will have your GPU 100% on graphics tasks, making anything more out of it virtually impossible. I mean PS4 has 4 CUs dedicated to that.

Ah, I see what you're saying now, but even a weak GPU could still be used for things like texture decompression, and maybe even a few in-game things like fluid simulation. The option is there at least for developers to see what they can do with it, so it still qualifies for being called a GPGPU.
 
Ah, I see what you're saying now, but even a weak GPU could still be used for things like texture decompression, and maybe even a few in-game things like fluid simulation. The option is there at least for developers to see what they can do with it, so it still qualifies for being called a GPGPU.
There are also some features in Latte that are helpful/needed for proper GPGPU support, but they were not activated in the sku version's documentation that BG saw for some reason.

It should also be noted that these shaders units seem to push more than its weight, and the dev from Project CARS specifically contribute that to the eDRAM. Unfortunately, we still don't know too much about that.

Did we come to an agreement to what is going on with the "ROPs"?
 
There are also some features in Latte that are helpful/needed for proper GPGPU support, but they were not activated in the sku version's documentation that BG saw for some reason.

It should also be noted that these shaders units seem to push more than its weight, and the dev from Project CARS specifically contribute that to the eDRAM. Unfortunately, we still don't know too much about that.

Did we come to an agreement to what is going on with the "ROI's"?
Has this been confirmed? I've been reading this thread and the other one and I didn't see BG saying that his documentation was old.
If those changes are effectively there, could you at least point some of them even if it's in a vaguely form (like saying "it has more memory here than it should" or "it can do X that allows Y")?

Thanks!
 
Has this been confirmed? I've been reading this thread and the other one and I didn't see BG saying that his documentation was old.
If those changes are effectively there, could you at least point some of them even if it's in a vaguely form (like saying "it has more memory here than it should" or "it can do X that allows Y")?

Thanks!
He did mention it early on in either this thread or the other one. From what I assume, the documentation said that the feature is in the hardware but is not enabled to be used at this time. We don't know how old the docs are afaik.
 

tipoo

Banned
And the 360 version has screen tearing all over.

True. Do we still think Nintendo is enforcing Vsync, and DS2 was just some crazy bastard child? And I'm not saying one was better than the other, just bringing up a point.

AA is bandwidth and compute intensive, Vsync isn't as much.
 
What's interesting to me about the no AA thing on Wii U games, is that when a game is in motion, it looks fine. You can't even tell. It's just when there's no motion, whatever it is about how these games are crafted changes. It's the same with Arkham Origins. Terrible jaggies when you just sit there and look at a motionless screen, but when you are actually playing, or observing someone playing, there are some jaggies, but all in all it's night and day.

I'm not sure what they are doing for this to be the case, but in all honesty, if someone thinks less of the game because of how it looks while not playing it, then I don't know what to tell you. The idea of screen shots telling the whole story is outdated.
 
True. Do we still think Nintendo is enforcing Vsync, and DS2 was just some crazy bastard child? And I'm not saying one was better than the other, just bringing up a point.

AA is bandwidth and compute intensive, Vsync isn't as much.

It should be noted that the game was outsourced to Human Head studios, and was not made by the same team that ported Arkam City last year (which is likely why it doesn't even have those touch features from the previous game.). They are a relatively small company and this was their first Wii U game, so there were unsurprisingly some issues porting over the Current-gen codes like it was during launch. I wonder if the game could be patched up to improved performance like what may happen with CoD: Ghost.
 
Status
Not open for further replies.
Top Bottom