• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

krizzx

Junior Member
I don't think we have enough information to dismiss any theory. Fourth Storm put a lot of hard work into his analysis and no one has refuted it yet. We have alternate theories, but I don't think we'll know who's theory pans out until we see what Nintendo has to offer at E3. It should tell us immediately whether we can expect something beyond what the PS3 and 360 have already shown us. More importantly, the gameplay needs to make the gamepad look unique and novel. Not every game necessarily, but at least a couple need to appealing enough that it makes you wonder why no one thought of that before.

What Nintendo has to offer has no bearing on the physical components of the hardware, though I fear people will try to draw such conclusions. As I have said many times. Nintendo never tries to max their hardware. Nothing they release will show the full potential of the console.
 

Hermii

Member
What Nintendo has to offer has no bearing on the physical components of the hardware, though I fear people will try to draw such conclusions. As I have said many times. Nintendo never tries to max their hardware. Nothing they release will show the full potential of the console.

Are you saying the galaxy games, DKCR or Skyward Sword does not max out the Wii ?
 

krizzx

Junior Member
Are you saying the galaxy games, DKCR or Skyward Sword does not max out the Wii ?

I doubt they came close and how did DKCR and Skyward Sword get on that list? They no more maxed out the Wii than Mario Sunshine maxed out the GameCube or Mario 64 maxed out the N64. Power and performance have never been Nintendo's focus even in games. They are all about art and creativity.

Mario Galaxy ran at 60 FPS with no load times and no performance hiccups. That alone lets you know that its not even remotely pushing the system. Heck, a person hacked the Wii version and got it to run split screen with Luigi as a second character and there was still no sign of frame drop. Nintendo always plays it safe.

http://www.youtube.com/watch?v=Pqef07U0LNs

If you can name a single Wii game that pushed more than 20 million polygons at 60 FPS with bump mapping,normal mapping, dynamic lighting, bloom and self shadowing(Rebel Strike did this on the GC), then you may be looking at a game that came close to maxing the Wii. Just finding a Wii game that did normal mapping is a rarity. Few devs attempted to get all the console had to offer and none of them really succeeded. Factor 5 probably would have but they went bankrupt midproduction of Lair for the Wii which I think they said was going to run at 720p on the console(the Wii capable of outputting in HD in case you weren't aware).
 

Schnozberry

Member
What Nintendo has to offer has no bearing on the physical components of the hardware, though I fear people will try to draw such conclusions. As I have said many times. Nintendo never tries to max their hardware. Nothing they release will show the full potential of the console.

Who else will? Retro is probably your best shot, well them or Monolithsoft.
 
probably around 8-10x in real terms (not paper numbers)

microsoft themselves quantified the overall durango as around 8x as powerful as 360 in that wired article. since wii u roughly equal a 360 so far imo...
Yes, because marketing departments rarely embellish useless numeric modifiers.
 

disap.ed

Member
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.

That's still a huge leap over Wii U.

That's what I would guess as well. Could mean 720p30 for WiiU and 1080p60 for XB1.
Wouldn't be too bad I guess, I can live with this also having a capable PC in case of bad ports.
 

AzaK

Member
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.

That's still a huge leap over Wii U.

It may even be more favourable to Wii U depending on targets. If Wii U hits that 352GFlops and XBO is 1.2TFlops that's 4ish. If XBO devs target 1080p or 60fps then that closes the raw GPU grunt gap a lot. XBO has 6 faster gaming cores compared to 3 which is a real plus and not much Wii U can do about that but in non CPU bound games it might not matter.

Of course it's all speculation and guesstimates but Wii U might get lucky.
 

z0m3le

Banned
I don't think we have enough information to dismiss any theory. Fourth Storm put a lot of hard work into his analysis and no one has refuted it yet. We have alternate theories, but I don't think we'll know who's theory pans out until we see what Nintendo has to offer at E3. It should tell us immediately whether we can expect something beyond what the PS3 and 360 have already shown us. More importantly, the gameplay needs to make the gamepad look unique and novel. Not every game necessarily, but at least a couple need to appealing enough that it makes you wonder why no one thought of that before.

4th storm's theory is that the TEV layer in the ALUs is why it is so much bigger (91% of the size of 2 20alu brazo blocks) Which really doesn't make sense when you think about how big Flipper actually is ~26 million transistors not including the eDRAM which would be replaced by Wii U's eDRAM, that is small enough to just fit in a tiny corner of the die. it would certainly not double the size of the ALUs, especially when you consider these ALUs lack DX11 and SM5.

I'm not sure why people even take his theory in higher regard than others. It might have more effort put forward than most, but it lacks concrete data and is as all others are, heavily based on assumptions that some of which are lacking logic, as in why the ALUs would be twice as big without those features posted above when TEV logic themselves would be very small on this chip.
 

z0m3le

Banned
Probably needs alot of power to handle 1080p. So I would put it near 4-5x better when all said is done.

That's still a huge leap over Wii U.

Depends, at the bottom it is 176 vs 1288 (GFLOPs) which is a little over a 7x difference, however at the top, 352 vs 1288 (GFLOPs) is still likely, seeing as how the SPUs could fit 40 DX10 SM4 ALUs in those SPUs without issue at ~92% of the same size as brazo's 2 SPUs with 20 ALUs inside them.

Making the difference only 3.6x for that particular comparison. The PS4 btw has a 43% performance advantage over XBone if the architectures are indeed very similar. Might not sound like a lot, but that is an extra 555 GFLOPs that can be used for GPGPU functions like realistic clothing, paper and hair. The extra RAM of PS4 over XBone can also be used for added Tessellation, which could really make the difference over XBone stand out.

XBone is more or less what the original idea of what Nintendo would create is, a low end performance, though it has more RAM than we ever expected Nintendo to use, it fits in with the original ~1TFLOPs of GPU performance we thought Wii U would have. (This was before e3 2011 iirc)
 

czk

Typical COD gamer
Mario Galaxy ran at 60 FPS with no load times and no performance hiccups. That alone lets you know that its not even remotely pushing the system. Heck, a person hacked the Wii version and got it to run split screen with Luigi as a second character and there was still no sign of frame drop. Nintendo always plays it safe.

http://www.youtube.com/watch?v=Pqef07U0LNs
GetFile.aspx
 
I doubt they came close and how did DKCR and Skyward Sword get on that list? They no more maxed out the Wii than Mario Sunshine maxed out the GameCube or Mario 64 maxed out the N64. Power and performance have never been Nintendo's focus even in games. They are all about art and creativity.
Those are very different scenarios.

Mario 64 was a launch title, on a generation Nintendo was hard to develop for, lots of trial and error involved too seeing it was the first acclaimed "full" 3D game.

Mario Sunshine was not launch; it already built on the Nintendo's experience with, say... Pikmin.

And I agree Skyward Sword doesn't push the system, but TP did, for the GC. It's a matter of ambition, SS is more satisfied in being confined, so it does less; it's not trying to prove anything, so it's not pushing the envelope in that sense.
Mario Galaxy ran at 60 FPS with no load times and no performance hiccups. That alone lets you know that its not even remotely pushing the system. Heck, a person hacked the Wii version and got it to run split screen with Luigi as a second character and there was still no sign of frame drop. Nintendo always plays it safe.

http://www.youtube.com/watch?v=Pqef07U0LNs
That doesn't mean it doesn't push the system.

Sure, they had overhead left (and optimizations possible), but they're using the hardware in a very throughful manner.
If you can name a single Wii game that pushed more than 20 million polygons at 60 FPS with bump mapping,normal mapping, dynamic lighting, bloom and self shadowing(Rebel Strike did this on the GC), then you may be looking at a game that came close to maxing the Wii. Just finding a Wii game that did normal mapping is a rarity. Few devs attempted to get all the console had to offer and none of them really succeeded. Factor 5 probably would have but they went bankrupt midproduction of Lair for the Wii which I think they said was going to run at 720p on the console(the Wii capable of outputting in HD in case you weren't aware).
Normal mapping is overated on a system like the Wii; you have loads of theorectical polygons for an SD console, so you might as well use them instead of faking them (Metroid Prime 3, for instance; as opposed to Conduit doing normal maps in those enclosed rooms). Talk about diminishing gains.

And AFAIK Factor 5 didn't use it.


"Lair" would never run at 720p on the Wii, lol; for starters framebuffer memory is too small (2 MB) it's small for 640x480+z-buffer hence the chronic dithering and almost no one using AA settings (720p would place that under an even bigger strain/require extra RAM from the main 24 MB 1T-SRAM bank); nor would it be called Lair, because that's a Sony trademark; just like Shadows of the Eternals isn't Eternal Darkness 2 because Nintendo owns that. Also bare in mind Factor 5 were trying to attract investment, they'd say anything at that point. Yes, they were capable of pulling something standing appart graphically, but that's not the only way to use your resources.

That's like saying Majora Mask on the GC didn't properly use the extra RAM just because instead of extra resolution and ultradetailed textures it used it to track down characters and the like. Rogue Squadron games had very little game logic going on, no complex physics too; and lots of TEV manipulation, that's fine, but most can think of better rounded-up methods of using said hardware in a more balanced manner.
 

NBtoaster

Member
I doubt they came close and how did DKCR and Skyward Sword get on that list? They no more maxed out the Wii than Mario Sunshine maxed out the GameCube or Mario 64 maxed out the N64. Power and performance have never been Nintendo's focus even in games. They are all about art and creativity.

The people that work on their engines are focused on looks and performance.
 

disap.ed

Member
"Lair" would never run at 720p on the Wii, lol; for starters framebuffer memory is too small (2 MB) it's small for 640x480+z-buffer hence the chronic dithering and almost no one using AA settings (720p would place that under an even bigger strain/require extra RAM from the main 24 MB 1T-SRAM bank); nor would it be called Lair, because that's a Sony trademark; just like Shadows of the Eternals isn't Eternal Darkness 2 because Nintendo owns that. Also bare in mind Factor 5 were trying to attract investment, they'd say anything at that point. Yes, they were capable of pulling something standing appart graphically, but that's not the only way to use your resources.

IIRC they wanted to "bypass" the 2MB framebuffer and use the 24MB main RAM instead to enable 720p.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
IIRC they wanted to "bypass" the 2MB framebuffer and use the 24MB main RAM instead to enable 720p.
The problem with Wii's inability to go above 480p had nothing to do with the size of the embedded framebuffer.

The embedded framebuffer (eFB) is not what the cube/wii show on your TV - the eFB is a back buffer. What you see on the TV is the primary framebuffer (xFB), which low and behold resides in MEM1 (24MB). You can use eFB as a tile buffer and build up an xFB of much larger resolution, not unlike how 360 would build up its primary buffer while using its scarce 10MB of back buffer. The problem on Wii was in the VI (video interface) - the TV signal encoder, which simply could not do > 480p.
 
IIRC they wanted to "bypass" the 2MB framebuffer and use the 24MB main RAM instead to enable 720p.
Even if they could, what's the point?

That's like pushing current gen consoles with 1080p, and feel obliged to go through a major downgrade in order to pull it. 720p (or under) is often a better payoff.
The problem with Wii's inability to go above 480p had nothing to do with the size of the embedded framebuffer.

The embedded framebuffer (eFB) is not what the cube/wii show on your TV - the eFB is a back buffer. What you see on the TV is the primary framebuffer (xFB), which low and behold resides in MEM1 (24MB). You can use eFB as a tile buffer and build up an xFB of much larger resolution, not unlike how 360 would build up its primary buffer while using its scarce 10MB of back buffer. The problem on Wii was in the VI (video interface) - the TV signal encoder, which simply could not do > 480p.
That's interesting, thanks for the insight.

The dude's/reports saying they somehow tested 720p were talking out of their asses?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The dude's/reports saying they somehow tested 720p were talking out of their asses?
Maybe they tested render target resolutions of 720p, which I've never tried but again should not be impossible from what I recall. How they'd manage to take that to the TV is entirely beyond my knowledge though, and I've actually experimented with that, unsuccessfully.
 

wsippel

Banned
Didn't Marcan state a while ago that the eDRAM on Latte is actually not eDRAM at all, but still 1T-SRAM? 1T-SRAM is often marketed as eDRAM, because that's technically what it is, but it has slightly lower density yet is quite a bit faster (lower, SRAM like latency) at the same clock speed and bus width.
 
Didn't Marcan state a while ago that the eDRAM on Latte is actually not eDRAM at all, but still 1T-SRAM? 1T-SRAM is often marketed as eDRAM, because that's technically what it is, but it has slightly lower density yet is quite a bit faster (lower, SRAM like latency) at the same clock speed and bus width.

i'm pretty sure he's just mistaken
 

wsippel

Banned
i'm pretty sure he's mistaken
I don't know. He wrote that he's seen Renesas 1T-SRAM, and what's on Latte looks just like it. Also, I'm not sure it would be possible to run Wii software if the memory timing isn't exactly like it should be. And it's impossible to get the exact same timing using regular eDRAM. It needs to be either real SRAM or some form of pseudostatic RAM.

It doesn't really have to be MoSys 1T-SRAM. Fujitsu FCRAM or Numonyx PSRAM are pretty much exactly the same thing as far as I know.
 
I don't know. He wrote that he's seen Renesas 1T-SRAM, and what's on Latte looks just like it. Also, I'm not sure it would be possible to run Wii software if the memory timing isn't exactly like it should be. And it's impossible to get the exact same timing using regular eDRAM. It needs to be either real SRAM or some form of pseudostatic RAM with no or completely hidden refresh cycles.

It doesn't really have to be MoSys 1T-SRAM. Fujitsu FCRAM or Numonyx PSRAM are pretty much exactly the same thing as far as I know.

oh ok i thought mosys were the only makers of 1tsram, maybe then
 

KingSnake

The Birthday Skeleton
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?
 

z0m3le

Banned
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?

I think the SoC is 40nm too, which definitely puts a stop to some ideas about what 33watts can and can't do, as we know that XBone is 1.2TFLOPs.
http://www.neogaf.com/forum/showthread.php?t=560464 according to wired it's 40nm.

Edit: 28nm corrected by phosphor112.
 

wsippel

Banned
so it wont make any difference? we still looking at 70 or 140 then?
how much better for latency is it?
I know there is a difference, and it's big enough for pseudostatic RAM to be a thing, but I don't know how big the difference actually is. Nintendo apparently loves the stuff though, they used it in every console since the Gamecube and every handheld since at least the DSi (the DS supposedly used real SRAM).
 

krizzx

Junior Member
Those are very different scenarios.

Mario 64 was a launch title, on a generation Nintendo was hard to develop for, lots of trial and error involved too seeing it was the first acclaimed "full" 3D game.

Mario Sunshine was not launch; it already built on the Nintendo's experience with, say... Pikmin.

And I agree Skyward Sword doesn't push the system, but TP did, for the GC. It's a matter of ambition, SS is more satisfied in being confined, so it does less; it's not trying to prove anything, so it's not pushing the envelope in that sense.That doesn't mean it doesn't push the system.

Sure, they had overhead left (and optimizations possible), but they're using the hardware in a very throughful manner.Normal mapping is overated on a system like the Wii; you have loads of theorectical polygons for an SD console, so you might as well use them instead of faking them (Metroid Prime 3, for instance; as opposed to Conduit doing normal maps in those enclosed rooms). Talk about diminishing gains.

And AFAIK Factor 5 didn't use it.


"Lair" would never run at 720p on the Wii, lol; for starters framebuffer memory is too small (2 MB) it's small for 640x480+z-buffer hence the chronic dithering and almost no one using AA settings (720p would place that under an even bigger strain/require extra RAM from the main 24 MB 1T-SRAM bank); nor would it be called Lair, because that's a Sony trademark; just like Shadows of the Eternals isn't Eternal Darkness 2 because Nintendo owns that. Also bare in mind Factor 5 were trying to attract investment, they'd say anything at that point. Yes, they were capable of pulling something standing appart graphically, but that's not the only way to use your resources.

That's like saying Majora Mask on the GC didn't properly use the extra RAM just because instead of extra resolution and ultradetailed textures it used it to track down characters and the like. Rogue Squadron games had very little game logic going on, no complex physics too; and lots of TEV manipulation, that's fine, but most can think of better rounded-up methods of using said hardware in a more balanced manner.

Your entire scenario is different than mine.

Its seems me and you have very different concepts of what "maxing" a system out is. By your logic, when Insomniac said Resistance maxed out the capabilities of the PS3 it was true. Then they come back and say they got 40% more performance with Ratchet and Clank and like 80% more performance with Resistance 2. So all of those games maxed out its system by your methods. Only the one that did the most did according to my concept of maxing out.



Any game can reach "its limit" on a console. I can write 10 lines of code and use all the resource of system hardware. That is not maxing out the system, though. That is just using up all of resources. Maxing out means optimizing to the point where there is nothing more that you can get out of the hardware. It means using all of the hardware most advanced features to their highest potential

Nothing Nintendo releases ever does this. Any game that runs at 60 FPS can get could get double the performance at 30. If you never hit an instance of slowdown, then that means you were no where near peaking the limits of the hardware. Mario Galaxy left so much leeway that is could still maintain solid 60 FPS with the screen split. There is a difference between a game being/looking good and a game pushing a system. I like MG 2 but I know the hardware can do far more than that.

Also, ATI said the chip was HD capable, but Nintendo had it locked in the firmware. Even the PS2 and Xbox1 could output in HD. The PS2 output at 1080i in actual games and it only had 4 MB edram had a lower texture bandwith than the GC embedded 1T-SRAM. Nintendo could have unlocked the higher resolutions if they wanted to. They just didn't.


One of the homebrew apps for the Wii allowed you to play "some" videos at 720p "Files with MPEG-2 TS video streams (typically HDTV recorded .ts files) with resolutions up to 1280x720 play correctly."
 
does this. Any game that runs at 60 FPS can get could get double the performance at 30. If you never hit an instance of slowdown, then that means you were no where near peaking the limits of the hardware.

That's a completely nonsensical metric. Completely maxing out a console would be stupid. No one wants to play a game at some ridiculously low framerate. Maxing should be looked at in terms of how optimized it is for a particular framerate (30 or 60), not at any arbitrary rate, because devs specifically target those 2
 

krizzx

Junior Member
That's a completely nonsensical metric. Completely maxing out a console would be stupid. No one wants to play a game at some ridiculously low framerate. Maxing should be looked at in terms of how optimized it is for a particular framerate (30 or 60), not at any arbitrary rate, because devs specifically target those 2
When did 30FPS become a "ridiculously low"? I guess that means Mario Sunshine was unplayable and Killzone 4 will be.

Also, you are missing the entirety of what I'm saying. The main point was that when Mario Galaxy achieved 60 FPS, it did it with a ton of leeway. Nintendo didn't even try to "push" the system. People are letting their opinion of the game itself prevent them from seeing it technically. You are viewing it as fans rather than analysts. Also, that was not "all" that i looked at or pointed out.

Lilke lostinblue, your concept of maxing out is completely different than mine. Your view seems to be emphatic, or based on your emotional view and the games appeal to you. What devs target is their business and completely irrelevant to what I'm talking about. How I feel about the game/hardware is irreverent.

Speaking purely in terms of what the hardware can possibly achieve, no game pushed the Wii anywhere near it optimal limits, ie. maxing it out by my concept. There were no games that used every stage of the TEV to its optimized limit or push out the highest level texture effects it could achieve. There were no games that pushed the polygon potential of the Wii to the highest achievable in a real world scenario like Rebel Strike(which did it at 60 FPS by the way). There were no games that optimized the Wii CPU to do physics/A.I. to the "best" of its capability. There were games that had these things, but they were far from optimal. By that metric, no game maxed out the Wii's capabilities.
 
Also, you are missing the entirety of what I'm saying. The main point was that when Mario Galaxy achieved 60 FPS, it did it with a ton of leeway. It didn't even try to "push" the system. People are letting their opinion of the game itself prevent them from see it on a technical level.

Nah. If you ask me, it was a great decision to go for rock stable 60 fps. Of course that also means that there will be many scenes in the game which could hit 120 fps, but that's just how it is. In any case this has nothing to do with technical achievement.
 

AzaK

Member
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?
Isn't the 100w just for the SoC though? I'm on my phone so can't easily find where I read it yesterday.
 

tipoo

Banned
Based on the information available now XB1 can draw up to 100W. If I remember correctly Wii U can draw up to 75W (this is not the real usage). So I think is quite safe to say that an 8x-10x more power on XB1 is a little bit out of the question. Even compared to the real wattage: 33W and assuming XB1 will come close to 100W (in an ideal world). Isn't it?

Ach, no, we dismissed this months ago. The Wii U power supply is rated for a max of 75 watts, and it's typical of consoles and Nintendo consoles in particular to overprovision their PSUs by up to double (for capacitor aging, efficiency, etc). And the One has an *APU* that draws 100 watts, not the whole system. In the Wii U the entire *system* draws 33w at the wall, including PSU inefficiency, the disk drive etc so less than 33w is going to the CPU and GPU together.

The One is also on 28nm vs 45 on the Wii U (or 45 CPU and 40 GPU, I forget). Even at the exact same wattage, the 28nm part would be able to do a lot more per watt and mm2.

Almost nothing ever uses 100% of its power supply rating, as that's just not smart. If a capacitor aged just 0.5% and you tried a 100% load, the system would crash.

Isn't the 100w just for the SoC though? I'm on my phone so can't easily find where I read it yesterday.



That is right. Just the APU, not total system.
 

tipoo

Banned
Both are not using eDRAM. THe One uses eSRAM, the U uses eDRAM. eDRAM is 3x as dense, however it must refresh its cells and under 4MB has a higher latency.
 

The_Lump

Banned
Ok, noob question: From a developers perspective, what's the practical difference between a system with x86 CPU and what WiiU is toting (PowerPC)?
 
Status
Not open for further replies.
Top Bottom