• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

The_Lump

Banned
That's just silly. It impossible...



Because after you take out the psu. It completely rules out 320 alu. After psu you are looking at 28.8 @80% for the whole console.


It doesn't, as blu has correctly pointed out.

We're still probably looking at 160 based on other observations (see FourthStorm), But this new wattage info doesn't back that up at all.

Hopefully USC comes back and discusses this with blu a bit more?
 
Facts!! I have stated the facts and I try to be as objective as possible.
Sorry, but you're not trying hard enough if you don't understand that 60fps is as much a design decision as 1080p is.
It's great that Nintendo and Platinum are aiming for 60fps but you make it sound like 60fps is some kind of next-gen characteristic when it's something that Nintendo already aimed for on the Wii. Insomniac as well with the Ratched & Clank series.

And I'm not just referring to the launch ports like you say but multiplatform games in general (like splintercell). Not a single 30 fps PS360 game is running at 60fps on the Wii U. And I don't think you will ever see that. Not because of unfinished dev tools, time, budgets or unoptimised code but because the Wii U hardware doesn't allow it without sacrificing some of the existing graphical detail.

The existence of 720p30fps games on the Wii U shouldn't be treated like aberrations or proof of sloppy ports. They are a design decision made by developers who aim to maximise graphical detail and preserve as much detail which the artists have created as possible. Don't expect X and many other games which will be build up from the ground for the Wii U to run in 60fps either.
 

69wpm

Member
M°°nblade;78512061 said:
And I'm not just referring to the launch ports like you say but multiplatform games in general (like splintercell). Not a single 30 fps PS360 game is running at 60fps on the Wii U. And I don't think you will ever see that. Not because of unfinished dev tools, time, budgets or unoptimised code but because the Wii U hardware doesn't allow it without sacrificing some of the existing graphical detail.

You do know that vsync cuts the fps in half, right? Devs could've gone for 60fps without vsync.
 

The_Lump

Banned
M°°nblade;78512061 said:
Sorry, but you're not trying hard enough if you don't understand that 60fps is as much a design decision as 1080p is.
It's great that Nintendo and Platinum are aiming for 60fps but you make it sound like 60fps is some kind of next-gen characteristic when it's something that Nintendo already aimed for on the Wii. Insomniac as well with the Ratched & Clank series.

And I'm not just referring to the launch ports like you say but multiplatform games in general (like splintercell). Not a single 30 fps PS360 game is running at 60fps on the Wii U. And I don't think you will ever see that. Not because of unfinished dev tools, time, budgets or unoptimised code but because the Wii U hardware doesn't allow it without sacrificing some of the existing graphical detail.

The existence of 720p30fps games on the Wii U shouldn't be treated like aberrations or proof of sloppy ports. They are a design decision made by developers who aim to maximise graphical detail and preserve as much detail which the artists have created as possible. Don't expect X and many other games which will be build up from the ground for the Wii U to run in 60fps either.

You do know that vsync cuts the fps in half, right? Devs could've gone for 60fps without vsync.


I'm beginning to think v-sync in mandatory (locked on) for WiiU. Probably to do with Gamepad letency? Which is fine by me as I'll take 30fps clean over 60fps tearing :D
 
I'm beginning to think v-sync in mandatory (locked on) for WiiU. Probably to do with Gamepad letency? Which is fine by me as I'll take 30fps clean over 60fps tearing :D

I think Darksiders 2 didn't have vsync. Also, if it were mandatory, it would be unlikely that certain titles advertised as 60fps could actually be locked at 60fps. Just my opinion.
 

69wpm

Member
I'm beginning to think v-sync in mandatory (locked on) for WiiU. Probably to do with Gamepad letency? Which is fine by me as I'll take 30fps clean over 60fps tearing :D

Yeah, I said the same thing a few pages back and the Darksiders 2 argument came up. But that still leads to the conclusion, that the Wii U has enough power to enable vsync and every dev prefers it to 60fps and screen tearing and I agree. The last generation was terrible when it came to performance, I don't care about all the bells and whistles when a game can't sustain a solid 30 fps framerate with vsync. Cut the crap out and give me something to look at which doesn't make me want to puke.
 
You do know that vsync cuts the fps in half, right? Devs could've gone for 60fps without vsync.
From my understanding, vsync locks the framerate at 30fps when the hardware is unable to run the game in 60fps on a 60Hz display. It means that the Wii U version would run the game somewhere between 30-59 fps without vsync.
 

NBtoaster

Member
You do know that vsync cuts the fps in half, right? Devs could've gone for 60fps without vsync.

That's not true. Triple buffering lets you run at any framerate without tearing (and is what almost all Wii U games use). Without vsync they would certainly not hit 60fps, as many already struggle with 30.
 

MDX

Member
That's not true. Triple buffering lets you run at any framerate without tearing (and is what almost all Wii U games use). Without vsync they would certainly not hit 60fps, as many already struggle with 30.

Please tell us which games and which developers where behind those games.
 

NBtoaster

Member
Please tell us which games and which developers where behind those games.

As I said almost all Wii U games use triple buffering. Potentially every game except Darksiders 2. As for those that struggle at 30fps...AC3, ME3, Batman AC, RE:Rev, Sonic racing, Black Ops 2.
 
Oh those lazy un-optimized launch titles. I see.

From the digitalfoundry Splinter cell face-off thread:
http://neogaf.com/forum/showthread.php?t=662477


Save for drops to the 25fps mark during some cut-scenes, the good news is that both PS3 and 360 versions hit this midway 30fps target consistently enough to make gameplay feel fluid. However, this is only made possible by means of some very aggressive full-screen tearing. Adaptive v-sync is in place for each platform, which causes frames to chop in half when the hardware is being pushed beyond its limits. Engaging


V-sync is permanently engaged for Nintendo's platform, adding to its credentials as one of the better-presented versions of the game overall - and it even outperforms the 360 during earlier in-engine cinematics. However, from the moment Sam Fisher enters the Paladin aircraft, it's clear this isn't to last. We suffer from lengthy bouts of 20fps performance during most briefing scenes here, plus similarly sluggish levels of refresh while at the heart of larger shoot-outs.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
M°°nblade;78516621 said:
From my understanding, vsync locks the framerate at 30fps when the hardware is unable to run the game in 60fps on a 60Hz display. It means that the Wii U version would run the game somewhere between 30-59 fps without vsync.
That's not what v-sync does. V-sync just blocks the individual page flip from occurring outside of the v-retrace. You can have vsync on, and get an average fps which is not a multiple of 30. e.g. if 90% of your fps is 60, but 10% is 30, our average fps is 57. Most games that strive for a fixed fps (60, 30) can still get a dropped frame now and then, which degrades their average fps.
 
That's not what v-sync does. V-sync just blocks the page flip from occurring outside of the v-retrace. You can have vsync on, and get an average fps which is not a multiple of 30. That's how most games that strive for a fixed fps (60, 30) can still get a dropped frame now and then, which degrades their average fps.
Oh thanks for the info.
 

mf luder

Member
Load times are really starting to spoil my enjoyment, even without the install option how does it take 4 times as long to load as ps3?
 
M°°nblade;78521029 said:
From the digitalfoundry Splinter cell face-off thread:
http://neogaf.com/forum/showthread.php?t=662477


Save for drops to the 25fps mark during some cut-scenes, the good news is that both PS3 and 360 versions hit this midway 30fps target consistently enough to make gameplay feel fluid. However, this is only made possible by means of some very aggressive full-screen tearing. Adaptive v-sync is in place for each platform, which causes frames to chop in half when the hardware is being pushed beyond its limits. Engaging


V-sync is permanently engaged for Nintendo's platform, adding to its credentials as one of the better-presented versions of the game overall - and it even outperforms the 360 during earlier in-engine cinematics. However, from the moment Sam Fisher enters the Paladin aircraft, it's clear this isn't to last. We suffer from lengthy bouts of 20fps performance during most briefing scenes here, plus similarly sluggish levels of refresh while at the heart of larger shoot-outs.

Wii U gameplay
http://www.youtube.com/watch?feature=player_embedded&v=Fg33bG2LgkY

PS3 vs 360 gameplay
http://www.youtube.com/watch?feature=player_embedded&v=u-o7o_21gmU

Wii U vs 360 gameplay
http://www.youtube.com/watch?v=uApPoVzMxBE

Wii U clearly looks better and during gameplay it runs just as good. 20 FPS is during cut scenes and not gameplay. So meh. Which was the same attitude Ubisoft had about this when they shipped the game.
 
Why do these guys bitch about the option missing to install the game to the flash drive, but don't test the eShop version? Would have been nice to know if loading times and the overall performance is better.

Because a lot of people still prefer physical Disc. I do and Nintendo really needs to add this option to the console. Load times are terrible. Good thing that once you start playing you dont deal with any untill your next mission.
 

NBtoaster

Member
So if Wii U has 8 TMUs, does that leave it with 4.4Gigatexels per second? That's a third of what PS3 has and half of 360, and could explain Splinter Cell's AF and texture resolution problems.
 
I don't understand, why does the game not feature any AF on Wii U, if it were as powerful as claimed, surely even that would not be a problem? Is it another case of lazy devs? If so, Nintendo need to reclaim their dev kits and give them to a more deserving developer.
 

Schnozberry

Member
I don't understand, why does the game not feature any AF on Wii U, if it were as powerful as claimed, surely even that would not be a problem? Is it another case of lazy devs? If so, Nintendo need to reclaim their dev kits and give them to a more deserving developer.

Clearly time constrained devs. The game runs like ass all on all three systems, and looks nowhere near as good as the best efforts on either platform. It was also completely broken for a lot of PC users on day one. The game got pushed out the door on fire.
 
http://www.youtube.com/watch?feature=player_embedded&v=Fg33bG2LgkY

Wii U clearly looks better and during gameplay it runs just as good. 20 FPS is during cut scenes and not gameplay.
The LoT vid clearly shows it drops to mid or low 20 during combat.
And this is what digital foundry says as well:
The Wii U version is completely v-synced, incidentally, but the trade-off is that it holds at the 20fps line during direct combat, while the PS3 version is the most stable performer.
 
M°°nblade;78525825 said:
The LoT vid clearly shows it drops to mid or low 20 during combat.
And this is what digital foundry says as well:
The Wii U version is completely v-synced, incidentally, but the trade-off is that it holds at the 20fps line during direct combat, while the PS3 version is the most stable performer.

All 3 suffer from FPS drops but the Wii U version is the only one without screen tearing.

Add the gamepad features as well and the Wii U version is the one for me.
 

wsippel

Banned
Random thought time: What if the GPU is not based on R700, but on an early GCN prototype based on R700? In which case, I'd guess we might be looking at 256 shader units. Could be a possible explanation for the large shader blocks I guess.
 

krizzx

Junior Member
Interestingly, the Wii U version puts its best foot first by producing a full 1280x720 internal framebuffer, backed up by 2x multi-sample anti-aliasing (MSAA).

The Xbox 360 only marginally misses the mark by peeling back its horizontal resolution by 80 pixels, providing us with a 1200x720 internal resolution backed up by a similar 2x MSAA.

To our pixel count, the PS3 weighs in at 1152x648 with a similar 2x pass of MSAA approach as the other platforms.

Weren't there people insistent on the Wii U version having the worst graphics earlier in the thread?

Also, in the Eye of Truth analysis videos, the Wii U version generally got better frame rate in combat than the PS3/360 version. Only the last combat scene saw a really. huge impact in frame rate The first after the cutscene maintained solid 30 FPS up until this small part where Sam left cover which is odd. Then it goes back to 30 until the last fight of the video. The low point was 22 fps for a second or 2 in the last fight but it generally remains stable. It never dropped to 20fps in combat once.

http://www.youtube.com/watch?feature=player_embedded&v=Fg33bG2LgkY

Seems that ducking behind cover kicks the Wii U versions frame rate in the balls which is odd.


http://www.youtube.com/watch?v=L5L5UjU5x5Y
Strangely, the 360 version has framerate problems in combat at the beginning, but out in the open after the cutscene, it has little tearing and next no frame rate drops while the PS3 has constant both. Neither the Wii U nor 360 version had frame drop in that first fight.

Also, one that is conveniantly being factored out is that the Wii U version is also outputting to the Gamepad's screen. Also, where did Eurogamer get 20fps from? It never hit that outside of a cutscene.

Random thought time: What if the GPU is not based on R700, but on an early GCN prototype based on R700? In which case, I'd guess we might be looking at 256 shader units. Could be a possible explanation for the large shader blocks I guess.

This is along the lines of what I have been suggesting for months now. We know that the GPU in the console now isn't the one that was their prior to launch. That one was an actual stock R700.
 

z0m3le

Banned
Random thought time: What if the GPU is not based on R700, but on an early GCN prototype based on R700? In which case, I'd guess we might be looking at 256 shader units. Could be a possible explanation for the large shader blocks I guess.

It could also be VLIW4 and have 32 ALUs per SPU for 256 shaders, considering size and power consumption it does fit best but the only hesitation I have of this is fourth storm's discovery about registers not matching 32 ALUs and more directly relating to 20 ALUs.

Then again didn't GCN require less register memory because of it's layout? I might be remembering it wrong but I seem to recall something along those lines which might explain the difference we see here.
 
Why do these guys bitch about the option missing to install the game to the flash drive, but don't test the eShop version? Would have been nice to know if loading times and the overall performance is better.

People on the other thread are sayng that both the loading time and the performance don't cange significantly with the DD version
 
Where did you read that?

Sometime I wonder why people keep ignoring all the data in plain sight:
http://www.amd.com/us/Documents/47285D_E4690_Discreet_GPU.pdf

55nm, 320 ALUs (VLIW5), 512MB GDDR3:
  • engine @ 150MHz, mem @ 250MHz = 8W
  • engine @ 300MHz, mem @ 400MHz = 12W
  • engine @ 450MHz, mem @ 600MHz = 17W
  • engine @ 600MHz, mem @ 700MHz = 25W
I said "I think" for a reason. >_<.

So, what would we realistically be looking at? Around 20? Those numbers include RAM, but doesn't the Latte have 3 mem pools?
Not only that but the Wii U is at 40nm.

EDIT: Already addressed above. So looks like 15 - 16w.

I'd imagine its because its doesn't fit their agenda.

You want to keep slinging shit?

Even if it WAS lower than the 360, it's still considerably more powerful.

I don't understand, why does the game not feature any AF on Wii U, if it were as powerful as claimed, surely even that would not be a problem? Is it another case of lazy devs? If so, Nintendo need to reclaim their dev kits and give them to a more deserving developer.

I don't understand either. It has the ram to compensate, but maybe pulling all the mipmaps from the disc takes too long? No clue. The game in general seems poorly coded on all 3 platforms. So I wouldn't be too worried.

People on the other thread are sayng that both the loading time and the performance don't cange significantly with the DD version

Seems like shitty programming then.
 

USC-fan

Banned
It doesn't.

A good quality PSU would have higher efficiency than 80% at the middle of its rated output. So it's not 80% to begin with. I bet that in reality it's much closer to 90%, or 32.4W of draw for the console (let's round it down to 32W).

A 40nm R800 160ALU core @ 600MHz (+ 512MB of GDDR5 @ 800MHz) is 16W. So in the case of WiiU that leaves 16W+ for CPU, disc and gamepad radio. That'd be quite wasteful of whoever designed the thing.
33 watts max from wall @ 90% psu ~30 watts

Disk drive ~4 Watts
Cpu ~8 watts

Whats left ~18

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
wifi ~.5 Watts
Flash Storage .5 Watts

Leaves about 15 watts for the whole gpu chip

This is really best case, 90% psu are very expensive.
 

z0m3le

Banned
What are the differences between the r700 and r770?

the r700 is the series chip, architecture should be similar across all of this series, it is also the code for the hd 4870x2, but of course we were never talking about that chip being in latte. r770 is the HD 4850/4870 I point it out because of all the analysts this will be the chip that they really did research on. Also if I am reading all of this right it had 16kb per SPU of register cache.
 
Do we have any comparison out there between Xbox 360, PS3, Wii U, PS4 and Xbox One disc reading speeds?

If disc reading speed isn't the problem for Splinter Cell: Blacklist (as several people reported similar loading times with the digital version), then is there another technical bottleneck (RAM problem?), or is it just Ubisoft being total incompetents?
 

z0m3le

Banned
r700 is the product line, rv770 is a specific chip - the one used in 48xx cards. Latte is definitely no rv770.
Right, what im saying is the architecture should be the same across all r700 chips so looking at how rv770 is laid out should give us a good idea of where latte came from
 

The_Lump

Banned
This is really best case, 90% psu are very expensive.


90% is best case (as in when the psu is in its optimum efficiency range), not the 33w from wall part - as demonstrated by Shnoz where he was getting 47w. That'd be about 40w heading to the system most likely which changes up that GPU number somewhat. ~20 - 25w for the gpu is entirely plausible in a best case scenario based on those figures.

Even at 20w this is not necessarily anything to worry about in terms of what WiiU is aiming for. As I pointed out earlier, the entire e6760 MCM (which is a different beast to Latte power wise: 480sp, 24TMU, 600mhz etc) has a TDP of 35w, that INCLUDES GDDR5 @ 800mhz. We're talking upwards of 20w solely for the gpu chip for Latte here.


Edit: and I'm not pointing to 20-25w as a ray of light for wiiU fans, I'm simply saying it's also not a rallying point for those looking for the negatives.
 

Van Owen

Banned
Do we have any comparison out there between Xbox 360, PS3, Wii U, PS4 and Xbox One disc reading speeds?

If disc reading speed isn't the problem for Splinter Cell: Blacklist (as several people reported similar loading times with the digital version), then is there another technical bottleneck (RAM problem?), or is it just Ubisoft being total incompetents?

LAZY DEVS
 

QaaQer

Member
After doing a wee bit of research/forum-diving, turns out that getting decent measurements of efficiency require spendy gear. Also using at the wall measurements is a very crude method of estimating system power.
 
Do we have any comparison out there between Xbox 360, PS3, Wii U, PS4 and Xbox One disc reading speeds?

If disc reading speed isn't the problem for Splinter Cell: Blacklist (as several people reported similar loading times with the digital version), then is there another technical bottleneck (RAM problem?), or is it just Ubisoft being total incompetents?

PS3: 9 MB/s
360: 15.85 MB/s
Wii U: 22.5 MB/s
PS4: 27 MB/s
X1: TBA

After doing a wee bit of research/forum-diving, turns out that getting decent measurements of efficiency require spendy gear. Also using at the wall measurements is a very crude method of estimating system power.

...how much?
 

joesiv

Member
So the earlier statements that some made were right. Certain games will pull more power from the GPU than others which means that it has not peaked its power. We have can get near 45w with no auxiliary attachments.
How do we come to this conclusion? From what I gather, there is a 4 watt difference between home menu and playing splitercell, couldn't the 4 watt difference just be the spinning drive?

I'm actually shocked that the difference is so small, given the home menu shouldn't be very taxing on the CPU/GPU, though it's also likely running through the web API interpreter, which could be eating cycles.

I think we'd have to test some other games with the same calibrated meter to come to any conclusion about some games pulling more from the GPU.
 

The_Lump

Banned
Basically the problem is WiiU doesn't universally have enough HDD space for an install, so developers can't rely on that when developing their games. The lowest common denominator (Johnny 8GB Basic Bundle) must be able to play the game.

Would be nice to have an optional install though, which I'm sure would solve the loading problems.

The texture problems seem to be 6 of one, half a dozen of the other. It's exceeding the 360 high texture pack most of the time but but randomly bombing on some of the scenery textures at other times. Odd. Perhaps that too could be fixed with an install?

All in all, I'm actually pretty impressed with this port. Frame rates are stable (outside of a few cutscenes), lighting and textures generally good and no tearing whatsoever. From what we know is in the console, that's not too shabby. If people were expecting more then they were ignoring a lot of facts.
 

AlStrong

Member
I don't understand either. It has the ram to compensate, but maybe pulling all the mipmaps from the disc takes too long?

AF = more texture sampling -> texture accesses & texture filtering ops = external RAM bandwidth since textures don't fit into the on-die texture caches (clearly :p).

I'd be hesitant about using part of the 32MB eDRAM as a stream/texture cache considering how much still needs to be swapped in and out for a game such as this.

edit:

If disc reading speed isn't the problem for Splinter Cell: Blacklist (as several people reported similar loading times with the digital version), then is there another technical bottleneck (RAM problem?), or is it just Ubisoft being total incompetents?

FWIW, during loading, there can be a fair bit of time spent decompressing assets, and that can be a heavy CPU process (dedicated threads on PS360 even).
 
Status
Not open for further replies.
Top Bottom