• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU "Latte" GPU Die Photo - GPU Feature Set And Power Analysis

Status
Not open for further replies.

USC-fan

Banned
90% is best case (as in when the psu is in its optimum efficiency range), not the 33w from wall part - as demonstrated by Shnoz where he was getting 47w. That'd be about 40w heading to the system most likely which changes up that GPU number somewhat. ~20 - 25w for the gpu is entirely plausible in a best case scenario based on those figures.

Even at 20w this is not necessarily anything to worry about in terms of what WiiU is aiming for. As I pointed out earlier, the entire e6760 MCM (which is a different beast to Latte power wise: 480sp, 24TMU, 600mhz etc) has a TDP of 35w, that INCLUDES GDDR5 @ 800mhz. We're talking upwards of 20w solely for the gpu chip for Latte here.


Edit: and I'm not pointing to 20-25w as a ray of light for wiiU fans, I'm simply saying it's also not a rallying point for those looking for the negatives.
What are you talking about? The reading where he has a bunch of usb stuff hook up?

And the guy still haven't post a single bit of proof. Take everything with a grain of salt
 
AF = more texture sampling -> texture accesses & texture filtering ops = external RAM bandwidth since textures don't fit into the on-die texture caches (clearly :p).

I'd be hesitant about using part of the 32MB eDRAM as a stream/texture cache considering how much still needs to be swapped in and out for a game such as this.

edit:



FWIW, during loading, there can be a fair bit of time spent decompressing assets, and that can be a heavy CPU process (dedicated threads on PS360 even).
The BW should be much higher on the Wii U though, because of the several pools of eSRAM.

hundreds to thousands. I wish I knew this guy: http://www.youtube.com/watch?v=96ChbxuvNys

Nvm! Lol!
 

The_Lump

Banned
What are you talking about? The reading where he has a bunch of usb stuff hook up?

And the guy still haven't post a single bit of proof. Take everything with a grain of salt

A bunch of USB stuff hooked up? You mean one USB drive hooked up. How much power do you think that takes? Even if it takes 5w then that's still 42w from the wall(~37w to the console). 20w to the GPU is now implausible from that?

I'm taking everything with salt. You should take those component power draw figures you quoted with a grain of salt too as there was no proof to back that up, of course. It's all estimations, I'm merely pointing out the ~30w getting to the console is not "best case scenario", as you stated.
 
Which pools? Are they part of the texture unit cache hierarchy?

Anyways, assuming those caches are in use, then the only thing left really is raw texture filtering rate.

Not sure, I'm not 100% knowledgeable about how that exactly is done in this case. Is the texture filtering limited by the main ram BW? Or would it be the eSRAM?
 
FWIW, during loading, there can be a fair bit of time spent decompressing assets, and that can be a heavy CPU process (dedicated threads on PS360 even).

Does this decompression step usually make use of SIMD operations? Generally I'd assume that this is relatively straightforward code (without many conditional branches), making it a good fit for PS360.
 

krizzx

Junior Member
Do we have any comparison out there between Xbox 360, PS3, Wii U, PS4 and Xbox One disc reading speeds?

If disc reading speed isn't the problem for Splinter Cell: Blacklist (as several people reported similar loading times with the digital version), then is there another technical bottleneck (RAM problem?), or is it just Ubisoft being total incompetents?

Its more than likely just poorly porting. Generally the only things better in Wii U ports are the texture resolution and load times. This is he first I've seen it happen the other way around.

As has been stated by even Sony themselves, the eXRAM pool pushes the real world bandwidth performance to far beyond what the raw system memory clocks state even compared the 360/PS3's RAM level without accounting for their memory bottlenecks. The Wii U has no memory bottleneck when programmed properly. Its probably just a case of poor optimization.

Basically the problem is WiiU doesn't universally have enough HDD space for an install, so developers can't rely on that when developing their games. The lowest common denominator (Johnny 8GB Basic Bundle) must be able to play the game.

Would be nice to have an optional install though, which I'm sure would solve the loading problems.

The texture problems seem to be 6 of one, half a dozen of the other. It's exceeding the 360 high texture pack most of the time but but randomly bombing on some of the scenery textures at other times. Odd. Perhaps that too could be fixed with an install?

All in all, I'm actually pretty impressed with this port. Frame rates are stable (outside of a few cutscenes), lighting and textures generally good and no tearing whatsoever. From what we know is in the console, that's not too shabby. If people were expecting more then they were ignoring a lot of facts.

Ever heard of the Xbox 360 Arcade?

Nintendo said clearly that the Wii U was built with the intent of you sing an external hardware for game storage.
 

joesiv

Member
Basically the problem is WiiU doesn't universally have enough HDD space for an install, so developers can't rely on that when developing their games. The lowest common denominator (Johnny 8GB Basic Bundle) must be able to play the game.
And the sad part is, even if they let users install to the USB HDD, USB2.0 is also slow, so the difference would be almost negligible.

It's possible that in the future, they could dedicate some local flash storage for a optional install/cache option. But at the same time, we don't even know how much faster the internal flash is compared to disk/usb2.0. From some tests that were done comparing e-shop to disk, I think the difference wasn't great...
 

prag16

Banned
And the sad part is, even if they let users install to the USB HDD, USB2.0 is also slow, so the difference would be almost negligible.

It's possible that in the future, they could dedicate some local flash storage for a optional install/cache option. But at the same time, we don't even know how much faster the internal flash is compared to disk/usb2.0. From some tests that were done comparing e-shop to disk, I think the difference wasn't great...

Again, plenty of other games that seem to be in the same ballpark in terms of pushing the hardware DO NOT have loading issues nearly this bad (AC3, Batman, ME3, BO2, etc). There's probably more to it.
 

z0m3le

Banned
So if I'm reading this right, r700 series had 16kb register cache in each SPU:

http://www.anandtech.com/show/2841/5
anandtech said:
Also improving the computational performance of the SIMDs is the doubling of the local data share attached to each SIMD, which is now 32KB.

How much register cache does Latte have in each SPU? I count 16 blocks of either 2kb or 4kb blocks right? so 32kb or 64kb, meaning you'd have plenty of register memory for more than 20 ALUs per block in r700 based GPUs if that is what Latte is.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So if I'm reading this right, r700 series had 16kb register cache in each SPU:

http://www.anandtech.com/show/2841/5


How much register cache does Latte have in each SPU? I count 16 blocks of either 2kb or 4kb blocks right? so 32kb or 64kb, meaning you'd have plenty of register memory for more than 20 ALUs per block in r700 based GPUs if that is what Latte is.
That's the LDS (local data share - memory accessible to all threads in a SIMD). We don't know how much Lattes has. Heck, we don't know what is the memory hierarchy visible to Latte threads, to boot.
 

z0m3le

Banned
That's the LDS (local data share - memory accessible to all threads in a SIMD). We don't know how much Lattes has. Heck, we don't know what is the memory hierarchy visible to Latte threads, to boot.

http://www.anandtech.com/show/2556/5 I got the r700 register numbers from this page and thought that was collaborating it. "16kb Registers per SM/SIMD Core" for rv770. Unless that is somehow referring to something else as well?
 

AlStrong

Member
Does this decompression step usually make use of SIMD operations? Generally I'd assume that this is relatively straightforward code (without many conditional branches), making it a good fit for PS360.

Depends on the formats.

mm.. brainfart (yay being in a hurry!). Was thinking general I/O & multithreading, and on further thought... shouldn't be that big of a problem afterall (derp), although the other two consoles do have spare threads. *shrug*

I'm not sure how threaded/CPU-heavy SC:B is otherwise.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
http://www.anandtech.com/show/2556/5 I got the r700 register numbers from this page and thought that was collaborating it. "16kb Registers per SM/SIMD Core" for rv770. Unless that is somehow referring to something else as well?
If you're referring to the bit from the anand quote your originally included in your post - that is the LDS (32KB on R800, 16KB on R700). That's pretty much the fastest bit of directly addressable memory in the entire system, save for the register files.
 

z0m3le

Banned
If you're referring to the bit from the anand quote your originally included in your post - that is the LDS (32KB on R800, 16KB on R700). That's pretty much the fastest bit of directly addressable memory in the entire system, save for the register files.

No i was trying to find the size of the registers on each SPU of the r800, it seems the r700 is 16kb. The idea Radeon cards always used 64kb for their registers inside the SPU is one reason latte is probably 160 alus since it didn't have the register size to be more than that (64kb which is found on GCN's registers) being 16kb on r700 would show that there is no such limit to their ALU count per SPU (as it has changed over the years)
 
Basically the problem is WiiU doesn't universally have enough HDD space for an install, so developers can't rely on that when developing their games. The lowest common denominator (Johnny 8GB Basic Bundle) must be able to play the game.
Neither does the X360 with the arcade 4 GB models lasting until today (and earlier they'd ship with as little as a 256 MB memory card)
And the sad part is, even if they let users install to the USB HDD, USB2.0 is also slow, so the difference would be almost negligible.
Difference from 22 MB/s to 30 MB/s is not huge by any means, no; but remember the fact PS3 loadings were a world of difference from X360?

Well, you were going from 9 MB/s to 11.2 MB/s; 22 MB/s to 30 MB/s have the capacity to make a difference.

If you are transfering 512 MB of data it means going from 23 seconds to 17 seconds, 6 seconds less.

Not just that though; latency is an issue with optical drives so you have to deal with the seek time being considerable, so that would take further the wait time to get data that isn't contiguous. On a external HDD that tends to be lessened, and even completly gone if you use a SSD.


Amazing USB 2.0 is not, but it has the capacity to bring palpable improvements to the table.
 

AlStrong

Member
Is the texture filtering limited by the main ram BW? Or would it be the eSRAM?

It's a combination of the # of texture units, texture cache size, and external bandwidth.

Higher filtering modes mean going through the bilinear filtering units multiple times (more cycles), so you're fetching more, spending more time. More units, more filtering ops performed. The size of the HW texture cache will determine how many more times you need to fetch from main memory (external bandwidth usage) - larger caches mitigate external bandwidth requirements to an extent since you're fetching more data per access.

I'm not clear on what extra cache you're referring to (eSRAM?) or how that fits into the texture unit cache hierarchy. *shrug*

At the end of the day, you still need texture filtering throughput if you have enough bandwidth.
 
No i was trying to find the size of the registers on each SPU of the r800, it seems the r700 is 16kb. The idea Radeon cards always used 64kb for their registers inside the SPU is one reason latte is probably 160 alus since it didn't have the register size to be more than that (64kb which is found on GCN's registers) being 16kb on r700 would show that there is no such limit to their ALU count per SPU (as it has changed over the years)

Here you go. Page 10 should help you out.

http://gpgpu.org/wp/wp-content/uploads/2009/09/E1-OpenCL-Architecture.pdf

2560 kB of total register space/10 SIMDs/4 blocks per SIMD = 64 kB per block (of 20 ALUs)
 
Basically the problem is WiiU doesn't universally have enough HDD space for an install, so developers can't rely on that when developing their games. The lowest common denominator (Johnny 8GB Basic Bundle) must be able to play the game.
It's really easy to forget this but the 360 also has models without HDDs. Only the PS3 guarantees that every model has a hard drive (until the $199 model is released everywhere) so that can't be an excuse.

I read a comment that the Wii U doesn't allow for disc installs but that doesn't make much sense since it allows (copious) patching. I don't understand what the distinction would be between a patch and installed assets from the disc. It may be that the developers didn't have time to figure out how to implement it in the Wii U version.
 
It's really easy to forget this but the 360 also has models without HDDs. Only the PS3 guarantees that every model has a hard drive (until the $199 model is released everywhere) so that can't be an excuse.
One thing is true though, and it's the fact that some third parties like EA and Activision might be rooting against the Wii U also due to that: we live in a world where they sell DLC as modus operandi and Nintendo completely overlooked those needs by going with Microsoft's 2005 mindset. Not that they ever tried of course, with the recurring vendetta of "DLC exclusively not for the Wii U".

Yeah, you can use external drives, but that's different than built-in and standard with a bit of capacity which both XBone and PS4 are offering. Entry model should be 32 GB and even then it would be at disadvantage on a third parties mindset; Sony and Microsoft are still going with mechanical drives for a reason.
 

joesiv

Member
Neither does the X360 with the arcade 4 GB models lasting until today (and earlier they'd ship with as little as a 256 MB memory card)Difference from 22 MB/s to 30 MB/s is not huge by any means, no; but remember the fact PS3 loadings were a world of difference from X360?

Well, you were going from 9 MB/s to 11.2 MB/s; 22 MB/s to 30 MB/s have the capacity to make a difference.

If you are transfering 512 MB of data it means going from 23 seconds to 17 seconds, 6 seconds less.

Not just that though; latency is an issue with optical drives so you have to deal with the seek time being considerable, so that would take further the wait time to get data that isn't contiguous. On a external HDD that tends to be lessened, and even completly gone if you use a SSD.


Amazing USB 2.0 is not, but it has the capacity to bring palpable improvements to the table.

True about latency and such, it's just too bad they didn't throw eSATA or USB3.0 in there... but I guess I'm beating a dead horse :)
 

krizzx

Junior Member
One thing is true though, and it's the fact that some third parties like EA and Activision might be rooting against the Wii U also due to that: we live in a world where they sell DLC as modus operandi and Nintendo completely overlooked those needs by going with Microsoft's 2005 mindset. Not that they ever tried of course, with the recurring vendetta of "DLC exclusively not for the Wii U".

Yeah, you can use external drives, but that's different than built-in and standard with a bit of capacity which both XBone and PS4 are offering. Entry model should be 32 GB and even then it would be at disadvantage on a third parties mindset; Sony and Microsoft are still going with mechanical drives for a reason.

One thing of note is that all arguments towards Nintendo not matching the PS4/Xbone completely factor out the fact that they did not exist when the Wii U launched nor a single spec for them, nor was there a release date of any kind. For all people knew at that point, they could have been slated for 2015 using pure online storage. There was no way of knowing what the PS4 or next Xbox would be at the point and Nintendo couldn't try to match something that didn't exist.

You can't compete with air.
 
One thing is true though, and it's the fact that some third parties like EA and Activision might be rooting against the Wii U also due to that: we live in a world where they sell DLC as modus operandi and Nintendo completely overlooked those needs by going with Microsoft's 2005 mindset.

Yeah, you can use external drives, but that's different than built-in and standard with a bit of capacity which both XBone and PS4 are offering. Entry model should be 32 GB and even then it would be at disadvantage on a third parties mindset; Sony and Microsoft are still going with mechanical drives for a reason.
Hopefully Nintendo will officially drop the basic soon and offer a model with higher capacity in the future. I've actually been surprised at how well games stream over USB2 into memory on Wii U so perhaps Nintendo should consider bundling newer models with official external HDDs if they can't find room in the case to add a 2.5 inch HDD.

And I don't believe that any 3rd parties are actively rooting for the Wii U to fail. I think that EA specifically tried a lot of different things last gen to bring their core franchises to the Wii and failed on most counts and has decided to not invest that kind of effort again until they see where the dust settles with the system. A lot of people seem to forget that EA was essentially providing the Wii the kind of effort that Ubisoft is putting into Wii U so far.

I'd be really interested to see what some of EA's studios could get out the Wii U if they pick up development again based on what Criterion did with NFS:U.
 

krizzx

Junior Member
Hopefully Nintendo will officially drop the basic soon and offer a model with higher capacity in the future. I've actually been surprised at how well games stream over USB2 into memory on Wii U so perhaps Nintendo should consider bundling newer models with official external HDDs if they can't find room in the case to add a 2.5 inch HDD.

And I don't believe that any 3rd parties are actively rooting for the Wii U to fail. I think that EA specifically tried a lot of different things last gen to bring their core franchises to the Wii and failed on most counts and has decided to not invest that kind of effort again until they see where the dust settles with the system. A lot of people seem to forget that EA was essentially providing the Wii the kind of effort that Ubisoft is putting into Wii U so far.

I'd be really interested to see what some of EA's studios could get out the Wii U if they pick up development again based on what Criterion did with NFS:U.

EA has intentionally sabotaged games on the consoles and have hammered it with insults and actions to make sure it is viewed poorly. EA was definitely rooting for them to fail.
 
One thing of note is that all arguments towards Nintendo not matching the PS4/Xbone completely factor out the fact that they did not exist when the Wii U launched nor a single spec for them, nor was there a release date of any kind. For all people knew at that point, they could have been slated for 2015 using pure online storage. There was no way of knowing what the PS4 or next Xbox would be at the point and Nintendo couldn't try to match something that didn't exist.

You can't compete with air.
They were well underway and developers sensed Nintendo was aiming a tad low; Remember, this happened when Epic was trying to convince Sony and Microsoft to go higher, they wanted their 3 Teraflops as the minimum common denominator. So there were specs and decisions already, Nintendo was most likely well aware of them too as sure, there are NDA's, but it would be easy to know via IBM that they didn't procure them for their next gen tech; they weren't doing their "clients" a disfavor at this point; information leaks within companies a lot more than it leaks outside. Nintendo ought to have known something.

So "next gen" got less powerful than expected by most and it went further than expected on RAM; that left some developers quite pissed, but these two platforms represent the majority of the market (or so it is expected) so ignoring them is not viable (same as saying PS3 and X360 are not getting ignored as of now even though they're quite old tech now), ignoring the Wii U though, right now is the cool thing to do, as always with Nintendo platforms.

But I feel the problem lies exactly there, Nintendo wasn't competing with air, so they should have gone further on quite a few decisions they took regarding their machine.


This said, developers didn't have an excuse to ignore the Wii U as they're doing now throughout this gap year and that shows that they're ill-intentioned seeing Nintendo plan was all about getting the foot on the ground this year; but that doesn't rid Nintendo of blame; the storage was clearly too short for some developers to see the platform as preferential even against PS3 and X360 because their business model is intrinsical with DLC; that and a few other mistakes surely made it easier for third parties to do what they did.

But it's true that most just wanted an excuse.
 

z0m3le

Banned
Here you go. Page 10 should help you out.

http://gpgpu.org/wp/wp-content/uploads/2009/09/E1-OpenCL-Architecture.pdf

2560 kB of total register space/10 SIMDs/4 blocks per SIMD = 64 kB per block (of 20 ALUs)
Yeah I think you are probably right with 160 alu, I didn't think they had used the same size registers in their spus through out the last 4 gens. I guess it made since with the 64 thread address for 80 alus for vliw5. Considering vliw5's set up, 160alu is even a bit misleading since it is really 128 general alus and 32 scalars.
 

SmokyDave

Member
And this adds to the conversation how?
Ah, you're right. Rather than pointing out that some of the blame lies with the choices Nintendo have made for the hardware, I should've been on track like this guy:
EA has intentionally sabotaged games on the consoles and have hammered it with insults and actions to make sure it is viewed poorly. EA was definitely rooting for them to fail.

I notice you didn't call out the dude that replied to my post with some odd shit about Sony & MS either. Huh.
 

krizzx

Junior Member
They were well underway and developers sensed Nintendo was aiming a tad low; Remember, this happened when Epic was trying to convince Sony and Microsoft to go higher, they wanted their 3 Teraflops as the minimum common denominator. So there were specs and decisions already, Nintendo was most likely well aware of them too as sure, there are NDA's, but it would be easy to know via IBM that they didn't procure them for their next gen tech; they weren't doing their "clients" a disfavor at this point; information leaks within companies a lot more than it leaks outside. Nintendo ought to have known something.

So "next gen" got less powerful than expected by most and it went further than expected on RAM.

I feel the problem lies exactly there, Nintendo wasn't competing with air, so they should have gone further on quite a few decisions they took regarding their machine.


This said, developers didn't have an excuse to ignore the Wii U as they're doing now throughout this gap year and that shows that they're ill-intentioned; but that doesn't rid Nintendo of blame; the storage was clearly too short for some developers to see the platform as preferential even against PS3 and X360 because their business model is intrinsical with DLC.

I don't know if I can agree with that. I've heard devs voice quite a few problems with the Wii U but memory and storage were neither or them.

Also, offtopic, I just a saw a link to an interesting thread a moment ago. Kind of parodies how people are slamming Nintendo and the Wii U right now.

http://www.neogaf.com/forum/showthread.php?t=47031

The resemblance is so surreal. Apparently the 360 was maxed at launch as well.
 
I don't know if I can agree with that. I've heard devs voice quite a few problems with the Wii U but memory and storage were neither or them.
Might not be a problem for the developers themselves, but it is for the PR portion of the company seeing they sell that kind of content and users on the Wii U lack the storage as a standard. Of course they didn't even try so it's not like the average wii U consumer is finding the need to procure a HDD attachment for his numerous DLC; but it's a problem easy to foreshadow had it been going as it should.

It certainly helps in going along with the "jeez, this is hard, I can't be arsed to do it" attitude EA PR's went along with regarding the Battlefield 4 team and Frostbite engine.


The Wii U problem at this point can't be summed in a few remarks, it's a little bit of everything; but I'm quite sure the 8 GB of storage in the barebone version are seen as laughable by most and that's a problem. It's a perception problem, like most things attached to this brand at this point.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
No i was trying to find the size of the registers on each SPU of the r800, it seems the r700 is 16kb. The idea Radeon cards always used 64kb for their registers inside the SPU is one reason latte is probably 160 alus since it didn't have the register size to be more than that (64kb which is found on GCN's registers) being 16kb on r700 would show that there is no such limit to their ALU count per SPU (as it has changed over the years)
The size of the register file per SIMD would depend on the number of threads, or put in hw terminology, PE (processing elements) per SIMD, which generally varies across the models and generations. Now, since across the entire AMD VLIW family SIMD sizes are normally quoted in ALUs and not in PEs, for a VLIW5 design, the formula for the width of the SIMD register file (aka number of PEs * 4) is 4/5 * num_ALUs. For example, for a 80 ALU R700 SIMD unit, you have 80 / 5 * 4 = 64-wide register file. From there on, it gets simple - it's 256 registers per thread, float4 per register, so 64 * 256 * sizeof(float4) = 256KB worth of registers per the given SIMD.
 

lherre

Accurate
One thing of note is that all arguments towards Nintendo not matching the PS4/Xbone completely factor out the fact that they did not exist when the Wii U launched nor a single spec for them, nor was there a release date of any kind. For all people knew at that point, they could have been slated for 2015 using pure online storage. There was no way of knowing what the PS4 or next Xbox would be at the point and Nintendo couldn't try to match something that didn't exist.

You can't compete with air.

You think nintendo didn't know the plans of sony or ms in 2010 when Epic and others know all about them? (at least the very first specs that are far superior to wii u).

EA has intentionally sabotaged games on the consoles and have hammered it with insults and actions to make sure it is viewed poorly. EA was definitely rooting for them to fail.

could you back up this claimings?
 
EA has intentionally sabotaged games on the consoles and have hammered it with insults and actions to make sure it is viewed poorly. EA was definitely rooting for them to fail.
I don't really see anything that supports that. I think we had an insider report that NFS:U was done in time for launch and then held back to make people double-dip but that was directly contradicted by Criterion who said in an interview that they didn't begin work on the Wii U version until they were done with current gen. NFS:U while late was an excellent port that was sent to die but not intentionally.

A big problem that I've tried to highlight several times in other threads is that most 3rd party publishers don't have dedicated Nintendo teams. The few that di earlier this gen transitioned them in mobile development teams. That means that publishers either have to dedicate separate resources to create teams to learn how to develop for Wii U or let their teams finish current gen games and then work on Wii U later.

This situation has created a terrible mess for Wii U development and it's shown in a lot of the software that we've seen so far. It may get better as the gen goes on and the PS360 eventually gets dropped from support but that would probably be two years from now. In the meantime developers should get a better understanding of the Wii U's hardware which in turn should give us better software to use for these GPU discussions.

The latest videos for Sonic that I took a look at yesterday showed of some great development work for example. I hadn't noticed that the stylized blades of grass in the main level they usually show off all move independently like they're blowing in the wind. I'm really eager to see some direct feed captures of more of next years game to see what else is being done.
 
Maybe not sabotage but it was pretty fucking stupid to release the 3rd entry in a franchise that never released 1 or 2 on any Nintendo platform for the same price that you were releasing a collection of 1 - 3 of the series everywhere else.
 

OryoN

Member
How do we come to this conclusion? From what I gather, there is a 4 watt difference between home menu and playing splitercell, couldn't the 4 watt difference just be the spinning drive?
...

I think we'd have to test some other games with the same calibrated meter to come to any conclusion about some games pulling more from the GPU.

I, also, was thinking the difference could be a attributed to the spinning drive.

But IF we took that as an absolute fact, then we'd have to conclude:

1) The Menu's GUI(and tested game) already uses the GPU's max resources. DSP, tessellator, entire shader core, mystery blocks, everything. (thus, no change in wattage, except for drive)

Or

2) The tested game uses as little GPU resources as the GUI. (thus, no change in wattage, except for drive)

The first case seems very unlikely, and I'm not sure if that has ever been the case in any console before. The second case would suggest that there are a lot of underutilized - or even completely untouched - resources in the GPU, but it doesn't necessarily mean that we should expect a significant wattage difference in future games.

Anyway, I'm guessing that some of that 4W difference is from the GPU working a bit harder, but more tests with other games are required - as you suggested - to narrow down just how much of that is from the drive.
 

krizzx

Junior Member
I don't really see anything that supports that. I think we had an insider report that NFS:U was done in time for launch and then held back to make people double-dip but that was directly contradicted by Criterion who said in an interview that they didn't begin work on the Wii U version until they were done with current gen. NFS:U while late was an excellent port that was sent to die but not intentionally.

A big problem that I've tried to highlight several times in other threads is that most 3rd party publishers don't have dedicated Nintendo teams. The few that di earlier this gen transitioned them in mobile development teams. That means that publishers either have to dedicate separate resources to create teams to learn how to develop for Wii U or let their teams finish current gen games and then work on Wii U later.

This situation has created a terrible mess for Wii U development and it's shown in a lot of the software that we've seen so far. It may get better as the gen goes on and the PS360 eventually gets dropped from support but that would probably be two years from now. In the meantime developers should get a better understanding of the Wii U's hardware which in turn should give us better software to use for these GPU discussions.

The latest videos for Sonic that I took a look at yesterday showed of some great development work for example. I hadn't noticed that the stylized blades of grass in the main level they usually show off all move independently like they're blowing in the wind. I'm really eager to see some direct feed captures of more of next years game to see what else is being done.

You must not have been around during the Wii U's launch.

EA was all but announcing their hate for Nintendo when the origin deal fell. Just do any search for EA comments about Nintendo and their game releases on the Wii U from later 2012 and early 2013.

They announced they were gimping the games they were releasing on the Wii U and cutting content before it even came out. They gave the Wii U FIFA 12 re-branded as FIFA 13 and enentionially developed the Wii U version of Madden with the last gen engine as opposed to the one they were using for the 360/PS3 with better physic on top of removing online support. THey also made a comment about how they were just going to use the Wii U Gamepad to demo games then bring them to smart glass.Note that this was before launch when sells were looking strong. Poor sells weren't the reason for those actions.

I stll remember how they forced a $60 price tag on NFS Most Wanted U and then gave everyone who bought it on origin a $30 discount on launch day undercutting everyone. That was the last major infraction I can remember. Since then, they have done a little bit of back peddling to try to douse the fire on the bridges they were burning like how they went back and changed their stance on game releases and the Frostbyte engine.

Just do a google search.
 
Yeah I think you are probably right with 160 alu, I didn't think they had used the same size registers in their spus through out the last 4 gens. I guess it made since with the 64 thread address for 80 alus for vliw5. Considering vliw5's set up, 160alu is even a bit misleading since it is really 128 general alus and 32 scalars.

I believe VLIW5's 5th spu isn't scalar, but can do everything the others do plus transcendental ops such as sin and cos. It's been a while since I looked at it though...

Edit: yup, that first link you shared states 1 32-bit FP MAD per clock.
 
You must not have been around during the Wii U's launch.

EA was all but announcing their hate for Nintendo when the origin deal fell. Just do any search for EA comments about Nintendo and their game releases on the Wii U from later 2012 and early 2013.

They announced they were gimping the games they were releasing on the Wii U and cutting content before it even came out. They gave the Wii U FIFA 12 rebranded as FIFA 13 and used developed the Wii U version of Madden with the last gen engine as opposed to the one they were using for the 360/PS3 with better physic on top of removing online support. Note that this was before launch when sells were looking strong. Poor sells weren't the reason for those actions.

I stll remember how they forced a $60 price tag on NFS Most Wanted U and then gave everyone who bought it on origin a $30 discount on launch day undercutting everyone. That was the last major infraction I can remember. Since then, they have done a little bit of back peddling to try to douse the fire on the bridges they were burning like how they went back and changed their stance on game releases and the Frostbyte engine.

Just do a google search.
No, I was here and if you search back through my post history you'll see that I did a fair share of digging into what did or didn't go down behind the scenes between EA and Nintendo. Something happened and it's clear that feelings were hurt on both sides but when you say a publisher is sabotaging their own games that doesn't make any sense.

I do think that there was serious bad blood between the two camps a year or so ago but a lot of that has probably been resolved by now which is why you hear EA softening their stance. In regards to the launch games and their release schedules see my previous post.

It may not seem to make sense but I would imagine that most publishers didn't want to start out on the Wii U with discount games as that would set the precedent going forward to expect games at that price. Wii U development is probably not as cheap right now as PS360 development due to the man-hours necessary to figure out the system. It obvious to look as a consumer and see the same game cheaper on the other systems but as far as the publishers are concerned it's just another new SKU on a different platform. I think the best course of action would have been for Nintendo to take a hit on licensing and set game prices at $49.99 for Wii U at launch. That might have alleviated a lot of these criticisms.

And I apologize guys for getting in the way of the real tech talk here.
 

roddur

Member
Ah, you're right. Rather than pointing out that some of the blame lies with the choices Nintendo have made for the hardware, I should've been on track like this guy:


I notice you didn't call out the dude that replied to my post with some odd shit about Sony & MS either. Huh.

you're still not adding anything to the discussion :)
as i'm too
 
I, also, was thinking the difference could be a attributed to the spinning drive.

But IF we took that as an absolute fact, then we'd have to conclude:

1) The Menu's GUI(and tested game) already uses the GPU's max resources. DSP, tessellator, entire shader core, mystery blocks, everything. (thus, no change in wattage, except for drive)

Or

2) The tested game uses as little GPU resources as the GUI. (thus, no change in wattage, except for drive)

The first case seems very unlikely, and I'm not sure if that has ever been the case in any console before. The second case would suggest that there are a lot of underutilized - or even completely untouched - resources in the GPU, but it doesn't necessarily mean that we should expect a significant wattage difference in future games.

Anyway, I'm guessing that some of that 4W difference is from the GPU working a bit harder, but more tests with other games are required - as you suggested - to narrow down just how much of that is from the drive.

DSP, tesselator, etc are all trivial to the wattage and the drive is probably contributing about 1.6w. The draw of the console in the system menu is probably more a product of Nintendo's lack of optimization in that software (as is plainly obvious in the user experience) and possibly a lack of power gating on the GPU die, so that it's working nearly the same amount as when under full load. I'd expect things like power draw in menu to go down more than draw during gameplay to go up.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I believe VLIW5's 5th spu isn't scalar, but can do everything the others do plus transcendental ops such as sin and cos. It's been a while since I looked at it though...

Edit: yup, that first link you shared states 1 32-bit FP MAD per clock.
Technically, all ALUs of the VLIW family are scalar (not so in the Xenos or GCN families). In the VLIW5 case one of them is transcendental, though.
 
Technically, all ALUs of the VLIW family are scalar (not so in the Xenos or GCN families). In the VLIW5 case one of them is transcendental, though.

Thanks for the correction. I was wondering about that before. Xenos' Vec4 essentially has to be used as one unit (or is essentially one unit I should say)?
 

USC-fan

Banned
A bunch of USB stuff hooked up? You mean one USB drive hooked up. How much power do you think that takes? Even if it takes 5w then that's still 42w from the wall(~37w to the console). 20w to the GPU is now implausible from that?

I'm taking everything with salt. You should take those component power draw figures you quoted with a grain of salt too as there was no proof to back that up, of course. It's all estimations, I'm merely pointing out the ~30w getting to the console is not "best case scenario", as you stated.
Why not take the measurement with nothing hook up? Wow the logic in this thread sometimes.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Thanks for the correction. I was wondering about that before. Xenos' Vec4 essentially has to be used as one unit?
Yes. Xenos is a weird beast in the sense that it's also essentially a VLIW design, but the ALU instruction word is very narrow - 2 ops, where one of the ops is actually a 4-way SIMD. Or you can think of it as a 2-way co-issue instruction word, comprising a vec4+scalar pair. The register file, though, is again of float4. The problem with the popular flops mis-estimation of Xenos comes from the fact Xenos cannot co-issue a 4-arg vec4 op (i.e. a MAD dst, src0, src1, src2) and a scalar op. So it cannot do a MAD + scalar in a clock.
 

fred

Member
Maybe not sabotage but it was pretty fucking stupid to release the 3rd entry in a franchise that never released 1 or 2 on any Nintendo platform for the same price that you were releasing a collection of 1 - 3 of the series everywhere else.

This ^^^

Not to mention FIFA 13 being released with half the game (Ultimate Team) missing.

And then you've got to take into account various interviews where they've slagged the console off. EA certainly wildly changed their stance on the platform after the E3 2011 console reveal.

People are saying it's the Origin business that's done this but I'd say that Nintendo patenting the use of the GamePad as the view of the ground in a golf game has something to do with it too.

Perhaps 'sabotage' is too strong a word, but they haven't made a great deal of effort to help the Wii U be a success either.
 
A big problem that I've tried to highlight several times in other threads is that most 3rd party publishers don't have dedicated Nintendo teams. The few that di earlier this gen transitioned them in mobile development teams. That means that publishers either have to dedicate separate resources to create teams to learn how to develop for Wii U or let their teams finish current gen games and then work on Wii U later.

This situation has created a terrible mess for Wii U development and it's shown in a lot of the software that we've seen so far. It may get better as the gen goes on and the PS360 eventually gets dropped from support but that would probably be two years from now. In the meantime developers should get a better understanding of the Wii U's hardware which in turn should give us better software to use for these GPU discussions.

The latest videos for Sonic that I took a look at yesterday showed of some great development work for example. I hadn't noticed that the stylized blades of grass in the main level they usually show off all move independently like they're blowing in the wind. I'm really eager to see some direct feed captures of more of next years game to see what else is being done.

I still have some faith in Treyarch with CoD Ghost port and maybe AC4.

But the way to go with Nintendo seems to be to gather the most exclusives and special collaborations it can, because 3rd parties are not going to dedicate enough time and resources, plus what Saint Gregory says above, developing for the Wii U seems like a huge mess for 3rd parties.

If I had not seen Bayo 2, Mario Kart 8, X and Sonic Lost World, and the current and past ports, I would have lost all hope whatsoever!!

And about EA, it is very clear they are in bed with MS now, and that they either don't feel Wii U is a good investment ATM (thus it does not bode well for them if the Wii U does well, as they have set their eggs on Next Gen twins). Also the possibilities on something gone wrong about Origin are pretty high.
 

The_Lump

Banned
Neither does the X360 with the arcade 4 GB models lasting until today (and earlier they'd ship with as little as a 256 MB memory card)Difference from 22 MB/s to 30 MB/s is not huge by any means, no; but remember the fact PS3 loadings were a world of difference from X360?

I suppose. But then why aren't we seeing installs on WiiU instead of 85 second loading times?
 
Status
Not open for further replies.
Top Bottom