• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

hat still leaves 114mm2. You also probably have a tiny DSP on there and very small ARM CPU, but that's unlikely to take up more than a few mm2.
If you subtract a few mm from 114 it would still get close to my number.
Looking at some Anandtech VLIW5 diagrams, the core numbers must be a multiple of 80
dppa201krf.png

So the closest other option would be 400 (440 GFLOPS @550 MHz)

I guess we will know soon anyway.
Here is the rv770 (hd 4870). you can see the 10 SIMD core rows in the middle right, with 10x4 TMUs per SIMD core to the middle left.
(think of the above diagram rotated 90 degrees)
die-shot5pj23.jpg
 

ozfunghi

Member
4.3 W @ 1.35V @ 46 nm according to this Samsung PDF I found via Google
could be lower with newer tech

The power consumption scales linear with clock, and square of voltage if they manage to reduce it. So this GPU should be within the WiiU power envelope, I think.

That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.
 

japtor

Member
I think they appreciate our craziness, though. It's good marketing.
I'd most definitely be interested in anything they have to say about it, yes.

And they seem to be cool dudes, I mean it's a business, but they're apparently going the extra mile for us. Gotta appreciate that.

I'm very eager for the results.
Yeah I have to imagine a group of people on a forum just doing this for fun isn't exactly their usual customer, I'm interested in seeing what they're coming up with.
You guys seemed rather excited. Are there any indications we might get a few nice surprises concerning WiiU GPU performance or features, or is it because Chipworks is giving you/us a "better deal" of sorts, which will not relate to performance of the console?
The latter for me...although it might relate depending on what they're doing. Ultimately they know a lot more about chip layouts and stuff, so if they can help out in sorting out some details it could help in figuring out performance metrics. Plus they have images of other chips that aren't being bought (yet?) so anything on those could help too.
 
That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.
23W with a 10% core voltage reduction.
also the 35W in the PDF is the "Thermal design power", which may not yet have been reached by the WiiU
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I'm not one either, but whenever I've built PCs and checked power supply calculators for what I should get, there was always over provisioning for capacitor aging. If it didn't affect output, why would that be there? One example:

http://www.extreme.outervision.com/psucalculatorlite.jsp
Well, I just talked to an actual EE, and that calculator is extreme, to put it mildly.

Aging of electrolyte capacitors is mostly a function of how much load you submit them to, by the simple causality: load -> heat -> evaporation of electrolyte. A quality power supply rated at N watts will use reservoir capacitors (responsible for the smoothing of the voltage into 'proper' DC) of both high-quality electrolyte and of sufficient capacity so that they would not degrade to unacceptable levels below N over the projected lifespan of the device (which can be several decades).

The advice on that site you quoted (that you should go for a larger power reserve over a longer projected lifespan) does help for no other reason than the fact that a higher-power PSU, regardless of its quality, would still use larger capacitors, so when used at lower loads those capacitors will function longer within the expected margins for the PSU. So basically if you have doubts about the quality of the PSU, going with a larger one will buy you some extra lifespan.

But at the same time the site sets the issue of aging PSU's onto the wrong premise. A badly-aged PSU does not produce less DC power per se - in 3 years it will not produce perfectly good DC at 30% less (arbitrary numbers) - no, it will produce power of lesser DC quality. Whether for the device using that power that translates to dropping of some power lines and shutdowns, or the death of some components - that's entirely up to how that device was designed to withstand bad DC. And that is what causes devices to fail when used with aged (low-quality) PSUs. Bottomline being, an aged (low quality) PSU is not 'just as good as new but for lower loads' - it's just bad!
 

tipoo

Banned
Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.
 
There has to be some over-provisioning. Capacitors age, similar to batteries, and in two years that 75w power supply may only be able to hit say 70 watts (not an actual calculation, just for example). So anything that tries to draw the original 75 watts from it is going to crash. If they want a console to have a healthy lifespan, as Nintendo always does, they have to over provision quite a bit.

I've always been under the impression that PSU capacitor aging leads to lowered efficiency, and not sever drops in power output.

*edit*

Should have read the rest of the thread before replying.
 

Schnozberry

Member
Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.
 

tipoo

Banned
I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/digitalfoundry-wii-u-is-the-green-console
 

Donnie

Member
Of course they dont change clock speeds. But still there are Xbox 360/PS3 games that use around 10% more power than other games on the same system. So I expect something similar with WiiU.
 
I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

If Cheesemeister's translation is accurate...

The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.

This makes it sound as though the 40w does not include USB, but it could have been mistated.
 

Kenka

Member
Since money was gathered how much should we wait to get the die pictures?
Thanks to everyone who contributed. I came too late to throw my bucks :-(
 

Kenka

Member
See this post. Chances are only Fourth Storm will be getting the actual details/picture, he'll just pass along info to everyone else.

http://www.neogaf.com/forum/showpost.php?p=47094751&postcount=2475
Thanks ! I am looking forward for the actual news blowout.

This has been posted up the wazoo in the WUST threads, but how much would 1GB GDDR5 consume in power?

http://www.amd.com/la/Documents/AMD-Radeon-E6760-Discrete-GPU-product-brief.pdf

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.
This was long considered on GAF as the culprit on which the GPU is based. I hope we get something comparable to the e6760 in terms of power.
 

Schnozberry

Member
It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/digitalfoundry-wii-u-is-the-green-console

Ok, I missed that for some reason. Thanks for pointing it out. How many watts are we assuming the CPU consumes? 5-10?
 
I think somebody was estimating under 2 per core at one point recently; forget who/where. So ~5.

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).
 

Chronos24

Member
If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

So that leaves 25 more watts for everything else. I wonder how much the GPU is using in that regard.
 
If Cheesemeister's translation is accurate...



This makes it sound as though the 40w does not include USB, but it could have been mistated.

I'm guessing Iwata threw in the load for one USB peripheral to reach that ballpark. It doesn't seem that games have gone past 33 watts standalone.

I'd still be surprised if there wasn't at least some minimal fluctuation. I remember when my Wii GPU was in the process of frying (as apparently the WiiConnect24 setting should have been labeled "Low and Slow"), it was only specific and seemingly graphically demanding games that would set it off. I could play Wii Sports and NSMBWii fine, but pop in SMG2 or Silent Hill: SM and the image would quickly start showing telltale signs of GPU damage. I would imagine that this would indicate that not all loads were necessarily equal. Perhaps somebody else has a better answer, though...
 

OryoN

Member
Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I just noticed some wrong numbers posted earlier that slipped unnoticed.

True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox 360 had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...
Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.
 

McHuj

Member
If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

Your math is off, but the actual power numbers maybe close to your estimates.

180nm => 90nm is a 4X reduction in area, there's a 130nm node in between. Same for 90 => 45nm, there's a 65nm node in between. Unfortunately, power scaling hasn't scaled as linearly as density. If it did, you could divide you numbers by another half.
 

joesiv

Member
I just noticed some wrong numbers posted earlier that slipped unnoticed.


Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.

My apologies if they were incorrect, I went with images of the powersupplies off google and just took the watt value that was printed on the label. It's possible that the images I used were third party, or the something...

*edit*, that's interesting, isn't this the official NGC power supply?
nintendo_dol-002.jpg


What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency? Anyways, my numbers for the GC and Wii were on the AC side, Can someone check what the DC rating (specifically amps) the WiiU's power supply label says?
 

tipoo

Banned
Around 8 watts max sounds right for the CPU, I would bet dollars to donuts that the larger TDP budget was given to the GPU (pretty safe bet based on size alone). Anyone know how much the optical drive uses? I'm guessing the GPU would be left with under 20, all things accounted for. Maybe 15 ish. There's still NAND, controllers, RAM, etc to account for.
 

z0m3le

Banned
Wouldn't they use less voltage for the smaller CPU? or is that already taken into account? also while it is based on the Wii CPU, it isn't exactly 3 Wii CPUs shrunk to 45nm and stitched together, that is technically not how it was done. Was Gecko to Broadway the same efficiency that people are suggesting here? or was more done to broadway to make it more efficient?

Also this same can be said about the GPU, it's hard to guess what components are using however I believe the disc drive should be easy to find, take measurement of a DD game and that same game playing from the disc, would give you that measurement. (at least probably)

ARM, DSP... these should be very small in wattage use, RAM is also the low powered stuff right? 4 chips, I'm sure it can be googled.
 

Lonely1

Unconfirmed Member
Wouldn't they use less voltage for the smaller CPU? or is that already taken into account? also while it is based on the Wii CPU, it isn't exactly 3 Wii CPUs shrunk to 45nm and stitched together, that is technically not how it was done. Was Gecko to Broadway the same efficiency that people are suggesting here? or was more done to broadway to make it more efficient?

Also this same can be said about the GPU, it's hard to guess what components are using however I believe the disc drive should be easy to find, take measurement of a DD game and that same game playing from the disc, would give you that measurement. (at least probably)

ARM, DSP... these should be very small in wattage use, RAM is also the low powered stuff right? 4 chips, I'm sure it can be googled.

Bigger caches at the least.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
My apologies if they were incorrect, I went with images of the powersupplies off google and just took the watt value that was printed on the label. It's possible that the images I used were third party, or the something...

*edit*, that's interesting, isn't this the official NGC power supply?
nintendo_dol-002.jpg


What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency? Anyways, my numbers for the GC and Wii were on the AC side, Can someone check what the DC rating (specifically amps) the WiiU's power supply label says?
Da heck? I had it the wrong all this time?.. Serves me right for not actually checking the brick before posting.
 
What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency?
Yes, roughly 80% efficiency is about what you would expect from power supply conversion.
 
Around 8 watts max sounds right for the CPU, I would bet dollars to donuts that the larger TDP budget was given to the GPU (pretty safe bet based on size alone). Anyone know how much the optical drive uses? I'm guessing the GPU would be left with under 20, all things accounted for. Maybe 15 ish. There's still NAND, controllers, RAM, etc to account for.
The Blu Ray drive should be 12 watts maximum, not more seeing that's the max TDP listed for such parts on desktop PC's.

Probably closer to 4/5 watts (it's possible, my only doubt actually steams from the fact the drive is actually clunky sized).
 

tipoo

Banned
No clue if this is useful:
uGvd4aN.jpg

Is that from the Wii U drive? I'm not sure which of those numbers to go with, moving parts like optical drives and hard drives usually use the 12V rail but that calculation leads to 22 watts which seems too high. The next bump down gets us 5 watts, and the 3.35 one would be 1.6 watts which then seems too low.

So I'm not sure which of the three numbers is right, and what it means that there are three of them. It wouldn't make sense for it to have three different power draws.


Watts = Amps x Volts btw
 

japtor

Member
Is that from the Wii U drive? I'm not sure which of those numbers to go with, moving parts like optical drives and hard drives usually use the 12V rail but that calculation leads to 22 watts which seems too high. The next bump down gets us 5 watts, and the 3.35 one would be 1.6 watts which then seems too low.

So I'm not sure which of the three numbers is right, and what it means that there are three of them. It wouldn't make sense for it to have three different power draws.


Watts = Amps x Volts btw
Yeah it's from iFixit's tear down, wish they opened it up considering how non standard it looks. I looked up Panasonic's other drives and could only find laptop ones.
 
Top Bottom