• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

ikioi

Banned
Again, like I said, show me where you can get an 75-80%+ efficiency from a passively cooled, wall wart formfactor that costs a typical BOM for a console PSU

There are plenty of laptops on the market, even cheap ones, that have similar PSU's to that of the Wii U, with efficiency far greater then 65%.

yes advances have been made, in PSU's that are actively cooled, larger than the Wii-U itself and cost $200-$300+

Can you provide an example of such a PSU? Link?

As as far as i'm concerned that's just out right b/s (sorry to offend but its 11pm here and can't think of a better word)

You're trying to make the evidence fit the theory, not the theory fit the evedence, if your aim is just to "spec up" the Wii-U to fit preconcieved notions then fine and facts really have no place here, but I would have thought making the theory fit the evidence and speculation about reality would be of more interest, if not then fair enough, I'm out.

Not trying to do that at all. All i did was propose another possible alternative.

Also given the form factor of the Wii U's PSU appears to be quite small, it would also make sense/support the theory that it's PSU would be of a higher efficiency then 65%. A 75w PSU at 65% efficency would suggest under max load, the PSU generates 26.25w of heat. I doubt very much a PSU as small as the Wii U's, can disipiate that kind of heat via surface disipation. Typically enclosed PSUs that have to deal with that kind of TDP are quite large as the extra surface area helps dispiate the heat. So what are your thoughts on how such a small PSU can disipate that kind of TDP?

We're not talking about transformers which cost tens of thousands of pounds. I can find you a 1200w PC PSU with 90% efficiency (ie '80 PLUS Platinum certified) for about £120 if you like :)

I'm running a Silverstone Strider Gold in my PC atm, that's rated at 87-90% effient.

And no doubt a transformer of that effiency would cost tends of thousands, its a TRANSFORMER NOT A POWER SUPPLY.

I work in the IT industry and PSUs, UPSs, transformers, even backup generators are apart of my day to day life.
 
Who said that? Read his quote again. He says 75w was the measured maximum using all funtionality. That's it. Can't be any more and no one said that it can.


Well I was being somewhat hyperbolic, obviously.

BTW he didn't say 75w "was the measured maximum using all funtionality". That's the absolute theoretical max of the PSU as confirmed by a gaffer who read the PSU specs back in June. It literally MUST use less than this.
 
The WiiU draws more than 75W because te PSU has to garuantee the 75W for the whole system...

It would be impossible to even port to WiiU with only 40W for the whole system...
 

Donnie

Member
It's terrible how everything that comes out about Wii U gets spun.

Iwata "it uses 40 watts"

GAF within a couple days: "as you see by these ironclad calculations, Wii U can be drawing no less than 70 watts confirmed, possibly more. That is if we assume they havent already re engineered the power supply to go much higher which they obviously probably have already"

You just spun Iwata's words in a post about how terrible it is that people are spinning his words
graphics-laughing-590695.gif
 

Donnie

Member
Well I was being somewhat hyperbolic, obviously.

BTW he didn't say 75w "was the measured maximum using all funtionality". That's the absolute theoretical max of the PSU as confirmed by a gaffer who read the PSU specs back in June. It literally MUST use less than this.

Hmm

"The Wii U is rated at 75w power consumption. Please note that this consumption rating is measured at maximum utilisation of all functionality."
 
Look, we have the official spec sheets. We've seen pictures of the PSU rating. The PSU is rated at 75 watts and we know electronics never, ever use 100% of the PSU rating. We've also been told EXPLICITLY BY IWATA that in game situations to expect power usage of 40-45 watts. Those are the fucking facts.

Yeah, and they have been working just as long on the WiiU GPU. So, if it's not possible, to get either a lower TDP, a higher flop per watt than a cheapish embedded GPU, why not simply go for the cheap embedded GPU and save R&D?

Don't confuse working "as long" with working as much. The majority of AMD's GPU engineers are quite assuredly engaged in either advancing core technologies or maintaining existing product lines. They didn't dedicate a couple hundred guys for three years just to see how far DX10.1 can be taken 'cause Nintendo asked. No, they spun off some people to adapt an existing architecture to a clients needs, not rebuild a GPU from the ground up.
 

THE:MILKMAN

Member
Wii U= 75W rated PSU and 40-45 Typical in-game at the wall consumption. That is how I understand it.

The PS3 super slim will be similar. 190W rated PSU +-70W in-game at the wall consumption.
 
The WiiU draws more than 75W because te PSU has to garuantee the 75W for the whole system...

It would be impossible to even port to WiiU with only 40W for the whole system...

Do you even understand what your saying? Your saying the Wii-U is going to draw more than 75 watts even though the PSU's rated max is 75 watts? That is some excellent deduction there LOL.

Honestly though how do you think it works?? How would that even make sense
Ubisoft: "Your console must draw 90 watts to receive our ports"
Nintendo: "Don't you mean we must have a certain computing, or graphical power to receive ports?"
Ubisoft: "No we literally mean power"
 

USC-fan

Banned
Yeah, "typical", like when you run a game. It's not like anyone is booting up furmark to try and set their GPU on fire.



The e6760 is two generations removed from the R700, from AMD's second generation DX11 architecture.



The 7690 is THREE generations removed from the R700, on a cutting edge 28nm process and a high margin, cherry picked part for the mobile market. They are not going to throw away 9 out of every 10 WiiU GPUs they make just to hit a 27 GFlop per watt efficiency. That number is completely useless. But even if we use your most generous 17.5 GFlop per watt figure, when you multiply that by a realistic (and yet generous) power usage number like 25 watts, you're still nowhere near 600 GFlops. 600 is a pipe dream.

I mean, sure, AMD spent a few years working on this thing, but they had a lot of stuff to do that wasn't fundamentally reworking how the shaders execute code to drastically improve performance. Like laying out the new configuration, modifying and replacing interfaces, adapting to IBM's manufacturing techniques, solving issues with test production runs, working out how to improve yeilds, etc, etc.
Just have to say its nice having someone on here who understand this stuff. Agree 100%.
 

The_Lump

Banned
Well I was being somewhat hyperbolic, obviously.

BTW he didn't say 75w "was the measured maximum using all funtionality". That's the absolute theoretical max of the PSU as confirmed by a gaffer who read the PSU specs back in June. It literally MUST use less than this.


Yes he did. Read cheesemeisters translation of the Nintendo Direct. That's it word for word. The Wii U draws 75w of power, max. Not the psu draws 75w of power max.

Edit: oz and others corrected me on this. Hadn't realised we had psu info before hand. Still, my point about 80% efficiency being possible is still valid! The below may now be incorrect, but the maths still stands!

And I'll point this out again (someone is welcome to correct me if I'm mistaken) but if the psu had a maximum of 75w output, then that's the maximum it can put out. When you see a power supplies efficiency, that's not how much of its max output it can actually use.

If its outputting 75w (Iwata has confirmed that's its max draw by the console when everything is utilised incl USB, WiFi etc) and its coming from a 60% efficient psu, then the psu draws 120w from the power outlet. 45w is wasted, 75w is used.

You don't take the 60% off the maximum output.

Now, if Iwata had said the PSU draws a max 75w of power from your power outlet, then we would need to think about it some more.
 

ikioi

Banned
We're not talking about transformers which cost tens of thousands of pounds. I can find you a 1200w PC PSU with 90% efficiency (ie '80 PLUS Platinum certified) for about £120 if you like :)

Wii U= 75W rated PSU and 40-45 Typical in-game at the wall consumption. That is how I understand it.

The PS3 super slim will be similar. 190W rated PSU +-70W in-game at the wall consumption.

I doubt very much the PS3 SS will have a 190w PSU, as the thing would need its own dam cooling system if it's going to be disipating 120w of wasted heat.

So no, you're very wrong.
 

The_Lump

Banned
The WiiU draws more than 75W because te PSU has to garuantee the 75W for the whole system...

It would be impossible to even port to WiiU with only 40W for the whole system...


Not quite.

All we know is the console draws 75w at maximum load (using all connections etc) So the psu is outputting that amount at minimum (and probably maximum too, as there's no need for it be more).

So, if the psu needs to output max 75w, then it's rated above that as its not 100% efficient. If its 65% efficient, then its drawing 123w of power to produce that 75w.


And its nothing to do with ports/flops etc. Let's not drag anything else into this!
 

Kenka

Member
If a PSU is 600w and is 60% efficient, it's not outputting only 360w (60% of 600). It's outputting 600w (because that's its advertised output). It's drawing 1000w of power from the mains, 40% is lost as heat and 60% is used (hence 60% efficiency).
Yes, I remember that the efficiency is given by [Output in DC/Input in AC].
Anyway, 40 W in power needed for a normal gaming session (?), 75 W for an online game with 2 Pads and 4 Wiimotes in action and 2 hard drives plugged in the butt.



I guess in both cases, the GPU power consumption is similar, correct ? Then, we can stick to this 40 W figure for our calculations.
 

Donnie

Member
yes advances have been made, in PSU's that are actively cooled, larger than the Wii-U itself and cost $200-$300+

I think its been a long time since you looked into consumer PSU's if that's what you think. A small actively cooled 350w PSU with over 80% efficiency can be had for about $35.

Even a passively cooled 500w PSU (which is smaller than WiiU) with 92% efficiency can be had for well under the money you just mentioned (this one is $135):

http://www.newegg.com/Product/Product.aspx?Item=N82E16817104166&Tpk=AU-500FL
 

Donnie

Member
If Iwata says the average for gameplay is 40w, why do you it's going to be more than that to support more flops?

Because the average is not the maximum and far from the theoretical maximum that people are using for the GPU's being mentioned.

As in you can't say "WiiU's average draw for its internal hardware is 40w, this GPU with 500Gflops has a max draw of 30w therefore it can't fit that kind of GPU inside".
 

USC-fan

Banned
Not quite.

All we know is the console draws 75w at maximum load (using all connections etc) So the psu is outputting that amount at minimum (and probably maximum too, as there's no need for it be more).

So, if the psu needs to output max 75w, then it's rated above that as its not 100% efficient. If its 65% efficient, then its drawing 123w of power to produce that 75w.


And its nothing to do with ports/flops etc. Let's not drag anything else into this!
Doesn't work like that in real life.

Facts: wiiu have a 75 w psu, wiiu uses 40-45 w running games. This is now it looks in the real world. You do not use more than 65% of the psu rated output.

Crazy thing is we have known the 75w psu since June. Yet people even then have sky high specs for the wiiu.
 

THE:MILKMAN

Member
I doubt very much the PS3 SS will have a 190w PSU, as the thing would need its own dam cooling system if it's going to be disipating 120w of wasted heat.

So no, you're very wrong.

On Sony's spec page they list it has 190W power consumption approx. http://uk.playstation.com/ps3system/#select-tab-specifications

I just assume being non-techy that they are really talking about the PSU rating? To me "power consumption" means at the wall. And in the case of the super slim that will be around 70W. Well short of 190W, hence why I think it is the PSU's rating.

Is this making any sense?
 

The_Lump

Banned
Doesn't work like that in real life.

Facts: wiiu have a 75 w psu, wiiu uses 40-45 w running games. This is now it looks in the real world. You do not use more than 65% of the psu rated output.

Crazy thing is we have known the 75w psu since June. Yet people even then have sky high specs for the wiiu.


Do we know the psu is rated at 75w? Or is it that it can output 75w? Two different things, as I've explained.

You're also ignoring the rest of my post, which is based on what Iwata said in the Japan Nintendo direct last week. (WiiU consumes up to 75w, max)

Everything I said still stands. And you seem to think I'm trying to use this to defend the Wii U. I'm not, I'm just correcting some maths.
 
Do we know the psu is rated at 75w? Or is it that it can output 75w? Two different things, as I've explained.

You're also ignoring the rest of my post, which is based on what Iwata said in the Japan Nintendo direct last week. (WiiU consumes up to 75w, max)

Everything I said still stands. And you seem to think I'm trying to use this to defend the Wii U. I'm not, I'm just correcting some maths.

No, you're ignoring what Iwata actually said, what the spec sheet says and what we've had pictures of since E3. 75 watts is the PSU's rating. It can't draw more than that from the wall, and it can only deliver a fraction of that in actual usage: about 45 watts.
 

wsippel

Banned
If Iwata says the average for gameplay is 40w, why do you it's going to be more than that to support more flops?
It's not an off-the-shelf part, so discussing the power consumption of PC parts won't give us more than a basic idea. AMD most likely ripped out several PC specific things to make it more efficient. Not to mention we have no idea what process it uses. We don't even know who manufactures the GPU.

There have been rumors lately that Nintendo has yield issues. They wouldn't have yield issues with a 40nm AMD GPU. Those were manufactured in gargantuan volume for years, and the top end 40nm chips should have a lot more transistors. If that recent article is anything to go by, the GPU will be manufactured by TSMC and the system will use a SoC (which was hinted at in several Linkedin profiles) with 3D stacking, increasing the efficiency further. Could even be a 28nm part as that's TSMCs "default" process these days. All in all, the GPU could reach a much better performance/ Watt ratio than any known AMD GPU, mobile parts included.
 

The_Lump

Banned
Hmmm. I'm liking the e4690 as a comparable gpu. Starting to think the e6760 is dream land :(


Edit. *reads above post* Now entering dream land.

Thanks wsippel! Awesome infos as usual :)
 

Donnie

Member
Doesn't work like that in real life.

Facts: wiiu have a 75 w psu, wiiu uses 40-45 w running games. This is now it looks in the real world. You do not use more than 65% of the psu rated output.

Crazy thing is we have known the 75w psu since June. Yet people even then have sky high specs for the wiiu.

Yes you can, PSU's have been using more than 65% of there max output for years. 360's PSU used 88%*

*Yes I know it was claimed that 360's PSU was 245w rather than 203w. But that does not explain why everyone including Microsoft lists it as 203w. So 178w in game out of 203w PSU is 88% (and that might not even include all USB ports so the number could be higher).
 
It's not an off-the-shelf part, so discussing the power consumption of PC parts won't give us more than a basic idea. AMD most likely ripped out several PC specific things to make it more efficient. Not to mention we have no idea what process it uses. We don't even know who manufactures the GPU.

There have been rumors lately that Nintendo has yield issues. They wouldn't have yield issues with a 40nm AMD GPU. Those were manufactured in gargantuan volume for years, and the top end 40nm chips should have a lot more transistors. If that recent article is anything to go by, the GPU will be manufactured by TSMC and the system will use a SoC (which was hinted at in several Linkedin profiles) with 3D stacking, increasing the efficiency further. Could even be a 28nm part as that's TSMCs "default" process these days. All in all, the GPU could reach a much better performance/ Watt ratio than any known AMD GPU, mobile parts included.

Thanks for the info wsippel.
 

ozfunghi

Member
I could be mistaken, but I think your not working it out correctly.

If a PSU is 600w and is 60% efficient, it's not outputting only 360w (60% of 600). It's outputting 600w (because that's its advertised output). It's drawing 1000w of power from the mains, 40% is lost as heat and 60% is used (hence 60% efficiency).

Well... i don't know about that... and i doubt it, but should you be right, that's a totally different thing then. But i doubt this is the case. If you buy an appliance, you want to know what it will consume, also for your electric bill. You generally don't care about how much is actually used, but how much is actually consumed.
 

USC-fan

Banned
In a console, where the CPU can freely access GPU data without going through a PCI-Express bus and buffers can be cast into different types without going through conversion, a DX10.1/openGL 3.3/SM4.1 GPU can be very useful at non-graphics tasks. I believe Havok's GPU acellerated (whichinclude cloth, hair and fluids) would work fine on it.

This is one of the reason I laughed when people talk about the wiiu was design to do this. Havok is running on the CPU on the wiiu. If it was design to off load task to the gpu this would be the perfect thing to do. As I have said I think it just comes down to the r700 just not good enough or they feel people want to use 100% of gpu power for gfx.

We know the r700 is not well design for these tasks.
 

The_Lump

Banned
Well... i don't know about that... and i doubt it, but should you be right, that's a totally different thing then. But i doubt this is the case. If you buy an appliance, you want to know what it will consume, also for your electric bill. You generally don't care about how much is actually used, but how much is actually consumed.


That's a good point. You may well be right then. I'll stick to correcting peoples maths instead :)

Could still easily output 60w to the Wii U. (80% efficiency)
 

ozfunghi

Member
Hmmm. I'm liking the e4690 as a comparable gpu. Starting to think the e6760 is dream land :(


Edit. *reads above post* Now entering dream land.

Thanks wsippel! Awesome infos as usual :)

The more i think about it, the more i believe my prediction might hold true. +/-27W@17Gflops/W = 459Gflops >>> 480SPU*480Mhz = 460 Gflops.
 
This is one of the reason I laughed when people talk about the wiiu was design to do this. Havok is running on the CPU on the wiiu. If it was design to off load task to the gpu this would be the perfect thing to do. As I have said I think it just comes down to the r700 just not good enough or they feel people want to use 100% of gpu power for gfx.

We know the r700 is not well design for these tasks.




We also know that Wii U GPU is highly customised. Maybe it uses R700 as basis, but remember its highly customised with more modern technologies.
 

The_Lump

Banned
The more i think about it, the more i think my prediction might hold true. +/-27W@17Gflops/W = 459Gflops >>> 480SPU*480Mhz = 460 Gflops.

So you're a e6760 fan too? ;)
And given wsippels hot-off-the-press info, a downclocked, die-shrunk version of that is doable at pretty low wattage, I believe...
 

wsippel

Banned
This is one of the reason I laughed when people talk about the wiiu was design to do this. Havok is running on the CPU on the wiiu. If it was design to off load task to the gpu this would be the perfect thing to do. As I have said I think it just comes down to the r700 just not good enough or they feel people want to use 100% of gpu power for gfx.

We know the r700 is not well design for these tasks.
Except, you know, the R700 is exactly the line of GPUs Havok used to demonstrate GPGPU physics on (Havok Cloth). And AMD used an R700 to demonstrate GPGPU AI (March of the Froblins).
 

The_Lump

Banned
No, you're ignoring what Iwata actually said, what the spec sheet says and what we've had pictures of since E3. 75 watts is the PSU's rating. It can't draw more than that from the wall, and it can only deliver a fraction of that in actual usage: about 45 watts.


I was genuinely asking. No idea we knew the psu is rated at 75w.

if that's the case, 60w is still easily doable.
 

USC-fan

Banned
It's not an off-the-shelf part, so discussing the power consumption of PC parts won't give us more than a basic idea. AMD most likely ripped out several PC specific things to make it more efficient. Not to mention we have no idea what process it uses. We don't even know who manufactures the GPU.

There have been rumors lately that Nintendo has yield issues. They wouldn't have yield issues with a 40nm AMD GPU. Those were manufactured in gargantuan volume for years, and the top end 40nm chips should have a lot more transistors. If that recent article is anything to go by, the GPU will be manufactured by TSMC and the system will use a SoC (which was hinted at in several Linkedin profiles) with 3D stacking, increasing the efficiency further. Could even be a 28nm part as that's TSMCs "default" process these days. All in all, the GPU could reach a much better performance/ Watt ratio than any known AMD GPU, mobile parts included.
yield issue with gpu or CPU? Would say 28nm is not possible now. The shortages and problems could not support a console launching this year on 28nm.

Mobile parts are bin. You will not get better performance than that.

Yes you can, PSU's have been using more than 65% of there max output for years. 360's PSU used 88%*

*Yes I know it was claimed that 360's PSU was 245w rather than 203w. But that does not explain why everyone including Microsoft lists it as 203w. So 178w in game out of 203w PSU is 88% (and that might not even include all USB ports so the number could be higher).
It doesn't matter we know the exact power usage and psu rating.
 

ozfunghi

Member
So you're a e6760 fan too? ;)
And given wsippels hot-off-the-press info, a downclocked, die-shrunk version of that is doable at pretty low wattage, I believe...

Actually... what are the options to start from the e4690?

http://www.anandtech.com/show/4307/amd-launches-radeon-e6760

Architecture = RV730 (RV7xx was it, right?)
TDP = 25 W
Process = TSMC 55nm

Could easily be scaled down, lower the wattage... could ALU's be added? Clockspeed dropped?

wsippel, we expect to reach efficiencies of 1 TFLOPS/W in 2016, partly thanks to photonics (!) technologies. Does any gaffer know if 28 Nm SoC with 3D staking can help us break the 0.04 TFLOPS/W in 2012 already ? This would give a 0.8 TFLOP GPU with what the GPU should suck overall.

I think Nintendo wants to keep it affordable?
 

Donnie

Member
No, you're ignoring what Iwata actually said, what the spec sheet says and what we've had pictures of since E3. 75 watts is the PSU's rating. It can't draw more than that from the wall, and it can only deliver a fraction of that in actual usage: about 45 watts.

Your maths don't add up, WiiU uses 40w without USB's in an average situation (not under full load).

Add all USB's and that's 50w, so you believe that WiiU's PSU can't output enough power to run WiiU in an average situation with all USB's in use?
 
But i doubt this is the case. If you buy an appliance, you want to know what it will consume, also for your electric bill. You generally don't care about how much is actually used, but how much is actually consumed.

I see how you would come to that conclusion. It's not true for PSUs though. When you buy a PSU, you want to know whether it can deliver sufficient power to the hardware. The_Lump's explanation was correct.
I will search the net for a source confirming this, should anyone not believe it yet.
 

Donnie

Member
It doesn't matter we know the exact power usage and psu rating.

Excuse me? You wrongly claimed that 65% efficiency is the most a PSU can supply, I corrected you and your response is "it doesn't matter"? How does it not matter?

We know that the average power usage of WiiU without USB is 40w, we know its PSU is apparently 75w (though I don't recall any proof). But the fact is 75w could allow for as much as 67w for the console, remove USB and that's still 57w.
 

wsippel

Banned
yield issue with gpu or CPU? Would say 28nm is not possible now. The shortages and problems could not support a console launching this year on 28nm.
We don't know if it's the CPU or the GPU. But we know the CPU is 45nm SoI, manufactured in East Fishkill. IBM produces those in large volumes for years, there shouldn't be any yield issues there. It's not like the chip will have a billion transistors. So that pretty much only leaves either the GPU or the stacking process. And it's a well known fact TSMC has massive issues with 28nm, but we also know they expected 28nm to be smooth sailing by now at least two years ago, which was most likely around the time the contracts were signed.

Mobile parts are bin. You will not get better performance than that.
Wrong.
 
This is one of the reason I laughed when people talk about the wiiu was design to do this. Havok is running on the CPU on the wiiu. If it was design to off load task to the gpu this would be the perfect thing to do. As I have said I think it just comes down to the r700 just not good enough or they feel people want to use 100% of gpu power for gfx.

We know the r700 is not well design for these tasks.

So spend some R&D and make it so right? maybe an extra feww years to do it as well?

From the way you post you would think that it is just an off the shelf r700 untouched? In the 4 years of R&D at both nintendo and AMD they would not be able to come up with something better in efficiency than 17.5GFLOPS per watt that they already came up with in 2011? It is within the realms of possibility that AMD and nintendo can come up with something around 21gflops per watt on a ~29 watt part which would give around 608 GFLOPS on a 40nm process, but on a 32nm process then it would be even more probable. You could be right though it could just be 12GFLOPS per watt if all the R&D was spent on hookers and blow oppa gangnam style




Also can this be of help to anyone discussing power supplies?

Efficient power supply requirements

Internal power supplies: 85% minimum efficiency at 50% of rated output and 82% minimum efficiency at 20% and 100% of rated output, with Power Factor > 0.9 at 100% of rated output for power supplies with >= 75W output power
OR
External power supplies: either ENERGY STAR qualified or meet the no-load and active mode efficiency levels

http://www.energystar.gov/index.cfm?c=computers.pr_crit_computers


This below is from wikipedia since I am lazy but the sources are there

A test in 2005 revealed computer power supplies are generally about 70–80% efficient.[13] For a 75% efficient power supply to produce 75 W of DC output it would require 100 W of AC input and dissipate the remaining 25 W in heat. Higher-quality power supplies can be over 80% efficient; energy efficient PSU's waste less energy in heat, and requires less airflow to cool, and as a result will be quieter.

As of 2012 some high-end consumer PSUs can exceed 90% efficiency at optimal load levels, though will fall to 87-89% efficiency during heavy or light loads. Google's server power supplies are more than 90% efficient.[14] HP's server power supplies have reached 94% efficiency.[15] Standard PSUs sold for server workstations have around 90% efficiency, as of 2010.

The energy efficiency of a power supply drops significantly at low loads. Therefore it is important to match the capacity of a power supply to the power needs of the computer. Efficiency generally peaks at about 50–75% load. The curve varies from model to model (examples of how this curve looks can be seen on test reports of energy efficient models found on the 80 PLUS website).

Various initiatives are underway to improve the efficiency of computer power supplies. Climate savers computing initiative promotes energy saving and reduction of greenhouse gas emissions by encouraging development and use of more efficient power supplies. 80 PLUS certifies power supplies that meet certain efficiency criteria, and encourages their use via financial incentives. Efficient power supplies also save money by wasting less power; as a result they use less electricity to power the same computer, and they emit less waste heat which results significant energy savings on central air conditioning in the summer. The gains of using an efficient power supply are more substantial in computers that use a lot of power.
 

USC-fan

Banned
We don't know if it's the CPU or the GPU. But we know the CPU is 45nm SoI, manufactured in East Fishkill. IBM produces those in large volumes for years, there shouldn't be any yield issues there. It's not like the chip will have a billion transistors. So that pretty much only leaves either the GPU or the stacking process. And it's a well known fact TSMC has massive issues with 28nm, but we also know they expected 28nm to be smooth sailing by now at least two years ago, which was most likely around the time the contracts were signed.


Wrong.

So we have the soc leak. Then we know the CPU is on 45 nm? So how do you have a soc design using 45nm and 28nm. Or they saying he gpu is not in the soc which doesn't make much sense.

Right
 

THE:MILKMAN

Member
wiki said:
The energy efficiency of a power supply drops significantly at low loads. Therefore it is important to match the capacity of a power supply to the power needs of the computer. Efficiency generally peaks at about 50–75% load. The curve varies from model to model (examples of how this curve looks can be seen on test reports of energy efficient models found on the 80 PLUS website).]

And the above is the reason why I think the Wii U is 75W rated/40-45 in use...... PSU Efficiency. Sony is the same. Xbox 360 and Laptop PSU's not so much.
 

Donnie

Member
This is one of the reason I laughed when people talk about the wiiu was design to do this. Havok is running on the CPU on the wiiu. If it was design to off load task to the gpu this would be the perfect thing to do. As I have said I think it just comes down to the r700 just not good enough or they feel people want to use 100% of gpu power for gfx.

We know the r700 is not well design for these tasks.

This is such flawed logic. Its like looking at multi-platform audio engines and saying "Well they're currently using WiiU's CPU, that must mean WiiU's sound hardware isn't good enough to process audio".

Havok, like any other multi platform software, has to support both a software and hardware path. Initially they'll get it working on the CPU because that's the priority due to the need for developers to port 360/PS3 games over. But that does not mean they won't provide a path to make it work on the GPU later.
 

wsippel

Banned
So we have the soc leak. Then we know the CPU is on 45 nm? So how do you have a soc design using 45nm and 28nm. Or they saying he gpu is not in the soc which doesn't make much sense.
With a stacked chip, individual components don't need to use the same process.
 

Kenka

Member
(Very lazy) question but is there anything that would let me know how power consumption behaves when you vary the die shrink ? Is the variation linear ?
 

The_Lump

Banned
One thing I'm confused by now: If the max psu draw is 75w, and it say 60% efficient as some of you think; then when the psu is only drawing 40w as Iwata said for 'normal gameplay', are you suggesting the Wii U will only be using 24w power?? Seriously guys?

Are we certain he wasn't talking about the system drawing 75w max, not the psu?
 
Top Bottom