• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

ozfunghi

Member
I wouldnt say that's 100% accurate....

From chesemeisters Japanese translation:

Iwata: "Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic."

And remember Nintendo are trying to market this as energy efficient. They could have taken this figure whilst playing a virtual console nes game for all we know!

That's exactly the point i was trying to make, few posts back. Iwata will more likely downplay the energy consumption in light of environment/cost & energy saving, than he is likely to boast about how much horsepowarz his console has. If he is talking "could be realistic depending on accesories and game"... to me that reads, "while playing tetris with nothing attached to the USB ports".

According to a chart our beloved Specialguy whooped out, the 360 consumes 88 W while playing Gears of War.
 
The 40W figure appears to be in reference to during gaming, and dependent on what peripherals are attached.

That would probably be the typical power draw when playing NSMB-U for example. I doubt they are pushing the hardware yet with their current games. 40 watts would be the minimum hd gaming condition they can safely say wii u is able to get down to. "typical" you might say.

In regards to the GPU

You would think that modifying the r700 series for the last three years would have some performance improvements correct? Is the HD7XXX series also modified from the r700?

Obviously they are not going to call it a HD7XXX series card as it would have been developed in parallel to the HD7XXX series it would have its own moniker let's call it cappucino. The full design wasn't even completed until this year. It might not be at 28nm but it can be at 40nm or even 32nm and the possibility of it being designed more effectively for lower resolution console gaming might have it gain more performance per watt compared to the original 8GFLOPS per watt of the 55nm part.

The HD4770 which beneffitted from having a die shrink to 40nm already performed at 12GFLOPS per watt in april 2009 with roughly a year extra worth of tweaking. The Radeon e6760 already performed at 16.5 GFLOPS per watt but that had an extra 2 years of design in may 2011.

The designers on project cafe having another year extra would have been able to come up with a chip that at 40nm can output 17.5 GFLOP per watt or better having an extra year of design parallel to the HD series. For reference the Radeon HD 7690M XT released in January 2012 is about ~25 watts and can produce 696GFLOPS at 40nm with a rating of 27.84GFLOPS per watt.

I would be confident in saying that the advancements they made from the original r700 series would share features more in common to the HD7xxx series or just as modern as the HD7XXX series which taped out more than six months before Project Cafe.

So where do we go from here?

You can think that the r700 was not modified by much so that at 30 watts the GPU would only be a round what the April 29 2009 HD4770 was at 12GFLOPS per watt so it would be 360GFLOPS

OR

You can think that the modified r700 advanced even further than that with three extra years under their belt.

Could it even be a better advancement of say the May 2011 e7650?

Could it even be a better advancement of say the HD7000 series which launched in Jan 2012?

Looking at Iwata saying the TDP with all USB ports being used (10 watts) is 75 watts. lets say without the system using any of the usb ports the maximum load is 65 watts. We can make out what the max TDP of the GPU based on this number. Or maybe not?

However, some may argue that the maximum TDP is 40 watts as Iwata says that you can realistically get the Wii U to 40 watts in a gaming condition. Some would argue that Iwata saying the maximum draw including all usb ports is 75 watts, means that he is talking about the PSU rating being a maximum 75 watts. Is this what iwata is really saying?
 

Donnie

Member
Alright, explanations that indicate that the situation is more obscure and complex than imagined. I am out of the thread. Thank you guys to let me understand the topic a bit better.

edit: I edited your edit Donnie. Thanks to clarify things a bit more in detail.

Yeah was going to use an over clocked number for the I7 just to emphasise it but decided it didn't really need to be emphasised.
 

Vic

Please help me with my bad english
Ha!

Nintendo's licensing deal has meant a technical collaboration between the two parties. Nintendo will want Wii U to shine, so Unity optimisation has happened on a "deep level". Actually getting Unity working on Wii U was a cinch, Helgason said, because the system's architecture is "similar" to Wii, as Wii was to GameCube. "Nintendo's smart like that, they don't force you to start anew," he praised.

Article: Nintendo licenses Unity engine for Wii U, both in-house and out

I knew Nintendo would make the architecture to be very similar to the Wii.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
That's exactly the point i was trying to make, few posts back. Iwata will more likely downplay the energy consumption in light of environment/cost & energy saving, than he is likely to boast about how much horsepowarz his console has. If he is talking "could be realistic depending on accesories and game"... to me that reads, "while playing tetris with nothing attached to the USB ports".

According to a chart our beloved Specialguy whooped out, the 360 consumes 88 W while playing Gears of War.

PSU's generally have an efficiency rating of about 60% to 65%, The 360 (Slim) PSU was rated at a maximum of 135 watts.

65% of 135 = 87.75 watts

Taking Iwata's description of the Wii U's PSU being rated at a maximum of 75 watts:

65% of 75 = 48.75 watts.

I think being realistic you are going to be seeing the total draw of the Wii U averaging out at about 47 watts under load, so the 45 watts figure quoted by Iwata is a pretty close match.
 

ozfunghi

Member
PSU's generally have an efficiency rating of about 60% to 65%, The 360 (Slim) PSU was rated at a maximum of 135 watts.

65% of 135 = 87.75 watts

Taking Iwata's description of the Wii U's PSU being rated at a maximum of 75 watts:

65% of 75 = 48.75 watts.

I think being realistic you are going to be seeing the total draw of the Wii U averaging out at about 47 watts under load, so the 45 watts figure quoted by Iwata is a pretty close match.

He said 40 W. So 47 is already an increase in of nearly 20%. That's what i was talking about when i said it could explain for 5-10W. And 5-10W for a GPU that had to cope with less than 30 could make a big difference.

Edit: sorry for the DP

Edit 2: i know everybody likes to state that PSU's are efficient at 65%... but what does that really mean? That they can not provide more than that? That it will consume more power than ideally required?
 

Vic

Please help me with my bad english
Yeah, that's not really saying much. WiiU needed to be compatible with Wii, without adding Wii specific chips. So obviously, architecture will be similar.
That was obvious. I'm just reiterating about what I said in an old post of mine. Anyone that worked on the Wii will probably feel at home on the Wii U.
 
That would probably be the typical power draw when playing NSMB-U for example. I doubt they are pushing the hardware yet with their current games. 40 watts would be the minimum hd gaming condition they can safely say wii u is able to get down to. "typical" you might say.

In regards to the GPU

You would think that modifying the r700 series for the last three years would have some performance improvements correct? Is the HD7XXX series also modified from the r700?

Obviously they are not going to call it a HD7XXX series card as it would have been developed in parallel to the HD7XXX series it would have its own moniker let's call it cappucino. The full design wasn't even completed until this year. It might not be at 28nm but it can be at 40nm or even 32nm and the possibility of it being designed more effectively for lower resolution console gaming might have it gain more performance per watt compared to the original 8GFLOPS per watt of the 55nm part.

The HD4770 which beneffitted from having a die shrink to 40nm already performed at 12GFLOPS per watt in april 2009 with roughly a year extra worth of tweaking. The Radeon e6760 already performed at 16.5 GFLOPS per watt but that had an extra 2 years of design in may 2011.

The designers on project cafe having another year extra would have been able to come up with a chip that at 40nm can output 17.5 GFLOP per watt or better having an extra year of design parallel to the HD series. For reference the Radeon HD 7690M XT released in January 2012 is about ~25 watts and can produce 696GFLOPS at 40nm with a rating of 27.84GFLOPS per watt.


I would be confident in saying that the advancements they made from the original r700 series would share features more in common to the HD7xxx series or just as modern as the HD7XXX series which taped out more than six months before Project Cafe.

So where do we go from here?

You can think that the r700 was not modified by much so that at 30 watts the GPU would only be a round what the April 29 2009 HD4770 was at 12GFLOPS per watt so it would be 360GFLOPS

OR

You can think that the modified r700 advanced even further than that with three extra years under their belt.

Could it even be a better advancement of say the May 2011 e7650?

Could it even be a better advancement of say the HD7000 series which launched in Jan 2012?

Looking at Iwata saying the TDP with all USB ports being used (10 watts) is 75 watts. lets say without the system using any of the usb ports the maximum load is 65 watts. We can make out what the max TDP of the GPU based on this number. Or maybe not?

However, some may argue that the maximum TDP is 40 watts as Iwata says that you can realistically get the Wii U to 40 watts in a gaming condition. Some would argue that Iwata saying the maximum draw including all usb ports is 75 watts, means that he is talking about the PSU rating being a maximum 75 watts. Is this what iwata is really saying?

I tend to agree with this.
 
That was obvious. I'm just reiterating about what I said in an old post of mine. Anyone that worked on the Wii will probably feel at home on the Wii U.

Anyone that worked on the control aspects of nintendo DS games or 3DS will also be at home with the Wii U. I would hope that those developers who hedged their bets and made money on the Wii and DS would now be rewarded with a development ecosystem that they are accustomed to familiar and comfortable with. It would be like Nintendo is giving the chance for these smaller developers to show what they can do with this new hardware as these developers would get the advantage in untapping the power (in processing and control setups) of the Wii U due to previous experience with similar concepts. We are already seeing this with the developers of Scribblenauts already releasing on Wii U.
 
OK, tx. If that is a full-load figure then 600 GFlops looks a bit unlikely indeed. Unless it's a mobile GPU.

Yeah 40 watts is a joke. That confirms it's quite underpowered, more than I suspected. What is their obsession with a small form factor, it's a home console not a laptop. And Matt on the last page confirmed the enhanced three Broadway CPUs? so what was IBM talking about then
 
PSU's generally have an efficiency rating of about 60% to 65%, The 360 (Slim) PSU was rated at a maximum of 135 watts.

65% of 135 = 87.75 watts

Taking Iwata's description of the Wii U's PSU being rated at a maximum of 75 watts:

65% of 75 = 48.75 watts.

I think being realistic you are going to be seeing the total draw of the Wii U averaging out at about 47 watts under load, so the 45 watts figure quoted by Iwata is a pretty close match.

So how strong hardware can you fit in on 45 watts?

I honestly don't mind, if it means it'll cost me less in electricity bills.
 

ozfunghi

Member
40 watts would be the minimum hd gaming condition they can safely say wii u is able to get down to. "typical" you might say.

Indeed. Iwata is going for the lowest possible number to share.

The Radeon e6760 already performed at 16.5 GFLOPS per watt but that had an extra 2 years of design in may 2011.

The designers on project cafe having another year extra would have been able to come up with a chip that at 40nm can output 17.5 GFLOP per watt or better having an extra year of design parallel to the HD series.

Well, my current guestimate would be 460Gflops... (480 SPU & 480 Mhz). Blu expected the GPU to consume in the "high 20's". Let's say 27 W @ 17Gflops per Watt = 459 Gflops.

That's double the 360's GPU give or take a little, with new features, a more modern architecture...
 

The_Lump

Banned
That is how i see it. And i also want to place a big question mark next to the 40 W statement of Iwata. How i see it, and how i think it is most sensible for him as head of Nintendo to share that, is in the sense that less = better. He already knows 40 or 50 or even 75 isn't going to persuade the tech analysts that it is superpowerful hardware. What he can do though, is come across as environmental friendly, cost efficient, energy saving etc... So IMO, he is likely to give a low estimate rather than a high. Also, if in the same line of thought, he is speaking of average consumption, i would guess he includes the amount of time you are in the WiiU menu, on MiiVerse, in the game menu etc... Maybe for each hour on the WIiU you are only stressing the GPU/CPU for 20 to 30 minutes.

So i think we are comparing low estimates (power consumption of WiiU) to high estimates (power consumption of GPU/CPU). This is of course pure speculation on my part, but it could explain for 5 to 10 W.

Yep. We concur on this.

I have an idea.. lets stop talking about spec, and wait till the hombrew scene finds out something more then guess working.


*looks at thread title*

;)
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Edit 2: i know everybody likes to state that PSU's are efficient at 65%... but what does that really mean? That they can not provide more than that? That it will consume more power than ideally required?

Efficiency rating is the total effective energy that is NOT lost / converted to heat by the transformer, the conversion in a PSU from AC to DC generally converts around 60-65% to electricity, 30-35%+ to heat and a very very minute amount to sound / other forms of energy.

So to answer your question, that means that 60-65% is the total amount of the rated maximum that is available to the equipment being powered.

This is why the launch 360 power brick was so hot & large, rated for a continous 245 watts meant that nearly 150 watts was being converted entirely to heat, therefore required a very large heatsink and needed to be external to the console.
 
Right, this FLOP talk has gone round the circle for the 100,000'th time now, and as people have pointed out, it's not really a reliable way to determine the performance. Take a day off everyone.

USC-Fan will have finished his 12th bong of the day soon and be here to wipe the floor with his shamanistic insights into the Wii U, so let's just wait for him for now.
 

Donnie

Member
PSU's generally have an efficiency rating of about 60% to 65%, The 360 (Slim) PSU was rated at a maximum of 135 watts.

65% of 135 = 87.75 watts

Taking Iwata's description of the Wii U's PSU being rated at a maximum of 75 watts:

65% of 75 = 48.75 watts.

I think being realistic you are going to be seeing the total draw of the Wii U averaging out at about 47 watts under load, so the 45 watts figure quoted by Iwata is a pretty close match.

Plenty of PSU's are more than 65% efficient though. Even the original 360 hit 88% of its PSU's max wattage (PSU was 203w and the most demanding games were driving the console itself to 178w).

So I don't think we can assume 65% efficiency.
 
Right, this FLOP talk has gone round the circle for the 100,000'th time now, and as people have pointed out, it's not really a reliable way to determine the performance. Take a day off everyone.

USC-Fan will have finished his 12th bong of the day soon and be here to wipe the floor with his shamanistic insights into the Wii U, so let's just wait for him for now.


lol
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Iwata said that 40w is the normal power consumption (not full load) of WiiU without considering external sources like USB. Max load could be 50w (60w including USB).

You are not getting 60w out of a max rated PSU of 75w, not physically possible.

Plenty of PSU's are more than 65% efficient though. Even the original 360 hit 80% of its PSU's max wattage (PSU was 203w and the most demanding games were driving the console itself to 177w).

So I don't think we can assume 65% efficiency.

The original Xbox 360 PSU was rated at 245-280w, not 203w

65% of 280w = 182w, only 5w out from the 177w

Microsoft said:
http://support.microsoft.com/kb/907635
The Xbox 360 power supply contains an internal fuse that helps protect your console from too-high voltage and from power surges. However, the PSU is a sealed unit. Therefore, you cannot replace the internal fuse. The power supply is rated for of 245 watts of continuous power and 280 watts of maximum power.
 

ozfunghi

Member
Efficiency rating is the total effective energy that is NOT lost / converted to heat by the transformer, the conversion in a PSU from AC to DC generally converts around 60-65% to electricity, 30-35%+ to heat and a very very minute amount to sound / other forms of energy.

So to answer your question, that means that 60-65% is the total amount of the rated maximum that is available to the equipment being powered.

This is why the launch 360 power brick was so hot & large, rated for a continous 245 watts meant that nearly 150 watts was being converted entirely to heat, therefore required a very large heatsink and needed to be external to the console.

Well, something doesn't add up. If you are saying 30-35% i can't see Nintendo take any risks, so that would mean they have to take 35% into account. That means only 48,75 W would be available, MAX. Taking into account 4 USB ports and subtract that... You're not even left with 40 W! And that is supposedly under maximum load?

Are there any numbers on the Wii, how much it was rated and how much it actually consumed while under load?
 

Donnie

Member
You are not getting 60w out of a max rated PSU of 75w, not physically possible.



The original Xbox 360 PSU was rated at 245-280w, not 203w

65% of 280w = 182w, only 5w out from the 177w

So then can you explain why it says on the very same page that the power supply is 203w? There seems to be two different ratings going on there. Which could easily confuse any comparison to WiiU's power supply.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Why not? If the efficiency is 88% that's 66w.

Because efficiency of 88% it far outside the realm of possibility for a consumer, cheap and small PSU, I work in an industry that has, at it's heart, a reliance on transformers that cost tens of thousands of pounds to achieve over 80% efficiency and are the size of a desktop tower.
 

Donnie

Member
Well, something doesn't add up. If you are saying 30-35% i can't see Nintendo take any risks, so that would mean they have to take 35% into account. That means only 48,75 W would be available, MAX. Taking into account 4 USB ports and subtract that... You're not even left with 40 W! And that is supposedly under maximum load?

Also what on earth are the 80 Plus PSU ratings about if the max possible is 65%?
 

The_Lump

Banned
Why not? If the efficiency is 88% that's 66w.


Well, "80+" PSUs have been the norm in PC land for around 6 years. Id say that's at least possible on a brand new console. Unless we're once again assuming console hardware is inexplicably like-for-like accross the last 6 years again, and can't possibly have advanced?
 

ozfunghi

Member
So, my Wii PSU says 52 W
My GCN PSU doesn't share Wattage.

Anybody know how much the Wii was actually using when playing... say, Xenoblade? We could likely take that as a point of reference, no? Do we have to take into account GCN controller/memcard ports? I take it they won't consume power in Wii mode? Wii has 2 USB ports.

Edit: so according to the 65%, minus 5 W for USB... that means the Wii's max power draw should be 28.8W??

Edit2: http://kotaku.com/216051/wiis-uses-about-onetenth-power-of-360?tag=gamingwii
according to that article, Wii needs 17W to show the Wii menu... and also 17W to play Twilight Princess... that means TP was as demanding as the home menu??

Edit3: according to a guy on a forum in 2006, "According to GCs PSU, GC consumes 39 Watts"
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Tell me when you can get one of those PSU's in a typical "wall wart" passively cooled package.

*shrugs*

You can believe whatever you want.
 
Plenty of PSU's are more than 65% efficient though. Even the original 360 hit 88% of its PSU's max wattage (PSU was 203w and the most demanding games were driving the console itself to 178w).

So I don't think we can assume 65% efficiency.

you also have to add in all the usb ports running so another 10 watts for that
 
That would probably be the typical power draw when playing NSMB-U for example. I doubt they are pushing the hardware yet with their current games. 40 watts would be the minimum hd gaming condition they can safely say wii u is able to get down to. "typical" you might say.

Yeah, "typical", like when you run a game. It's not like anyone is booting up furmark to try and set their GPU on fire.

The HD4770 which beneffitted from having a die shrink to 40nm already performed at 12GFLOPS per watt in april 2009 with roughly a year extra worth of tweaking. The Radeon e6760 already performed at 16.5 GFLOPS per watt but that had an extra 2 years of design in may 2011.

The e6760 is two generations removed from the R700, from AMD's second generation DX11 architecture.

The designers on project cafe having another year extra would have been able to come up with a chip that at 40nm can output 17.5 GFLOP per watt or better having an extra year of design parallel to the HD series. For reference the Radeon HD 7690M XT released in January 2012 is about ~25 watts and can produce 696GFLOPS at 40nm with a rating of 27.84GFLOPS per watt.

The 7690 is THREE generations removed from the R700, on a cutting edge 28nm process and a high margin, cherry picked part for the mobile market. They are not going to throw away 9 out of every 10 WiiU GPUs they make just to hit a 27 GFlop per watt efficiency. That number is completely useless. But even if we use your most generous 17.5 GFlop per watt figure, when you multiply that by a realistic (and yet generous) power usage number like 25 watts, you're still nowhere near 600 GFlops. 600 is a pipe dream.

I mean, sure, AMD spent a few years working on this thing, but they had a lot of stuff to do that wasn't fundamentally reworking how the shaders execute code to drastically improve performance. Like laying out the new configuration, modifying and replacing interfaces, adapting to IBM's manufacturing techniques, solving issues with test production runs, working out how to improve yeilds, etc, etc.
 

ikioi

Banned
PSU's generally have an efficiency rating of about 60% to 65%, The 360 (Slim) PSU was rated at a maximum of 135 watts.

65% of 135 = 87.75 watts

Taking Iwata's description of the Wii U's PSU being rated at a maximum of 75 watts:

65% of 75 = 48.75 watts.

I think being realistic you are going to be seeing the total draw of the Wii U averaging out at about 47 watts under load, so the 45 watts figure quoted by Iwata is a pretty close match.

It's also equally realistic that Nintendo are using a more efficient power supply.

PSU efficiency has increased significantly in the past few years, there's been a real push within the electronics industry to get devices up to the 80%+ efficency mark. Improved green credentials, reduced power consumption, reduced heat generation, and cheaper opperational costs have been the primary motivating factors.

On top of this, given more efficient PSUs ouput less heat. It also means they can be made smaller, or housed in sealed casings without issue. These two things seem to be something Nintendo love.

I would not be surprised at all if Nintendo were using a 75-85% efficiency PSU for the Wii U. In fact i'd be quite surprised if the Wii U's PSU really was only 65% efficient, as for a modern electronics device that is terrible. Especially for what is already a low draw power supply, as it's far easier and cheaper to build efficent power supplys when they're not drawing much to begin with.

75% - 56 watts

85% - 64 watts

Iwata's comments about 40w consumption i doubt very much would be 'max draw'. Nintendo would need to be using the ass end and cheapest of power supply components to have only 65% efficency. I apprecaite it wont be a gold class computer PSU grade power supply in the Wii U, but i do doubt very much its 65% efficient bad.
 

The_Lump

Banned
You are not getting 60w out of a max rated PSU of 75w, not physically possible.



The original Xbox 360 PSU was rated at 245-280w, not 203w

65% of 280w = 182w, only 5w out from the 177w


I could be mistaken, but I think your not working it out correctly.

If a PSU is 600w and is 60% efficient, it's not outputting only 360w (60% of 600). It's outputting 600w (because that's its advertised output). It's drawing 1000w of power from the mains, 40% is lost as heat and 60% is used (hence 60% efficiency).


Likewise, a 75w psu which is, say, 80% efficient is drawing 95w but only using 75w.

It's not some theoretical maximum, and when you buy a PSU thinking "Im good for 600w" you don't have to get a calculator and work out how much power you can actually get from it - that would be silly. It says 600w output, so that's what you're getting. You add onto it to work out the actual power consumption, based on its efficiency rating.

And to add to this; Iwata's words were "The Wii U is rated at 75w power consumption. Please note that this consumption rating is measured at maximum utilisation of all functionality." So there you go. In plain English.
 

Kenka

Member
Because efficiency of 88% it far outside the realm of possibility for a consumer, cheap and small PSU, I work in an industry that has, at it's heart, a reliance on transformers that cost tens of thousands of pounds to achieve over 80% efficiency and are the size of a desktop tower.
Hey, high-five !
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
It's also equally realistic that Nintendo are using a more efficient power supply.

PSU efficiency has increased significantly in the past few years, there's been a real push within the electronics industry to get devices up to the 80%+ efficency mark. Improved green credentials, reduced power consumption, reduced heat generation, and cheaper opperational costs have been the primary motivating factors.

On top of this, given more efficient PSUs ouput less heat. It also means they can be made smaller, or housed in sealed casings without issue. These two things seem to be something Nintendo love.

I would not be surprised at all if Nintendo were using a 75-85% efficiency PSU for the Wii U. In fact i'd be quite surprised if the Wii U's PSU really was only 65% efficient, as for a modern electronics device that is terrible.

75% - 56 watts

85% - 64 watts

Iwata's comments about 40w consumption i doubt very much would be 'max draw'.

Again, like I said, show me where you can get an 75-80%+ efficiency from a passively cooled, wall wart formfactor that costs a typical BOM for a console PSU, yes advances have been made, in PSU's that are actively cooled, larger than the Wii-U itself and cost $200-$300+

You're trying to make the evidence fit the theory, not the theory fit the evidence, if your aim is just to "spec up" the Wii-U to fit preconcieved notions then fine and facts really have no place here, but I would have thought making the theory fit the evidence and speculation about reality would be of more interest, if not then fair enough, I'm out.
 

Donnie

Member
Tell me when you can get one of those PSU's in a typical "wall wart" passively cooled package.

*shrugs*

You can believe whatever you want.

Surely getting a relatively cheap passively cooled PSU to reach 80% efficiency is quite a bit easier if its lower wattage to begin with. I'd imagine the PSU's you deal with are a couple of orders of magnitude more than 75w.

By the way could you answer my question above please?

http://www.neogaf.com/forum/showpost.php?p=42348232&postcount=4231
 
And to add to this; Iwata's words were "The Wii U is rated at 75w power consumption. Please note that this consumption rating is measured at maximum utilisation of all functionality." So there you go. In plain English.

Plain english? You know what we have here.... A WITCH! BURN HIM!
 

ozfunghi

Member
The e6760 is two generations removed from the R700, from AMD's second generation DX11 architecture.

Yeah, and they have been working just as long on the WiiU GPU. So, if it's not possible, to get either a lower TDP, a higher flop per watt than a cheapish embedded GPU, why not simply go for the cheap embedded GPU and save R&D?

But even if we use your most generous 17.5 GFlop per watt figure, when you multiply that by a realistic (and yet generous) power usage number like 25 watts, you're still nowhere near 600 GFlops. 600 is a pipe dream.
Well, my current guestimate would be 460Gflops... (480 SPU & 480 Mhz). Blu expected the GPU to consume in the "high 20's". Let's say 27 W @ 17Gflops per Watt = 459 Gflops.
 

Jarsonot

Member
Hey guys quick question:

Earlier you were discussing backwards compatibility with the Wii, whether or not it would require "Wii Mode" and if it would have accessibility to the Wii U home menu.

Someone mentioned the 3DS and DS mode, how the 3DS intercepted a call, put the game in sleep mode.

I'm wondering why the 3DS won't allow the DS game to be suspended to allow access to web browsing, notes - the menu.

Is it because of memory limitations?

Is it plausible that the Wii U will be able to intercept a call from the HOME button on the remotes (while playing a Wii game), suspend the game and allow one access to the Wii U menu?

I'm an avid reader of this thread, despite no tech knowledge. Most of this speculation goes over my head, though I am becoming more familiar with terms and limits of hardware. Maybe this'll be a new avenue of speculation for me to lurk on? =)
 

The_Lump

Banned
Because efficiency of 88% it far outside the realm of possibility for a consumer, cheap and small PSU, I work in an industry that has, at it's heart, a reliance on transformers that cost tens of thousands of pounds to achieve over 80% efficiency and are the size of a desktop tower.

We're not talking about transformers which cost tens of thousands of pounds. I can find you a 1200w PC PSU with 90% efficiency (ie '80 PLUS Platinum certified) for about £120 if you like :)
 
It's terrible how everything that comes out about Wii U gets spun.

Iwata "it uses 40 watts"

GAF within a couple days: "as you see by these ironclad calculations, Wii U can be drawing no less than 70 watts confirmed, possibly more. That is if we assume they havent already re engineered the power supply to go much higher which they obviously probably have already"
 

The_Lump

Banned
It's terrible how everything that comes out about Wii U gets spun.

Iwata "it uses 40 watts"

GAF within a couple days: "as you see by three ironclad calculations, Wii U can be drawing no less than 70 watts confirmed, possibly more. That is if we assume they havent already re engineered the power supply to go much higher which they obviously probably have already"


Who said that? Read his quote again. He says 75w was the measured maximum using all funtionality. That's it. Can't be any more and no one said that it can.
 
Top Bottom