• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Is that from the Wii U drive? I'm not sure which of those numbers to go with, moving parts like optical drives and hard drives usually use the 12V rail but that calculation leads to 22 watts which seems too high. The next bump down gets us 5 watts, and the 3.35 one would be 1.6 watts which then seems too low.

So I'm not sure which of the three numbers is right, and what it means that there are three of them. It wouldn't make sense for it to have three different power draws.


Watts = Amps x Volts btw
The reason a disk drive would have multiple supply powers is because it needs those, and apparently those rails are already present in the host.

Now, of those lines, the 12V is most likely the one for the spindle motor, but that one can have considerable ranges of consumption - spinning up the disk would be the most power-hungry task, while sustaining a constant speed would be much less demanding. Chances are the rest of the rails go to the head mover and the actual reader (laser) circuitry.

Unfortunately, we can draw next to no conclusions what the nominal power draw of the drive is - those inputs are max ratings (just as the console PSU has a max rating of 75W) and we don't know what combinations of loads can occur on those rails - if all can be up at once, or sequentially, or in any other pattern and loads.
 
The reason a disk drive would have multiple supply powers is because it needs those, and apparently those rails are already present in the host.

Now, of those lines, the 12V is most likely the one for the spindle motor, but that one can have considerable ranges of consumption - spinning up the disk would be the most power-hungry task, while sustaining a constant speed would be much less demanding. Chances are the rest of the rails go to the head mover and the actual reader (laser) circuitry.

Unfortunately, we can draw next to no conclusions what the nominal power draw of the drive is - those inputs are max ratings (just as the console PSU has a max rating of 75W) and we don't know what combinations of loads can occur on those rails - if all can be up at once, or sequentially, or in any other pattern and loads.
Why is it so high? Has anyone even been able to pull more than 33W out of the machine?
 

tipoo

Banned
Why is it so high? Has anyone even been able to pull more than 33W out of the machine?

We've discussed this a bunch in the last few pages, but the important thing is most consoles have power supplies rated for far higher than what they actually draw, and Nintendo in particular often has a rating near double what the system actually draws. There's a variety of factors for this discussed previously.
 

USC-fan

Banned
Power usage this is what we got?

33 watts max from wall @ 90% psu ~30 watts

Disk drive ~4 Watts
Cpu ~8 watts

Whats left ~18

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
wifi ~.5 Watts
Flash Storage .5 Watts

Leaves about 15 watts for the whole gpu chip

Its very impressive that this little bit of power can keep up with the ps360.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Why is it so high? Has anyone even been able to pull more than 33W out of the machine?
I don't think anybody has gone further than hooking the console to a wall watt meter and playing a couple of games. That is not exactly an exhaustive stress test, and won't give you very precise information.
 
Power usage this is what we got?

33 watts max from wall @ 90% psu ~30 watts

Disk drive ~4 Watts
Cpu ~8 watts

Whats left ~18

My guesses, anyone have hard numbers
2GB DDR3 Ram ~2 Watts
wifi ~.5 Watts
Flash Storage .5 Watts

Leaves about 15 watts for the whole gpu chip

Its very impressive that this little bit of power can keep up with the ps360.

Nice positive spin you put on it there. haha. Those figures really do make you wonder...
 

tipoo

Banned
Its very impressive that this little bit of power can keep up with the ps360.


The PS3 and 360 in their latest 45nm shrunk down forms are down to 80 ish watts, right? And their architectures are 7 years old. The Wii U only cuts that down by 2.6x with a brand new architecture. In the computer world, that kind of efficiency gain isn't really "impressive" after 7 years, it's just expected. Anand from Anandtech on his latest podcast said the Intel HD2500, which is the cut down version of the Intel integrated graphics, which in turn is the baseline for the lowest GPU power you can get in a standard PC these days, is about half as powerful as the 360 GPU. And that draws single digit watts by itself not including the processor. The not cut down HD4000 is over that power already, and that's still bottom barrel integrated graphics drawing little power.


PS3power.jpg


That's not even the super slim, just the old slim.
 

USC-fan

Banned
The PS3 and 360 in their latest 45nm shrunk down forms are down to 80 ish watts, right? And their architectures are 7 years old. In the computer world, that kind of efficiency gain isn't really "impressive" after 7 years, it's just expected. Anand from Anandtech on his latest podcast said the Intel HD2500, which is the cut down version of the Intel integrated graphics, which in turn is the baseline for the lowest GPU power you can get in a standard PC these days, is about half as powerful as the 360 GPU. And that draws single digit watts by itself not including the processor. The not cut down HD4000 is over that power already, and that's still bottom barrel integrated graphics drawing little power.


That's not even the super slim, just the old slim.

THe ps3 and x360 have not stop at 90nm. They have drop down to 45nm like you said. It drops the power.

Wiiu is at 40nm not much difference from 45nm. Wiiu is in no way using cutting edge chips. Its design are years old at this point.

You need to look at what you are saying.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
Than why does it cost so much to produce?
 

tipoo

Banned
THe ps3 and x360 have not stop at 90nm. They have drop down to 45nm like you said. It drops the power.

Wiiu is at 40nm not much difference from 45nm. Wiiu is in no way using cutting edge chips. Its design are years old at this point.

You need to look at what you are saying.



What?

My point was
1) Since the PS360 have dropped to 45nm, the fabrication process is comparable
2) That accounted for, the Wii U drawing 33w while the others draw 80w to produce similar visuals isn't that impressive for a newer design, 2.6x efficiency gain after 7 years is nothing mind blowing.

Not sure what you're trying to say I said wrong.
 
THe ps3 and x360 have not stop at 90nm. They have drop down to 45nm like you said. It drops the power.

Wiiu is at 40nm not much difference from 45nm. Wiiu is in no way using cutting edge chips. Its design are years old at this point.

You need to look at what you are saying.

Than why does it cost so much to produce?

At this point, I'm thinking that, yes, the design is probably from 2010 (Nintendo love mature tech), but they have utilized the last couple years optimizing the MCM so that the dif chips get along, playing with clocks, and ensuring that yields would be high.
 

USC-fan

Banned
What?

My point was
1) Since the PS360 have dropped to 45nm, the fabrication process is comparable
2) That accounted for, the Wii U drawing 33w while the others draw 80w to produce similar visuals isn't that impressive for a newer design, 2.6x efficiency gain after 7 years is nothing mind blowing.

Not sure what you're trying to say I said wrong.

Design isnt that newer, in fact the cpu is older.

Gpu design is from 2008 and ps3 gpu is from 2005.
 

THE:MILKMAN

Member
The reason a disk drive would have multiple supply powers is because it needs those, and apparently those rails are already present in the host.

Now, of those lines, the 12V is most likely the one for the spindle motor, but that one can have considerable ranges of consumption - spinning up the disk would be the most power-hungry task, while sustaining a constant speed would be much less demanding. Chances are the rest of the rails go to the head mover and the actual reader (laser) circuitry.

Unfortunately, we can draw next to no conclusions what the nominal power draw of the drive is - those inputs are max ratings (just as the console PSU has a max rating of 75W) and we don't know what combinations of loads can occur on those rails - if all can be up at once, or sequentially, or in any other pattern and loads.

This is interesting. I have a theory that Nintendo are artificially capping the WiiU at ~33W. If you are right about the BD drive, how come it doesn't show up say when loading a game?

I just don't get how the power range seems to be about ~3W from idling on the front-end to playing Assassins Creed 3.
 

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
A thought... could Nintendo have left plenty of psu headroom in order to make room for a future version if Wii U with a built-in SSD? They like to have their power bricks interchangeable between hardware.
 

tipoo

Banned
This is interesting. I have a theory that Nintendo are artificially capping the WiiU at ~33W. If you are right about the BD drive, how come it doesn't show up say when loading a game?

I just don't get how the power range seems to be about ~3W from idling on the front-end to playing Assassins Creed 3.

Then again see the picture of the PS3 power consumption I posted above, consoles notoriously do not bother with power scaling. Their either off, gaming, or playing video, no variable power consumption by load. Even at idle they are near max.
 

tipoo

Banned
Quick game of extrapolation anyone? POV ray results.

Core i5 2400S (2.5 GHz): 235.18 pps ; 94.07 pps/GHz
Celeron 220 (1.2 GHz): 81.15 pps ; 67.62 pps/GHz
Athlon II x4 (2.8 GHz): 179.82 pps ; 64.22 pps/GHz
PowerPC 750 (700 MHz): 20.47 pps ; 29.25 pps/GHz
Pentium !!! (450 MHz): 2.43 pps ; 27.62 pps/GHz
Exynos 4210 (1.2 GHz): 29.90 pps ; 24.91 pps/GHz (-mfloat-abi=hard)
Pentium 4m (1.5 GHz): 36.24 pps ; 24.16 pps/GHz
Exynos 4210 (1.2 GHz): 21.99 pps ; 18.32 pps/GHz (-mfloat-abi=softfp)
Atom N270 (1.6 GHz): 28.96 pps ; 18.10 pps/GHz
OMAP 3621 (1.2 GHz): 6.76 pps ; 5.63 pps/GHz
MSM 7227(0.6 GHz): 0.70 pps ; 1.17 pps/GHz

The smartphone SoCs are getting near the pps/GHz. Better than the old Atom though, even better than the Pentium 4 but the higher clock on the P4 lets it win in total even with lower IPC. And no chance it could even touch an i5 as expected.
 

THE:MILKMAN

Member
A thought... could Nintendo have left plenty of psu headroom in order to make room for a future version if Wii U with a built-in SSD? They like to have their power bricks interchangeable between hardware.

I doubt it. Nintendo (And Sony) have always to my knowledge had PSUs that are rated ~double what they pull from the wall.
 
sure it using a r700 based design from 2008

The main takeaway there is that we're looking at a VLIW5 part. They then had at least 2 years to play around with different configurations and tweak design for their own "Nintendo" needs.

We also don't know how Renesas' (presumably) 40nm process has affected the GPU. Yields could be somewhat better or worse than TSMC's and GF's. I would imagine with the maturity of the process, however, that yields are pretty good.

But I do see your argument that we're not actually looking at an architecture 7 years removed from Xenos.
 

Durante

Member
Quick game of extrapolation anyone? POV ray results.



The smartphone SoCs are getting near the pps/GHz. Better than the old Atom though, even better than the Pentium 4 but the higher clock on the P4 lets it win in total even with lower IPC. And no chance it could even touch an i5 as expected.
Jaguar should be slightly below Athlon II in per-clock performance.
 

tipoo

Banned
Jaguar should be slightly below Athlon II in per-clock performance.

Which would still have it well above the 750 in instructions per clock per core regardless of clock rate, and with the core count and clock rate advantages of the other two next gen consoles, the power difference should be remarkable.
 

THE:MILKMAN

Member
Then again see the picture of the PS3 power consumption I posted above, consoles notoriously do not bother with power scaling. Their either off, gaming, or playing video, no variable power consumption by load. Even at idle they are near max.

I guess you're right. I still think something funky is going on though.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Which would still have it well above the 750 in instructions per clock per core regardless of clock rate, and with the core count and clock rate advantages of the other two next gen consoles, the power difference should be remarkable.
From what I can see this test is largely scalar-fp bound.
 

tipoo

Banned
From what I can see this test is largely scalar-fp bound.

You're probably right, but would other aspects of the chip be so out of whack in proportion?

The chip is so old that was one of the few tests I could find and compare to modern chips. Anyone find any others?
 
Quick game of extrapolation anyone? POV ray results.

Those Celeron 220 results look suspiciously high for a 07' era, very poorly clocked, single core Core 2 architecture based chip. The gap between it and the i5 should be much higher. Are the results based on a multi-core implementation of the POV-ray benchmark?
 
This just occurred to me... How likely is it to find a Hollywood graphics core in the Wii U GPU? Or some remains of the TEV Pipelines apparatus I guess.

Just seemed somewhat likely, since Nintendo is prone to leave parts intact just to ensure perfect compatibility. If that's the case then the biggest changes should be no embedded 1T-SRAM and instead access to 3MB of the eDRAM.

The prospect of whether a Wii U developer could use such part, if it's there, should be interesting. Could be more than enough to render to the controller if needed.


Has been done before, I reckon some GBA games used Gameboy's Z80 for sound.
Wouldn't SIMD FP be a "very unfavorable" scenario for Espresso? Scalar seems more average.
Certainly so.

Best benchmark for it is probably Dhrystones.
 

tipoo

Banned
Those Celeron 220 results look suspiciously high for a 07' era, very poorly clocked, single core Core 2 architecture based chip. The gap between it and the i5 should be much higher. Are the results based on a multi-core implementation of the POV-ray benchmark?

81-235 for the total chip is a pretty big gap. The second measurement is per core scaled to 1GHz. The first is the total chip performacne. Doesn't seem off to me. 67-94 per core per Ghz is still a pretty large jump with no core count or clock speed advantage.
 
81-235 is a pretty big gap. The second measurement is per core scaled to 1GHz. The first is the total chip performacne. Doesn't seem off to me. 67-94 per core per Ghz is still a pretty large jump with no core count or clock speed advantage.

Ahh.. reading comprehension failure on my part. Thanks for the clarification. But still, given that the i5 has 4x the cores of the single core Celeron 220 and twice the clockspeed, one would expect a minimum of 4 x increase of the score, also factoring in the architectureal improvements in the Sandy bridge family. That is, if the benchmark fully utilized all 4 cores at 100% without other bottlenecks.
 

prag16

Banned
Wouldn't SIMD FP be a "very unfavorable" scenario for Espresso? Scalar seems more average.

Crap, just looked it up and you're right. I got my wires crossed. I'm several years removed from what hardware knowledge/experience I have/had. *goes to sit in the corner*
 

tipoo

Banned
Crap, just looked it up and you're right. I got my wires crossed. I'm several years removed from what hardware knowledge/experience I have/had. *goes to sit in the corner*

So, this test is actually using one of the more favorable aspects of the 750?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
So, this test is actually using one of the more favorable aspects of the 750?
This test is mainly using one aspect of those chips, and in a certain way. Trying to generalize IPC et al from it can be a bit dubious.
 

Donnie

Member
Then again see the picture of the PS3 power consumption I posted above, consoles notoriously do not bother with power scaling. Their either off, gaming, or playing video, no variable power consumption by load. Even at idle they are near max.

There are 360 games that use at least 10% more power than others. Its certainly not normal to expect all games to use the same exact power. Also quite unusual to see games only using 10% more power than idle and almost the same power as the likes of Netflix.

I've said it a few times before but I expect WiiU games to eventually draw more like 36w+.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Also which PowerPC 750 is being tested there? Its not like they're all identical in performance.
Indeed they aren't, and neither are compilers. Generally, taking a test at face value without even been able to look at the code you're testing is prone to all kinds of errors.
 

Meelow

Banned
S this worth a thread as I can't make one.
http://www.ign.com/boards/threads/rumor-new-fps-ip-coming-to-wii-u.452863971/

New IP by Gearbox running "Unreal 4" on Wii U. Guess if true that put to rest that Wii U can't run it.

Wouldn't shock me, it was rumored a few months ago that UE4 is supported on Wii U.

For people not like clicking links.

So there we have it, title says it all, I've heard that Gearbox is working on something new for "next-gen" consoles, and that at this specifically includes the Wii U.

What I "know:"

- Being developed by Gearbox Software
- It's super duper early in development, it's very "bare bones" at the moment.
- It's being developed using the Unreal 4 engine.
- It's a new IP, (currently) So it's something we haven't seen yet from them.
- It's an FPS
- It's multiplatform (duh) but the only version I specifically heard about was the Wii U version.

Other interesting tidbits:

- Supposedly their PS4 devkits recently had a meltdown and they're awaiting new ones. (I got a chuckle out of that)

And of course, all things are subject to change, again this project is really very early in development, so who knows what will come of it.

Personally I'm most excited that they are using Unreal 4. I don't really care much for FPS games. Not sure if this excites anyone else though, clearly the fact that they are using Unreal 4 is a good thing and perhaps answers some questions.
 
Top Bottom