• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Thraktor

Member
I'm ultra skeptical about the register file using anything but (multiported) SRAM. But I already disclaimed any proficiency in the subject, so I'll shut up : )

Yeah, registers strike me as the kind of thing that you want to keep as SRAM, but it's the only thing that we've heard any indication of changes on. The L2 texture cache seems to me to be something where the increased density of eDRAM would be worthwhile, given the limited DDR3 bandwidth, but I don't know if there's some massive reworking of logic required to switch a cache over to eDRAM like that.

And a lack of proficiency is hardly stopping me, now is it?
 

japtor

Member
It's from a doctoral thesis. Data and Type Optimizations in Virtual Machine Interpreters. One section describes optimizing a Java interpreter for the Cell BE. It doesn't seem to be online anymore, but the paper Optimization Strategies for a Java Virtual Machine Interpreter on the Cell Broadband Engine is available and seems to form the core of the SPE optimization section.

More relevant to Wii U discussion is probably the comparison of the PPE to a 2.4GHz Core 2 processor. Despite running at a lower clockspeed, the Core 2 shows itself 2x-6x faster. This is a good demonstration of a common task in many game engines – a scripting language – which should profile considerably different on the Wii U than the previous HD consoles.
Isn't something like a VM pretty much a worst case scenario for something like the SPEs (or PPE for that matter)?
 
Yes, and many games have a large amount of code written in them all the way back to QuakeC (and probably further).

If you're putting something on SPU in the first place it's probably well-contained and parallelized. Branch-heavy code like a VM which is jumping all over the place in both memory and instructions is a really poor choice.

Code reasonably well-suited for SPU smokes Xenon/PPU, but even if it doesn't your priority is often getting it off PPU to start with. Because you will always have more free SPU time than free PPU time... even code that runs significantly slower on SPU because of retarded DMA or whatever can be worth moving to SPU.
 

Argyle

Member
If you're putting something on SPU in the first place it's probably well-contained and parallelized. Branch-heavy code like a VM which is jumping all over the place in both memory and instructions is a really poor choice.

Code reasonably well-suited for SPU smokes Xenon/PPU, but even if it doesn't your priority is often getting it off PPU to start with. Because you will always have more free SPU time than free PPU time... even code that runs significantly slower on SPU because of retarded DMA or whatever can be worth moving to SPU.

I really only posted that chart because of Argyle's exaggerated claim. In the context of the Wii U the comparison of the Core2 to the PPE is more relevant as I mentioned above. I don't want to derail the thread with too much SPE talk that isn't relevant to Wii U discussion.

I agree completely with wonderdung. Perhaps it was a bit hyperbolic, but I just wanted to share my experience with the SPUs, I've seen people say things in this thread like "they don't count as processor cores" or "they are only good for math" which is frankly ridiculous. But yes, because I don't like swimming upstream - people tend to move tasks that are parallelizable and self-contained to SPU, PPU is almost always the bottleneck if you are CPU bound, and if you insist on moving your game's scripting engine to SPU you will get what you deserve.

IMHO the Wii U CPU probably falls architecturally in between the Core 2, which I suspect has a few generations of enhancements (and thus greater performance per clock) over what is in the Wii U, and the PPU. But since the clock is so low on the Wii U I would guess that net performance per core is similar (or slightly worse, if developer comments are to be believed) to the PPU.

My comments earlier in the thread about what I would do if faced with porting an existing game to Wii U are based on the idea that it wouldn't hurt to try to break up the work in a similar way to that on PS3 - self contained tasks that you can load into edram to avoid stalling on main memory, do not allow the tasks to be preempted, but make the chunks of work small enough that if something becomes high priority you don't wait around too long for a spare worker thread, but not so small that the overhead of setting up the job becomes significant. Perhaps the reason for the lackluster ports and dev comments about performance is because if you don't spend time to make sure your data is in edram you will get sub-PPU/Xenon level performance because of the slow system RAM they are using.

I would also guess that the core with the extra cache is intended to generate the command list for the GPU. It's the only work I can think of that you could reasonably expect to have to perform in every game.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I would also guess that the core with the extra cache is intended to generate the command list for the GPU. It's the only work I can think of that you could reasonably expect to have to perform in every game.
Why would you need an extra 1.5MB of cache for command lists? Particularly when you have an entire 32MB of CPU/GPU-addressable eDRAM pool?
 

Argyle

Member
Why would you need an extra 1.5MB of cache for command lists? Particularly when you have an entire 32MB of CPU/GPU-addressable eDRAM pool?

No real idea, other than maybe it was something to do with the typical memory access pattern for this work. Just trying to think about why they'd bother to even make it asymmetrical. Do you have any ideas on why they would even do this?

Another possibility is that is the core that the OS will preempt, and games don't have access to all of the cache (to keep the OS thread from screwing with the cache for the game thread).
 
Thanks. I assume that if Wii U was being pushed to the max it'd go higher?

Read the article
One thing that did stand out from our Wii U power consumption testing - the uniformity of the results. No matter which retail games we tried, we still saw the same 32w result and only some occasional jumps higher to 33w. Those hoping for developers to "unlock" more Wii U processing power resulting in a bump higher are most likely going to be disappointed, as there's only a certain amount of variance in a console's "under load" power consumption.
 

beje

Banned
The Power Supply is rated at 75W and the console runs incredibly cool so I'd be surprised if they don't pull a PSP/3DS and unlock an extra overclock (along with extra game RAM due to lower OS footprint) in future revisions of the dev kits when they're confortable with the machine being able to stand it.
 
http://www.neogaf.com/forum/showpost.php?p=44966628&postcount=646 Just to bring up this post from a few pages back, I had a question...

If 1 Xenon core only out performed Broadway by 20%, than could this be why Wii U's CPU cores were originally clocked at 1GHz? (a clock speed increase of 37%) wouldn't this than put the Wii U's 3 cores at 1.24GHz with as good or better performance clock for clock as Broadway, quite a bit beyond Xenon (Broadway would match or beat a Xenon core at only 875MHz?) so more like a 1.5x (I know we love those multiples) or 50% faster than Xenon according to the emulator developer from the quoted post above?

Obviously for certain tasks that are SIMD heavy would still likely fall short on Wii U, but for general computing of game logic and what not, wouldn't Wii U's CPU cores reasonably faster? or was the developer missing some other key component of Xenon's power that was unlocked later on?

I was wondering the same thing since somebody (don't remember who, sorry) mentioned the fact that a single Wii core was almost as fast a single Xenon core, wouldn't the increased clock and multicore architecture make it surpass the Xenon quite easily?
 

dock

Member
I wonder how much stuff is in the Gamepad.

It obviously runs the TV remote stuff independently of the Wii U itself, so how much could this firmware be expanded upon to offer more functionality?

I hope the TV remote becomes more customisable in the future to control receivers, or toggle various remote configs.
 

SmokyDave

Member
I wonder how much stuff is in the Gamepad.

It obviously runs the TV remote stuff independently of the Wii U itself, so how much could this firmware be expanded upon to offer more functionality?

I hope the TV remote becomes more customisable in the future to control receivers, or toggle various remote configs.

Doesn't it do that already?
 

Donnie

Member
Read the article

Right there's a certain variance in consoles power consumption under load, so expect higher power usage in future WiiU games, just don't expect 40w or higher. For instance 360's variance between launch games and more recent games was something like 15%. If we used that as a guideline we'd expect WiiU to max out at around 37w, but of course it could end up a bit lower or higher than that.
 

mrklaw

MrArseFace
No real idea, other than maybe it was something to do with the typical memory access pattern for this work. Just trying to think about why they'd bother to even make it asymmetrical. Do you have any ideas on why they would even do this?

Another possibility is that is the core that the OS will preempt, and games don't have access to all of the cache (to keep the OS thread from screwing with the cache for the game thread).

maybe OS only cache and game only cache for that core?
 

wsippel

Banned
The measured power consumption seems weird, looking at the hardware presentation back in September:

Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.
http://www.neogaf.com/forum/showpost.php?p=42312104&postcount=1

28 - 33W isn't "roughly 40W" - it's roughly 30W. What's with the 30% difference? What if it's related to Arkam's statements that initial devkits were running at 1GHz/ 400MHz? Maybe games designed for firmware 1.x run at lower clock speeds and no currently released game or application has the chips running at the 1.25/ 550 clocks Marcan discovered? That difference in clock speed could potentially explain the 10W difference.
 
I wonder how much stuff is in the Gamepad.

It obviously runs the TV remote stuff independently of the Wii U itself, so how much could this firmware be expanded upon to offer more functionality?

I hope the TV remote becomes more customisable in the future to control receivers, or toggle various remote configs.

honestly, i'm so used to activating my devices the way i do, that i forget to use the WiiU-pad.

So.. in the end, it's kinda useless for me. Maybe when it becomes easy to use on all my devices it would be usefull.
 

Log4Girlz

Member
The measured power consumption seems weird, looking at the hardware presentation back in September:


http://www.neogaf.com/forum/showpost.php?p=42312104&postcount=1

28 - 33W isn't "roughly 40W" - it's roughly 30W. What's with the 30% difference? What if it's related to Arkam's statements that initial devkits were running at 1GHz/ 400MHz? Maybe games designed for firmware 1.x run at lower clock speeds and no currently released game or application has the chips running at the 1.25/ 550 clocks Marcan discovered? That difference in clock speed could potentially explain the 10W difference.

Those 10 watts should make a monstrous difference.
 
28 - 33W isn't "roughly 40W" - it's roughly 30W. What's with the 30% difference? What if it's related to Arkam's statements that initial devkits were running at 1GHz/ 400MHz? Maybe games designed for firmware 1.x run at lower clock speeds and no currently released game or application has the chips running at the 1.25/ 550 clocks Marcan discovered? That difference in clock speed could potentially explain the 10W difference.

Maybe Digital Foundry measured the power consumption after power supply, but then again PS3 slim has an internal power supply, so leaving that out is more difficult.
 

Thraktor

Member
Read the article

Their logic here is somewhat flawed. If every game draws 32W, and even the Internet browser draws 32W, then the logical deduction would be that there is very little in the way of power gating, or other mechanisms to reduce power draw when at less than full load. You can't really make any deductions from it as to whether there's more performance to be squeezed from the console.

I also would find it incredibly unlikely that current games are running on anything less than the full 1.25GHz/550MHz clocks.
 

Log4Girlz

Member
Those 10W are one third of the overall power consumption. That is quite a difference.

Still a rather small amount of power. I mean, if you had a 200 watt power supply and were not utilizing 30%, the performance lost is far more than say, 10 watts, even if the percentage is the same.
 

Donnie

Member
Still a rather small amount of power. I mean, if you had a 200 watt power supply and were not utilizing 30%, the performance lost is far more than say, 10 watts, even if the percentage is the same.

Wether you're dealing with a 40w console only using 75% of its power or a 1000w PC only using 75% of its power the result of using that extra % is the same no matter what the actual number of watts. You're still going to increase performance by 30%+, which is a pretty big increase.

Though my own view is simply that this is a case of variance in power usage rather than clock speeds. From early power usage tests 165w was reported as XBox 360's power usage under load. Later tests saw games using more like 190w, a difference of 15%.
 

lherre

Accurate
The measured power consumption seems weird, looking at the hardware presentation back in September:


http://www.neogaf.com/forum/showpost.php?p=42312104&postcount=1

28 - 33W isn't "roughly 40W" - it's roughly 30W. What's with the 30% difference? What if it's related to Arkam's statements that initial devkits were running at 1GHz/ 400MHz? Maybe games designed for firmware 1.x run at lower clock speeds and no currently released game or application has the chips running at the 1.25/ 550 clocks Marcan discovered? That difference in clock speed could potentially explain the 10W difference.

The kits with final speeds (at least the gpu at 550 mhz) are 1.5 years old (more or less summer 2011) so it doesn't make sense use older kits than this ones with "gimped" performance.
 

Donnie

Member
What is 30% of 30, what is 30% of 200? You are losing 2 Wii U's worth of performance in the latter scenario.

Ok first of all you can't measure performance between two different devices in watts. Secondly no matter what you compare it to its always going to be a 30% increase which is large, no matter what the increase in wattage.

The idea that increasing a 30w consoles power consumption to 40w wouldn't be a big deal because other devices increased by the same percentage would see a larger increase in wattage is crazy.
 

wsippel

Banned
Still a rather small amount of power. I mean, if you had a 200 watt power supply and were not utilizing 30%, the performance lost is far more than say, 10 watts, even if the percentage is the same.
Yes, 30% of 200 is more than 30% of 30. No shit Sherlock. It's still 30%. Same relative difference. In fact, the relative difference in this latter case would be higher, as it's all for the chipset. Periphery stays the same, obviously. If we assume 10W for the periphery, that leaves 20W/ 190W for the actual chipset (I'm not using your exact example as it's backwards: x - y% + y% != x). One third more power overall means 30W just for the chipset in the former case (+50%), 256W in the latter (+35%).
 
Performance doesnt seem to be choked due to GPU intensive tasks, does it? Thought I read where slowdowns occur in areas with lots of alpha, or areas with lots of normal maps or high res textures, in game, which would be a RAM bandwidth issue

Yes, 30% of 200 is more than 30% of 30. No shit Sherlock.

Christ dude, calm down.
 

Durante

Member
I think it's great that they did that, as I said earlier in the thread I was very unsure of whether the system did more aggressive power gating or not, so it was hard to draw any conclusions from just 3 measurements. Now we see that there does seem to be a pretty hard cap on the power draw.

Their logic here is somewhat flawed. If every game draws 32W, and even the Internet browser draws 32W, then the logical deduction would be that there is very little in the way of power gating, or other mechanisms to reduce power draw when at less than full load. You can't really make any deductions from it as to whether there's more performance to be squeezed from the console.

I also would find it incredibly unlikely that current games are running on anything less than the full 1.25GHz/550MHz clocks.
Yeah, I agree with all of this.

Performance doesnt seem to be choked due to GPU intensive tasks, does it? Thought I read where slowdowns occur in areas with lots of alpha, or areas with lots of normal maps or high res textures, in game, which would be a RAM bandwidth issue
The alpha stuff is the most puzzling remaining aspect of Wii U performance to me at this point. One potential explanation that was put forth is that doing correct alpha blending using traditional methods would require polygon sorting on the CPU, which could be the reason for the slowdown. I guess this could be tested by checking whether the slowdown is dependent on the amount of screen area covered by blended effects or not.
 

Log4Girlz

Member
Ok first of all you can't measure performance between two different devices in watts. Secondly no matter what you compare it to its always going to be a 30% increase which is large, no matter what the increase in wattage.

Inefficiencies are large problems on larger scales. An Ipad can use upto what, 4 watts at full loads? We're talking about like 2 Ipads and some change worth of wattage. The readily observable graphics increase that would bring to the Wii U would be far less noticeable than a 30% power bump in a 200 watt system.
 

Donnie

Member
Inefficiencies are large problems on larger scales. An Ipad can use upto what, 4 watts at full loads? We're talking about like 2 Ipads and some change worth of wattage. The readily observable graphics increase that would bring to the Wii U would be far less noticeable than a 30% power bump in a 200 watt system.

Thats like saying that if you increase the quality of a character model from 1000 polys to 1300 polys it'll be a less noticable difference than if you increase a 6700 poly model to 8700 polys. Not only is it very questionable, but its also quite a pointless discussion. You're increasing the systems power usage by 30%, all we can say about that is that it should allow for a large increase in performance for that system. That won't change no matter how many Ipads you compare the power increase to.

I don't even believe the power usage will increase to 40w and I don't believe that kind of power usage increase is required for better looking games either, so I probably shouldn't bother with this discussion. Its just that the idea of a 30% increase being seen as trivial because of the wattage increase not sounding big in comparison to other devices is ludicrous. I mean the original XBox 360 uses about 190w under load. If we increase that by 30% the difference is 57w, not that far off two WiiU's going on current power draw. Does that mean if 360 was overclocked by 30% it would suddenly be over twice as powerful as WiiU?, of course not because no matter what the wattage increase its still just a 30% increase.
 
Do you think I would have bothered writing a lengthy explanation if I were upset?

Length of post is the greatest measurement of MAD.

You seem to have a 45 MOPS (Mad operations per second). At 3 lines of anger, you roughly are at a meager 135 MOPS.

Your MOP per line efficiency can probably increase even more so, and with the amount of space you have on a page, you can increase the MOPS per White Space. You're currently at a rough 32ws (white space), if you can bump that up to 40, that would be a 30% increase.
 

Log4Girlz

Member
Thats like saying that if you increase the quality of a character model from 1000 polys to 1300 polys it'll be a less noticabble difference than if you increase a 6700 poly model to 8700 polys. Not only is it very questionable, but its also quite a pointless discussion. You're increase the systems power usage by 30%, all we can say about that is that it should allow for a large increase in performance for that system. That won't change no matter how many Ipads you compare the power increase to.

What is more noticeable, a 30% increase in fuel efficiency in a Prius or a 30% increase in fuel efficiency in a full size pick up? You will be saving hundreds if not thousands of dollars more per year on the pick up in fuel costs.

Increasing the Wii U's performance 30% will make a negligible impact on the graphics. Increasing some monster 200 watt next-gen platform's performance 30% could see dramatic increases in scene complexity which the Wii U could only dream of.
 

wsippel

Banned
Length of post is the greatest measurement of MAD.

You seem to have a 45 MOPS (Mad operations per second). At 3 lines of anger, you roughly are at a meager 135 MOPS.

Your MOP per line efficiency can probably increase even more so, and with the amount of space you have on a page, you can increase the MOPS per White Space. You're currently at a rough 32ws (white space), if you can bump that up to 40, that would be a 30% increase.
Nah, that doesn't work for me. My posts get shorter the more angry I am. Different architecture, my MPU (MAD processing unit) has a built-in compression stage. :)
 

AmFreak

Member
The measured power consumption seems weird, looking at the hardware presentation back in September:

Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.

http://www.neogaf.com/forum/showpost.php?p=42312104&postcount=1

28 - 33W isn't "roughly 40W" - it's roughly 30W. What's with the 30% difference? What if it's related to Arkam's statements that initial devkits were running at 1GHz/ 400MHz? Maybe games designed for firmware 1.x run at lower clock speeds and no currently released game or application has the chips running at the 1.25/ 550 clocks Marcan discovered? That difference in clock speed could potentially explain the 10W difference.

The WiiU has 4 USB 2.0 ports, one of them can consume up to 2.5W (5V x max. 500mA).
4 USB ports make that up to 4 x 2.5W = 10W.
 

chaosblade

Unconfirmed Member
What is more noticeable, a 30% increase in fuel efficiency in a Prius or a 30% increase in fuel efficiency in a full size pick up? You will be saving hundreds if not thousands of dollars more per year on the pick up in fuel costs.

Increasing the Wii U's performance 30% will make a negligible impact on the graphics. Increasing some monster 200 watt next-gen platform's performance 30% could see dramatic increases in scene complexity which the Wii U could only dream of.

But... aren't you doing this comparison backwards?

I mean, let's look at it the other way. Remove 30% of the WiiU's performance, that would be a pretty significant step back would it not? Far from negligible, just like adding 30% is not negligible.
 
Top Bottom