• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Mastperf

Member
That statement is simplified to the point of being wrong. OoOE still require well optimized code. OoOE CPUs don't stall easily when waiting for operands, that's pretty much it.
So what exactly is "unoptimized" about this 360/ps3 code that wouldn't take advantage of the WiiU's cpu?
 

Alexios

Cores, shaders and BIOS oh my!
So what exactly is "unoptimized" about this 360/ps3 code that wouldn't take advantage of the WiiU's cpu?
It's not unoptimized, it's optimized for different CPUs (and whole systems, like not made to utilize different sound processors instead offloading all that work to the main CPU, not utilizing all edram but instead expecting faster ram alone, etc). OoOE is one difference that means it doesn't need to be as high in Ghz to be as capable, there could be other differences requiring special optimization (which could be why some devs have commented on it being slow, perhaps before realizing the possibilities for optimizations on top of the better performance per hz). Like the optimizations they didn't, for example, do for Skyrim which meant the PC version performed sub par until a modder partially fixed it giving it a 40% boost depending on set up and claiming that had it been done properly by Bethesda it could yield an 100% boost (Bethesda then incorporated the fixes in a patch but not any better than the modder had done, in fact they probably just reproduced his work). Or the WiiU CPU is just that weak and no optimization and using the other available processors can save it 100%. I'm neither a hardcore tech head to fully understand this nor have I seen exact specs and their differences to the PS360 hardware posted, I'm just saying these things are plausible based on real world examples in different cases (if you look up skyboost or whatever you can read in detail exactly what the original Skyrim code was lacking that the modder added to make it perform that much better, I don't understand it enough to put it in any more detail than I have without just parroting his words). And of course I'm not saying this is for sure the case (although, regardless of this, for sure we'll see way better games on WiiU eventually, possibly games PS360 couldn't 100% reproduce, even if they could improve areas).
 

AndTAR

Member
Any certainty of apx. when to expect Game Pad spare part availability? Saw in the iFixit teardown that the analog stick units are separate from the main board, so good news for replacing worn-out sticks - if the parts are anywhere to be found..

any good screen protector for my WiiU GamePad?
Most stuff, even cheaper brands, is usually good enough as long as a few basic criteria are met; static adhesive probably the most important one.

If you want something certain to be good, the Hori one is probably the way to go. They have a long history making resistive touch screen protectors for Nintendo.
 

BlankaBR

Banned
any good screen protector for my WiiU GamePad?

imag0359.jpg
 
It's not unoptimized, it's optimized for different CPUs (and whole systems, like not made to utilize different sound processors instead offloading all that work to the main CPU). OoOE is one difference that means it doesn't need to be as high in Ghz to be as capable, there could be other differences requiring optimization not done for the other consoles. Like the optimizations they didn't, for example, do for Skyrim which meant the PC version performed sub par until a modder partially fixed it giving it a 40% boost depending on set up and claiming that had it been done properly by Bethesda it could yield an 100% boost (Bethesda then incorporated it the fixes in a patch but not any better than the modder had done, in fact they probably just reproduced his work). Or the WiiU CPU is just that weak and no optimization and using the other available processors can save it 100%. I'm neither a hardcore tech head to fully understand this nor have I seen exact specs and their differences to the PS360 hardware posted, I'm just saying these things are plausible based on real world examples in different cases (if you look up skyboost or whatever you can read in detail exactly what the original Skyrim code was lacking that the modder added to make it perform that much better, I don't understand it enough to put it in any more detail than I have without just parroting his words).

There is a stark difference in raw horsepower, apparently. Any optimization would only bring down the amount of things being processed, like that 'optimization' you posted. That optimization only helps slower CPUs not process so much.
 
Bail out to blu's thread. Bomb dropped there that Nintendo apparently haven't synced the clocks this time around. My mind is blown. All clock speed estimates out the window.

Holy fuck, finally sifted through this thread and got to this post.

Holy fucking fuck.

EDIT: Found link. BAIL OUT!
 

SS4Gogita

Henshin!
There's one thing that keeps getting touted that I don't understand. Iwata said the output under full load would be 75 watts and that stuff like the menu/apps/less demanding games would be around 40. So where is the 30-35 coming from, and why are those figures being used to derive the possible specs of the various components instead of this max figure?
 

FLAguy954

Junior Member
There's one thing that keeps getting touting that I don't understand. Iwata said the output under full load would be 75 watts and that stuff like the menu/apps/less demanding games would be around 40. So where is the 30-35 coming from, and why are those figures being used to derive the possible specs of the various components instead of this max figure?

I remember this being said as well, some speculators here are being too selective with there information I'm afraid.
 
3DS afaik is synced too.

I find it strange after so many generations that Nintendo
is not using some kind of multiplier to balance their system.
Maybe due to the multi-core architecture its not so apparent
where the balance was made.

As far as I understand it, at certain point the benefit gained of the lowered latency from running different chips "in sync" is less than what it would be from running each chip just at maximum speed. Maybe multible cores and long pipelines in modern processors are also getting less benefit from that...
 

Alexios

Cores, shaders and BIOS oh my!
There is a stark difference in raw horsepower, apparently.
So now it's not only of similar power to the current systems but "starkly" less powerful? News to me.
That optimization only helps slower CPUs not process so much.
No, optimization helps any CPUs, like powerful PCs in my example, which go from chugging in certain Skyrim areas to comfortably running them, or less powerful PCs which go from chugging almost everywhere regardless of settings to playing comfortably almost everywhere with high settings (as in my particular case). While it was a PC specific issue despite the game being multi platform (of course the consoles didn't run it as good, but not because of such a blatant error/omission, but because even after catering to them they still tried to push too much in areas - had the issue been present across all versions then the would have also released a PS360 patch improving performance as dramatically, or most likely suffer even worse performance on both systems requiring further downgrades) showing you can optimize for different components separately rather than optimize once and have it apply to everything. I don't know if it means you then process "less" but if you do so by getting rid of useless shit wasting power, not actually dumbing down any game logic functions, which the mod (and then official patch) didn't seem to do, then it's something that should be done regardless, rather than expect the hardware to be powerful enough to run it well regardless of if it's made correctly or not. Lower end games should optimize too rather than brute force, to save on energy and lower hardware heat at the very least. And if it's something that could have made certain WiiU launch ports perform say 40% better too when adding everything together (from optimization for the given CPU/GPU to taking advantage of the components the other systems either don't have or don't utilize in the same manner), then that would mean they'd be at least on par with the other systems (that already have those engines' code optimized for their quirks over the past gen) and so not cause people to make comments like how WiiU isn't only not more powerful but "starkly" less powerful as there'd be no reason to think so.

I've no idea where this is going. It's jumping all over the place now.
No, it's not jumping anywhere. If you think it jumps now, then you didn't read the last comment, since this basically just explains that a litle further due to your weird to me answer about how optimization only helps low end components. You jumped to Wii U hardware descriptions I didn't get into at all in that post while replying to me if anything.
 

Diablos54

Member
There's one thing that keeps getting touted that I don't understand. Iwata said the output under full load would be 75 watts and that stuff like the menu/apps/less demanding games would be around 40. So where is the 30-35 coming from, and why are those figures being used to derive the possible specs of the various components instead of this max figure?
The 30-35 number is coming from NSMBU IIRC. Naturally, that's the most intensive Wii U game and thus the best one to base this number on.
 
So now it's not only of similar power to the current systems but "starkly" less powerful? News to me.No, optimization helps any CPUs, like powerful PCs in my example, which go from chugging in certain Skyrim areas to comfortably running them, or less powerful PCs which go from chugging almost everywhere regardless of settings to playing comfortably almost everywhere with high settings (as in my particular case). While it was a PC specific issue despite the game being mult iplatform (of course the consoles didn't run it as good, but not because of such a blatant error/omission, but because even after catering to them they still tried to push too much in areas) showing you can optimize for different components separately rather than optimize once and have it apply to everything. I don't know if it means you then process "less" but if you do so by getting rid of useless shit wasting power, not actually dumbing down any game logic functions, which the mod (and then official patch) didn't seem to do, then it's something that should be done regardless, rather than expect the hardware to be powerful enough to run it well regardless of if it's made correctly or not. Lower end games should optimize too rather than brute force, to save on energy and lower hardware heat at the very least. And if it's something that could have made certain WiiU launch ports perform say 40% better too when adding everything together (from optimization for the given CPU/GPU to taking advantage of the components the other systems either don't have or don't utilize in the same manner), then that would mean they'd be at least on par with the other systems (that already have those engines' code optimized for their quirks over the past gen) and so not cause people to make comments like how WiiU isn't only not more powerful but "starkly" less powerful as there'd be no reason to think so.

I've no idea where this is going. It's jumping all over the place now.

Some of the slowdown in the Wii U games are apparently not all CPU-triggered, and this Skyrim optimization is PC CPU-based (x87/SSE opcodes) and lower the CPU load.
 

ikioi

Banned
So how do you explain the Wii U version of Mass Effect 3 running at equal graphical quality to the Xbox 360 and PS3, yet maintaining a higher frame rate then the PS3?

If the Wii U was inferior technically, that wouldn't be the case.
 

Rolf NB

Member
Even OoOE CPUs still do what they're told. They're certainly more efficient, but Rolf's post made it sound like optimized code isn't important for OoOE CPUs - and that's wrong.
But it's true. You don't optimize for OoOE.

There's a ton of things compilers can do to remove redundancies from generated code. Constant folding and propagation, branch elimination, small-body inlining, aligning code and data, value lifetime analysis, register allocation, yadda yadda yadda. Yes, you want to enable them. No, none of them are specific to OOOE or not OOOE. They are universally beneficial, and don't even depend on the underlying instruction set at all.

It's only for in-order archs that you do have to go further and individually have to move around instructions to maximize issue rates. This is not necessary for OOOE archs. They do it themselves. You just run your compiler with -O2 and things will be fine.

Show me one documented compiler switch in GCC or whatever that specifically optimizes for OoOE and I'll rest my case. I'm pretty sure no such thing exists.

And while we're at it, modern C/C++ compilers are frighteningly mature. There's no innovations on the horizon that promise more than maybe a couple percent more performance on any significant codebase. Haven't been for over five years.

And just so we understand each other, SIMD units, sure, they change more frequently, and need new compiler support or dedicated assembly fiddling. Which has nothing to do with OOOE either.
 
So how do you explain the Wii U version of Mass Effect 3 running at equal graphical quality to the Xbox 360 and PS3, yet maintaining a higher frame rate then the PS3?

If the Wii U was inferior technically, that wouldn't be the case.

The PS3 has the weakest GPU of all 3 by far. Games HAVE to make heavy yet efficient use of the CELL to keep up with the 360. Comparatively, the PS3 GPU is archaic.
 

MDX

Member
The PS3 has the weakest GPU of all 3 by far. Games HAVE to make heavy yet efficient use of the CELL to keep up with the 360. Comparatively, the PS3 GPU is archaic.


But that was the whole point wasn't it?
The CELL being used primarily to develop games on.
So why would the PS3 need a strong GPU?
 

pottuvoi

Banned
But that was the whole point wasn't it?
The CELL being used primarily to develop games on.
So why would the PS3 need a strong GPU?
If ps3 would have had geforce 8xxx series GPU, the games would have looked a lot better and more of the Cell could have been used for other things which affect the actual gameplay.
Now a lot of it is used to prepare date so that GPU can do it's job properly.
 

Mithos

Member
But it's true. You don't optimize for OoOE.

But you optimize when taking code running on Triple Core 3,2Ghz In Order Execution CPU and move to Triple Core 1,5Ghz (?) Out of Order Execution CPU, right?
Am I wrong to think you're going to run into issues unless you do?
 
But you optimize when taking code running on Triple Core 3,2Ghz In Order Execution CPU and move to Triple Core 1,5Ghz (?) Out of Order Execution CPU, right?
Am I wrong to think you're going to run into issues unless you do?

Nope, you are right. Looks like if the graphics engine is CPU heavy on PS3 and X360, some stuff has to be moved to GPU side when doing a Wii U port. Looking at the launch titles, that may require some actual work...
 

Durante

Member
I think a big problem that people have with evaluating specs and technical information is that they don't take into account 2 things:
(1) the relative performance of the individual components of the system. With the complaints about the Wii U CPU this issue is getting less common though.
(2) the relative load individual games place on the various components sometimes even switching between radically different load profiles during a frame.

That's why you can't just say that one system is "better" than another based on the results of a single game, or even a group of games.

At best, you can take the results from a large number of games, speculate as to (or obtain information about) their load profiles, look at their performance in different scenarios and then use this as the basis for estimations about individual hardware components.

I realize that's a lot to ask for from the average forum-goer.

But it's true. You don't optimize for OoOE.

There's a ton of things compilers can do to remove redundancies from generated code. Constant folding and propagation, branch elimination, small-body inlining, aligning code and data, value lifetime analysis, register allocation, yadda yadda yadda. Yes, you want to enable them. No, none of them are specific to OOOE or not OOOE. They are universally beneficial, and don't even depend on the underlying instruction set at all.

It's only for in-order archs that you do have to go further and individually have to move around instructions to maximize issue rates. This is not necessary for OOOE archs. They do it themselves. You just run your compiler with -O2 and things will be fine.

Show me one documented compiler switch in GCC or whatever that specifically optimizes for OoOE and I'll rest my case. I'm pretty sure no such thing exists.

And while we're at it, modern C/C++ compilers are frighteningly mature. There's no innovations on the horizon that promise more than maybe a couple percent more performance on any significant codebase. Haven't been for over five years.

And just so we understand each other, SIMD units, sure, they change more frequently, and need new compiler support or dedicated assembly fiddling. Which has nothing to do with OOOE either.
Good post. Though I would argue that there are architecture specific optimizations also for different OoE architectures (that's why gcc has the mtune switch) -- but you are right that their effect is usually miniscule.
 

User Tron

Member
But it's true. You don't optimize for OoOE.

..

Show me one documented compiler switch in GCC or whatever that specifically optimizes for OoOE and I'll rest my case. I'm pretty sure no such thing exists.

Yes you can optimize code for oooe (eg. loop unrolling, helping branch prediction, etc) which helps the compiler the generate better code. But real question is if this kind of optimization is worth the efforts. Only in rare cases you get significant improvements greater than a few percent. I don't know about gaming developers but programmers of normal apps don't do this kind of stuff because it is not worth it. So while you can optimize for oooe it is defiantly not the magic fairy dust some here try to make it. GPGPU on the other hand might be it :)
 

wsippel

Banned
But it's true. You don't optimize for OoOE.
True, you don't optimize for OoOE. You can/ should still optimize code for certain chips/ architectures. And that holds true for the Wii U as well. We don't even know if GHS 5 can auto-vectorize code for paired singles for example. GCC supports it, at least for the 750CL.
 

z0m3le

Banned
We got the exact die sizes of the components, and some power measurements. That's all so far.

The power measurements are interesting, combined with the GPU die size, you are looking most likely at a GPU as large as 140mm^2, or as small as 120mm^2.

Considering the chip is custom, the only thing we can gather is that it isn't likely to have less than 400 shaders, and at 137mm^2 it fits 640 shaders @ 40nm.

Thanks to the power measurements though, we don't have to worry about the shaders, we can focus on the GPU efficiency. R700's desktop part @ 40nm is 12GFLOPs per watt, but that is with a GDDR5 controller, a bunch of extra stuff and 1GB RAM, which Wii U doesn't use. So that number would grow, and if it's using a mobile R700 part, it's more likely somewhere between 500GFLOPs on the low end, and as much as 768GFLOPs on the high end. (just based on what R700 can do with 25-30watts on it's mobile 40nm parts.)
 

Alexios

Cores, shaders and BIOS oh my!
I dunno why you'd continue to be this optimistic about the specs every time you get some room to speculate.
 

z0m3le

Banned
I dunno why you'd continue to be this optimistic about the specs every time you get some room to speculate.

I'm an optimistic person? I don't think it matters either way, it's not like it's really going to effect 3rd parties if the GPU is 300GFLOPs (nearly impossible) or 800GFLOPs.

Besides, what else am I suppose to do in a speculation thread? pull numbers out of the air, we have been told that it's a custom part that doesn't relate to R700s already produced, and that the clocks are not multiplied, so almost all the work that has been done over the last year has been for nothing really.
 
I'm an optimistic person? I don't think it matters either way, it's not like it's really going to effect 3rd parties if the GPU is 300GFLOPs (nearly impossible) or 800GFLOPs.

Besides, what else am I suppose to do in a speculation thread? pull numbers out of the air, we have been told that it's a custom part that doesn't relate to R700s already produced, and that the clocks are not multiplied, so almost all the work that has been done over the last year has been for nothing really.

uh... why wouldnt a beefier GPU affect 3rd parties?
 

z0m3le

Banned
They'd have to develop on the Wii U first for it to affect them.

This mostly, also it's not like you couldn't produce your game with engines designed to scale down to mobile, so 300GFLOPs wouldn't even stop devs, as last gen consoles have shown.

Heck when you can get Doom 3 to run on Voodoo 2 (that is a game that came out in 2004 running on a GPU from the 90s) you have to assume developers not making games for the console mainly because they believe the audience isn't there. Wii however was a different case, it's architecture wasn't programmable shaders, without that, Wii could not be ported to. Good news is Wii U will get more ports than Wii, that is nearly a guarantee.
 
They'd have to develop on the Wii U first for it to affect them.

Not when it comes to image quality (resolution, AA, AF, etc.). With enough raw power, this comes for free without extra development time (look at PCs). The low memory bandwidth can get in the way though.

Wii however was a different case, it's architecture wasn't programmable shaders, without that, Wii could not be ported to.

It could be ported to, CoD maybe being the most prominent example. It's just more difficult.
And as you said, Doom 3 shows something similar when it could be run on a Voodoo 2.
 

z0m3le

Banned
Not when it comes to image quality (resolution, AA, AF, etc.). With enough raw power, this comes for free without extra development time (look at PCs). The low memory bandwidth can get in the way though.



It could be ported to, CoD maybe being the most prominent example. It's just more difficult.
And as you said, Doom 3 shows something similar when it could be run on a Voodoo 2.
The cod games were up ports from their ps2 engines, not downports from 360.
 

Alexios

Cores, shaders and BIOS oh my!
The cod games were up ports from their ps2 engines, not downports from 360.
There was no MW/WaW/Blops/MW3 on PS2. Maybe they mentioned the engine is of that string but the only PS2 mention I remember is for COD3 which was a diff game on PS2/Wii than on PS360 so I'd need to see a source for that.

Not when it comes to image quality (resolution, AA, AF, etc.).
Nobody was talking about that alone. If we get PS2 graphics with good IQ that's not exactly going to mean PS4Nextbox ports are viable. We need effect geometry and texture complexity capabilities on top of IQ. If anything IQ would be of the things that suffer when porting games from more powerful hardware...
 

z0m3le

Banned
So you think they took the work they did with ps2 and Wii up to the 5th game and threw it away so they could downport from the 360, with the same level of quality of the ps2/wii titles? How does that even make sense. they already had a working Wii engine that they used for 3 prior games, so let's throw that away and try to downport from the 360 with all the hurdles of fixed function shaders.
 

Alexios

Cores, shaders and BIOS oh my!
So you think they took the work they did with ps2 and Wii up to the 5th game and threw it away so they could downport from the 360, with the same level of quality of the ps2/wii titles? How does that even make sense. they already had a working Wii engine that they used for 3 prior games, so let's throw that away and try to downport from the 360 with all the hurdles of fixed function shaders.
But... I didn't ask for what you think/assume which was already obvious...

And yes, I think it could perhaps be easier (still, not easy) to downport newer games from scratch than to retrofit an engine that never run those with all the functions needed to reproduce them without remaking everything (not asset wise) under its quirks which could be tougher with little staff and resources. Dead Rising did the latter. As did COD3. Neither approached the games they were supposed to port. Later COD ports contrary to that were as close to 1:1 as you could get on Wii (for reasons like you stated). Shader effects were mostly missing completely rather than utilize TEV to reproduce them as closely as possible (not very close) anyway.

But I didn't assume it's true, I just asked for a source indicating otherwise, to inform myself, and got pretty much nothing in return. Sadface.
 

z0m3le

Banned
But... I didn't ask for what you think/assume which was already obvious...

And yes, I think it could perhaps be easier to downport new games from scratch than to retrofit an engine that never run those games with all the functions needed to reproduce them without remaking everything (not asset wise) under its quirks which could be tough with that little staff and resources. Dead Rising did the latter. As did COD3. Neither approached the games they were supposed to port. Later COD ports contrary to that were as close to 1:1 as you could get on Wii (for reasons like you stated). Shader effects were mostly missing completely rather than utilize TEV to reproduce them as closely as possible (not very close likely) anyway.

But I didn't assume it's true, I just asked for a source indicating otherwise, to inform myself, and got pretty much nothing in return. Sadface.

Sorry, I was on my phone. If I find a source for it, I'll give you one, having said that. You were the one who stated it as fact and I corrected you from my phone.
 
The power measurements are interesting, combined with the GPU die size, you are looking most likely at a GPU as large as 140mm^2, or as small as 120mm^2.

Considering the chip is custom, the only thing we can gather is that it isn't likely to have less than 400 shaders, and at 137mm^2 it fits 640 shaders @ 40nm.

Thanks to the power measurements though, we don't have to worry about the shaders, we can focus on the GPU efficiency. R700's desktop part @ 40nm is 12GFLOPs per watt, but that is with a GDDR5 controller, a bunch of extra stuff and 1GB RAM, which Wii U doesn't use. So that number would grow, and if it's using a mobile R700 part, it's more likely somewhere between 500GFLOPs on the low end, and as much as 768GFLOPs on the high end. (just based on what R700 can do with 25-30watts on it's mobile 40nm parts.)

I like your enthusiasm Z0m3le :).

If the GPU gets anywhere close to 700 GFLOPs then first party games will look absolutely phenomenal and next gen third party ports should be possible (esp if Sony, MS or both are going with a much more GPU centric console for PS4 / 720).

Im going to stick with a high end estimate of 500 though, still really good imo and 2x Xenos on paper which should reach 3x or 4x with it's 2011 architecture and effects.
 

Alexios

Cores, shaders and BIOS oh my!
Sorry, I was on my phone. If I find a source for it, I'll give you one, having said that. You were the one who stated it as fact and I corrected you from my phone.
Huh? What did I state as fact? Only that I need to see a source for your claim because the only Wii to PS2 source I remember was for COD3, a different game to the other versions. I didn't lie about what I remember, no, and I acknowledged there may be things I haven't seen.. Maybe you confuse me with the other guy. But I think porting the CODs in a different engine to put them on Wii would qualify as what he said, ie, that porting them was hard, anyway. As long as the game is roughly the same unlike 3 and DR of course.
 

z0m3le

Banned
Huh? What did I state as fact? Only that I need to see a source for your claim because the only Wii to PS2 source I remember was for COD3, a different game to the other versions. I didn't lie about what I remember, no, and I acknowledged there may be things I haven't seen.. Maybe you confuse me with the other guy. But I think porting the CODs in a different engine to put them on Wii would qualify as what he said, ie, that porting them was hard, anyway. As long as the game is roughly the same unlike 3 and DR of course.

You are right, I did mistake you for the other guy. I was correcting his post. Wii never got the later call of duties until modern warfare 3. I thought this was made with the PS2 engine they had used for world at war, but considering performance was much worse, I think I am possibly wrong.

I do think that if they did that, they are a bit insane, as they threw away a ton of work they already had. I couldn't find a source that says it's running on the PS2 engine. though there were a few reviews that said it was slightly upgraded from that engine, they could just be talking about IW engine itself and not the version they used for Wii in the earlier titles.
 
Top Bottom