• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Donnie

Member
Where did this come from? I have not seen one statement that backs this up. No where does havok said they put this on the cpu because its needed to port ps360 games.

You come here saying well the reason was xxx why it on the cpu but based that on nothing released by havok.

Dear god, I didn't say anything of the sort..

I said just because last we heard Havoc was only running on the CPU that does not mean you can assume they won't also put a hardware path in their for the GPU or that the GPU isn't capable. Do you understand me?

Honestly its amazing if you're really this difficult to talk to. I mean I've explained this 3 or 4 times now and you're still not getting it.
 

USC-fan

Banned
is there a compute shader version of havok yet? i honestly don't know the answer to this, but i do know for a fact that havok has been running on power pc processors for quite a while in software.

there is probably hardly any work required to port havok to the Wii U's CPU, if there is any work at all. porting havok to use compute shaders is going to be a lot of work. it hasn't even happened on PCs yet, to my knowledge.

my graphics card could eat havok alive, yet all those game developers choose to run it on my processor. it must be because my graphics card is crap right?

or, you know, the CPU runs it without any trouble, so why completely rewrite the thing to run using compute shaders.

Yeah it been out. They talk about this since 2006.

http://gpgpu.org/2006/03/17/havok-and-nvidia-present-havok-fx-at-gdc-2006

here it running on a amd gpu in 2009
http://www.youtube.com/watch?v=MCaGb40Bz58


Do you see the problem with this? You're doing this based on a 55nm part. We've know for awhile now it was underclocked. We have an idea of what of what it was downclocked to. Why do you think I've been talking about 600GFLOPS for the final? No one said it would be 1TF if they just used a smaller process. You keep harping on the R700 and treat it like all the limitations won't be addressed. Like I said back in that post, you treat things in absolutes. And it's hypocritical to sit there and say "people just make things up to fit whatever there point is...." when you've been one of, it not the worst at doing this.

I put the wrong part. The 12 w per glfop is based on the 40nm version of the r700 Radeon the HD 4770. The 55nm part is 8.15 glfop per watt. The math is correct. I fix the post.

I base the r700 on everything we know, leak specs. It fit the r700 core and nothing else made by amd. Now you can come up with whatever crazy idea to fit the specs you make up but again this is not based on info that has leaked. You can say 600 glfop but i do not see the power there to get any where close to that. The only way you been getting close to that number is by jumping generation of amd gpu cores.
 

Kenka

Member
The problem with PSUs being used close to their max ratings is that due to their efficiency dropping, the amount of heat generated as byproduct increases. Heat shortens the lifespan of electronic/electrical parts, so you can safely claim that a PSU that is kept closer to its max rating will last shorter than one kept at 50% of its max rating. But how long a PSU can last while at its max rating is entirely dependent on a bunch of factors: what currents? what voltages? what components? Theoretically, nothing stops me from manufacturing a PSU that is physically capable of 100W, but artificially limit it to 50W (by limiting its max current). Hey look, I just made an ultra durable PSU that can be pushed to its max for as long as I please!
I get your point but that's a bit

iwlvIKzrDEXWW.gif
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Let me correct you: Iwata said that the WiiU can draw up to 75 W. A PSU rated a 75 W must be 100% efficient at peak power usage and this is not possible.
The WiiU is not meant to run at 75W for prolonged periods of time. It can most likely hit that at bootup, with USB bus-powered devices attached, with radios emitting, etc. After bootup, though, the draw will drop from 75W, even if all the original loads (USB, etc) are still there. And once you remove the 'non-typical' loads (USB for instance), it's typical draw down to 40's. A 75W-rated PSU is perfectly fine for the purpose - it will last you long enough.
 

pottuvoi

Banned
So with 57w what kind of horsepower can we expect? are there any comparable chips or is it hard to compare due to these chips being customized
If we just take something like high end AMD radeon 7970 and that it at maximum uses something like ~200W for its 3.8TFlops, this results 19Gflops/Watt, which is very good.
57 Watts would make 1083GFlops.

So 500-700 GFlops would be my guess for the GPU.
 

Kenka

Member
The WiiU is not meant to run at 75W for prolonged periods of time. It can most likely hit that at bootup, with USB bus-powered devices attached, with radios emitting, etc. After bootup, though, the draw will drop from 75W, even if all the original loads (USB, etc) are still there. And once you remove the 'non-typical' loads (USB for instance), it's typical draw down to 40's. A 75W-rated PSU is perfectly fine for the purpose.
OK, then. I didn't know that general thing. Alright, makes your PSU a valid one.
 
Woaw, this is a USB cable that goes in? The box looks tiny already. Form factor and power were a non-issue back then, they only problem reaming was/is manufacturing costs. But I wonder, why is Nintendo so anal about form factor? Why is it they wanna make their home consoles so small? It impacts the entire design.

They want the Wii U to be in the living room. Small and minimalistic devices have less chance to be rejected by someone in the family to be in the living room.
 

Kenka

Member
57 Watts would make 1083GFlops.

So 500-700 GFlops would be my guess for the GPU.
Iwata said that WiiU draws 40 W in general, GPU draw included naturally. 57 is more than that. I'd love to hear someone tell me that WiiU's GPGPU has an efficiency of 30 GFLOPS/W :p
 

THE:MILKMAN

Member
Let me correct you: Iwata said that the WiiU can draw up to 75 W. A PSU rated a 75 W must be 100% efficient at peak power usage and this is not possible.

Thus, the PSU must be rated higher in any cases. But let's say that for the Wii U, 60 W is the normal power draw. In agreement with your argument, the PSU should be rated then at 120 W 60 W being 50% of 120 W).

Well until proven otherwise or someone tell's me Iwata is a electrician then I think I'll stick to the theory that the 75W figure is the PSU rating.

I earlier provided a link to a Sony spec sheet that states the PS3 consumes 190W and that is not true is it. I don't think the CEO's/PR understand this stuff any more than we do TBH.
 
Well until proven otherwise or someone tell's me Iwata is a electrician then I think I'll stick to the theory that the 75W figure is the PSU rating.

I earlier provided a link to a Sony spec sheet that states the PS3 consumes 190W and that is not true is it. I don't think the CEO's/PR understand this stuff any more than we do TBH.

Normally I would agree but Iwata was a developer before being President. Although I do think his comment was a translational error/difference so it's hard to say what he meant exactly
 
I base the r700 on everything we know, leak specs. It fit the r700 core and nothing else made by amd. Now you can come up with whatever crazy idea to fit the specs you make up but again this is not based on info that has leaked. You can say 600 glfop but i do not see the power there to get any where close to that. The only way you been getting close to that number is by jumping generation of amd gpu cores.


Or designing your own CUSTOM gpu based on an original design. I thought that the official statement was that it was a custom designed core by amd. How custom? Well they have been working on it sice 2008 and finished it this year. Why dont they just die shrink the 4770 underclock it and call it a day? Maybe they can do better than that. Keyword here is CUSTOM.
 

Donnie

Member
Iwata said that WiiU draws 40 W in general, GPU draw included naturally. 57 is more than that. I'd love to hear someone tell me that WiiU's GPGPU has an efficiency of 30 GFLOPS/W :p

He said it draws 40w on average, however that doesn't equal peak load and peak load is what we're referencing when talking about GPU power usage numbers. I was merely looking at the maximum I thought possible from a 75w PSU based on other consoles and came up with an absolute max of 57w for the internals (discluding USB's). Heavy was then asking what might be possible with that theoretical power consumption.
 

The_Lump

Banned
So, reading what Iwata said in that quote, why is everyone talking about a 75W PSU (aside from the pic/witness report from a while back)?

He seems to be clearly stating that the Wii U will draw 75W max - under load, with all USB slots in use - but that 40W would be average, depending upon game being played and accessories used.

So, unless I'm misreading, we're not talking about it running off a 75W PSU, but rather a larger PSU and drawing up to 75W max (with that being in the 60-ish% range "safe" draw people have been talking about).

Or am I completely misreading?

EDIT:

So, if the 75W is the max draw from the Wii U, the PSU would be rated at something more like 110W, surely? That gives your safety margin and supplies the machine with its max draw under load and with all USB ports powered.

Yep. If that's the console drawing that power, not the psu, then it will be around 110w (assuming 60% efficiency)

However, 60% is a very low estimate. over 80% is entirely possible (couldn't comment if it plausible, but it's easily possible). And given Nintendo are pushing efficiency as a selling point, wouldn't be surprising if it were more efficient than that.

For comparison, generally PC CPUs have been at least 80% efficient for some time. Not sure how that equates to consoles, but they should really be looking at wasting less than 40% energy in this day and age.
 

Kenka

Member
He said it draws 40w on average, that does not equal peak load and peak load is what we're referencing when talking about GPU power usage numbers.
I don't know. It doesn't matter for the GPU if you play Mario Kart online with four dudes on the couch or alone offline. To me, both scenarii imply the same amount of power drawn by the GPU. If I am wrong, please correct me, thanks.
 
I put the wrong part. The 12 w per glfop is based on the 40nm version of the r700 Radeon the HD 4770. The 55nm part is 8.15 glfop per watt. The math is correct. I fix the post.

Cool, but that's not what was in the dev kit so R700 40nm information isn't necessary. You'll see what I'm saying shortly.

I base the r700 on everything we know, leak specs. It fit the r700 core and nothing else made by amd. Now you can come up with whatever crazy idea to fit the specs you make up but again this is not based on info that has leaked. You can say 600 glfop but i do not see the power there to get any where close to that. The only way you been getting close to that number is by jumping generation of amd gpu cores.

picardfacepalm.jpg

That's the point that I've been trying to get across to you. It's not a stock R700. I don't know how many times I or anyone else have to keep saying that. It's a customized GPU. It was completed in late Dec./early Jan. You can't keep harping on what limitations the R700 has if it's not a stock R700. It comes off like you can't accept console customization and everything is locked in to only how the PC counterpart would be.
 

Earendil

Member
Cool, but that's not what was in the dev kit so R700 40nm information isn't necessary. You'll see what I'm saying shortly.



picardfacepalm.jpg

That's the point that I've been trying to get across to you. It's not a stock R700. I don't know how many times I or anyone else have to keep saying that. It's a customized GPU. It was completed in late Dec./early Jan. You can't keep harping on what limitations the R700 has if it's not a stock R700. It comes off like you can't accept console customization and everything is locked in to only how the PC counterpart would be.

Here, why don't you yell at this for a while. It will make you feel better.

0S404.jpg
 

beril

Member
Don't confuse working "as long" with working as much. The majority of AMD's GPU engineers are quite assuredly engaged in either advancing core technologies or maintaining existing product lines. They didn't dedicate a couple hundred guys for three years just to see how far DX10.1 can be taken 'cause Nintendo asked. No, they spun off some people to adapt an existing architecture to a clients needs, not rebuild a GPU from the ground up.

Isn't the Hollywood pretty much their most sold GPU of all time? I think AMD will spend whatever resources necessary to secure a potential multi billion dollar contract. Obviously it woulnd't have made much sense to start out with an r700 if they were going to change it completely, but there's just no way to know the extent of the modifications yet.
 

ozfunghi

Member
Like i said in the last WUST, don't reply to the trolls.

Anyway, for now i'll stick to my guestimate (460Gflops), at least i can't be disappointed too much. If it turns out to be more powerful, even better.

On the other hand, the RV770 was a 1TF chip, if this was their target, even when downclocked... they could easily have been targetting 600+ Gflops.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her

big_erk

Member
For the one thousandth time: a PSU is rated by its capability to provide power, i.e. a fixed voltage and up to a certain current (normally, unless it's multiple voltages and multiple currents, but that's not the case here). That's what its 'power' is (in Watts). How efficient it is, i.e. how much it draws from the grid while providing its output can only be measured by putting a watt-meter between the PSU and the grid socket. But if a PSU is rated at 75W that's what it can deliver (unless it's a scam). Yes, normally a PSU's efficiency will drop with reaching the max rated output, but that's and entirely different subject from the one of 'how much W a PSU can deliver'.

GAF can be a really funny place sometimes.


I don't remember seeing a pic, but we got an eyewitness testimony, and it said 75W.

Agreed
 

nordique

Member
I forgot to respond to this, but if anything this confirms what we knew a year ago that an underclocked RV770 was the placeholder GPU. And considering that GPU is 55nm, I don't get how some can continually harp on the performance deficiencies of that line when we know Nintendo is not going to use a stock 55nm GPU. And they aren't just going to die-shrink it and be done. With what was in that dev kit being ~576GFLOPs on a "large for today's time" process, it should seem rather likely that the final being on a smaller process in a similarly-size retail case can be slightly over 600GFLOPs and fit in the power envelope given by Iwata. Some of you are seriously underestimating the power of Nintendium.

Could shrinking it down simply have been to keep that performance-target but simply use less power?

Or in every case will there be a power boost with die shrinkage?
 

dumbo

Member
Well until proven otherwise or someone tell's me Iwata is a electrician then I think I'll stick to the theory that the 75W figure is the PSU rating.

My understanding is that you would both be correct.

Iwata's engineers have likely told Iwata that the WiiU will typically draw 40W, they've guaranteed that it will never draw over 75W and have tested the design with a 75W PSU. Based on that, he placed an order to buy 75W PSUs.

When asked by the press, he (quite reasonably) said it will draw a maximum of 75W, as his engineers have guaranteed it won't draw over that.

So, everyone would be kindof correct.
 

Earendil

Member
Could shrinking it down simply have been to keep that performance-target but simply use less power?

Or in every case will there be a power boost with die shrinkage?

You could keep the same performance at a lower wattage, or you could increase the clockspeed giving you more power while keeping the overall wattage the same. It's really hard to say for sure what they did, but if I had to guess, I would say that they did the former.

Put enhanced Broadway skeptics on suicide watch.

As many of us have said before, saying "Enhanced" doesn't tell us much without saying what was modified. I have serious doubts that it's 3 Broadways glued together and upclocked.
 
That's quite a "clarification", more like a retraction/correction, given the tweet it's responding too. o_O

But absolutely not surprising, looks like a few from earlier in the thread were right, Power-based, not Power7, so we're back to "Enhanced Broadway".

back to the Espresso tri-core power pc476fp based "enhanced broadway" clocked at 2.187GHz running at only 8 watts hehe. It is powerful because it is shaped in a triforce configuration making it 3x faster than xenon

To be honest, it is not much of a clarification.

Put enhanced Broadway skeptics on suicide watch.

You do understand that enhanced broadway could mean good things right?
 

nordique

Member
You could keep the same performance at a lower wattage, or you could increase the clockspeed giving you more power while keeping the overall wattage the same. It's really hard to say for sure what they did, but if I had to guess, I would say that they did the former.

Thanks, Earendil
 

Van Owen

Banned
As many of us have said before, saying "Enhanced" doesn't tell us much without saying what was modified. I have serious doubts that it's 3 Broadways glued together and upclocked.

Given that the biggest complaint from devs seems to be the CPU, whatever it is doesn't sound that great.

I'm sure it was done to ensure Wii BC.
 

USC-fan

Banned
Cool, but that's not what was in the dev kit so R700 40nm information isn't necessary. You'll see what I'm saying shortly.



picardfacepalm.jpg

That's the point that I've been trying to get across to you. It's not a stock R700. I don't know how many times I or anyone else have to keep saying that. It's a customized GPU. It was completed in late Dec./early Jan. You can't keep harping on what limitations the R700 has if it's not a stock R700. It comes off like you can't accept console customization and everything is locked in to only how the PC counterpart would be.

Do you understand what the glfop are and how its rated? It seem to be a misunderstanding of what they can change to boost this number. What are you saying they are changing to boost the glfop so high?
 

nordique

Member
Put enhanced Broadway skeptics on suicide watch.

from majority of accounts by those who have worked with the Wii CPU, that would be good news

my understanding is Broadway was an excellent CPU, and in some ways it could perform tasks as well as the 360 CPU (even though it was not as powerful and did not have 6 threads nor 3 cores)

considering the power discrepancy between the 360 and Wii, that to me is very impressive

I personally feel as time goes on with the Wii U development, and developers actually get used to the Wii U CPU, we will see reports coming out saying "Wii U CPU more powerful than previously believed!"

The talk of a slow CPU has been bastardized to no end on public forums, and while I don't deny it might be slow clock-wise, I don't agree to the train of thought that submits it means it is "weak" or a poor CPU

but that won't be talked about until developers actually become used to the system

for example...already IdeaMan has given hints that framerates have been increased, potentially 2-fold. That is not a minor thing, and shows how much devs do not yet understand about Wii U.
 

nordique

Member
Do you understand what the glfop are and how its rated? It seem to be a misunderstanding of what they can change to boost this number. What are you saying they are changing to boost the glfop so high?

do you? gflops aren't everything, and a newer GPU could be a "better" GPU, or even more powerful in many respects, even if its "gflop" count is lower or even half that of an older GPU

consider, it is possible, what bg is suggesting
 

Earendil

Member
Given that the biggest complaint from devs seems to be the CPU, whatever it is doesn't sound that great.

I'm sure it was done to ensure Wii BC.

Sure, running unoptimized X360 code on a CPU that was not meant for that style of code. It's probably not nearly as bad as you would like to believe.
 

USC-fan

Banned
do you? gflops aren't everything, and a newer GPU could be a "better" GPU, or even more powerful in many respects, even if its "gflop" count is lower or even half that of an older GPU

consider, it is possible, what bg is suggesting
I have said that myself many many times onhere. But when people keep saying 600 gflop and there just isnt power to get there. Then say well they changed it to reach that and I would like to know what he thinks they are changing.
 
do you? gflops aren't everything, and a newer GPU could be a "better" GPU, or even more powerful in many respects, even if its "gflop" count is lower or even half that of an older GPU

consider, it is possible, what bg is suggesting

not half but


June 25 2008, HD 4870 RV770 XT 1200GFLOPS 150 watts
Februrary 15 2012, HD 7750 Cape Verde Pro 819.2 GFLOPS 55 watts

using current drivers the HD7750 is faster than the HD4870
 

IdeaMan

My source is my ass!
knock knock knock, i've heard it's here we talk about fromage ?
...
...
...
...
Oh ok, wattage !

badum tsss :trollface:
 

The_Lump

Banned
from majority of accounts by those who have worked with the Wii CPU, that would be good news

my understanding is Broadway was an excellent CPU, and in some ways it could perform tasks as well as the 360 CPU (even though it was not as powerful and did not have 6 threads nor 3 cores)

considering the power discrepancy between the 360 and Wii, that to me is very impressive

I personally feel as time goes on with the Wii U development, and developers actually get used to the Wii U CPU, we will see reports coming out saying "Wii U CPU more powerful than previously believed!"

The talk of a slow CPU has been bastardized to no end on public forums, and while I don't deny it might be slow clock-wise, I don't agree to the train of thought that submits it means it is "weak" or a poor CPU

but that won't be talked about until developers actually become used to the system

for example...already IdeaMan has given hints that framerates have been increased, potentially 2-fold. That is not a minor thing, and shows how much devs do not yet understand about Wii U.


Yeah and sensible people have said all along that a 3 core, up clocked broadway would be no bad thing at all.

I still don't think that'll turn out to be a very accurate description though. There's only so far they could push broadway whilst still referring to it as such. You can't just glue 3 cores together, push the clocks way up (didnt broadway peak at 1Ghz?), shrink the die and embed some DRAM without altering a few other things.
 

stupidvillager

Neo Member
posted?
https://twitter.com/IBMWatson/status/248820933618442240

"@Strider_BZ @BoostFire @Xbicio WiiU chip clarification: It's a "Power-based microprocessor" http://ibm.co/UhGspo "

Thats still a vague answer and its not new. That was in the original press release. Saying its Power-based is all encompassing. Power architecture covers everything. They didnt say its a PowerPC based microprocessor. While I dont really believe its POWER7, I also dont believe its a 3-core enhanced broadway.
 
Thats still a vague answer. Saying its Power-based is all encompassing. Power architecture covers everything. They didnt say its a PowerPC based microprocessor. While I dont really believe its POWER7, I also dont believe its a 3-core enhanced broadway.

believe in the moniker, but not for what you think it means. You think it means 3 slightly overclocked wii's duct taped together.
 
Can't we infer more from the performance of the games that are being initially ported to/developed for the system than whether it is an overclocked Wii CPU or a Power PC?
 

tkscz

Member
Can't we infer more from the performance of the games that are being initially ported to/developed for the system than whether it is an overclocked Wii CPU or a Power PC?

Not really from ports or games made at the moment. Time is needed to make something good and judge it's performance.
 
Top Bottom