• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Van Owen

Banned
So I've been reading some B3D stuff...

I'm presumably LTTP on this so it may have already been discussed, but regarding the GPU - isn't the mooted ~600 GFlop number now impossible based on the 45W average power draw number now known?


It's less than 45W. 45W is the maximum when all USB ports are in use.
 

USC-fan

Banned
So I've been reading some B3D stuff...

I'm presumably LTTP on this so it may have already been discussed, but regarding the GPU - isn't the mooted ~600 GFlop number now impossible based on the 45W average power draw number now known?

EDIT: Actually different sites seem to have two different numbers for the "typical" power usage: 40W and 45W.

I thought 75W was the max power rating?

The 75w is what the psu is rated for. Most psu are run at 60% of Max rated.

The wii u uses 45w.

Like I have said many times given these rating its impossible for 600 glfop gpu.
 

Sheroking

Member
The 75w is what the psu is rated for. Most psu are run at 60% of Max rated.

The wii u uses 45w.

Like I have said many times given these rating its impossible for 600 glfop gpu.

Really, what's the math on that?

I'm not an expert by any metric, but the 600GFLOP number comes from the E6760 @ 35W, does it not? Could it not be die-shrunk from 40nm to get the consumption lower?
 

Lonely1

Unconfirmed Member
.

Like I have said many times given these rating its impossible for 600 glfop gpu.

No, is not impossible.

Really, what's the math on that?

I'm not an expert by any metric, but the 600GFLOP number comes from the E6760 @ 35W, does it not? Could it not be die-shrunk from 40nm to get the consumption lower?

Yes, it can. Also, Next-gen mobile devices are expected to reach 100+ Gflops on sub 10W power budgets, including the screen.
 
The 75w is what the psu is rated for. Most psu are run at 60% of Max rated.

The wii u uses 45w.

Like I have said many times given these rating its impossible for 600 glfop gpu.

Just like how it wouldn't be a GPGPU, amirite? ;)

Really, what's the math on that?

I'm not an expert by any metric, but the 600GFLOP number comes from the E6760 @ 35W, does it not? Could it not be die-shrunk from 40nm to get the consumption lower?

No. But I have expected it to be on a smaller process than 40nm.
 
Where's the notion of a process shrink coming from?

EDIT: Actually where did the original claim of 40nm come from to begin with?

Really, what's the math on that?

I'm not an expert by any metric, but the 600GFLOP number comes from the E6760 @ 35W, does it not? Could it not be die-shrunk from 40nm to get the consumption lower?
From what I can find the E6760 is already 40nm.

Also, there seems to be confusion around this idea that the Wii U is actually using an E6760 - rather than what was actually floated, in that it may be in a similar performance range.
 
Just like how it wouldn't be a GPGPU, amirite? ;)

No one disputed that DX10.1 based hardware could technically do "GPGPU" work. Some of us just haven't deluded ourselves into thinking that's some kind of magical difference maker.

No. But I have expected it to be on a smaller process than 40nm.

We'll see. Nintendo has a habit of stay as far from the leading edge of process advantages as they can manage.
 

artist

Banned
capturekzfnv.png

capturee6shr.png
 

USC-fan

Banned
Just like how it wouldn't be a GPGPU, amirite? ;)



No. But I have expected it to be on a smaller process than 40nm.

I been right every time... loving it. Amirite?

R700 is still terrible at gpgpu which is what I said. Never said it does not support compute shaders, which what makes it a gpgpu.

As I said no way in the world this thing is close to 600 gflops. You have 40-45w to power the console.

Funny I been saying the same thing for 6 months...
 

Instro

Member
I been right every time... loving it. Amirite?

R700 is still terrible at gpgpu which is what I said. Never said it does not support compute shaders, which what makes it a gpgpu.

As I said no way in the world this thing is close to 600 gflops. You have 40-45w to power the console, that is on the low end.

Funny I been saying the same thing for 6 months...

About...?
 

Kenka

Member
Regular power consumption is around 45 W, that at least is true.

And you want us to think that the power budget for the GPU is... ? Something that cannot develop 600 GFLOPS ?
 
Hey, did you revised your NextBox GPU estimate to ~2TFlops?

Also, Wii U GPU is going to ship at 55nm since it has an unmodified HD 4550 inside. I read it in BY3D. :p

I was only told 1+TF so I wasn't going to lock myself into anything without knowing more. Even when people were saying 1.2-1.5TF, I didn't understand where that came from.

No one disputed that DX10.1 based hardware could technically do "GPGPU" work. Some of us just haven't deluded ourselves into thinking that's some kind of magical difference maker.

You didn't read his posts then. Because of him the Community thread title got changed.

I been right every time... loving it. Amirite?

R700 is still terrible at gpgpu which is what I said. Never said it does not support compute shaders, which what makes it a gpgpu.

As I said no way in the world this thing is close to 600 gflops. You have 40-45w to power the console.

Funny I been saying the same thing for 6 months...

XD
 

USC-fan

Banned
I was only told 1+TF so I wasn't going to lock myself into anything without knowing more. Even when people were saying 1.2-1.5TF, I didn't understand where that came from.



You didn't read his posts then. Because of him the Community thread title got changed.



XD

You can read them, you just don't understand it. What he said is what I have said from day one.
 

Sheroking

Member
Regular power consumption is around 45 W, that at least is true.

And you want us to think that the power budget for the GPU is... ? Something that cannot develop 600 GFLOPS ?

I suppose that's what he's saying.

I'll ask again as I got no response from him: What is the math on that, USC-fan?. Surely if a 45w box can not support a 600GFLOP GPU, there is some kind of clear cut-off point - I'd like to see how you worked that out.

EDIT: Now I see this post:

R770 at 40nm is 12 gflops per watt. Do the math....

Have we not long ruled out the R770?
 
So, since no one's actually answering the question... maybe it will help if broken into smaller questions.

The E6760 is apparently 16.5 GFLOPS/W at 40nm (?)

The HD7970M is apparently 21.8 GFLOPS/W at 28nm (?)

Are both of these considered efficient?

The figure known for the Wii U's typical power draw is apparently 40W, with a max of 75W. We don't know whether that means when gaming though.

A) How likely is it for the Wii U to have similar performance/power usage?

B) How likely is it that the GPU will be using 30-35W of power?

EDIT: So we do know it's referring to when gaming?
 

Lonely1

Unconfirmed Member
R770 at 40nm is 12 gflops per watt. Do the math....

How do you know it will be a R770? How do you know it will be at 40nm? How do you know that GPU power and power consumption scales linearly and at the same rate in the particular architecture Wii U is going to use?
 

tenchir

Member
It's less than 45W. 45W is the maximum when all USB ports are in use.

According to Iwata http://www.neogaf.com/forum/showthread.php?t=492101&highlight=iwata+direct

The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.
 

USC-fan

Banned
So, since no one's actually answering the question... maybe it will help if broken into smaller questions.

The E6760 is apparently 16.5 GFLOPS/W at 40nm (?)

The HD7970M is apparently 21.8 GFLOPS/W at 28nm (?)

Are both of these considered efficient?

The figure known for the Wii U's typical power draw is apparently 40W, with a max of 75W. We don't know whether that means when gaming though.

A) How likely is it for the Wii U to have similar performance/power usage?

B) How likely is it that the GPU will be using 30-35W of power?
you cannot use high bin gpus. E6760 is not happening either so just forget about it.

We have r700 is 12 gflops per watt at 40nm. Gpu uses around 25-30 watts. Range 300-360 gflops. This also is backed up with games that run a little better on wiiu but really not a large jump for ps360.
 

Sheroking

Member
you cannot use high bin gpus. E6760 is not happening either so just forget about it.

We have r700 is 12 gflops per watt at 40nm. Gpu uses around 25-30 watts. Range 300-360 gflops. This also is backed up with games that run a little better on wiiu but really not a large jump for ps360.

That's a hugely specious argument USC-fan.

Because first-gen launch titles, mostly direct ports, don't look considerably better, the Wii U can't have a performance gain?

Wow do you even read these pots? Smh, crazy people on here really believe the stuff you post.

I'm not attacking you, but is english your first language?

Because your posts do come off as... contradictory, and I'm wondering if you think you're communicating something that you aren't.
 
I think he was hoping you forgot bg.....



Ok, lets forget about the GPU watts for a second since we know the performance level will be similar to the e6760 or HD 4850......just be happy with that.

How many watts does the CPU and USB ports use?

I think the CPU total is probably 5-6W max and I guess 2.5W for each USB port.

Wow do you even read these pots? Smh, crazy people on here really believe the stuff you post.

ROFL! No, do you read those posts? It's there for everyone to see and something you can't deny.
 

pottuvoi

Banned
It will be definitely interesting to see what the reverse-engineers make of the Wii U GPU after cracking the box...



... But this guy makes me sad.

You want to crack the box to see a GPU that's definitely not there?
Actually, you might be right about that.
Even if the GPU is RV770 or based on it, it might reside on CPU die and thus not be visible when opening the box.. ;)
 
So does that make 30+W for the GPU feasible then?

What's the power draw of a 5× BluRay drive?

Unfortunately I'm not familiar enough with optical drive wattage to make a comment. And with the GPU it's tough to say in that regard because we had that person from Tezzaron talking about TSMC and stacking so there is a possibility that ih he wasn't speaking hypothetically the GPU may not reach 30w to begin with.
 

USC-fan

Banned
Haha. I know that post. It's not a GPGPU, but can run GPGPU code. Think about that for a moment.

You are taking a post out of the thread. I was saying to the other poster there is no gpgpu in the wiiu. There is a gpu, he thought a gpgpu/gpu were 2 different things. He was saying what if the gpu in the next Gen console is not a gpgpu.

What I was saying and have always said. There is a gpu in the wiiu that support compute shaders. Making it have gpgpu support. Every 10.1+ dx cards supports gpgpu. We don't call them gpgpu we call them gpu.

People were saying gpgpu replaces the gpu which is not true.
 
You are taking a post out of the thread. I was saying to the other poster there is no gpgpu in the wiiu. There is a gpu, he thought a gpgpu/gpu were 2 different things. He was saying what if the gpu in the next Gen console is not a gpgpu.

What I was saying and have always said. There is a gpu in the wiiu that support compute shaders. Making it have gpgpu support. Every 10.1+ dx cards supports gpgpu. We don't call them gpgpu we call them gpu.

People were saying gpgpu replaces the gpu which is not true.

Then don't you think that's a bad choice of words on your part then? Wouldn't it have been better to say (as you did later) that it is a GPGPU and then clarified that they aren't two separate things?
 
So does that make 30+W for the GPU feasible then?

What's the power draw of a 5× BluRay drive?

What's the power draw of 2GBs of RAM + WiFi + Wireless Controller video feed transmission?

Haha. I know that post. It's not a GPGPU, but can run GPGPU code. Think about that for a moment.

Fermi was designed as a GPGPU. AMD's GCN architecture was designed as a GPGPU. R700 was not, and it's pretty inefficient at anything but pretty straight forward graphics work as a result. Nintendo describing R700 era, DX10.1 era hardware as a "GPGPU" is disingenuous at best, and duplicitous at worst.
 

Sheroking

Member
Fermi was designed as a GPGPU. AMD's GCN architecture was designed as a GPGPU. R700 was not, and it's pretty inefficient at anything but pretty straight forward graphics work as a result. Nintendo describing R700 era, DX10.1 era hardware as a "GPGPU" is disingenuous at best, and duplicitous at worst.

Assuming they're talking about an R700 "era", which there's no real evidence for.

Nintendo doesn't give enough fucks to be duplicitous to the fourteen people who both know and care what GPGPU is or means.
 

EloquentM

aka Mannny
You guys are now just only noticing Matt? He's been dropping tidbits of info in wii u threads here and there. I'd watch out for him in case he says something else interesting.
 
Fermi was designed as a GPGPU. AMD's GCN architecture was designed as a GPGPU. R700 was not, and it's pretty inefficient at anything but pretty straight forward graphics work as a result. Nintendo describing R700 era, DX10.1 era hardware as a "GPGPU" is disingenuous at best, and duplicitous at worst.

That's just spin. Those were designed to be better at being GPGPUs. The latter is saying they did no customizing at all, which I know they did, and just stuck a 55nm GPU in the console.
 

Instro

Member
You guys are now just only noticing Matt? He's been dropping tidbits of info in wii u threads here and there. I'd watch out for him in case he says something else interesting.

Yeah he's been doing it for a long time, although much more often recently.
 

USC-fan

Banned
Assuming they're talking about an R700 "era", which there's no real evidence for.

Nintendo doesn't give enough fucks to be duplicitous to the fourteen people who both know and care what GPGPU is or means.

Every bit of detail about gpu points to r700. Every leak spec we have match the r700 exactly, no other gpu core ,made by amd
 

Sheroking

Member
Every bit of detail about gpu points to r700. Every leak spec we have match the r700 exactly, no other gpu core ,made by amd

What leaked specs? Show your work.

Tell me why this is an R700? Tell me why the customization of the GPU can't account for the difference you're suggesting is impossible?
 
Top Bottom