• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Yes. Or at least put it into a deep-sleep state where it consumes very little power. Most modern CPUs are capable of that.

However, I don't think that Wii U's CPU cores will ever be idle for long in most (retail) games.

Roger, gotcha, I was thinking in terms of power draw while in "WiiUconnect24" mode. (Whatever they end up calling it.)
 
But the thing is, we're not talking about a 200 watt monster. Granted, there will be variation in GPU power consumption, but we're talking about a swing between 15 and 30watts. That is a realistic range for the WiiU. And thst doesn't actually change the math in terms of peak Flops.
 
But the thing is, we're not talking about a 200 watt monster. Granted, there will be variation in GPU power consumption, but we're talking about a swing between 15 and 30watts. That is a realistic range for the WiiU. And thst doesn't actually change the math in terms of peak Flops.

It would be nice to know if at 30W if the GPU is being utilized at 100%.

Maybe Nintendo put a power limit on the GPU and it will never get to it's theoretical max? Probably the same holds true for the CPU.
 
It would be nice to know if at 30W if the GPU is being utilized at 100%.

Maybe Nintendo put a power limit on the GPU and it will never get to it's theoretical max? Probably the same holds true for the CPU.

If memory serves, Nintendo likes to underclock their processors to improve reliability.
 

Kenka

Member
But the thing is, we're not talking about a 200 watt monster. Granted, there will be variation in GPU power consumption, but we're talking about a swing between 15 and 30watts. That is a realistic range for the WiiU. And thst doesn't actually change the math in terms of peak Flops.
You mean the 300-600 GFLOPS are refering to peak performances?
 

Kenka

Member
No. That's not how it works. Max flops are max flops, a chip just runs hotter or cooler, drawing more or less power, depending on the specific work load.
I don't understand, sorry.

To me it sounds like you are contradicting what the others have said about the relation power draw - performance.
 

beril

Member
No. That's not how it works. Max flops are max flops, a chip just runs hotter or cooler, drawing more or less power, depending on the specific work load.

When you're using a crappy metric like GFLOPS/watt to estimate the flops based on an estimated wattage it does work like that. Maybe in Iwatas 40w scenario the GPU only uses 80% of its maximum power consumption. In that case it could have up to 37w for the max power draw instead of 30w. If you want 600 DFLOP then it's just 16 GFLOP/w
 

ozfunghi

Member
No. That's not how it works. Max flops are max flops, a chip just runs hotter or cooler, drawing more or less power, depending on the specific work load.

Max flops are max flops, and max watt is max watt. What he means is, when the GPU is not being pushed to its limits, it will not be consuming the maximum amount of wattage. Whether you chose to discribe it being pushed to the limit in flops or fairy dust, doesn't matter.
 

Donnie

Member
But the thing is, we're not talking about a 200 watt monster. Granted, there will be variation in GPU power consumption, but we're talking about a swing between 15 and 30watts. That is a realistic range for the WiiU. And thst doesn't actually change the math in terms of peak Flops.

But if you're trying to figure out what GPU can fit inside a console with a 40-50w power draw then a 15w swing is pretty massive. Especially when the figure being used for the prospective GPU's is max power draw.
 

The_Lump

Banned
Just whilst we're on gpu wattage: people should note the 35w for e6760 and the 25w for e4690 are not only max under load, but they are for the whole mcm package (gpu + RAM).
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
1) it's a GPGPU
2) it's made by AMD
3) its consumption likely doesn't exceed 25-30 W

That's all we know for sure. Then you have guesses, speculations, prophecies.

What does that mean? GPGPU is not a thing, it just means the GPU has stream processors that can be used by non-rendering software for calculations. I've seen many posts saying the GPGPU in the Wii U is some magic bullet to make up for a weak CPU.

IT WON'T.

Know why? The GPU is rendering the games, it can't do both general processing and frame buffering render at the same time (at least not well).

The same stream processors cannot be used for two things at once. The PS4 leaked info had two GPUs, one for the rendering and one with the CPU for GPGPU. The Wii U has one GPU. It is also not like the PS3 where developers use a few SPUs to help the GPU, there are six (available) independent SPUs in the PS3.

Take it from a guy who is daily is running GPGPU calculations via CUDA while ray racing volumes at the same time on one GPU, the performance of both is hindered. My rendering frame rate is like 1/3 when the CUDA code is running.

So unless Nintendo has some sort of magic, the GPU is not going to help much while the game is actually rendering. Now if the CPU is so weak that they would rather scale back the rendering performance to help it, fine, but that mean things are pretty dire.
 

Donnie

Member
Just whilst we're on gpu wattage: people should note the 35w for e6760 and the 25w for e4690 are not only max under load, but they are for the whole mcm package (gpu + RAM).

GDDR5 RAM as well as far as I know, which uses quite a bit more power then slower memory types.
 

Earendil

Member
I never understood the argument that the PS4 and Nextbox would put WiiU to shame because they would be 3x more powerful...when it seems that the WiiU is around 2-3x more powerful than the PS360 and people are complaining...I also don't understand the argument that the next boxes will show major improvements over year 2 WiiU software when devs will be getting used to new architectures all over again...This is mostly toward Brad and Special...I would understand it better if the argument was that by the second wave of PS4 and Nextbox games they will start to show improvements and after that more than likely surpass WiiU graphics...but to say that year 2 software on WiiU won't even matter with launch games from the others systems is just silly in my opinion. Some developers are flat out saying that they need more time to figure out the CPU and how the whole systems works best. Some of you don't quote those lines though. The architecture is different. The CPU is def different than what the devs are used to but that doesn't mean that the total package is crap and worse off. You use a system to its strengths...you don't use a system in a way that its not really built to be used, no matter how powerful it may be. That is the key. The launch games look fine. The games will get better. We all know Sony and Microsoft are going to come out with awesome tech demos and trailers to wow us. Nintendo hopefully is preparing for this with some awesome games. My bet is that they are but that's just my opinion. When those systems launch we will see. A LOTof people are going to look like fools. Which side those fools are on we will see. I'm more in the middle ground and happy about it. My opinion is that it won't be as bad as PS2 to XBOX splinter cell differences. I just don't see it. I don't see how someone can look at a game like Assassins Creed or Uncharted, expect some improvements with the WiiU but when the PS4 and Nextbox come out say the WiiU is rubbish. That just doesn't make sense to me. I don't understand that line of thinking. Sorry so long Friday night here in Korea and I'm drinking hahaha :p


The key assumption behind this argument is that the WiiU is only on par with the PS360, and not more powerful.

GDDR5 RAM as well as far as I know, which uses quite a bit more power then slower memory types.

Not to mention that the e6760 is a 40nm part. If they have shrunk the die, it will use even less power.
 

Donnie

Member
What does that mean? GPGPU is not a thing, it just means the GPU has stream processors that can be used by non-rendering software for calculations. I've seen many posts saying the GPGPU in the Wii U is some magic bullet to make up for a weak CPU.

IT WON'T.

Know why? The GPU is rendering the games, it can't do both general processing and frame buffering render at the same time (at least not well).

The same stream processors cannot be used for two things at once. The PS4 leaked info had two GPUs, one for the rendering and one with the CPU for GPGPU. The Wii U has one GPU. It is also not like the PS3 where developers use a few SPUs to help the GPU, there are six (available) independent SPUs in the PS3.

Take it from a guy who is daily is running GPGPU calculations via CUDA while ray racing volumes at the same time on one GPU, the performance of both is hindered. My rendering frame rate is like 1/3 when the CUDA code is running.

So unless Nintendo has some sort of magic, the GPU is not going to help much while the game is actually rendering. Now if the CPU is so weak that they would rather scale back the rendering performance to help it, fine, but that mean things are pretty dire.

There are plenty of examples of PC games that have an option to shift physics to the GPU and they provide the same graphics with better physics then when using the CPU. Of course using flops for physics will allow less for rendering, that has to be balanced by the developer and no nobody's claiming its a magic bullet. But neither is the idea of physics on a GPU as pointless as you're making it sound. GPGPU functionality can absolutely take significant burden away from a CPU during rendering and as you know there is no fixed amount it would slow rendering down by. Obviously just because you're experiencing that level of slow down with your code doesn't make it a guideline for any GPGPU work done on any GPU.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
What does that mean? GPGPU is not a thing, it just means the GPU has stream processors that can be used by non-rendering software for calculations. I've seen many posts saying the GPGPU in the Wii U is some magic bullet to make up for a weak CPU.

IT WON'T.

Know why? The GPU is rendering the games, it can't do both general processing and frame buffering render at the same time (at least not well).

The same stream processors cannot be used for two things at once. The PS4 leaked info had two GPUs, one for the rendering and one with the CPU for GPGPU. The Wii U has one GPU. It is also not like the PS3 where developers use a few SPUs to help the GPU, there are six (available) independent SPUs in the PS3.

Take it from a guy who is daily is running GPGPU calculations via CUDA while ray racing volumes at the same time on one GPU, the performance of both is hindered. My rendering frame rate is like 1/3 when the CUDA code is running.

So unless Nintendo has some sort of magic, the GPU is not going to help much while the game is actually rendering. Now if the CPU is so weak that they would rather scale back the rendering performance to help it, fine, but that mean things are pretty dire.
I don't recall anybody claiming that the GPGPU comes for free.

It's a simple fact of matter, though, that the GPU will handle some GP tasks better than the CPU. All next-get GPUs will be dong that. Will their framerates 'universally suffer' from that? It really depends what the games are trying to do - what their GP tasks and the rendering tasks involve.

Last but not least, games don't necessarily shoot for infinite framerates, whereas CUDA tasks do - they try to get max performance, sustained. Console games don't to do that - they aim for fixed performances (whether that fixed performance is close to the GPU's max is another matter).
 

The_Lump

Banned
GDDR5 RAM as well as far as I know, which uses quite a bit more power then slower memory types.


Yup and as Earendil mentioned, if the die has been shrunk it will be even less. Add the (almost certainly) lower clock and its not out of the realm of possibility by any means.

Just been reading that GPR amount plays a role in whether a gpu is dx certified. Now, if Nintendo were to take a dx10.1 part and add GPRs....maybe they were trying to achieve closer parity with dx11 equivalent parts?

And if Matt's info is taken from a dev kit with this 'placeholder' r700+extra GPRs gpu (not saying it is, just a guess) then maybe Nintendo were trying to replicate what their final gpu might look like?


No idea if that makes sense. It's 100% speculation!
 
The thing with power consumption is that enery is fuild and flucuates from conponent to conponent based on need. I also doubt any game right now on WiiU is pushing it to it's max.

One way to pinpoint performance threshold using wattage is to take a mutliplatform game and see how much of the total power it consumes. Use the rest of the power left over to come up with a reasonable estimate on how much better WiiU can do.

Even then, the multiplatform game will not be optimized and a lot of performance can be gained from optimization. :/

It just shows how power usage isn't very good performance measurement either. :/

What Power Consumption can tell is what parts can actually fit in the system. And then from those parts we can understand the potential performance.
 
I dont think the odds of a Nintendo system surviving in the core gaming space without a Wii-esque hook are very good.

Gamecube didn't even do well, and it was at least technically competent for it's generation. That was proof Nintendo's belated IP's dont cut it by themselves (in the core space).

So it requires the Gamepad 2nd display to catch on, which nobody seems to think it will.

I actually think the Gamecube is why Nintendo has never been the same since. In their eyes, going down the strong hardware traditional path is a dead end for them.

My opinion is if you are a Nintendo fan, hope the Wii U fails fast and hard, so that it's clearly dead by say 2014, and then hope Nintendo wakes up and comes back strong. I think the first part is covered. the 2nd? More difficult.

If I was a Nintendo fan that's the silver lining of Wii U. It wont meander along and do "ok", like the PSP for example all those years, it will flame and crash horribly. At that point Nintendo will have to do some soul searching.

All imo etc etc disclaimer etc.

Blimey, I couldn't disagree with you more mate. It's a Nintendo console launching with a 2D Mario as well as several other system sellers either on launch day or during the launch window, including Monster Hunter U not only as a launch title in Japan but also with a bundle.

It's going to fly off shelves worldwide imo.
 
Take it from a guy who is daily is running GPGPU calculations via CUDA while ray racing volumes at the same time on one GPU, the performance of both is hindered. My rendering frame rate is like 1/3 when the CUDA code is running.

What rendering engine are you using and what video card? The latest Nvidia ones have been slower than previous gens, since Nvidia is trying to make everyone spend the money on Quadros.
 

Mr Swine

Banned
Can't we just be happy that Wii U kicks Xbox360 and PS3 ass while consuming very little electricity? That makes it less hot and Nintendo can put in a very quiet CPU/GPU fan
 

AzaK

Member
What does that mean? GPGPU is not a thing, it just means the GPU has stream processors that can be used by non-rendering software for calculations. I've seen many posts saying the GPGPU in the Wii U is some magic bullet to make up for a weak CPU.

IT WON'T.

Know why? The GPU is rendering the games, it can't do both general processing and frame buffering render at the same time (at least not well).

The same stream processors cannot be used for two things at once. The PS4 leaked info had two GPUs, one for the rendering and one with the CPU for GPGPU. The Wii U has one GPU. It is also not like the PS3 where developers use a few SPUs to help the GPU, there are six (available) independent SPUs in the PS3.

Take it from a guy who is daily is running GPGPU calculations via CUDA while ray racing volumes at the same time on one GPU, the performance of both is hindered. My rendering frame rate is like 1/3 when the CUDA code is running.

So unless Nintendo has some sort of magic, the GPU is not going to help much while the game is actually rendering. Now if the CPU is so weak that they would rather scale back the rendering performance to help it, fine, but that mean things are pretty dire.
If the Wii U CPU was deliberately gimped because Nntendo thought "just do it on GPGPU" I think that was short sighted.

Can't we just be happy that Wii U kicks Xbox360 and PS3 ass while consuming very little electricity? That makes it less hot and Nintendo can put in a very quiet CPU/GPU fan
Well, I can try and ignore it and just play games, but I had no problem with a bigger console and bigger fan (within reason). Nintendo's attitude towards this just diminishes what's possible.
 
There are plenty of examples of PC games that have an option to shift physics to the GPU and they provide the same graphics with better physics then when using the CPU. Of course using flops for physics will allow less for rendering, that has to be balanced by the developer and no nobody's claiming its a magic bullet. But neither is the idea of physics on a GPU as pointless as you're making it sound. GPGPU functionality can absolutely take significant burden away from a CPU during rendering and as you know there is no fixed amount it would slow rendering down by. Obviously just because you're experiencing that level of slow down with your code doesn't make it a guideline for any GPGPU work done on any GPU.
Well, thats not really the same thing. PCs generally have alot more spare resources than a console. When you flip the switch to turn on your Physx or whatever, depending on the card you have, you will notice a fair amount of framerate drop. More or less.

On consoles, developers have to budget the resources and generally, pretty graphics win out. Other than this generations relative lack of logical hardware power, I don't know CPUs were to burdened anyway.Most were trying not to wreck their brains figuring what else they could move to the CPU.

If you want to get deeper into it we can but GPUs aren't really the god-send for phyics like some make them out to be. There are some aspects where a good CPU would best it at.


I don't recall anybody claiming that the GPGPU comes for free.

It's a simple fact of matter, though, that the GPU will handle some GP tasks better than the CPU
. All next-get GPUs will be dong that. Will their framerates 'universally suffer' from that? It really depends what the games are trying to do - what their GP tasks and the rendering tasks involve.

Last but not least, games don't necessarily shoot for infinite framerates, whereas CUDA tasks do - they try to get max performance, sustained. Console games don't to do that - they aim for fixed performances (whether that fixed performance is close to the GPU's max is another matter).
I really don't agree with this. But I've seen "GP" cover so many different aspects that I really don't have a clear idea what you may be alluding to.

GPGPUs are pretty much a huge collection of SIMD units. Alot like Cell SPE's except exponentially more HP. It shares the same drawbacks though.

1. Memory Latency. The hardest one to shake because GPUs are sorta built to tolerate latency rather than combat it.

2. A Wide-Simd lane so it will generally struggle with anything logical

3. A lack of branch HW. This was the case for this gen so it won't be that big of a change next-generation but the few developers I know aren't fond of this at all.

Personally, I would rather just integrate a few decent SIMD units in the CPU and let my GPU do its thing but perspective is always intresting.

What would you rather see?
 

Absinthe

Member
I can't believe I'm posting in here again (ok, yes I can), but about the GPU...it is modeled on the R700 series, but it has significantly more GPRs. However, it seems to have fewer then the E6760, so...make your own conclusions.

OK, NOW I'm done.

It seems like the press releases below keep getting ignored, no?

In my view, the Wii U GPU has been right in front of our faces, and has been for a while.

First,

Green Hills Software's MULTI Integrated Development Environment Selected by Nintendo for Wii U Development
http://www.ghs.com/news/20120327_ESC_Nintendo_WiiU.html

Second,

In May of this year, it was announced that the AMD Radeon E6760 would use Green Hills Software.
http://www.altsoftware.com/press-ne...gl-graphics-driver-architecture-embedded-syst

  1. Older dev kit's for the Wii U were basically using the AMD Radeon 4850.
  2. Nintendo told devs that the 4850 would be roughly equal to the Wii U's final GPU.

But, the 4850 pulls around 130-240 watts, and is 55nm. Features of the 4850 are also dated, and better standards are being used today like DirectX 11, etc.

Back to the E6760,

  1. The E6760 runs at 35 watts.
  2. The E6760 has a 40nm size, exactly what Nintendo announced the size would be for the GPU.
  3. A stock E6760 scores a 5870 in 3D Mark Vantage, which is higher than the HD 4850.

All that said, basic reason should tell us (especially from those top two press releases) that the GPU will end up being based on the E6760, although tweaked for the better, like most stock GPU's are when in a console.

AMD Radeon HD E6760 Specs
http://www.em.avnet.com/en-us/desig...60-Embedded-Discrete-Graphics-Processors.aspx
 
Guys, that email was REAL. I just decided to email AMD on a whim and they sent me the EXACT same thing.

p2NpW.png
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Why are you still on about the e6760 when it was pretty much confirmed earlier in the thread that it isn't an e6760 in the Wii-U?

I can't believe I'm posting in here again (ok, yes I can), but about the GPU...it is modeled on the R700 series, but it has significantly more GPRs. However, it seems to have fewer then the E6760, so...make your own conclusions.

OK, NOW I'm done.
 

jerd

Member
Sorry if this is old, but I'm not sure what it means. Comment from Unity CEO

Gaming Blend: While the Wii U doesn't specifically use DirectX functionality, will the Unity Engine for the Wii U allow for DirectX 11 equivalent functionality in regards to shaders, soft and self shadowing as well as potential scalability for shader 5.0 (or higher)?

Helgason: Yeah. We'll do a – we'll make it potentially possible to do.

http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html

"Make it potentially a possibility"? That seems like a yes or no question. What's up with that? Sorry if this has already been answered.

Edit: Wha? AMD dropping bombs or something? This doesn't seem right.
 

Earendil

Member
OK, either someone at AMD is seriously trolling everyone or they are breaking NDAs. I find it hard to believe that as secretive as Nintendo has been about the system, that AMD support personnel would let the cat out of the bag like this.
 

Absinthe

Member
Why are you still on about the e6760 when it was pretty much confirmed earlier in the thread that it isn't an e6760 in the Wii-U?

Why do you ignore official press releases and look to posts from supposed insiders using phrases like "it seems" instead?

Besides, it looks like we are getting multiple legit response emails from AMD when asked about the GPU, maybe you should try and email them as well?
 
OK, either someone at AMD is seriously trolling everyone or they are breaking NDAs. I find it hard to believe that as secretive as Nintendo has been about the system, that AMD support personnel would let the cat out of the bag like this.

unless since price and launch date has been out some NDAs are laxed a bit to provide support or info to media

still questioning this because I find it hard to believe the mystery of the ages has been solved like this
 

Ryoku

Member
Why are you still on about the e6760 when it was pretty much confirmed earlier in the thread that it isn't an e6760 in the Wii-U?

Customization would be your answer.

I was one of the people who figured that the e6760 would be the most similar in terms of performance/features of Wii U's GPU, but that the Wii U didn't actually base its GPU off of the e6760, making it a custom e6760. I don't know how to take the emails. I want to believe, but I don't want to be too optimistic, either.

It seems all we needed to do is ask them. One and a half years of speculation..... ONE AND A HALF YEARS!
 
Is E6760 good?

Good compared to what?

Compared to Xenos and RSX it should be stronger by a decent amount. Roughly, at least twice as fast and with a more modern feature set.

Compared to my 2 years old mid range graphics card on PC (GTX 460), it is weaker by at least the same amount.
 

Meelow

Banned
Is E6760 good?

And does anyone have the AMD technical support email?, the email above wouldn't work and I can't find it.

I want to see what they say to me.

Good compared to what?

Compared to Xenos and RSX it should be stronger by a decent amount. Roughly, at least twice as fast and with a more modern feature set.

Compared to my 2 years old mid range graphics card on PC (GTX 460), it is weaker by at least the same amount.

Interesting.
 

Ryoku

Member
Good compared to what?

Compared to Xenos and RSX it should be stronger by a decent amount. Roughly, at least twice as fast and with a more modern feature set.

Compared to my 2 years old mid range graphics card on PC (GTX 460), it is weaker by at least the same amount.

e6760 rapes Xenos and RSX, but yes, it is noticeably weaker than a GTX460.
 

nordique

Member
It seems like the press releases below keep getting ignored, no?

In my view, the Wii U GPU has been right in front of our faces, and has been for a while.

First,

Green Hills Software's MULTI Integrated Development Environment Selected by Nintendo for Wii U Development
http://www.ghs.com/news/20120327_ESC_Nintendo_WiiU.html

Second,

In May of this year, it was announced that the AMD Radeon E6760 would use Green Hills Software.
http://www.altsoftware.com/press-ne...gl-graphics-driver-architecture-embedded-syst

  1. Older dev kit's for the Wii U were basically using the AMD Radeon 4850.
  2. Nintendo told devs that the 4850 would be roughly equal to the Wii U's final GPU.

But, the 4850 pulls around 130-240 watts, and is 55nm. Features of the 4850 are also dated, and better standards are being used today like DirectX 11, etc.

Back to the E6760,

  1. The E6760 runs at 35 watts.
  2. The E6760 has a 40nm size, exactly what Nintendo announced the size would be for the GPU.
  3. A stock E6760 scores a 5870 in 3D Mark Vantage, which is higher than the HD 4850.

All that said, basic reason should tell us (especially from those top two press releases) that the GPU will end up being based on the E6760, although tweaked for the better, like most stock GPU's are when in a console.

AMD Radeon HD E6760 Specs
http://www.em.avnet.com/en-us/desig...60-Embedded-Discrete-Graphics-Processors.aspx

Interesting post, though is it absolutely confirmed the GPU is on a 40nm process? There were some postulations it may be a 32nm process

unless I missed something
 
Sorry if this is old, but I'm not sure what it means. Comment from Unity CEO



http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html

"Make it potentially a possibility"? That seems like a yes or no question. What's up with that? Sorry if this has already been answered.

Edit: Wha? AMD dropping bombs or something? This doesn't seem right.

you question about DirectX 11? I've read around here that DirectX 11 = Microsoft so yeah you can't just go and use it on a non-microsoft product but its key features is what the guy seems to be saying you can do
 
Top Bottom