• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Donnie

Member
Wait, this makes no sense.

Forgive me if I may be wrong (and let me preface by once again pointing out that I don't know dick about them fancy computer doodads), but considering Wii-U has double the amount of RAM than the PS360, shouldn't the CPU also be more powerful in order to take advantage of all that extra RAM?

Did you read the last two paragraphs in those articles? He says right at the end that the reason WiiUs CPU seems slow is because it's new and they're not getting the most out of it yet. I'm sure there's a thread on this forum totally ignoring those paragraphs though..

But to answer your question, no you don't need more CPU power to use more RAM.
 
Did you read the last two paragraphs in those articles? He says right at the end that the reason WiiUs CPU seems slow is because it's new and they're not getting the most out of it yet. I'm sure there's a thread on this forum totally ignoring those paragraphs though..

But to answer your question, no you don't need more CPU power to use more RAM.


Of course not, eurogramer has had a bais against WiiU for a while now. They buried any sliver lining deep in the article and made all the negative quotes prominent.
 

AzaK

Member
If the Wii-U really was substantially more poewrfull, they wouldn't need to do that in order to produce more graphically impressive games than we have seen on PS360.

You do realise they'd have to put extra effort into that right? If a machine has tonnes of raw grunt and a game is a port, a dev is really only going to be able to bump the res or framerate (assuming you coded with that in mind) without having to do more work to take advantage of the extra power.

Why put the extra effort in if you'd sell just as many copies without the effort?
 
You do realise they'd have to put extra effort into that right? If a machine has tonnes of raw grunt and a game is a port, a dev is really only going to be able to bump the res or framerate (assuming you coded with that in mind) without having to do more work to take advantage of the extra power.

Why put the extra effort in if you'd sell just as many copies without the effort?

AA, resolution and framerate would take minimal effort but we haven't seen that.
 

alcabcucu

Member
"The Wii U utilizes an AMD E6760 GPU, which is a specially-designed, embedded GPU inside the Wii U specifically. This is based around the 6xxx series of GPUs, but has obviously been modified for the Wii U and its specific needs and configuration. "

Or so it seems. This is the AMD statement:

9qwmiq.jpg


I don't think this is a hoax.

BR,
 

alcabcucu

Member
You're free to think it's not a hoax. We all know it is.

Well, the story is as follows. There is another forum in Spain, and there is this well known user there who sent an e-mail asking AMD for Wii U's GPU specs.

This same user is the one who recently asked to @IBMWatson if WII U had a Power7 CPU, as previously reported. This was first confirmed by the source, but then corrected a day after by a new tweet. Now it's clear, as we all knew, that Wii U uses a custom Power based processor, but not a Power7 based CPU.

I know this guy, and I know he has received this mail from AMD. He is not lying here. Question is: is the person from AMD, the person who answered the e-mail, telling the truth?

BR.
 
Well, the story is as follows. There is another forum in Spain, and there is this well known user there who sent an e-mail asking AMD for Wii U's GPU specs.

This same user is the one who recently asked to @IBMWatson if WII U had a Power7 CPU, as previously reported. This was first confirmed by the source, but then corrected a day after by a new tweet. Now it's clear, as we all knew, that Wii U uses a custom Power based processor, but not a Power7 based CPU.

I know this guy, and I know he has received this mail from AMD. He is not lying here. Question is: is the person from AMD, the person who answered the e-mail, telling the truth?

BR.

Maybe its not a hoax then?
 

alcabcucu

Member
I'm completely sure the e-mail is real. Just type in google and you'll find the document.

What i cannot be sure of is: did the person who wrote it know what he was doing?

BR
 

wsippel

Banned
Well, the story is as follows. There is another forum in Spain, and there is this well known user there who sent an e-mail asking AMD for Wii U's GPU specs.
AMD's customer support wouldn't know shit about the GPU, though. The only official statement I've seen so far claims that the GPU is unique and unlike any off-the-shelf part.
 

gogogow

Member
AMD's customer support wouldn't know shit about the GPU, though. The only official statement I've seen so far claims that the GPU is unique and unlike any off-the-shelf part.

Even if they knew, they wouldn't just tell you because you asked via a email...pretty sure they have some sort of NDA.
 

Donnie

Member
The guy might be telling the truth about the email. But if he is then the AMD person isn't because we know its not an e6760 so..
 

Bauhaus

Banned
"The Wii U utilizes an AMD E6760 GPU, which is a specially-designed, embedded GPU inside the Wii U specifically. This is based around the 6xxx series of GPUs, but has obviously been modified for the Wii U and its specific needs and configuration. "

Or so it seems. This is the AMD statement:

9qwmiq.jpg


I don't think this is a hoax.

BR,

9qwmiq_zps4181c9e5.jpg


Original post

It's a hoax.
 

The_Lump

Banned
1) it's a GPGPU
2) it's made by AMD
3) its consumption likely doesn't exceed 25-30 W

That's all we know for sure. Then you have guesses, speculations, prophecies.


Plus time travel. I came back from December and can confirm it has a cpu, a gpgpu and a psu. You can take that as fact. Once we get (back) to december, you'll see I was telling the truth.
 
So, are these the best guesses as to what's inside the Wii U?

It has a CPU with three cores, each core is an enhanced Broadway CPU, and larger cache.
It has dedicated sound processing transistors.
It has 32MB eDRAM.
It has 2GB of system RAM, of which 1GB is currently reserved for developers to make use of.
It has a GPU with modern feature-set, and range from 300-600 Gflops.

If so, how does it compare to the Wii, Xbox360, the PS3, and the current top/low predictions of PS4/Xbox8?
 

Donnie

Member
It's a customised (underclocked) e6760, on a smaller die.

It might be similar in its configuration (number of rops, shader units ect) but no I don't think its a custom e6760. If it was its base specification would be DX11. It seems from the info we've seen that its base spec is DX10.1 and they've then added what features they want on top of that without necessarily sticking to the DX11 spec. Obviously because they use their own API (Not DirectX) there's no need to improve the GPU along exactly the same lines as DirectX. But its base spec of DX10.1 fits with the idea that it started development from a R700 and then went in a different direction (customised to Nintendo's requirements or NintendoX 11 if you will :D).
 
It's 40 W minus roughly Blu-Ray, CPU, communications and the rest. Could be less.

GPU power consumption changes based on what it is doing. Giving it a fixed range is not meaningful unless you say what task it is doing.

I'm sure 40w is typical average demand for current games. As the generation goes on, the GPU will be pushed more.
 
Plus time travel. I came back from December and can confirm it has a cpu, a gpgpu and a psu. You can take that as fact. Once we get (back) to december, you'll see I was telling the truth.

We need an image of Lumps avi peaking out of a Delorean...

I have a question, and I apologize if it's a dumb one.

Would the power needs of a GPU change depending on the game being played? Would ZombiU use more energy to play than NSMBU? And remember, I'm speaking strictly about the GPU so streaming assets from the disc drive do not count as far as this question is concerend
 

Kenka

Member
GPU power consumption changes based on what it is doing. Giving it a fixed range is not meaningful unless you say what task it is doing.

I'm sure 40w is typical average demand for current games. As the generation goes on, the GPU will be pushed more.
I need to ask you DS if you are correct and how you back up your assertion. What if the GPU was loaded with the same current, regardless of the workload? does the power consumption of a gpu really vary with the game played?
 
walking fiend said:
Can't we infer more from the performance of the games that are being initially ported to/developed for the system than whether it is an overclocked Wii CPU or a Power PC?
LiveFromKyoto said:
aunch titles where the developers have no middleware, little to no documentation and don't even know the final specs until halfway through development? Of course not.
Even later on it's not great. Silent Hill HD collection would make HD twins look pretty unimpressive.
 
I need to ask you DS if you are correct and how you back up your assertion. What if the GPU was loaded with the same current, regardless of the workload? does the power consumption of a gpu really vary with the game played?

Sure thing

http://www.guru3d.com/articles_pages/asus_geforce_gtx_660_directcu_ii_top_review,7.html

That's an article where they stress different GPUs to 100% utilization to get the max power draw reading.

The GPU max power draw will only happen if the GPU is utlized 100%.
 

djyella

Member
I never understood the argument that the PS4 and Nextbox would put WiiU to shame because they would be 3x more powerful...when it seems that the WiiU is around 2-3x more powerful than the PS360 and people are complaining...I also don't understand the argument that the next boxes will show major improvements over year 2 WiiU software when devs will be getting used to new architectures all over again...This is mostly toward Brad and Special...I would understand it better if the argument was that by the second wave of PS4 and Nextbox games they will start to show improvements and after that more than likely surpass WiiU graphics...but to say that year 2 software on WiiU won't even matter with launch games from the others systems is just silly in my opinion. Some developers are flat out saying that they need more time to figure out the CPU and how the whole systems works best. Some of you don't quote those lines though. The architecture is different. The CPU is def different than what the devs are used to but that doesn't mean that the total package is crap and worse off. You use a system to its strengths...you don't use a system in a way that its not really built to be used, no matter how powerful it may be. That is the key. The launch games look fine. The games will get better. We all know Sony and Microsoft are going to come out with awesome tech demos and trailers to wow us. Nintendo hopefully is preparing for this with some awesome games. My bet is that they are but that's just my opinion. When those systems launch we will see. A LOTof people are going to look like fools. Which side those fools are on we will see. I'm more in the middle ground and happy about it. My opinion is that it won't be as bad as PS2 to XBOX splinter cell differences. I just don't see it. I don't see how someone can look at a game like Assassins Creed or Uncharted, expect some improvements with the WiiU but when the PS4 and Nextbox come out say the WiiU is rubbish. That just doesn't make sense to me. I don't understand that line of thinking. Sorry so long Friday night here in Korea and I'm drinking hahaha :p
 

Kenka

Member
Really? So why does this review indicate only "idle" and "load" consumptions? To me, it indicates quite the contrary, and we are talking about a modern card.

www.guru3d.com/articles_pages/asus_geforce_gtx_660_directcu_ii_top_review,27.html

guru3d said:
Power consumption then, it's low if you place it into context with the game performance. Roughly 115 Watt is what we measure during gaming. The card is allowed to peak to 140 Watts. That does pose a problem though, these card will not be grand overclockers as they quickly run into the power design limitations.
It never says that the GPU draws less or more depending on the game played. Damn, I can't find a chart with "game title" on x axis and "power consumption" on the y axis for a given GPU.
 
Even given variable power usage depending upon the game - we have an approximate limit to how much total power the console should be expected to use, with everything going, at somewhere around 55-60W.

Take away 4 USB ports, and we're at 45-50W anyway? WiFi? 5x BluRay? CPU? Based on the discussion, expecting the GPU to have a max power draw of 25-30W doesn't sound unreasonable.

Is the BluRay mostly idle while playing a game - if so, then is Iwata's "40W while gaming" that far off from this figure?
 
Absolutely. A modern high-end PC GPU can have a range of 100 W depending on which game you are running.

Wow, I didn't know that. I thought it was either on or off...

A couple of questions...

Is there a possibility that the CPU could disable cores that are not in use?

Is their a separate processor for I/O? (This was speculated in WUST, but I haven't heard anything lately.)
 

Kenka

Member
Is the BluRay mostly idle while playing a game - if so, then is Iwata's "40W while gaming" that far off from this figure?

Interestingly, the Blu-Ray "should" be idle when downloaded games are played. That theoretically leaves more power for the GPGPU :eek:
Wow, I didn't know that. I thought it was either on or off...
I am still not sold.

edit:
Power Consumption Results: When it comes to power consumption the ATI cards use less idle and load. Power consumption varies from game title to game title, but this is a good idea of how much power will be consumed at load.
http://www.legitreviews.com/article/681/9/

OK then.
 
Really? So why does this review indicate only "idle" and "load" consumptions? To me, it indicates quite the contrary, and we are talking about a modern card.

www.guru3d.com/articles_pages/asus_geforce_gtx_660_directcu_ii_top_review,27.html


It never says that the GPU draws less or more depending on the game played. Damn, I can't find a chart with "game title" on x axis and "power consumption" on the y axis for a given GPU.

Trust me. Power consumption is dependent on software.

FurMark has often been used to test the power consumption of a GPU. Nvidia newer cards have been optimized for FurMark to make it's results look better in power consumption.

There are articles littered all over the internet.

Here is an article that tests both power consumption and heat back when the GTX 460 came out. It uses varies software and gets different wattage and heat readings.

http://www.anandtech.com/show/2977/...x-470-6-months-late-was-it-worth-the-wait-/19


Here is a good post on GPU power consumption variations.

http://forums.atomicmpc.com.au/index.php?showtopic=264



As more demanding games come out for WiiU, that base 40w will move up. Developers are going to have to find ways to keep power consumption of the other parts in WiiU low if they want to increase GPU utlization.
 

Durante

Member
Wow, I didn't know that. I thought it was either on or off...

A couple of questions...

Is there a possibility that the CPU could disable cores that are not in use?
Yes. Or at least put it into a deep-sleep state where it consumes very little power. Most modern CPUs are capable of that.

However, I don't think that Wii U's CPU cores will ever be idle for long in most (retail) games.
 
Top Bottom