• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U GPU base specs: 160 ALUs, 8 TMUs, 8 ROPs; Rumor: Wii U hardware was downgraded

Status
Not open for further replies.

LoveCake

Member
As someone who pays a power bill, the lower power consumption is nice. But I'd be willing to pay the $5 more a year which will be useless since I have to buy a PS4 to compensate for the lack of third-party support.

I have to agree,

I do not understand Nintendo's insistence on low power consumption, how much more does it cost to run a Xbone or PS4 over a year than it does to run a WiiU ?

We have some good tech people here on Gaf !

Could someone work out the power consumption figures over a year for each console X360, Xbone, PS3, PS4, Wii, WiiU with average play time at i don't know four hours a day every day for a year just to have a average ?

Really like the member i have quoted, i am not really bothered how much power my console needs to run, it is never going to be anything crazy so i do not see what the real issue Nintendo has with it.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Yeah and it contradicts what seemed like Nintendo's MO at the start of WiiUs development: making it easy for 3rd parties to jump in and to avoid the Wii situation all over again. Bizarre really. I guess along the line somewhere they just ditched that philosophy and went back to looking after their own development first and foremost.

Either that or they were genuinely second guessing other next gen specs and they got it wrong/the goalposts moved? Durango was originally rumoured to be much less powerful than it ended up after all...

Nintendo being out of touch with evolution of tech industries and what developers expect for game development has been a consistently reoccurring issue for a significant portion of the company's history going all the way back to the SNES. Often they learn and make amendments, just usually far later than they should have. Sometimes they make a projection/gamble and get it super right (see: Wii market success). Sometimes they get it completely wrong in face of stubbornness (see: HD television penetration rate). Nintendo's policy towards indy devs on their estore, for example, is apparently really fucking fantastic now. But it took years of utter bullshit to get to this point, and it's not really working in their favour when others are doing the same/similar things with better framework to support them.

Nintendo might project that they made a system easy to work with. Just because they believe that doesn't make it true. They're as fallible as anyone else, and just as prone to making catastrophically bad decisions backed by poor reasoning.

I'm failing to see how this matters at this point. Mainly for the reasons EatChildren stated on the last page.

What it matters to depends on what you wish to compare or discuss. Maybe it does matter. Maybe it doesn't. Or it does for some things and not others and subjectivity and yadda yadda. Either way, it's a thing, and this is the discussion.
 

LCGeek

formerly sane
Seems like there could be two decisions here working in parallel, not necessarily one as a result of the other. Solving two birds with one stone might have made sense.

What do you mean here specifically? They worked with IBM and AMD on components in the past. And agreed regarding the overheating.

Sorry mentioning amd to me is garbage considering how bad their cooling solutions are compared to nvidia. There are plenty of legit companies who they could've gone too to get things in order when it comes to cooling, or better yet ditch a small design fewer consumers want vs power.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
So am I right in saying that the CPU in Xbox 360 and PS3 is more powerful then the CPU in WiiU?
 
A

A More Normal Bird

Unconfirmed Member
Sorry mentioning amd to me is garbage considering how bad their cooling solutions are compared to nvidia. There are plenty of legit companies who they could've gone too to get things in order when it comes to cooling, or better yet ditch a small design fewer consumers want vs power.
I don't think the quality of AMD's reference coolers for PC GPUs has any relevance here.
 
Thats it, Im buying a PS4 and building a good game-pc. Im baffled by theire design of the wiiu, the gamepad is way to expensive, and the console has wii written al over it, compared to 360/PS3, albeit better/newer instruction sets.

I honestly think that gimping on the hardware this much is a huge mistake, because with this weak of a hardware, they essentially force me to buy a good PC/PS4 for all the multiplatform titles/3rd party support, and isnt that what they wanted to fix? The difference with wiiu and PS4/PC multiplats will be HUGE in 2-3 years. And I dont fucking care about the gamepad, I'd rather have a evolutionised gamecube controller, like MS did with the xbox (rumble in triggers? thats just awesome).

I really hate nintendos stance on competitve graphics, and I feel like this will hurt them a lot with wiiu in the long run, especcially regarding 3rd party support; I just wont be a multiplat-title for the wiiu...ever.

This kind of response really is a wonder to me. Hysteria over a bunch of silly numbers on the internet. I can tell you something that is more concrete and relevant than anything you've read here: The Wii U is fun. I discovered it with the help of a mysterious, rare and magical element known as "First Hand Experience". That's when you try something with your own two hands and decide if you like it or not.

If you're really so worried about your fancy computer graphics then go watch a Super Mario 3D World or X trailer and tell me how important these so called "specs" are. You might even realize that at the end of the day, they're just numbers. They don't determine the quality of a videogame experience. They don't limit your capacity for enjoyment. They don't control you. Don't be a slave to them.

Go now, run free, and embrace this new found mental liberation. Have a cold shower while you're at it.
 

Toski

Member
I have to agree,

I do not understand Nintendo's insistence on low power consumption, how much more does it cost to run a Xbone or PS4 over a year than it does to run a WiiU ?

We have some good tech people here on Gaf !

Could someone work out the power consumption figures over a year for each console X360, Xbone, PS3, PS4, Wii, WiiU with average play time at i don't know four hours a day every day for a year just to have a average ?

Really like the member i have quoted, i am not really bothered how much power my console needs to run, it is never going to be anything crazy so i do not see what the real issue Nintendo has with it.

I'm not a power consumption expert, but I know my PC probably eats more power at load than the PS4, X1, and Wii U combined. I can only think Nintendo planned to use it as a marketing ploy, except the power savings the Wii U has over other platforms is negligible, it was probably done more for heat purposes and reliability than anything else.
 

terrier

Member
I don't agree with nintendo's view on how they design hardware, but at least, they should've gone for an 'easy' cpu+gpu combination that, even if only slightly better than ps360 6 years later, allowed for easy/cheap porting of multiplatform engines/games, and also release at a lower price point. I guess they are confident because still have Mario and such to guarantee some degree of success, even in the worst scenarios, but....
 

Oblivion

Fetishing muscular manly men in skintight hosery
Nintendo's obsession with tiny ass consoles has been annoying as fuck. Is it worth having an admittedly quiet, low power console if it means you won't be able to get multiplat games?
 

SmokyDave

Member
If you're really so worried about your fancy computer graphics then go watch a Super Mario 3D World or X trailer and tell me how important these so called "specs" are.
I don't get it. If they hadn't upgraded the Wii at all, you wouldn't be wowed by those trailers. Specs are obviously important, just look at the feverish excitement in this recent Shin'en thread for an example. Even 3DS gamers got excited when they saw 'Ironfall'. People do care about this stuff.

You could've had graphics like that years ago if Nintendo weren't so far behind. The shit that makes you go 'wow!' next generation will be on a level that PS4 / XBone players will be enjoying this generation.
 

Seik

Banned
Gotta appreciate them best ports of Batman Origins and Deus Ex because it's something that won't last for long. :/
 

Log4Girlz

Member
Fire Iwata, develop other platforms and (probably) alienate, if not offload on the long run, their current staff. And thus end up becoming yet another entertainment software company that will compete for mediocre margins.

Or they can break new grounds anew.

In any case, the time to decide is now.

There is precious "new" avenues to stroll down. See, if they had played their cards right, they could be providing a system that didn't break the bank, still offered good specs and have not only third party titles galore, but then Nintendo games as icing on the cake. Who wouldn't buy that kind of console?

But when I say "played their cards right", well, that would have meant not fucking over third parties with the N64 (which retained carts), Gamecube (mini-discs and abhorrent exterior design which could very well have been made to fend off teens and young adults), and Wii (which was an overclocked Gamecube. A successful machine which finally gave Nintendo some breathing room after their poor decisions from their previous two consoles but alienated third parties and the engines they were pouring money into creating). One wonders if such a machine would ever have been necessary if they hadn't fucked up so bad previously.

So I think its too late to turn around and deliver a "hardcore" machine...because the Nintendo name is too tarnished to just turn their fortunes around in one generation. They'll have to do something unconventional and I think they will.
 

Jaagen

Member
I don't get it. If they hadn't upgraded the Wii at all, you wouldn't be wowed by those trailers. Specs are obviously important, just look at the feverish excitement in this recent Shin'en thread for an example. Even 3DS gamers got excited when they saw 'Ironfall'. People do care about this stuff.

You could've had graphics like that years ago if Nintendo weren't so far behind. The shit that makes you go 'wow!' next generation will be on a level that PS4 / XBone players will be enjoying this generation.

I agree.

And would it be possible to put a smaller(28nm), but faster chipset in the same case? And if so, why didn't Nintendo spend the extra R&D on mnaking a smaller chip?
 

Oblivion

Fetishing muscular manly men in skintight hosery
Wait, this news is very confusing. Does this mean that the GPU is weaker than the ones in the PS360?
 

Nikodemos

Member
Wait, this news is very confusing. Does this mean that the GPU is weaker than the ones in the PS360?
No, it isn't. If it were, there would either be no multiplats on it, or they would look noticeably worse (which they don't). OTOH, it's very convoluted to use properly, hence the very few multiplats (also because of the low installed base).
 

LCGeek

formerly sane
I don't think the quality of AMD's reference coolers for PC GPUs has any relevance here.

Someone mentions their expertise they have very little in comparison to people who do it as main speciality. Mentioning that they can help out when anything of any that area is pointless. Nintendo flat out fucked up this system and should've have some humbleness in the form size cause it's costing them big time. When nintendo can't make their vision cause of heat issues which is one reason why gc ended up being downgraded and why both wii are weak I stop caring. It's a prime example of not having good priorities.
 

EdLin

Neo Member
I don't get it. If they hadn't upgraded the Wii at all, you wouldn't be wowed by those trailers. Specs are obviously important, just look at the feverish excitement in this recent Shin'en thread for an example. Even 3DS gamers got excited when they saw 'Ironfall'. People do care about this stuff.

You could've had graphics like that years ago if Nintendo weren't so far behind. The shit that makes you go 'wow!' next generation will be on a level that PS4 / XBone players will be enjoying this generation.

Yes, but those graphics that make you go "wow" on a PS4 and Xbox One won't be in a Mario game, or an RPG from Monolith. That having been said, the audience for Nintendo exclusives is not big enough to carry their platforms; the Wii was a console built with a different philosophy than the Wii U, other than "overclocked Gamecube with a gimick and a better GPU" which it does repeat. The Wii was a family-and-friends box, while the Wii U is more of a single-player experience. That means that the network effect that made the Wii a huge success isn't quite there with the Wii U. The fact that the average consumer thinks it's a tablet for the Wii doesn't help matters.

I say this not out of flaming Nintendo, but out of concern for the company. I happen to have a Wii U and love it (and a PS3), but people not into Nintendo console exclusives aren't going to buy one. It's another Gamecube really, though considering the quality of games for the Gamecube produced by Nintendo, that won't make it not worth owning if quality over quantity over popularity is your thing. :)

I predict Nintendo might decide to become a handheld company, or do what GAF has been predicting they do and become a hybrid-handheld-conventional-console company. I think if they do the latter, they'll have a hit on their hands if they do it right - look at how much excitement there is over the Vita TV. (A 3ds TV would be not as good though, obviously they need to base it on more up to date mobile technology, which they will in their next handheld generation after this one no doubt; or at least something suitable for an HD TV in the Mobile SoAC department, which would be doable with 2014 technology already road-mapped.)
 

nikatapi

Member
I think the goal of perfect backwards compatibility was one of the main reasons that led Nintendo to use this type of architecture for the system. Maybe they wanted to avoid making completely new graphics engines for the console in order to make games faster (which ultimately didn't happen).

Also, the low consumption - small case is a continuation of the Wii philosophy, but they wanted to appeal to hardcore gamers this time, so i think they could make a machine a little more power hungry, but much more capable.

Ultimately, i personally think that they failed on every angle with this hardware, it is barely more powerful than 7 year old machines, it has an awful design which led to more brand confusion, and it alienated some 3rd parties.

Still, first party games look good and i think we will be more impressed in the future.
 

wsippel

Banned
So am I right in saying that the CPU in Xbox 360 and PS3 is more powerful then the CPU in WiiU?
Depends on the workloads. When it comes to floating point code? Definitely, we knew that for quite some time. Then again, in that regard, CELL beats Jaguar as well.
 
Not in anyway surprising. Actually, just how nice looking the best games look is surprising. But that's not saying much really. This news is no surprise. This is the same philosophy from the Wii. The Wii U is in every way the Wii's sequel.
That's plainly false and you know it.
The Wii had a 19 mm^2 CPU and a 79 mm^2 GPU/DSP/ARM/Northbridge die.
The WiiU has a 27,73 mm^2 CPU and a 146,48 mm^2 GPU/DSP/ARM/Northbridge die.
The GC had a 43 mm^2 CPU and a 110 mm^2 GPU/DSP/ARM/Northbridge die.

In terms of die area the chips are clearly much more than they were on the Wii, in fact, their combined area is bigger on the WiiU than on the GC.

Regarding the specs listed in this thread, I have two questions:
- Those numbers are known for sure, or they are deduced from the "downgrade" information you (bgassassin) had?

- Even excluding the 40 mm^2 of the 32MB of eDram, we are still left with 106 mm^2 of GPU area (the 2+1 MB of eDRAM/eSRAM occupy between 10 and 12mm^2, so let's be "generous" and say it goes down to 90mm^2 of pure GPU area without any memory on it).
If we compare with the HD4000 released in 2009, at a 40nm fabrication process it had a 640:32:16 core with Double Precision (I doubt the WiiU has that, which would mean smaller shader processors for the same single precision FLOP performance), a GDDR5 controller (which for what I know, it's bigger than a standard DDR3 controller like the one used on the WiiU) and a total area of 137 mm^2.

There's only one explanation to me for why Nintendo couldn't even reach a 320:16:9 core with (more than) 90mm^2 of pure GPU area, and that explanation is that the GPU is at least fabricated at 55nm and not the 40nm we suspected.
Now, is it possible to put 32MB of eDRAM in 40mm^2 with a transistor size of 55nm?

To compare apples to apples, the GC 2+1 MB of eDRAM/eSRAM occupied a bit less than 1/3 of the total area of the chip (110 mm^2). 110/3 = 36,6 mm^2.
Since it's "a bit less" until someone measures it with the die shot, let's suppose a 30mm^2 figure at 180nm.
Now even going by the lowest figure (10 mm^2) for the smaller banks of memory, it seems pretty obvious that this console isn't fabricated at 40nm (30mm^2 at 180nm would occupy less than 1,875 mm^2 at 40nm).

In fact, going by the figures, I would say a 90nm fabrication process is the most probable scenario. Is that even possible?
 

Oblivion

Fetishing muscular manly men in skintight hosery
I think it means that it was planned to be a bigger leap, but ultimately got cut down to around the same level.

Flops != Flops

No, it isn't. If it were, there would either be no multiplats on it, or they would look noticeably worse (which they don't). OTOH, it's very convoluted to use properly, hence the very few multiplats (also because of the low installed base).

Its a Lotus Elise on a twisty track. It'll eat a Mustang's laptime for breakfast.

Okay, cause I think the one positive thing devs had to say about the Wii-U was about its GPU. Pretty sure everyone praised it.
 

Chindogg

Member
So the CPU's the bottleneck but the GPU can do some neat tricks?

Isn't that exactly what the consensus was a almost a year ago when this debate started?
 

wsippel

Banned
I doubt the downgrade rumors, by the way. They removed transistors because the system was overheating? Makes no sense considering they've apparently upped the clocks as well. The clock increase would produce much more additional heat than the lower transistor count could compensate for, so the revised design would run hotter overall.
 

Oblivion

Fetishing muscular manly men in skintight hosery
So the CPU's the bottleneck but the GPU can do some neat tricks?

Isn't that exactly what the consensus was a almost a year ago when this debate started?

Yeah, I'm starting to think that this news isn't exactly news.
 

Chindogg

Member
Do we have any real numbers behind this? Cut all the bullshit and just establish the pure facts here.

This has got to be the most confusing argument for any console ever. Even more confusing than the Saturn.
 

Nikodemos

Member
Okay, cause I think the one positive thing devs had to say about the Wii-U was about its GPU. Pretty sure everyone praised it.
The GPU is impressively efficient. It produces PS360-level graphics on a 20-watt budget. The joint (AMD+Nintendo) design team performed a great feat of technomancy.

Unfortunately, it still is greatly removed from what the new wave of consoles coming this November can do.
 

Goodlife

Member
I doubt the downgrade rumors, by the way. They removed transistors because the system was overheating? Makes no sense considering they've apparently upped the clocks as well. The clock increase would produce much more additional heat than the lower transistor count could compensate for, so the revised design would run hotter overall.

Interesting....
 
Its a Lotus Elise on a twisty track. It'll eat a Mustang's laptime for breakfast.

that's really an over statement, in reality it's very close to the X360 GPU and noticeably more powerful than the RSX found in the PS3(but the cell processor crushes both combined in-term of CPU power).
 

Zinthar

Member
The GPU is impressively efficient. It produces PS360-level graphics on a 20-watt budget. The joint (AMD+Nintendo) design team performed a great feat of technomancy.

Unfortunately, it still is greatly removed from what the new wave of consoles coming this November can do.

It's actually more like 32-33 watts while playing games. And that's not too impressive from an efficiency perspective versus, say, Apple's A7 SoC. The iPad Air draws a mere 5 watts while playing Infinity Blade 3, and A7 is approaching the power of PS360 in many ways (its GPU is around 115 GFLOPS).

I also suspect that the PS4 (and possibly Xbox One) will be more efficient than the Wii U relative to its performance. We'll see soon enough.
 

CTLance

Member
Very interesting. Thanks for the confirmation, bgassassin (/EatChildren).

Obligatory LOL Nintendo's gonna Nintendo, and holy crap is that a low-specced GPU given contemporary designs. :lol

Then again, as Nintendo-leaning gamer this is nothing new. Loved my Wii last gen despite the horrifying resolution, assy sound output options and generally weak hardware - and while region codes and silly account shenanigans still keep me away from both WiiU and 3DS, both are very likely to end up in my collection after I've upgraded the flip out of my PC. There's just something different about Nintendo games that can't be found anywhere else.
Or so I tell myself
.
 

tesla246

Member
This kind of response really is a wonder to me. Hysteria over a bunch of silly numbers on the internet. I can tell you something that is more concrete and relevant than anything you've read here: The Wii U is fun. I discovered it with the help of a mysterious, rare and magical element known as "First Hand Experience". That's when you try something with your own two hands and decide if you like it or not.

If you're really so worried about your fancy computer graphics then go watch a Super Mario 3D World or X trailer and tell me how important these so called "specs" are. You might even realize that at the end of the day, they're just numbers. They don't determine the quality of a videogame experience. They don't limit your capacity for enjoyment. They don't control you. Don't be a slave to them.

Go now, run free, and embrace this new found mental liberation. Have a cold shower while you're at it.

Are you implying that I do not own a wiiu? Because it sure seems like it. I bought one at launch day, and the only game I really enjoyed was Pikmin3, in a whole year! Sure I am looking forward to World 3D, X, Kart, DK, smash etc. and will buy them day one.

Thats not the point though. The point is that Nintendo promises 3rd party support when in reality, that is practically non-existant as of now.
When there will be multiplats in the next 5 years which I am ALSO interested in (besides nintendo only titles), the difference will be enormous compared to the other platforms because I feel like the consoles gamepad feature costs way too much in comparison with the hardware. And that bothers me, because nintendo is actually forcing me to buy a PS4/PC because the differences in multiplatform will be huge in the next few years, which is just sad.

And you could say I shouldve known this before buying a Wiiu, but I honestly thought that the gap wouldnt be this big in pure processing power ( I can live with a ''low quality texture here and there'').
 
Eh I would be more open to leave Nintendo if they didn't have like the only shooter I ever enjoyed and the only Puzzle/Adventure I really enjoy.
 

The_Lump

Banned
As I look at my Wii U collecting dust I often think to myself "at least the system is pretty small".

Not really adding anything to the discussion, but how would playing your WiiU more stop it gathering dust? What do you do with your console whilst you're playing games on it that prevents dust gathering on it?

How was it possible to release a next gen games machine in 2012 that had less flops then the GPU from a machine released in 2006.

Nintendo. You are a joke.

It's pretty easy when that's what you're trying to do :) Don't make the mistake of thinking Nintendo were trying to create something more powerful and failed. They created exactly what they intended, it just seems it was the wrong choice entirely this time.
 

artist

Banned
Please educate as to why these metrics are particularly relevant.
I am literally inches away to not send my application to Nintendo today.
Those "metrics" are the basic building blocks of the GPU, completely relevant. I dont know what that has got to do with your application ..

I think It's the cpu power
they have the same CPU, xbone one is slightly OCed

the double pixel fill rate is because the ps4 has twice the rops
Wrong, the right answer was already given earlier in the thread.

I doubt the downgrade rumors, by the way. They removed transistors because the system was overheating? Makes no sense considering they've apparently upped the clocks as well. The clock increase would produce much more additional heat than the lower transistor count could compensate for, so the revised design would run hotter overall.
This line of thinking doesnt account for the big delta in the TDP for a 320 alu GPU versus a 160 alu GPU.

For example;
Redwood LE - 320 alus @ 550MHz = 39W
Caicos - 160 alus @ 625MHz = 18W

If you normalize the clocks the difference may end up around 25W, the difference (25W) looks small but in a small box, it's significant.
 
Status
Not open for further replies.
Top Bottom