• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Thank you sir, so that more or less confirms the three letter code, but it still doesn't tell us what the P stands for... I'm probably far more interested in this than I should be... but I've always been fond of them ever since I read an article on what the DOL stood for on the back of GCN game cases (then later confirming form yself that every other system Nintendo made did the same)

Sounds like a job for Iwata Asks (laughs). I think CTR is still a mystery too.
I don't know if the Dolphin codename was officially announced but I remember it being public knowledge beforehand.
 
Alot of people on GAF really need to grow up.

If all you can say is "LOLOLOL looks like crap" and can´t appreciate good graphics in relation to the hardware they run on then you might as well switch to pc gaming only to get the best graphics possible.

Mario Galaxy 1/2 look goregous no matter how much stronger the xbox 360 is because you have to consider that it runs on an overclocked Gamecube...

Assasins Creed (Especially Brotherhood and Revelations) are freaking awesome looking on the 360 even though the PC versions are far superior. Why? Because for 2005 tech, 241 gflop and a dx 8 GPU, it looks freaking awesome.

Comparing every console visuals, without taking the hardware aspect into account, to the best visuals available (PC) and think its CRAP becaue it dosen´t match/surpass it, is pretty childish behaviour.

Yes PS4/720 will be more powerful. There so doubt about it. But why are you so afraid of the fact, that Nintendo has the strongest hardware for about a year so you jump to every opportunity to defend the Xbox 360/PS 3?
 

ozfunghi

Member
One might even say a marginal margin.

Look, there will be some things WiiU undoubtedly does better. Texture res can be higher, world size can be larger. But we are still talking last gen + visuals.

I happen to think if you have trouble seeing your vision to fruition on that capable of hardware... you're doing something really wrong.


I've seen you make statements like these quite often as of late. Looks like you have some info the rest of us don't. So far we are looking at at least twice the amount of RAM, 3 times the eDRAM, and if bgassassins guestimates are somewhat in the same ballpark, a 2-3 times more powerful GPU with modern features. Now, i understand this is far from a generational leap, but i for one am starting to get annoyed by comments such as this one. Marginal, it is not.

So please, share what extra information you might be holding on to.
 

Eteric Rice

Member
My only concern is weather it can receive ports of the big next gen games. I would be amazed if Nintendo wasn't forward thinking enough to have their system at least capable of getting those games, even if they don't look as pretty.

That will be a pretty big factor to determine Wii U's future.
 

Theonik

Member
I wonder if Nintendo has some sort of system installed to collect usage data from Internet connected consoles like Steam's hardware survey. This would be useful in determining how much of the multi-task memory users actually use and how much RAM they can free for developers in the future.
Edit:
My only concern is weather it can receive ports of the big next gen games. I would be amazed if Nintendo wasn't forward thinking enough to have their system at least capable of getting those games, even if they don't look as pretty.

That will be a pretty big factor to determine Wii U's future.
I don't think this should be a huge problem. Sure they won't look as good presuming PS4/Nextbox are more powerful of course, but this time round we shouldn't have the huge technological gap that prohibited easy downporting to the Wii.
 

Crazyorloco

Member
I like Nintendo damage controllers.

"OMG look how beautiful Mario galaxy looks. Looks better then some 360 games."

guy* so wiiU is gonna be stomped by ps4/720

"no one cares about graphics!"

We can all quote imaginary people. Although Mario Galaxy is gorgeous, it's not really one of it's main high points. As numerous reviews note, It has really great gameplay and it's fun. That's what the Wii focuses on.

Eteric Rice, i'm with you on that. I hope Wii U gets all the major multiplatform games that are released in the future. If grand theft auto V comes out for the 360 i'll be happy, but if it comes out for the Wii U i'll be really happy.
 

Van Owen

Banned
Alot of people on GAF really need to grow up.

If all you can say is "LOLOLOL looks like crap" and can´t appreciate good graphics in relation to the hardware they run on then you might as well switch to pc gaming only to get the best graphics possible.

Mario Galaxy 1/2 look goregous no matter how much stronger the xbox 360 is because you have to consider that it runs on an overclocked Gamecube...

Assasins Creed (Especially Brotherhood and Revelations) are freaking awesome looking on the 360 even though the PC versions are far superior. Why? Because for 2005 tech, 241 gflop and a dx 8 GPU, it looks freaking awesome.

Comparing every console visuals, without taking the hardware aspect into account, to the best visuals available (PC) and think its CRAP becaue it dosen´t match/surpass it, is pretty childish behaviour.

Yes PS4/720 will be more powerful. There so doubt about it. But why are you so afraid of the fact, that Nintendo has the strongest hardware for about a year so you jump to every opportunity to defend the Xbox 360/PS 3?

No one is denying that, but it only has the most powerful hardware by a small margin, and the visuals you see will be of little difference compared to current gen.

The laughable thing is seeing people praise Darksiders screens as superior when they're old or being blown away by Trine's lighting when it looks the same.
 

kinggroin

Banned
Well I'm glad they're taking Wii U's launch seriously with cheaply made games? lol

When Retro's game looks like a good PS3/360 title it will be "IT'S THEIR FIRST GAME ON NEW HARDWARE".


...curious, do you actually have a point to make in this thread?

I don't myself, but thought I'd ask
 
No one is denying that, but it only has the most powerful hardware by a small margin, and the visuals you see will be of little difference compared to current gen.

The laughable thing is seeing people praise Darksiders screens as superior when they're old or being blown away by Trine's lighting when it looks the same.

Darksiders 1/2 in MY opinion is ugly on every platform. Cartoon style + 4 horsemen? NO thanks.

And time will tell if its really only "marginally" because IMO its quite a bit more beefy.
 

Thraktor

Member
Thank you sir, so that more or less confirms the three letter code, but it still doesn't tell us what the P stands for... I'm probably far more interested in this than I should be... but I've always been fond of them ever since I read an article on what the DOL stood for on the back of GCN game cases (then later confirming form yself that every other system Nintendo made did the same)

Given they're product codes, my guess is it simply stands for Wii U Product.
 

AlStrong

Member
Do we have an idea what realistically the chip could do in terms of GFLOPS per Watt? I have no idea what is reasonable nowadays.

Anywhere between 10 and 20. ;)

It's not a particularly great metric because there are a number of factors that go into the power consumption besides shader throughput (TMU/ROP/eDRAM amounts, process node, ALU design, clock speeds etc). And if you look at TDP for various PC cards, you're also considering the RAM type, speed, and number of chips.

So it really depends on what you think all of those factors really are, at which point, GFLOPs/W is meaningless. :p
 

Mr Swine

Banned
A GPU in the 20 ish watt range is probably going to be closer to 400 gigaflops than 600.

Doesn't that depend on what die size the GPU is? If its 40nm then it would be closer to 300 gigaflops and 600 if at 28nm? Would be smart to produce the GPU at 28nm now that 32nm is more or less dead
 

axisofweevils

Holy crap! Today's real megaton is that more than two people can have the same first name.
Given they're product codes, my guess is it simply stands for Wii U Product.

I think the P could stand for Premium.
That's what the black Deluxe model is called here in Europe.

Alternatively, it could be P for Pad.
 
Thank you sir, so that more or less confirms the three letter code, but it still doesn't tell us what the P stands for... I'm probably far more interested in this than I should be... but I've always been fond of them ever since I read an article on what the DOL stood for on the back of GCN game cases (then later confirming form yself that every other system Nintendo made did the same)

Wii U Product
 

Earendil

Member
Really? What games have you seen that look better?

That's not what I meant, I guess I wasn't very clear on that. What I was trying to say is that no matter how good Wii U games look, many people will allow their cognitive bias to get the better of them and claim they look worse than PS360 games. Case in point, E3 2011 when many here stated that games from the sizzle reel looked worse than their PS360 counterparts, only to find out that the supplied footage was from those PS360 games and not WiiU versions.

Sorry I meant 2.8 terraflops.

That's better... :p

Not really. For starters, we have no idea what process it's built on. And then there's the whole width vs. height thing.

And FLOPS is kind of misleading anyway because a lower FLOP'd chip can sometimes perform as well as a higher one (see 6750 vs 4850).
 

Koren

Member
Alternatively, it could be P for Pad.
They never changed the three letters for different things (pad, power block, sensor bar, stand, etc.) related to a same console, I don't see them change that. The number is used to distinguished the different parts. I also don't think that the color would grand a change of letters.

Probably Project, indeed. But color me disappointed, I wanted CAF or something like this. Kinda boring, like AGB or DMG (except the first one is at least a nice trivia).


I'll thus go with a P for Pikmins...
 

wsippel

Banned
I think discussing width and height in a Wii thread is dirty. Go on...
lol

Well, what I mean is this (heavily simplified): Let's assume you have a GPU that does 100GFLOPS with 100 shader units at 100MHz, and consumes 5W. And you have a 20W budget. You could either quadruple the number of shader units and leave the clock as is, or you could leave the number of shader units as is and double the clock. Power consumption increases linearly with the number of transistors and exponentially with the clock, so in both cases, the resulting chip would consume 20W - but the former approach would lead to a chip twice as fast. The latter approach is less effective, but it's also a lot cheaper.
 

Log4Girlz

Member
lol

Well, what I mean is this (heavily simplified): Let's assume you have a GPU that does 100GFLOPS with 100 shader units at 100MHz, and consumes 5W. And you have a 20W budget. You could either quadruple the number of shader units and leave the clock as is, or you could leave the number of shader units as is and double the clock. Power consumption increases linearly with the number of transistors and exponentially with the clock, so in both cases, the resulting chip would consume 20W - but the former approach would lead to a chip twice as fast. The latter approach is less effective, but it's also a lot cheaper.

Ahh, fascinating.
 

mrklaw

MrArseFace
5GHz is smart, it'll be less pro he to interference from wifi, cordless phones etc.

I hope it doesn't adversely affect 5GHz wifi though, I picked that specifically to avoid congestion with my neighbours
 
I've seen you make statements like these quite often as of late. Looks like you have some info the rest of us don't. So far we are looking at at least twice the amount of RAM, 3 times the eDRAM, and if bgassassins guestimates are somewhat in the same ballpark, a 2-3 times more powerful GPU with modern features. Now, i understand this is far from a generational leap, but i for one am starting to get annoyed by comments such as this one. Marginal, it is not.

So please, share what extra information you might be holding on to.
Marginal it is, at least so far. You can't blame Thunder Monkey, because that is reality. The ports that we have seen running on the WiiU doesnt feautre significant graphical enhancements.

Also what an incredible orchestrated assault on objectivity was executed by a band of loyalist claiming Nintendo Land is some sort of incredible graphical showcase. It was like a trip into the mind of an schizophrenic. :)
 

mrklaw

MrArseFace
More memory will not give you more frames per second anyway, unless we're talking scenarios where your scene assets do not fit in your RAM, so you'd have to do mid-frame fetches of assets from storage, in which case storage speed becomes the bottleneck. But how is 2GB not better than 1GB at the same BW? Yes, there might be a diminishing return with increasing the amount to, say, 32GB from 16GB as you have to fill up that memory with something. At 25GB storage you can try to preload most of your content off the disk, but at 22.5MB/s that's 18.5 minutes to read the entire disk, so you need to do that in the same streaming fashion anyway. But we're not talking such amounts here - we're talking 1GB vs 2GB. Trust me, the latter is definitely better than the former.

You'd hit diminishing returns way before that though.

Eg If you calculate the available memory bandwidth, and the capacity of the GPU to process 'x' number of polys, then there is a finite amount of memory that can be used to draw the immediate scene. Then if you have a landscape to walk around, you'll need additional memory to hold objects you can't yet see. But only close by objects - things further away can be streamed in off disc.

It's possible that 1Gb is enough for that on wiiU.

There was an article linked a while ago that looked more into this, estimating how much memory you'd need for an open world streaming engine for different speeds of ram
 
I've seen you make statements like these quite often as of late. Looks like you have some info the rest of us don't. So far we are looking at at least twice the amount of RAM, 3 times the eDRAM, and if bgassassins guestimates are somewhat in the same ballpark, a 2-3 times more powerful GPU with modern features. Now, i understand this is far from a generational leap, but i for one am starting to get annoyed by comments such as this one. Marginal, it is not.

So please, share what extra information you might be holding on to.
Actually as funny as it is I've been hearing a few things that lead me to believe porting will be much easier than I initially thought. But I don't think we're getting that powerful of system with WiiU. More than enough power to see anything in my head on the screen.

But I usually preface those statements by saying "Power multipliers mean vastly different things now than they once did." Now it's the difference between Beyond Two Souls, and that snazzy tessellated frog CryEngine demo. Appreciable, but not gigantic.

For a long time I've thought the system would be an "M2" style upgrade. M2 versus Xbox using vague generational definitions. WiiU is the M2, N64 is PS3/360, and Xbox is Orbis/Durango.

In that era that meant huge visual differences. In this era it's the difference between Beyond Two Souls, and that tessellated frog. Noticeable under bad conditions, but not the same kind of difference.
 

Log4Girlz

Member
Marginal it is, at least so far. You can't blame Thunder Monkey, because that is reality. The ports so far running on the WiiU doesnt feautre significant graphical enhancements.

Also what an incredible orchestrated assault on objectivity was executed by a band of loyalist claiming Nintendo Land is some sort of incredible graphical showcase. It was like a trip into the mind of an schizophrenic. :)

Well at least it is pretty. So is NSMBW2. Though I'm not a big fan of either.
 

wsippel

Banned
Ahh, fascinating.
It certainly is. In the end, it's all about effectiveness versus efficiency. And as always, it comes down to money. Nintendo most likely knew they want a small case and low power consumption, so they looked at what's possible and set a budget. If you look at my example, you probably come to an obvious conclusion: Why not clock the GPU down to 50MHz and go with 1600 shader units? Yep, still in the energy budget, and it would be blazing fast, but the chip would be huge and a lot more expensive. So, as I wrote, it's about balance, and making the most of what you have or are willing to spend.
 
Actually as funny as it is I've been hearing a few things that lead me to believe porting will be much easier than I initially thought. But I don't think we're getting that powerful of system with WiiU. More than enough power to see anything in my head on the screen.

But I usually preface those statements by saying "Power multipliers mean vastly different things now than they once did." Now it's the difference between Beyond Two Souls, and that snazzy tessellated frog CryEngine demo. Appreciable, but not gigantic.

For a long time I've thought the system would be an "M2" style upgrade. M2 versus Xbox using vague generational definitions. WiiU is the M2, N64 is PS3/360, and Xbox is Orbis/Durango.

In that era that meant huge visual differences. In this era it's the difference between Beyond Two Souls, and that tessellated frog. Noticeable under bad conditions, but not the same kind of difference.

From what I've heard and read on this forum, I have a hard time believing that the games releasing on launch day are as good as it gets for the Wii U. It's almost like there's a sector of posters who refuse to believe or entertain the thought that it will do anything more than what we've seen. This is, without taking into consideration that the launch window games are not even optimized. Retro's new game, Zelda U, and Mario U need to at least be released before the curtain is drawn. I will let time be the deciding factor for what's under the hood since the ninjas will get anyone who "officially" blurbs

I'm not expecting HUGE graphical jumps but I'm expecting noticeable differences in Wii U capabilities versus 360/PS3
 
Marginal it is, at least so far. You can't blame Thunder Monkey, because that is reality. The ports so far running on the WiiU doesnt feautre significant graphical enhancements.

Also what an incredible orchestrated assault on objectivity was executed by a band of loyalist claiming Nintendo Land is some sort of incredible graphical showcase. It was like a trip into the mind of an schizophrenic. :)
Honestly, I didn't really expect ports to show much if any advantage.

You're not talking a metric jump with the system anyway, and then you get devs strapped for cash and time porting a game that may not translate 100% to the system. Requiring a rethink because of core design differences between WiiU and PS360. I have no doubt you'll see better looking games on WiiU. Though the system can hit a wall by being the "cheap" to develop for system.

If no one puts in the money, we won't see much of any improvements.
 
None of what I've said is a knock against the system.

There's more power there. But it means nothing if no one uses it. Which should be a big "No shit." to everyone involved.
 

Van Owen

Banned
5GHz is smart, it'll be less pro he to interference from wifi, cordless phones etc.

I hope it doesn't adversely affect 5GHz wifi though, I picked that specifically to avoid congestion with my neighbours

Anyone know if Wii U supports 5ghz 802.11n? Does 3DS?
 
Well at least it is pretty. So is NSMBW2. Though I'm not a big fan of either.
Hold on a second, im not saying otherwise. I have always given credit to Nintendo because they have amazing graphical designers. But this an spec thread here and we are talking hardware. Nintendo Land is not some amazing graphical showcase that far surpasses the limmits of the cuarrent generation. Writting those hyperbolic statements
was rather shameful.
 

Log4Girlz

Member
Hold on a second, im not saying otherwise. I have always given credit to Nintendo because they have amazing graphical designers. But this an spec thread here and we are talking hardware. Nintendo Land is not some amazing graphical showcase that far surpasses the limmits of the cuarrent generation. Writting those hyperbolic statements
was rather shameful.

I wasn't disagreeing with you :p
 
Honestly, I didn't really expect ports to show much if any advantage.

You're not talking a metric jump with the system anyway, and then you get devs strapped for cash and time porting a game that may not translate 100% to the system. Requiring a rethink because of core design differences between WiiU and PS360. I have no doubt you'll see better looking games on WiiU. Though the system can hit a wall by being the "cheap" to develop for system.

If no one puts in the money, we won't see much of any improvements.
Oh for god sakes! Let's not kill common sense here. Nobody in his right mind would suggest the WiiU is hitting a wall with those ports, of course the system is more capable than that.

But it is not a substantial jump. Even the cheap, quick and dirty ports in other console launches featured more marked graphical improvements than we are seeing here.

For me it's kind of funny that if there's a console concept that suits well for packing some relative strong specs for the time is this, you know with multiple screen rendering. Too bad NIntendo didn't go for one SKU at 350 with a beefed up part. Instead we got 2 skus with the more expensive one featuring some extra pieces of plastic and 24 extra GB of cheap flash memory. Too bad they couldn't or wanted to hit the 250 price.
I wasn't disagreeing with you :p
My apologies :) Better be save than soory them.
 

Durante

Member
lol

Well, what I mean is this (heavily simplified): Let's assume you have a GPU that does 100GFLOPS with 100 shader units at 100MHz, and consumes 5W. And you have a 20W budget. You could either quadruple the number of shader units and leave the clock as is, or you could leave the number of shader units as is and double the clock. Power consumption increases linearly with the number of transistors and exponentially with the clock, so in both cases, the resulting chip would consume 20W - but the former approach would lead to a chip twice as fast. The latter approach is less effective, but it's also a lot cheaper.
Power consumption does not increase exponentially with clock speed. It increases quadratically in voltage and linearly in frequency and capacitance.

P = C * V² * f

Of course, often you need higher voltages to reach higher frequencies.
 

wsippel

Banned
Power consumption does not increase exponentially with clock speed. It increases quadratically in voltage and linearly in frequency and capacitance.

P = C * V² * f

Of course, often you need higher voltages to reach higher frequencies.
That's why I said it was "simplified". I was basically just quoting an IBM whitepaper.
I'm also really bad with math terminology. And drunk.
 
Top Bottom