• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Log4Girlz

Member
Power consumption does not increase exponentially with clock speed. It increases quadratically in voltage and linearly in frequency and capacitance.

P = C * V² * f

Of course, often you need higher voltages to reach higher frequencies.

WTF did I just read.
 

ozfunghi

Member
Marginal it is, at least so far. You can't blame Thunder Monkey, because that is reality. The ports that we have seen running on the WiiU doesnt feautre significant graphical enhancements.

No, marginally it isn't. Marginal would be 640MB RAM, 12MB eDRAM, a couple of stream processor extra and call it a day. When you have a console, that performs 200% better or more, that isn't marginal in my book. Marginal means, you could just as well have left any improvements out because the result isn't worth the trouble.

The ports we have seen so far are no point of reference. Even Wii saw games that weren't possible on GCN, while launch games were straight ports (on basically identical yet more powerful architecture). RE4, Twilight Princes...

Also what an incredible orchestrated assault on objectivity was executed by a band of loyalist claiming Nintendo Land is some sort of incredible graphical showcase. It was like a trip into the mind of an schizophrenic. :)

It has some nice touches... but i wouldn't say i'm really impressed by it.

Oh for god sakes! Let's not kill common sense here. Nobody in his right mind would suggest the WiiU is hitting a wall with those ports, of course the system is more capable than that.

But it is not a substantial jump. Even the cheap, quick and dirty ports in other console launches featured more marked graphical improvements than we are seeing here.

Yet nobody is debating the jump would be comparable with the jump between, say PS2 and PS3. There is a middle ground and it's not marginal.
 
A GPU in the 20 ish watt range is probably going to be closer to 400 gigaflops than 600.

Consider that the e6760 draws 35 watts at 600 Mhz w/ 480 spus with 1 GB of 800 Mhz GDDR5 included in the TDP. And that's 40nm. The picture of Wii on Global Foundries' page promoting their 32nm/28nm High K Metal Gate SOI tech, gives me confidence the GPU will be a 32nm product. I'm sticking to 614.4 GFLOPS with lower clocks (480 Mhz) and the slightly more SPUs (640) which bgassassin told us of in the dev kits long ago.
 
Consider that the e6760 draws 35 watts at 600 Mhz w/ 480 spus with 1 GB of 800 Mhz GDDR5 included in the TDP. And that's 40nm. The picture of Wii on Global Foundries' page promoting their 32nm/28nm High K Metal Gate SOI tech, gives me confidence the GPU will be a 32nm product. I'm sticking to 614.4 GFLOPS.

Isn't this thing a mobile part though?

Mobile parts are cherry picked and cant really be used for consoles (where volume/cost is king).

Edit: It's just called embedded not mobile, but considering the TDP much lower than their comparable desktop parts I'm pretty sure it's binned.
 
Isn't this thing a mobile part though?

Mobile parts are cherry picked and cant really be used for consoles (where volume/cost is king).

Edit: It's just called embedded not mobile, but considering the TDP much lower than their comparable desktop parts I'm pretty sure it's binned.

Perhaps, but take into consideration the lower clock speed that's being predicted. And the fact of the matter is this: If Nintendo is promoting it as a GPGPU but it barely has enough shaders to match 360 with pad usage, there wouldn't be much point...
 
No, marginally it isn't. Marginal would be 640MB RAM, 12MB eDRAM, a couple of stream processor extra and call it a day. When you have a console, that performs 200% better or more, that isn't marginal in my book. Marginal means, you could just as well have left any improvements out because the result isn't worth the trouble.

Yet nobody is debating the jump would be comparable with the jump between, say PS2 and PS3. There is a middle ground and it's not marginal.
Except you can't reach that conclusion because there are no actual specs realesed for the CPU and GPU parts. Is simple as that.

And it is marginal, because marginal is the improvement (or lack of) that we are seeing in the games been showcased for the system. That's whats real, and that is what shines in the abscense of any relevant hardware figures, how those games look right now.

In any case the system runs the risk of being outclassed way before it has the time to stretch its muscles.
 

Eric_S

Member
WTF did I just read.

Something scales linearly ==> Y = a + bX or Y = ABC where capital letters are independent variables. So if all variables are 0< then if (say) A increases with two, then so does Y.

A quadratic increase Y = a + b(X^2) means that an increase in X by two will lead to an increase in Y by four as 2 * 2 = 4

The frequency is voltage dependent in CPU/GPUs as you need a certain potential in order to effectively shuffle your electrons back and forth. (sorta kinda)

The equation P = C * V² * f may be rewritten using SI dimensions as W = (W*s/V²) * V² * s^-1 The frequency part may be intuetively understood as something lowering and rising the electrons potential (and by doing so performing work). The capacitance may be understood as an inner "resistance" of potential motion of sorts.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Isn't this thing a mobile part though?

Mobile parts are cherry picked and cant really be used for consoles (where volume/cost is king).

Edit: It's just called embedded not mobile, but considering the TDP much lower than their comparable desktop parts I'm pretty sure it's binned.
It's not 'just called' embedded, it is embedded. Which is not mobile.

Embedded parts are parts with prolonged production life, guaranteed by the manufacturer. So if you are an embedded IHV, you don't have to worry that if you designed your product using such an embedded part, that part would be phased out in a year or two.

Has nothing to do with mobile.
 

ozfunghi

Member
Except you can't reach that conclusion because there are no actual specs realesed for the CPU and GPU parts. Is simple as that.

Then you can't come to the conclusion that it's only marginally more powerful either. Can't have it both ways. It is indeed as simple as that.

And it is marginal, because marginal is the improvement (or lack of) that we are seeing in the games been showcased for the system. That's whats real, and that is what shines in the abscense of any relevant hardware figures, how those games look right now.

Improvements in current games are marginal to non-existent even. I agree. That doesn't mean the hardware has only been marginally improved. The first batch of 360 games didn't show the same improvements over late last gen, relative to what (more) the hardware was capable of either.

In any case the system runs the risk of being outclassed way before it has the time to stretch its muscles.

There is a risk, agreed. Depends on how future proof Nintendo made it, even if it's underpowered.
 

AlStrong

Member
The picture of Wii on Global Foundries' page promoting their 32nm/28nm High K Metal Gate SOI tech, gives me confidence the GPU will be a 32nm product.

Just out of curiosity, has anyone opened up a recent Wii production unit?

edit: I'm moreso curious about whether they bothered to go beyond 90nm over the 6 years.
 

Cuth

Member
Perhaps, but take into consideration the lower clock speed that's being predicted. And the fact of the matter is this: If Nintendo is promoting it as a GPGPU but it barely has enough shaders to match 360 with pad usage, there wouldn't be much point...
I'm not sure what to think of the 600 GFLOPS rumor, but I suppose "there would be some point" in Nintendo creating hype for their product. :)
 

SapientWolf

Trucker Sexologist
Improvements in current games are marginal to non-existent even. I agree. That doesn't mean the hardware has only been marginally improved. The first batch of 360 games didn't show the same improvements over late last gen, relative to what (more) the hardware was capable of either.
I disagree. First party 360 titles showed marked improvements over last gen consoles and they tripled the resolution on top of that.
 

Stewox

Banned
Well they could always unload some things that are less demanding in terms of ram speed on those less quick chip, if it's designed that way.

Doesn't ninendo go with unified memory since N64, I don't expect a split, didn't do any benefit in PS3.


This makes sense, as someone else on GAF crunched the numbers and figured out that the bandwidth needed to push the images out to the gamepads was about the same as Wireless-N's upper limit, so 5GHz networking makes sense.

Also means GamePads shouldn't be murdered by microwaves being run.

Where?
Can you please point me to it.


Yes, but that's the A/V signal. What is sending the actual input data to and from the Gamepad? I always assumed Bluetooth for that.

Nope, never, or at least that's what all the information we gathered showed.

That IGN bluetooth article is invalid, they mix stuff with backward compatability, because WiiU is going to have it for WiiMotes. Completely separate internally, separate antenna ofcourse.

Wikipedia took it to another level, they listed bluetooth as a (user) connectivity method. Wrong.
 
Then you can't come to the conclusion that it's only marginally more powerful either. Can't have it both ways. It is indeed as simple as that.
What game is not a marginal improvement them? Name it, post a video, give an example. Show me the specs. What?
Improvements in current games are marginal to non-existent even. I agree. That doesn't mean the hardware has only been marginally improved. The first batch of 360 games didn't show the same improvements over late last gen, relative to what (more) the hardware was capable of either.
Intentinal or not this is a lie. Sorry but it is. Nothing touched Project Ghotan in the previous generation. No if, buts or arguments, no BS. It was a game which graphics were impossible to achieve in the previous gen. It was a traditional generational leap in graphics fidelity. Other games in the 360 launch window did it too, but people like to get very subjective when they are mentioned.
There is a risk, agreed. Depends on how future proof Nintendo made it, even if it's underpowered.
Yes, we can only aspire for ports of important projects on the first year, time when another system with better specs will land.
 

Ryoku

Member
I'm not sure what to think of the 600 GFLOPS rumor, but I suppose "there would be some point" in Nintendo creating hype for their product. :)

Don't go on FLOP rating alone--especially when comparing two different GPUs architectures.
e6760 @ 576GFLOPs is slightly more powerful than 1TFLOP 4850.
 

ozfunghi

Member
What game is not a marginal improvement them? Name it, post a video, give an example. Show me the specs. What?

You lost me. You're trying to bend your own argument here. You were talking about HARDWARE, not SOFTWARE. You claimed i could not say it is more than a marginal improvement because the specs are unknown. But when i use the same logic and state that for the same reason (unknows specs) you can't claim it is only marginally better, you start going coocoo. I already agreed the games are not showing those improvements.

Intentinal or not this is a lie. Sorry but it is. Nothing touched Project Ghotan in the previous generation. No if, buts or arguments, no BS. It was a game which graphics were impossible to achieve in the previous gen. It was a traditional generational leap in graphics fidelity. Other games in the 360 launch window did it too, but people like to get very subjective when they are mentioned.

I disagree. First party 360 titles showed marked improvements over last gen consoles and they tripled the resolution on top of that.

Please. Take your time to read what i wrote: RELATIVE to how much more powerful it was. Obviously the improvements would be larger than those seen with WiiU games, because the hardware leap was also larger. Again, not something i was contesting.
 
Nope, never, or at least that's what all the information we gathered showed.

That IGN bluetooth article is invalid, they mix stuff with backward compatability, because WiiU is going to have it for WiiMotes. Completely separate internally, separate antenna ofcourse.

Wikipedia took it to another level, they listed bluetooth as a (user) connectivity method. Wrong.

Thanks for the correction. I'd never analyzed that diagram of the patent closely enough to see the single wireless transmitter. Very interesting indeed. Makes me a bit more hopeful for bathroom play! lol
 

Cuth

Member
Don't go on FLOP rating alone--especially when comparing two different GPUs.
e6760 @ 576GFLOPs is slightly more powerful than 1TFLOP 4850.
Well, I'd certainly like to have more information on the WiiU GPU, other than the rumors about the GFLOPS :D
 
You lost me. You're trying to bend your own argument here. You were talking about HARDWARE, not SOFTWARE. You claimed i could not say it is more than a marginal improvement because the specs are unknown. But when i use the same logic and state that for the same reason (unknows specs) you can't claim it is only marginally better, you start going coocoo. I already agreed the games are not showing those improvements.
You are the one at fault here, it's very clear really. You don't have either the hardware specs or the software showcase to back up your claims. Lose/lose situation.

All im saying (i don't know why you insist on keep replying) is that SO FAR the improvement is marginal. And that by the time we start to see the software that could exploit the system capabilities, we might still be unimpressed by them because at that time a competitor's offering my overshadow Nintendo with more visualy impressive software.

So what's so outrageous about my claims? They are just reasonable, im not pulling a "2x, 300% more than" figures out of thin air, instead im using some common sense. It's a better tool for people like you and me that don't quiet grasp the tech side of things.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I disagree. First party 360 titles showed marked improvements over last gen consoles and they tripled the resolution on top of that.
PGR3 runs at 1024x600, while there were 720p Xbox games. Just saying.
 
Don't go on FLOP rating alone--especially when comparing two different GPUs.
e6760 @ 576GFLOPs is slightly more powerful than 1TFLOP 4850.

Is this true? GAF worships the FLOP like an angry vengeful god. GAF been full of shit though isn't surprising in the slightest.
 

ozfunghi

Member
You are the one at fault here, it's very clear really. You don't have either the hardware specs or the software showcase to back up your claims. Lose/lose situation.

All im saying (i don't know why you insist on keep replying) is that SO FAR the improvement is marginal. And that by the time we start to see the software that could exploit the system capabilities, we might still be unimpressed by them because at that time a competitor's offering my overshadow Nintendo with more visualy impressive software.

So what's so outrageous about my claims? They are just reasonable, im not pulling a "2x, 300% more than" figures out of thin air, instead im using some common sense. It's a better tool for people like you and me that don't quiet grasp the tech side of things.


Ok... what? lol. You were the one, claiming it was only marginally more powerful. When i disagreed, you played the "you don't know, because you don't have the specs" card. But you don't know the specs either, so you can't make the claim that it's only marginally more powerful either.

So if ALL your "marginally more powerful" comments are SOLELY about CURRENT games, than sure, they are not showing much improvements, if any. And i have agreed on that more than once now. But if you are also talking about hardware, then i do not agree, because the only specs that actually ARE known, are not marginally more powerful.

So no, i am most definitely not at fault. You are the one mixing comments about hardware with those about current software.
 

Ryoku

Member
Is this true? GAF worships the FLOP like an angry vengeful god. GAF been full of shit though isn't surprising in the slightest.

You're welcome to Google search the FLOP ratings of each. FLOP ratings are most reliable when comparing GPUs of the same architecture. And even then, it doesn't translate to the exact difference in performance. FLOP rating is absolutely worthless when comparing Nvidia and AMD cards.

I edited my last comment. Wrong choice of word.
 

Stewox

Banned
I wonder if Nintendo has some sort of system installed to collect usage data from Internet connected consoles like Steam's hardware survey. This would be useful in determining how much of the multi-task memory users actually use and how much RAM they can free for developers in the future.

OMG! You are reading my mind
That's exactly what I was having in the back of my mind for the last 3 days ... Was going to post that replying to ideaman from previous pages.


5GHz is smart, it'll be less pro he to interference from wifi, cordless phones etc.

I hope it doesn't adversely affect 5GHz wifi though, I picked that specifically to avoid congestion with my neighbours

That's a plus, I think the bigger reason is bandwidth.
 
Ok... what? lol. You were the one, claiming it was only marginally more powerful. When i disagreed, you played the "you don't know, because you don't have the specs" card. But you don't know the specs either, so you can't make the claim that it's only marginally more powerful either.

So if ALL your "marginally more powerful" comments are SOLELY about CURRENT games, than sure, they are not showing much improvements, if any. And i have agreed on that more than once now. But if you are also talking about hardware, then i do not agree, because the only specs that actually ARE known, are not marginally more powerful.

So no, i am most definitely not at fault. You are the one mixing comments about hardware with those about current software.
Im glad the disscussion is over them, good to see you understand my point of view now. :)
 

ozfunghi

Member
Im glad the disscussion is over them, good to see you understand my point of view now. :)

No problem. I guess we were both arguing from different angles.

PS: but still Thunder Monkey deserves a beating for not having played Xenoblade. Which was really the most important point to get across.
 

Cuth

Member
PGR3 runs at 1024x600, while there were 720p Xbox games. Just saying.
What's the point in this? You're trying to say PGR3 didn't show obvious improvements over previous generation's games?

I don't get why some people keep trying to downplay the improvement from the Xbox to the Xbox 360, it doesn't make the WiiU look better, it's just annoying.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
What's the point in this? You're trying to say PGR3 didn't show obvious improvements over previous generation's games?

I don't get why some people keep trying to downplay the improvement from the Xbox to the Xbox 360, it doesn't make the WiiU look better, it's just annoying.
Did you bother to read the post I was replying to?
 
You're welcome to Google search the FLOP ratings of each. FLOP ratings are most reliable when comparing GPUs of the same architecture. And even then, it doesn't translate to the exact difference in performance. FLOP rating is absolutely worthless when comparing Nvidia and AMD cards.

I edited my last comment. Wrong choice of word.

I see. I hope my collection of useless information is called on in a pub quiz one day. Thanks.
 

The_Lump

Banned
What's the point in this? You're trying to say PGR3 didn't show obvious improvements over previous generation's games?

I don't get why some people keep trying to downplay the improvement from the Xbox to the Xbox 360, it doesn't make the WiiU look better, it's just annoying.

I think he was referring to the "3x resolution" comment.
 
Just out of curiosity, has anyone opened up a recent Wii production unit?

edit: I'm moreso curious about whether they bothered to go beyond 90nm over the 6 years.

I don't recall hearing of any die shrinks beyond 90 nm and that's something we usually hear about.. I always just assumed the latest Wii lacked the cube controller port guts and called it a wrap, but you raise a good point.
 

ozfunghi

Member
What's the point in this? You're trying to say PGR3 didn't show obvious improvements over previous generation's games?

I don't get why some people keep trying to downplay the improvement from the Xbox to the Xbox 360, it doesn't make the WiiU look better, it's just annoying.

If you want to jump into a discussion, make sure to know what was said prior to the comment you're trying to attack. The point was made that 360 games tripled the resolution on top of other visual feats, and PGR was specifically mentioned.
 
I don't recall hearing of any die shrinks beyond 90 nm and that's something we usually hear about.. I always just assumed the latest Wii lacked the cube controller port guts and called it a wrap, but you raise a good point.

I don't believe they ever did die shrink the chips, I wonder if they ever will with wii u, at least the CPU certainly has room to be down the line
 

Bear

Member
Anyone know if Wii U supports 5ghz 802.11n? Does 3DS?

I don't think the 3DS does, but the Wii U was confirmed to on the official spec sheet. I bought a dual band router a little while back so I'm glad to hear they are supporting it.

It's here under "Networking".

Wii U can be connected to the Internet via a wireless (IEEE 802.11b/g/n) connection. The console features four USB 2.0 connectors &#8211; two in the front and two in the rear &#8211; that support Wii LAN Adapters.
 
Top Bottom