• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

Gahiggidy

My aunt & uncle run a Mom & Pop store, "The Gamecube Hut", and sold 80k WiiU within minutes of opening.
I dont know if we can trust the numbers done by the couple of people that are looking at the image. Also just because its on the chip doesn't mean they are active.

Unless we get official specs than its just a guess. If they can even read the photos correctly...

I can see this ending very badly!
Can you not make the same argument for Microsoft and Sony's new machines?
 

jerd

Member
I dont know if we can trust the numbers done by the couple of people that are looking at the image. Also just because its on the chip doesn't mean they are active.

Unless we get official specs than its just a guess. If they can even read the photos correctly...

I can see this ending very badly!

Ha
 

USC-fan

Banned
Because he's feeling the heat of everything he has said!? :p

Really? I have been proved 100% correct. This was back when people were claiming the 650 glfop as "fact" and that was the LOW end of the scale. lol But most of those people do not post anymore...



That range is wrong. It is impossible in fact. The best case was 550 or so but that power consumption is even too high and that used the latest and greatest form amd. it 350-500 glfop would be the correct range and likely around 400-450. AS from design gpu improvement this will be even greater with next gen system seeing the wiiu uses a 2008 design. Compare to a 2013 design in the next gen system going by the leaks.
http://www.neogaf.com/forum/showpost.php?p=42128123&postcount=2747

I'd say 300 or lower is extremely unlikely. IMHO, the range of likely values goes from ~350 at the low end to ~600 at the high end.
Strange when i posted this back in sept of 2012 i was attacked.
 

Durante

Member
Sony had a SPU disabled to improve yields. Nintendo could do something similar here, but I don't think they would.
Yeah, it's not completely out of the question, but the Wii U GPU is a moderately sized die on a mature process, so I wouldn't expect any large functional units of it to be disabled.
 

joesiv

Member
The Joe and Jane consumers of the world care nothing for the details we're looking for, and they'll be the largest portion of buyers. And enthusiasts who really care about the technical nitty gritty would probably want Durango or Orbis instead, no?

I want specs for the sake of knowing the specs, I don't think it will hinder or help it in any major way beyond maybe a few thousand sales.
Perhaps, though Joe and Jane probably read Joystiq or IGN or Kotaku, and when any or all of them write sensational articles detailing our findings, I could see it affecting the overal tone of feelings towards the system. It's impossible to know how much it will or won't affect things, but given the way news travels, and how Joe and Jane probably have heard grumblings of PS4/NeXtbox leaks. Even if they don't understand them, the surrounding text, including comments from anyone, can invite sumations of "really poweful" or "really weak", or "Super Saiyan this" or "last gen tech that"... heck a tweet or facebook post could spread negatism pretty easy these days.

I'm interested in the specs, because I'm curious, and maybe knowing it will have a spike in interest/debate, but hopefully it will peter out and we can all get on with our lives :)
 

joesiv

Member
Wait, aren't you the same USC-fan that was claiming we should be lucky to get 300 Gflops, not too long ago?

Probably... but before we burn him on the fire... maybe we should wait till we get the actual numbers... just sayin... Nintendo of late have been pretty good at making less "specs" work for them.
 

USC-fan

Banned
Wait, aren't you the same USC-fan that was claiming we should be lucky to get 300 Gflops, not too long ago?

I said 350 and i still believe that is true. This is based on the power the console is using, the 40nm process, die size, r700 design and the games released.

I believe the card will be a lot like the Radeon HD 5550.

320:16:8 at 550 mhz = 352 glfops i believe this is best case.

The die may show 400:20:8 but i do think there will be part disabled to improve yields and for power savings.

You have 33 watts to work with, something got to give. These number are all base on facts we have of the system at this point.
 

FLAguy954

Junior Member
I said 350 and i still believe that is true. This is based on the power the console is using the games released.

I believe the card will be a lot like the Radeon HD 5550.

320:16:8 at 550 mhz = 352 glfops i believe this is best case.

The die may show 400:20:8 but i do think there will be part disabled to improve yields and for power savings.

You have 33 watts to work with, something got to give. These number are all base on facts we have of the system at this point.

You also have years and tons of money spent on R & D. Surely Nintendo had the foresight to put the most money into dramatically increasing the GPU's performance per watt based on the system's power consumption?
 

ozfunghi

Member
Probably... but before we burn him on the fire... maybe we should wait till we get the actual numbers... just sayin... Nintendo of late have been pretty good at making less "specs" work for them.

That's not really the issue. The thing is he now claims that he "has been saying" what Durante just posted. Just a couple of posts further, he is eating his own words, since now he is claiming he said 352 Gflops to be the best we can expect, while Durante specifically said that would be the worst we can expect.

It has nothing to do with what number eventually turns out to be correct. Just that he is weaseling out of any and every discussion. Much like his "WiiU has no GPGPU function - lol" argument... followed by Iwata's "WiiU features GPGPU functions"... followed by USC-fans "Well, du-uh, obviously it has GPGPU, i never said the opposite".
 

tipoo

Banned
You also have years and tons of money spent on R & D. Surely Nintendo had the foresight to put the most money into dramatically increasing the GPU's performance per watt based on the system's power consumption?

We all know consoles get a boost over PCs due to less API overhead and weaker hardware on them can do more than it could in a PC, but in terms of one of the console makers making a GPU with far higher performance per watt than what the GPU makers themselves could make? Even in collaboration with each other...If GPU efficiency could be increased by that much, it would be increased in AMDs PC architectures as well.

Sure there might be some interesting tweaks to the hardware, but I would assume nothing that dramatically raises the performance the architecture can produce at such low wattage.
 
Really? I have been proved 100% correct. This was back when people were claiming the 650 glfop as "fact" and that was the LOW end of the scale. lol But most of those people do not post anymore...




http://www.neogaf.com/forum/showpost.php?p=42128123&postcount=2747


Strange when i posted this back in sept of 2012 i was attacked.
That may depend on how you phrase your statements, and how you can give supporting evidence on your logical conclusions. Durante may sometimes a bit more pessimistic about the Wii U than some other posters , but he states his reasons, he knows what he is talking about, civil to people, and has earned high respect on the boards. That is a bit different than starting a heated agrument on rather or not the Wii U has a GPGPU, then quoting a statement from another user from another board (ahem) and not even give credit to who originally made the post ;-)
 
I'm just thinking about a comparison between VLIW in a PC where you're dealing with more software layers and not targeting specific hardware vs VLIW in a console. Apparently most optimizations would deal with scheduling and I don't know if that's even feasible in most situations.
Yeah as far as I know the biggest difference is that VLIW doesn't have run time scheduling, so the compiler needs to do all of it. If and how much can be improved by coding to the metal, I don't know.
 

USC-fan

Banned
That's not really the issue. The thing is he now claims that he "has been saying" what Durante just posted. Just a couple of posts further, he is eating his own words, since now he is claiming he said 352 Gflops to be the best we can expect, while Durante specifically said that would be the worst we can expect.

Back then we didnt have the power usage of the console. Those numbers were based off of using 45w-50w. I just dont see anyway of getting to those number at 33w.

It has nothing to do with what number eventually turns out to be correct. Just that he is weaseling out of any and every discussion. Much like his "WiiU has no GPGPU function - lol" argument... followed by Iwata's "WiiU features GPGPU functions"... followed by USC-fans "Well, du-uh, obviously it has GPGPU, i never said the opposite".
lol just too funny.

look at this post before Iwata announcement. Someone even posted this and i reply...

Part of the problem is there really isn't that much to discuss until the fall conference. Heavy has also been pretty abrasive in this thread, so there's a good bit of built up hostility from many of the regulars. It's cathartic to crack on him, kind of like when Amirox was de-moded. I imagine the same would happen to USC-Fan if Iwata announced that the WiiU did use a GPGPU.
Wiiu does have a gpgpu since its r700 based. That was never the debate. SMH...

http://www.neogaf.com/forum/showpost.php?p=39803015&postcount=8029

still SMH....

That may depend on how you phrase your statements, and how you can give supporting evidence on your logical conclusions. Durante may sometimes a bit more pessimistic about the Wii U than some other posters , but he states his reasons, he knows what he is talking about, civil to people, and has earned high respect on the boards. That is a bit different than starting a heated agrument on rather or not the Wii U has a GPGPU, then quoting a statement from another user from another board (ahem) and not even give credit to who originally made the post ;-)
Again that was never the debate... SMH...

I know why i am attack by the fanboys, is because i dont tell them what they want to hear. Same people that couldnt understand the gpgpu debate...
 

Earendil

Member
That may depend on how you phrase your statements, and how you can give supporting evidence on your logical conclusions. Durante may sometimes a bit more pessimistic about the Wii U than some other posters , but he states his reasons, he knows what he is talking about, civil to people, and has earned high respect on the boards. That is a bit different than starting a heated agrument on rather or not the Wii U has a GPGPU, then quoting a statement from another user from another board (ahem) and not even give credit to who originally made the post ;-)

Exactly. I may not always agree with Durante's pessimism, but I respect his logic. He's not out to prove an agenda and ignore facts that conflict with his predetermined beliefs.

Could we please not let this descend into another bout of he said/she said?

You're right. My apologies for participating in it.
 

Schnozberry

Member
This may be a silly question, and could have already been answered, but do we know that the Wii U is locked at 33w no matter what game it's playing? During one of the Nintendo Directs prior to launch around September, Iwata said the Wii U is rated for a maximum of 75w, but would likely not consume more than 45w in operation.

I don't know what kind of power gating is in place if any, but an extra 12w would change some of the assumptions in this thread.
 

guek

Banned
This may be a silly question, and could have already been answered, but do we know that the Wii U is locked at 33w no matter what game it's playing? During one of the Nintendo Directs prior to launch around September, Iwata said the Wii U is rated for a maximum of 75w, but would likely not consume more than 45w in operation.

I don't know what kind of if power gating is in place at all, but an extra 12w would change some of the assumptions in this thread.

I was thinking about this too. It's very odd that games appear to be capped at 33w according to the test Eurogamer ran. Even Netflix ran at 29w. What reason would Iwata have to lie? It's not likely that he'd do so in order to put up a veil of more power since the nintendo direct audience generally wouldn't have a clue about wattage/power ratios. If anything, you'd think they'd boast a bit more about the lower power consumption since they view that as a major priority.
 

Schnozberry

Member
I was thinking about this too. It's very odd that games appear to be capped at 33w according to the test Eurogamer ran. Even Netflix ran at 29w. What reason would Iwata have to lie? It's not likely that he'd do so in order to put up a veil of more power since the nintendo direct audience generally wouldn't have a clue about wattage/power ratios. If anything, you'd think they'd boast a bit more about the lower power consumption since they view that as a major priority.

For all I know he could have mispoke and meant to say 35w, but it would be interesting to see if the wattage draw increases in the future, especially if it's a ~36% increase over our assumptions.
 

USC-fan

Banned
This may be a silly question, and could have already been answered, but do we know that the Wii U is locked at 33w no matter what game it's playing? During one of the Nintendo Directs prior to launch around September, Iwata said the Wii U is rated for a maximum of 75w, but would likely not consume more than 45w in operation.

I don't know what kind of power gating is in place if any, but an extra 12w would change some of the assumptions in this thread.

Iwata most likely didnt have the exact power usas at that time. Wiiu is not locked as you see some games use less. I was just using the highest we have...

Going by the x360 all games are within 10% of each other.
 

guek

Banned
For all I know he could have mispoke and meant to say 35w, but it would be interesting to see if the wattage draw increases in the future, especially if it's a ~36% increase over our assumptions.

still doesn't make sense since iwata said 75w under "full load." From the looks of things, wii u will never hit that number unless it's temporarily locked due to OS problems. It's probable that the firmware is just too unstable and software development is presently being capped at a lower level of performance but considering no devs have hinted that that's the case, such rationale heads straight into wishful thinking territory.
 

USC-fan

Banned
still doesn't make sense since iwata said 75w under "full load." From the looks of things, wii u will never hit that number unless it's temporarily locked due to OS problems. It's probable that the firmware is just too unstable and software development is presently being capped at a lower level of performance but considering no devs have hinted that that's the case, such rationale heads straight into wishful thinking territory.

People misunderstood. The 75w is the rated power of the psu.
 

guek

Banned
Iwata most likely didnt have the exact power usas at that time. Wiiu is not locked as you see some games use less. I was just using the highest we have...

Going by the x360 all games are within 10% of each other.

wrong.

eurogamer said:
One thing that did stand out from our Wii U power consumption testing - the uniformity of the results. No matter which retail games we tried, we still saw the same 32w result and only some occasional jumps higher to 33w.

People misunderstood. The 75w is the rated power of the psu.

This is not how iwata presented the information. He could have misspoke, but he didn't give any kind of indication that he was talking about the power rating.
 

chaosblade

Unconfirmed Member
I'd guess he was talking about the PSU. Later games using more power wouldn't be out of the norm I don't think, but doubling the power consumption of launch games? Not feeling that at all.

And saying that the system has a 75w PSU and would not draw more than 45w makes sense, IIRC console makers tend to stick with PSUs rated about twice what they actually plan to draw.
 

USC-fan

Banned
wrong.





This is not how iwata presented the information. He could have misspoke, but he didn't give any kind of indication that he was talking about the power rating.
Yeah you are right. I was looking at the Netflix Playback compare to the super mario U.

Looking back looks like it was report as 40w as the average not 45w.
 

Schnozberry

Member
still doesn't make sense since iwata said 75w under "full load." From the looks of things, wii u will never hit that number unless it's temporarily locked due to OS problems. It's probable that the firmware is just too unstable and software development is presently being capped at a lower level of performance but considering no devs have hinted that that's the case, such rationale heads straight into wishful thinking territory.

Yeah, I don't want to drift off into more unfounded speculation. It just seems odd that he would state hard numbers if he wasn't certain of their accuracy, rather than just not commenting on them at all. It certainly wasn't information that most people would have given a crap about had it not been mentioned at all.
 

joesiv

Member
This may be a silly question, and could have already been answered, but do we know that the Wii U is locked at 33w no matter what game it's playing? During one of the Nintendo Directs prior to launch around September, Iwata said the Wii U is rated for a maximum of 75w, but would likely not consume more than 45w in operation.

I don't know what kind of power gating is in place if any, but an extra 12w would change some of the assumptions in this thread.
It's not really a silly question, it's valid.

Yeah, the WiiU has a 75w power supply, and so tests have indicated that it "peaks" out at 33w usage.

http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Why would they only be using less than half the wattage of the power supply?
Some are saying that it's the efficiency that brings the "usable" wattage down 70-80% (which would mean usable wattage would be around 52watts. However, I believe these people are incorrect, because the rating on power supplies is what it can actually deliver, and the inefficiences actually are at the wall side (which would be an actual power draw of around 100watts at 70% efficiency)

So we *should* have 75 watts for use by the console.

Some other places the power consumption could go:
USB ports usually power up to 500mA, which is what the Wii U supports (which is why you need 2 ports for some external USB drives), that's 2.5W. There are 4 of these ports, whic would be 10W

I don't believe anandtech was maxing out the USB ports power draw, so now we're at 43W

I don't know what else would use power, since during a game, the disk drive is already spinning in the case of the WiiU, obviously all the chips and components are working, wireless antenna's and such.

Maybe the SD card slot? That would be another 1W or so.

So we're at 44W?

It's possible that Nintendo went with a cheaper Power Supply that doesn't have over draw protection, and thus over spec'd it to just be careful. Even so, it's a closed system, I don't think you'd need that much leeway.

*puts on fanboy hat*
There is still a chance that via a future firmware upgrade, Nintendo could up the CPU/GPU clocks somewhat, similar to what Sony did with the PSP (didn't Nintendo also overclock the 3DS post launch as well?). Or maybe there is some silicon that hasn't been excersized in some games as of yet. Or maybe the disk drive isn't spinning at full speed yet (or ever will due to noise/durability?). Who knows... just believe!

*takes off fanboy hat*

*edit* Wait, do we even know the process at which Anandtech took the wattage readings? In otherwords, they likely took the measurements "at the wall"... so if you reverse the efficiency of the powersupply, the console its self would only use 23-27W in their worst case? Super Saiyan firmware upgrade just prior to E3 confirmed?
 

Schnozberry

Member
I'd guess he was talking about the PSU. Later games using more power wouldn't be out of the norm I don't think, but doubling the power consumption of launch games? Not feeling that at all.

And saying that the system has a 75w PSU and would not draw more than 45w makes sense, IIRC console makers tend to stick with PSUs rated about twice what they actually plan to draw.

Yeah, I guess the wiggle room is that Mario uses 33w, when he stated that 45w would be possible in normal operation. It's a pretty big increase when you're dealing in small numbers.
 

guek

Banned
I'd guess he was talking about the PSU. Later games using more power wouldn't be out of the norm I don't think, but doubling the power consumption of launch games? Not feeling that at all.

And saying that the system has a 75w PSU and would not draw more than 45w makes sense, IIRC console makers tend to stick with PSUs rated about twice what they actually plan to draw.

I'm inclined to agree, though in retrospect it was worded incorrectly. Why would he say it's power consumption was 75w at "full load?" Why would he cite 45w if 33w was the actual cap? That's a 36% difference in power consumption (or conversely a 26% drop). I agree that it was most likely just him misspeaking but looking back, a lot of what he said during that ND simply doesn't make much sense. It's just weird.
 

Schnozberry

Member
It's not really a silly question, it's valid.

Yeah, the WiiU has a 75w power supply, and so tests have indicated that it "peaks" out at 33w usage.

http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

Why would they only be using less than half the wattage of the power supply?
Some are saying that it's the efficiency that brings the "usable" wattage down 70-80% (which would mean usable wattage would be around 52watts. However, I believe these people are incorrect, because the rating on power supplies is what it can actually deliver, and the inefficiences actually are at the wall side (which would be an actual power draw of around 100watts at 70% efficiency)

So we *should* have 75 watts for use by the console.

Some other places the power consumption could go:
USB ports usually power up to 500mA, which is what the Wii U supports (which is why you need 2 ports for some external USB drives), that's 2.5W. There are 4 of these ports, whic would be 10W

I don't believe anandtech was maxing out the USB ports power draw, so now we're at 43W

I don't know what else would use power, since during a game, the disk drive is already spinning in the case of the WiiU, obviously all the chips and components are working, wireless antenna's and such.

Maybe the SD card slot? That would be another 1W or so.

So we're at 44W?

It's possible that Nintendo went with a cheaper Power Supply that doesn't have over draw protection, and thus over spec'd it to just be careful. Even so, it's a closed system, I don't think you'd need that much leeway.

*puts on fanboy hat*
There is still a chance that via a future firmware upgrade, Nintendo could up the CPU/GPU clocks somewhat, similar to what Sony did with the PSP (didn't Nintendo also overclock the 3DS post launch as well?). Or maybe there is some silicon that hasn't been excersized in some games as of yet. Or maybe the disk drive isn't spinning at full speed yet (or ever will due to noise/durability?). Who knows... just believe!

*takes off fanboy hat*

Good points on the peripheral use.

EDIT: I wasn't thinking of future firmware updates bumping the clocks, just that a game like Mario may not push the hardware to full load.
 

z0m3le

Banned
I'd guess he was talking about the PSU. Later games using more power wouldn't be out of the norm I don't think, but doubling the power consumption of launch games? Not feeling that at all.

And saying that the system has a 75w PSU and would not draw more than 45w makes sense, IIRC console makers tend to stick with PSUs rated about twice what they actually plan to draw.
beaten.
You are forgetting USB power draw, 45watts plus sd card reader, 4x USB 2.0, WiFi, Bluetooth, disc drive, the system could get somewhat close to 75watts, so saying that is the max the Wii u will draw, but typical use being 45watts seems to make sense, also ports are not a good indicator of power usage, games on 360 for instance vary; red dead redemption draws much more power than kameo for instance.
 

AzaK

Member
Should be a couple of days. Folks should be happy about how much detail we're able to provide.

Great stuff. I assume you dudes are going to do some nice diagrams and shit for us too? If you need a hand with any meanial stuff, let me know.
 

ozfunghi

Member
Sooooo... Fourth Storm... if i am to understand correctly, you already have the pictures (going by Blu's teaser). Can you at least give us some first impressions? Is it a good looking, potently handsome and manly chip? Seriously though, is there a lot of "open space" on the die? Unused surface? Is it layed-out by hand?

It sounds like they're learning at least something from the diagrams. I'm very happy to have contributed $10 then..

I'm happy you did too. :)
 

prag16

Banned
It sounds like they're learning at least something from the image. I'm very happy to have contributed $10 then..

Who is on this "review committee"? You are Thraktor? You already have access to the files? And I didn't take blu's comment to mean they already actually have something. (He said "the planets are in motion" or something?) But not sure.

If people already have their hands on it, some first impressions would be very welcome. I'm not a hardware guru, but know just enough to get myself into trouble and/or somewhat understand the eventual findings here.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Sooooo... Fourth Storm... if i am to understand correctly, you already have the pictures (going by Blu's teaser). Can you at least give us some first impressions? Is it a good looking, potently handsome and manly chip? Seriously though, is there a lot of "open space" on the die? Unused surface? Is it layed-out by hand?
I didn't say that. Actually, we don't have the pictures yet - we would have informed everybody otherwise. What I said was that you guys need to have some patience - what was needed to be done from our end to request the pictures was done. The process has been set in motion. Now we wait.
 
There is still a chance that via a future firmware upgrade, Nintendo could up the CPU/GPU clocks somewhat, similar to what Sony did with the PSP (didn't Nintendo also overclock the 3DS post launch as well?).

The PSP didn't have higher clocks from the start because of the initially low battery life.
I don't see any reason why Nintendo should do the same for a home console. It'd be counterproductive to purposely hold back the power at release (first impressions matter a lot).
 

prag16

Banned
I didn't say that. Actually, we don't have the pictures yet - we would have informed everybody otherwise. What I said was that you guys need to have some patience - what was needed to be done from our end to request the pictures was done. The process has been set in motion. Now we wait.

Cool. Their website makes it sound like you can download the assets directly from the site once paid. So what's the "wait" now for? Maybe their process on their end isn't automated and they need a human to actually approve every sale.
 

ASIS

Member
I didn't say that. Actually, we don't have the pictures yet - we would have informed everybody otherwise. What I said was that you guys need to have some patience - what was needed to be done from our end to request the pictures was done. The process has been set in motion. Now we wait.
I haven't been checking up on this thread for quite sometime but.. what is this about?
 

Schnozberry

Member
I didn't say that. Actually, we don't have the pictures yet - we would have informed everybody otherwise. What I said was that you guys need to have some patience - what was needed to be done from our end to request the pictures was done. The process has been set in motion. Now we wait.

Is Chipworks bound by any contract agreements as to who they can and can't sell the photos to?
 
Top Bottom