• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Cool. Their website makes it sound like you can download the assets directly from the site once paid. So what's the "wait" now for? Maybe their process on their end isn't automated and they need a human to actually approve every sale.
You will understand what the waiting is all about in its due time - my guess is next week. And if you recall Durante's hint - the wait is worth it. Otherwise your guess is right - in our particular case, the acquisition of the pictures did not go as simple as amazon checkout, but that has nothing to do with Chipworks' online store. Patience, people!
 

tipoo

Banned
This is not how iwata presented the information. He could have misspoke, but he didn't give any kind of indication that he was talking about the power rating.



The power rating on the power supply is 75 watts, is it not? You never make electronics that put a 100% load on the PSU, that's a terrible idea, it's always over-provisioned. It's pretty common for electronics to draw about half what the PSU is rated for. This talk of a future overclock of the chips is a very big reach imho.
 

prag16

Banned
You will understand what the waiting is all about in its due time - my guess is next week. And if you recall Durante's hint - the wait is worth it. Otherwise your guess is right - in our particular case, the acquisition of the pictures did not go as simple as amazon checkout, but that has nothing to do with Chipworks' online store. Patience, people!

Not to be a ballbreaker, but I don't understand the secrecy about the process/status. I understand there may be legal restrictions on who the actual info can be shared with based on what you agree to on the point of sale, but what's the need to be so nebulous about it?

EDIT: Just saw FS's post. Thanks. I got to the party late this time, but if we end up doing the same thing for the CPU, etc, I'll try to throw in a contribution.
 
Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.

I can share this in the meantime. They provided some initial measurements that seem in fact a bit smaller than previous numbers:

CPU: 4.74 x 5.85mm
GPU:11.88 x 12.33mm
NOR Flash: 1.24 x 1.46 mm (this is the small chip on the MCM and seemingly similar to Wii's EEPROM in function)

~146mm2 for the GPU is a bit less than thought, but I do have a few crackpot theories as to what will be shown.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The power rating on the power supply is 75 watts, is it not? You never make electronics that put a 100% load on the PSU, that's a terrible idea, it's always over-provisioned. It's pretty common for electronics to draw about half what the PSU is rated for.
You never make electronics that put 100% load on the PSU 100% of the time. But if a PSU for a device is rated at N watts, there's a pretty good reason for that. Normally the reason stems from the fact that the PSU has to meet all possible loads that can come from the device, including power viruses.
 

QaaQer

Member
I haven't been checking up on this thread for quite sometime but.. what is this about?

Some gaffers chiped in to get some high resolution shots of the gpu for $200 from chipworks. they will be looked at by some other more technical types on gaf, and then a new thread will be made with what can be discerned. the picture will not be released, however.

it is actually worth $200, as they have to acid etch the chip, and use expensive imaging devices to capture the images.
 

Earendil

Member
You never make electronics that put 100% load on the PSU 100% of the time. But if a PSU for a device is rated at N watts, there's a pretty good reason for that. Normally the reason stems from the fact that the PSU has to meet all possible loads that can come from the device, including power viruses.

I got a power virus once... my toilet was never the same after.
 

tipoo

Banned
You never make electronics that put 100% load on the PSU 100% of the time. But if a PSU for a device is rated at N watts, there's a pretty good reason for that. Normally the reason stems from the fact that the PSU has to meet all possible loads that can come from the device, including power viruses.

There has to be some over-provisioning. Capacitors age, similar to batteries, and in two years that 75w power supply may only be able to hit say 70 watts (not an actual calculation, just for example). So anything that tries to draw the original 75 watts from it is going to crash. If they want a console to have a healthy lifespan, as Nintendo always does, they have to over provision quite a bit.
 

AzaK

Member
You will understand what the waiting is all about in its due time - my guess is next week. And if you recall Durante's hint - the wait is worth it. Otherwise your guess is right - in our particular case, the acquisition of the pictures did not go as simple as amazon checkout, but that has nothing to do with Chipworks' online store. Patience, people!

So Nintendo saw this thread and is in the process of collecting up all specs for us. Nice.

Not to be a ballbreaker, but I don't understand the secrecy about the process/status. I understand there may be legal restrictions on who the actual info can be shared with based on what you agree to on the point of sale, but what's the need to be so nebulous about it?

EDIT: Just saw FS's post. Thanks. I got to the party late this time, but if we end up doing the same thing for the CPU, etc, I'll try to throw in a contribution.

I was going to say the same as I understand we need patience when even Fourth Storm is waiting but I don't think it needs to be kept too secret if there's something to share. Especially for those of us who ponied up some dosh; just an update would be good.

NOTE: OK, I've seen the update :)

Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.

I can share this in the meantime. They provided some initial measurements that seem in fact a bit smaller than previous numbers:

CPU: 4.74 x 5.85mm
GPU:11.88 x 12.33mm
NOR Flash: 1.24 x 1.46 mm (this is the small chip on the MCM and seemingly similar to Wii's EEPROM in function)

~146mm2 for the GPU is a bit less than thought, but I do have a few crackpot theories as to what will be shown.

Cool. Are they going to do a deal like give the other photos too or something? That'd be nice.

BTW: I can't remember what the initial measurements were.


I have more hype for this incoming info than the last 2 Nintendo E3's.
 
*puts on fanboy hat*
There is still a chance that via a future firmware upgrade, Nintendo could up the CPU/GPU clocks somewhat, similar to what Sony did with the PSP (didn't Nintendo also overclock the 3DS post launch as well?). Or maybe there is some silicon that hasn't been excersized in some games as of yet. Or maybe the disk drive isn't spinning at full speed yet (or ever will due to noise/durability?). Who knows... just believe!
No, no, no. Supposedly they've released more RAM from the OS and access to the second core, previously reserved to OS and networking tasks; as much as 25% of it, that is.

It's unlikely that Nintendo would overclock the console (and they could overclock the 3DS quite a bit, and they certainly won't for battery reasons), but the further a game or software uses the hardware the bigger the power draw will be, so 33W measured with New Super Mario Bros Wii U can be probably topped out.

Case in point: past Nvidia Quadro cards who were virtually equal to Geforce cards often had less clockrate than them, because they use more transistor's out of their array and in full load for long periods of time no less (and a regular geforce is humiliated in rendering tasks by them, by those extra transistor's being put to use). The MHz threshold had to be lower then. In fact, looking at current Nvidia Quadro that's still true, GK107GLM a quadro part is topping out at 706 MHz; a regular GT107 is a GeForce GT 640 who's sold with 797 (DDR3 RAM), 900 and 950 MHz (GDDR5 RAM) configurations, the lowest is due to RAM bandwidth constraints (and punishing for going cheaper, I'm sure) and the others are attainable clocks for gaming, the quadro version though...

It's just low, but low for a reliability purpose.

Also, there's the old talk about Carmack advising against stock overclocked cards for Doom 3 because since the game was cutting edge and using shit no one else did (more transistors, as he said) said card should heat more and if it was too overclocked it could fry or start out a artifact fest.

So, if this is a really custom part, and say, it has stuff like Pikmin 3's HLSL Blur hardwired along other effects as well as lots of stuff to exploit in a low level basis, then power consumption is bound to be affected by wether those paths are being exploited or not, simultaneously too. The VLIW5 example too, AMD opted out of it for VLIW4 not because it wasn't efficient, but because since it being 5-Way it wasn't really being taken advantage of then they were wasting transistors better used in more stream processors with only 4-way capability.

So, just by going from that if there was a VLIW5 GPU on a console, and on a PC, and it was the very same part, with optimization and the closed nature of the console there could be a very palpable power draw and heat difference.

Don't get me wrong I'm sure it can be overclocked (perhaps quite a bit), and full draw of the GPU might not amount to much difference (pretty sure it won't go past 40W sans-peripheric energy feeding), but it's not an universal thing to conclude after measuring first generation a low-end game (from a technological standpoint); not just GPU, I doubt they had to use more than one CPU core.
 

joesiv

Member
You never make electronics that put 100% load on the PSU 100% of the time. But if a PSU for a device is rated at N watts, there's a pretty good reason for that. Normally the reason stems from the fact that the PSU has to meet all possible loads that can come from the device, including power viruses.

True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox 360 had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...
 

chaosblade

Unconfirmed Member
Better than expected indeed. Looking forward to the info we're getting out of this. I kind of feel bad for not contributing.
 

joesiv

Member
No, no, no. Supposedly they've released more RAM from the OS and access to the second core, previously reserved to OS and networking tasks; as much as 25% of it, that is.
Thanks for the clarification.

I'm almost tempted to run a power monitor 24/7 and consider wattage more like a high score lol... Won't we be excited when we start hitting 40 watts!
 

Schnozberry

Member
True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...

75 watts is the PSU rating. Iwata did say specifically that the unit would be closer to 45W in operation, but with the inclusion of bus powered peripherals that makes sense. It's possible we could see some variation in power usage based on the game in question, but probably not much.

And holy shit that OG Xbox Power supply was squeeling.
 

pestul

Member
Thanks for the clarification.

I'm almost tempted to run a power monitor 24/7 and consider wattage more like a high score lol... Won't we be excited when we start hitting 40 watts!

It will certainly be interesting to see what future firmware updates do and to monitor the power usage for newer titles post-March. Yeah, I'm a little too obsessed as well.
 

Earendil

Member
I think it's interesting that the Xbox drew 8 times the power on average that the Gamecube did, but the games were only marginally better.

Goes to show that you cannot write off a system solely based on it's power draw.

EDIT:
According to this site (http://games.gearlive.com/playfeed/article/how-much-power-does-your-console-use-06182330) the Xbox pulled 70W under load. So that's only 3 times the juice the Gamecube pulled. But it's still impressive that the visual difference wasn't anywhere near 3x.
 

Schnozberry

Member
I think it's interesting that the Xbox drew 8 times the power on average that the Gamecube did, but the games were only marginally better.

Goes to show that you cannot write off a system solely based on it's power draw.

It's also a testament to how pitiful Nvidia and Intel's power efficiency was at the time compared to ArtX and IBM.

Edit: It was the 360 and not the OG Xbox PSU that was referenced, so I'm completely wrong in my original statement.
 

USC-fan

Banned
True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...

Think you are talking about the xbox 360 here....
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
There has to be some over-provisioning. Capacitors age, similar to batteries, and in two years that 75w power supply may only be able to hit say 70 watts (not an actual calculation, just for example). So anything that tries to draw the original 75 watts from it is going to crash. If they want a console to have a healthy lifespan, as Nintendo always does, they have to over provision quite a bit.
I'm not an EE, but nevertheless here are my 2 cents. /disclaimer

It is true that electrolyte capacitors age similarly to (some types of) batteries, but that does not affect the power rating of the AC/DC circuit (which may not even use capacitors for the rectification process). What is most often affected by capacitors in a PSU is the quality of the DC output - ie. the absence of voltage ripples. IOW, a PSU with damaged/aged capacitors does not produce 'proper' DC anymore, to the point it can kill the device it's supplying with power. But PSU output power per se is not dictated by capacitors.
 

joesiv

Member
Think you are talking about the xbox 360 here....

Whoops yep, I only included it because it was on the opposite of the scale in terms of percentage of power supply usage. Incredible if you consider the benchmarks likely were not pulling full draw from the USB ports.
 

japtor

Member
Cool. Their website makes it sound like you can download the assets directly from the site once paid. So what's the "wait" now for? Maybe their process on their end isn't automated and they need a human to actually approve every sale.
This is when Fourth Storm reveals his internet connection is tethering with a sketchy GPRS phone and that it'll take a few days to download 50MB, and only if Chipworks' server supports resuming downloads.
Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.
I'm wondering if they're thinking "Nintendo fanboys are even crazier than Apple ones!"

It'd be pretty cool if they did some chip analysis (cause they kind of have experience in this) instead of just the die photos alone...
 

wsippel

Banned
I'm wondering if they're thinking "Nintendo fanboys are even crazier than Apple ones!"

It'd be pretty cool if they did some chip analysis (cause they kind of have experience in this) instead of just the die photos alone...
I think they appreciate our craziness, though. It's good marketing.
 

ASIS

Member
Some gaffers chiped in to get some high resolution shots of the gpu for $200 from chipworks. they will be looked at by some other more technical types on gaf, and then a new thread will be made with what can be discerned. the picture will not be released, however.

it is actually worth $200, as they have to acid etch the chip, and use expensive imaging devices to capture the images.

oh wow, that sounds exciting! So does this mean we will get full info on the thing? Also, I'd like to chip in with you guys on this project. If there is any extra cost I'd like to handle it as well.

BTW why just the GPU? IIRC both the CPU and RAM are also still kind of vague no?
 

Schnozberry

Member
oh wow, that sounds exciting! So does this mean we will get full info on the thing? Also, I'd like to chip in with you guys on this project. If there is any extra cost I'd like to handle it as well.

BTW why just the GPU? IIRC both the CPU and RAM are also still kind of vague no?

I think the GPU was the most pressing. We can chip in again for the other hardware if people want to. I'll throw down $20 again.
 

Durante

Member
BTW why just the GPU? IIRC both the CPU and RAM are also still kind of vague no?
The GPU is by far the most interesting, since it's the component we know least about and the one where we stand to learn most from a die shot (since e.g. the number of shader groups or ROPs should be recognizable and countable). For the CPU it's much harder to learn anything meaningful -- we already know that it has 3 cores. And the RAM is pretty clear at this point, at least the external one, since it's just standard chips for which you can look up the exact specs. The more unclear part, e-DRAM, is also on the GPU.
 

ASIS

Member
I think the GPU was the most pressing. We can chip in again for the other hardware if people want to. I'll throw down $20 again.

Let me know whichever you decide, i'll be happy to help, though I still don't know how I'm gonna pay :p
The GPU is by far the most interesting, since it's the component we know least about and the one where we stand to learn most from a die shot (since e.g. the number of shader groups or ROPs should be recognizable and countable). For the CPU it's much harder to learn anything meaningful -- we already know that it has 3 cores. And the RAM is pretty clear at this point, at least the external one, since it's just standard chips for which you can look up the exact specs. The more unclear part, e-DRAM, is also on the GPU.

ah I see, thanks for clearing this up.
 
I'm wondering if they're thinking "Nintendo fanboys are even crazier than Apple ones!"

It'd be pretty cool if they did some chip analysis (cause they kind of have experience in this) instead of just the die photos alone...
I'd most definitely be interested in anything they have to say about it, yes.

And they seem to be cool dudes, I mean it's a business, but they're apparently going the extra mile for us. Gotta appreciate that.

I'm very eager for the results.
 

joesiv

Member
The GPU is by far the most interesting, since it's the component we know least about and the one where we stand to learn most from a die shot (since e.g. the number of shader groups or ROPs should be recognizable and countable). For the CPU it's much harder to learn anything meaningful -- we already know that it has 3 cores. And the RAM is pretty clear at this point, at least the external one, since it's just standard chips for which you can look up the exact specs. The more unclear part, e-DRAM, is also on the GPU.

I think it'd be interesting to see a scan of the MCM, see how/where the traces go between the cores, might give us some indication of general architecture. Not sure if they do that scan though... Seems easier to do though.
 

tipoo

Banned
I'm not an EE, but nevertheless here are my 2 cents. /disclaimer

It is true that electrolyte capacitors age similarly to (some types of) batteries, but that does not affect the power rating of the AC/DC circuit (which may not even use capacitors for the rectification process). What is most often affected by capacitors in a PSU is the quality of the DC output - ie. the absence of voltage ripples. IOW, a PSU with damaged/aged capacitors does not produce 'proper' DC anymore, to the point it can kill the device it's supplying with power. But PSU output power per se is not dictated by capacitors.



I'm not one either, but whenever I've built PCs and checked power supply calculators for what I should get, there was always over provisioning for capacitor aging. If it didn't affect output, why would that be there? One example:

http://www.extreme.outervision.com/psucalculatorlite.jsp

And it says
Electrolytic capacitor aging. When used heavily or over an extended period of time (1+ years) a power supply will slowly lose some of its initial wattage capacity. We recommend you add 10-20% if you plan to keep your PSU for more than 1 year, or 20-30% for 24/7 usage and 1+ years.

And from experience, sometimes a few years down the line a power supply will still be working, but providing less load power than it originally could leading to system crashes when the system started drawing more power. Nothing about the hardware or would change, the power supply just got weaker over time. So for them to have a 75 watt power supply in there with an actual peak system load of 75 watts seems highly unlikely to me.


So who wants to start taking bets on the numbers?


100 bucks on three CPU cores :p
 
*edit* Wait, do we even know the process at which Anandtech took the wattage readings? In otherwords, they likely took the measurements "at the wall"... so if you reverse the efficiency of the powersupply, the console its self would only use 23-27W in their worst case? Super Saiyan firmware upgrade just prior to E3 confirmed?

Has anyone got the answer to this question? I posed the same thing a while a go, but don't know if anyone answered it.
 
OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.
 

joesiv

Member
OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.
What about the extra logic *needed* for [Nintendo] consoles? Typically Nintendo puts things like the NorthBridge (CPU interface, Video Interface, Memory Controller, I/O Interface) on the GPU. It won't take up *too* much, but it'll take up some (on the flipper it was around 1/25th the die for the North Bridge), also the Sound DSP took up slightly more space... Not sure if that's also in the WiiU GPU or not.
 
OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.

It seems much more likely we will find Renesas eDRAM on the GPU. Also, the R700 series has been on 40nm for years now and Nintendo would likely want to take advantage of the maturity of that process. They've had plenty of time to optimize and tweak. It's hard to say how much room the eDRAM will take up. The overhead required might be affected by internal bus width from the eDRAM to the rest of the GPU.
 

Donnie

Member
OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.

Think Renesas eDRAM (Renesas are producing the GPU) is about 16mm2 for 32MB. There will be extra space required for wiring it all together ect but even if we double it that still leaves 114mm2. You also probably have a tiny DSP on there and very small ARM CPU, but that's unlikely to take up more than a few mm2.
 
Yes its taken from the wall. So you will have to take some off. Its only a couple of watts at most.

It's the percentage that matters isn't it? The most expensive PSU's are 80Plus platinum where you have 92% efficiency at 50% load. So you lose 8% of the power from the wall to heat. If the PSU is less efficient to save money but still efficient at 80Plus the you lose 20% of the power to heat, it may only be a few points but the WiiU is only 33W so a few points off that is a hefty chunk.
 

ozfunghi

Member
It seems much more likely we will find Renesas eDRAM on the GPU. Also, the R700 series has been on 40nm for years now and Nintendo would likely want to take advantage of the maturity of that process. They've had plenty of time to optimize and tweak. It's hard to say how much room the eDRAM will take up. The overhead required might be affected by internal bus width from the eDRAM to the rest of the GPU.

You guys seemed rather excited. Are there any indications we might get a few nice surprises concerning WiiU GPU performance or features, or is it because Chipworks is giving you/us a "better deal" of sorts, which will not relate to performance of the console?
 
Top Bottom