• Register
  • TOS
  • Privacy
  • @NeoGAF

Earendil
Member
(01-30-2013, 10:57 PM)
Earendil's Avatar
Will we be able to get an idea of the eDRAM bandwidth from these shots?
oversitting
Banned
(01-30-2013, 10:59 PM)
So who wants to start taking bets on the numbers?
tipoo
Banned
(01-30-2013, 11:06 PM)

Originally Posted by blu

I'm not an EE, but nevertheless here are my 2 cents. /disclaimer

It is true that electrolyte capacitors age similarly to (some types of) batteries, but that does not affect the power rating of the AC/DC circuit (which may not even use capacitors for the rectification process). What is most often affected by capacitors in a PSU is the quality of the DC output - ie. the absence of voltage ripples. IOW, a PSU with damaged/aged capacitors does not produce 'proper' DC anymore, to the point it can kill the device it's supplying with power. But PSU output power per se is not dictated by capacitors.



I'm not one either, but whenever I've built PCs and checked power supply calculators for what I should get, there was always over provisioning for capacitor aging. If it didn't affect output, why would that be there? One example:

http://www.extreme.outervision.com/p...ulatorlite.jsp

And it says

Electrolytic capacitor aging. When used heavily or over an extended period of time (1+ years) a power supply will slowly lose some of its initial wattage capacity. We recommend you add 10-20% if you plan to keep your PSU for more than 1 year, or 20-30% for 24/7 usage and 1+ years.

And from experience, sometimes a few years down the line a power supply will still be working, but providing less load power than it originally could leading to system crashes when the system started drawing more power. Nothing about the hardware or would change, the power supply just got weaker over time. So for them to have a 75 watt power supply in there with an actual peak system load of 75 watts seems highly unlikely to me.


Originally Posted by oversitting

So who wants to start taking bets on the numbers?


100 bucks on three CPU cores :P
Last edited by tipoo; 01-30-2013 at 11:09 PM.
Earendil
Member
(01-30-2013, 11:07 PM)
Earendil's Avatar

Originally Posted by oversitting

So who wants to start taking bets on the numbers?

I'm in. I say it will be somewhere between 1 and 20,000,000.
Grampa Simpson
(01-30-2013, 11:09 PM)
Grampa Simpson's Avatar

Originally Posted by oversitting

So who wants to start taking bets on the numbers?

Over 9000.
ScepticMatt
Member
(01-30-2013, 11:11 PM)
ScepticMatt's Avatar

Originally Posted by oversitting

So who wants to start taking bets on the numbers?

Originally Posted by Fourth Storm

11.88 x 12.33mm ~146mm2 for the GPU is a bit less than thought.

the WiiU is a RV700 series GPU right?
146mm^2 AMDs have a 320:32:8 core (shader:TMU:ROP), and assuming 550 Mhz that would give us 352 GFlops (ADD+MUL)
which would be lower than I initially thought.
Last edited by ScepticMatt; 01-30-2013 at 11:16 PM.
Earendil
Member
(01-30-2013, 11:17 PM)
Earendil's Avatar

Originally Posted by ScepticMatt

the WiiU is a RV700 series GPU right?
146mm^2 AMDs have a 320:32:8 core (shader:TMU:ROP), and assuming 550 Mhz that would give us 352 GFlops (ADD+MUL)
which would be lower than I initially thought.

Is that at 55nm? Or 40nm?
ScepticMatt
Member
(01-30-2013, 11:20 PM)
ScepticMatt's Avatar

Originally Posted by Earendil

Is that at 55nm? Or 40nm?

55 nm. Is the WiiU 40nm?
And oh wait, is the eDRAM included into the 146mm^2?
Last edited by ScepticMatt; 01-30-2013 at 11:26 PM.
Moral Panic
Member
(01-30-2013, 11:24 PM)
Moral Panic's Avatar

Originally Posted by joesiv


*edit* Wait, do we even know the process at which Anandtech took the wattage readings? In otherwords, they likely took the measurements "at the wall"... so if you reverse the efficiency of the powersupply, the console its self would only use 23-27W in their worst case? Super Saiyan firmware upgrade just prior to E3 confirmed?

Has anyone got the answer to this question? I posed the same thing a while a go, but don't know if anyone answered it.
LeleSocho
Member
(01-30-2013, 11:26 PM)
LeleSocho's Avatar
10 bucks that it is more similar to the RV740 than a RV770
AzaK
Member
(01-30-2013, 11:27 PM)
AzaK's Avatar

Originally Posted by ScepticMatt

55 nm. Is the WiiU 40nm?
And oh wait, is the eDRAM included into the 146mm^2?

That's where speculation currently sits.
And yes.
oversitting
Banned
(01-30-2013, 11:28 PM)

Originally Posted by ScepticMatt

55 nm. Is the WiiU 40nm?
And oh wait, is the eDRAM included into the 146mm^2?

likely 40nm. But who knows. I think eDRAM is included inside the area.
ScepticMatt
Member
(01-30-2013, 11:40 PM)
ScepticMatt's Avatar
OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.
Last edited by ScepticMatt; 01-30-2013 at 11:53 PM.
USC-fan
aka Kbsmoker
(01-30-2013, 11:53 PM)

Originally Posted by Moral Panic

Has anyone got the answer to this question? I posed the same thing a while a go, but don't know if anyone answered it.

Yes its taken from the wall. So you will have to take some off. Its only a couple of watts at most.
joesiv
Member
(01-30-2013, 11:56 PM)

Originally Posted by ScepticMatt

OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.

What about the extra logic *needed* for [Nintendo] consoles? Typically Nintendo puts things like the NorthBridge (CPU interface, Video Interface, Memory Controller, I/O Interface) on the GPU. It won't take up *too* much, but it'll take up some (on the flipper it was around 1/25th the die for the North Bridge), also the Sound DSP took up slightly more space... Not sure if that's also in the WiiU GPU or not.
Last edited by joesiv; 01-30-2013 at 11:59 PM.
Fourth Storm
Member
(01-31-2013, 12:17 AM)
Fourth Storm's Avatar

Originally Posted by ScepticMatt

OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.

It seems much more likely we will find Renesas eDRAM on the GPU. Also, the R700 series has been on 40nm for years now and Nintendo would likely want to take advantage of the maturity of that process. They've had plenty of time to optimize and tweak. It's hard to say how much room the eDRAM will take up. The overhead required might be affected by internal bus width from the eDRAM to the rest of the GPU.
Donnie
Member
(01-31-2013, 12:26 AM)

Originally Posted by ScepticMatt

OK another go.

looking at this gives 1.92 mm^2 per MB @45 nm.
~1.5 mm^2 per MB @40nm
~100mm^2 left for the GPU.
RV740 is 137mm^2 for 640:32:16.

So I'd guess around 480:32:16 or 528 GFLOPs if they cut some logic not needed for consoles.

Think Renesas eDRAM (Renesas are producing the GPU) is about 16mm2 for 32MB. There will be extra space required for wiring it all together ect but even if we double it that still leaves 114mm2. You also probably have a tiny DSP on there and very small ARM CPU, but that's unlikely to take up more than a few mm2.
Moral Panic
Member
(01-31-2013, 12:29 AM)
Moral Panic's Avatar

Originally Posted by USC-fan

Yes its taken from the wall. So you will have to take some off. Its only a couple of watts at most.

It's the percentage that matters isn't it? The most expensive PSU's are 80Plus platinum where you have 92% efficiency at 50% load. So you lose 8% of the power from the wall to heat. If the PSU is less efficient to save money but still efficient at 80Plus the you lose 20% of the power to heat, it may only be a few points but the WiiU is only 33W so a few points off that is a hefty chunk.
ozfunghi
Member
(01-31-2013, 12:35 AM)
ozfunghi's Avatar

Originally Posted by Fourth Storm

It seems much more likely we will find Renesas eDRAM on the GPU. Also, the R700 series has been on 40nm for years now and Nintendo would likely want to take advantage of the maturity of that process. They've had plenty of time to optimize and tweak. It's hard to say how much room the eDRAM will take up. The overhead required might be affected by internal bus width from the eDRAM to the rest of the GPU.

You guys seemed rather excited. Are there any indications we might get a few nice surprises concerning WiiU GPU performance or features, or is it because Chipworks is giving you/us a "better deal" of sorts, which will not relate to performance of the console?
Wishmaster92
Member
(01-31-2013, 12:39 AM)
Wishmaster92's Avatar

Originally Posted by Grampa Simpson

Over 9000.

Ya, has to be in the fousands at least.
ScepticMatt
Member
(01-31-2013, 01:11 AM)
ScepticMatt's Avatar

Originally Posted by Donnie

hat still leaves 114mm2. You also probably have a tiny DSP on there and very small ARM CPU, but that's unlikely to take up more than a few mm2.

If you subtract a few mm from 114 it would still get close to my number.
Looking at some Anandtech VLIW5 diagrams, the core numbers must be a multiple of 80

So the closest other option would be 400 (440 GFLOPS @550 MHz)

I guess we will know soon anyway.
Here is the rv770 (hd 4870). you can see the 10 SIMD core rows in the middle right, with 10x4 TMUs per SIMD core to the middle left.
(think of the above diagram rotated 90 degrees)
Last edited by ScepticMatt; 01-31-2013 at 02:33 AM.
ozfunghi
Member
(01-31-2013, 01:56 AM)
ozfunghi's Avatar
This has been posted up the wazoo in the WUST threads, but how much would 1GB GDDR5 consume in power?

http://www.amd.com/la/Documents/AMD-...duct-brief.pdf

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.
ScepticMatt
Member
(01-31-2013, 02:12 AM)
ScepticMatt's Avatar

Originally Posted by ozfunghi

but how much would 1GB GDDR5 consume in power?

4.3 W @ 1.35V @ 46 nm according to this Samsung PDF I found via Google
could be lower with newer tech

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.

The power consumption scales linear with clock (8% lower), and square of voltage (unknown). So this GPU should be within the WiiU power envelope, I think.
Last edited by ScepticMatt; 01-31-2013 at 02:26 AM.
ozfunghi
Member
(01-31-2013, 02:22 AM)
ozfunghi's Avatar

Originally Posted by ScepticMatt

4.3 W @ 1.35V @ 46 nm according to this Samsung PDF I found via Google
could be lower with newer tech

The power consumption scales linear with clock, and square of voltage if they manage to reduce it. So this GPU should be within the WiiU power envelope, I think.

That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.
japtor
Member
(01-31-2013, 02:23 AM)
japtor's Avatar

Originally Posted by wsippel

I think they appreciate our craziness, though. It's good marketing.

Originally Posted by lostinblue

I'd most definitely be interested in anything they have to say about it, yes.

And they seem to be cool dudes, I mean it's a business, but they're apparently going the extra mile for us. Gotta appreciate that.

I'm very eager for the results.

Yeah I have to imagine a group of people on a forum just doing this for fun isn't exactly their usual customer, I'm interested in seeing what they're coming up with.

Originally Posted by ozfunghi

You guys seemed rather excited. Are there any indications we might get a few nice surprises concerning WiiU GPU performance or features, or is it because Chipworks is giving you/us a "better deal" of sorts, which will not relate to performance of the console?

The latter for me...although it might relate depending on what they're doing. Ultimately they know a lot more about chip layouts and stuff, so if they can help out in sorting out some details it could help in figuring out performance metrics. Plus they have images of other chips that aren't being bought (yet?) so anything on those could help too.
Smurfman256
Member
(01-31-2013, 02:27 AM)
Smurfman256's Avatar

Originally Posted by oversitting

So who wants to start taking bets on the numbers?

I'm not gonna bet, but my guess is somewhere between 400 and 480 shaders, 32 TEVs and 16 ROPs.
ScepticMatt
Member
(01-31-2013, 02:33 AM)
ScepticMatt's Avatar

Originally Posted by ozfunghi

That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.

23W with a 10% core voltage reduction.
also the 35W in the PDF is the "Thermal design power", which may not yet have been reached by the WiiU
blu
Member
(01-31-2013, 10:33 AM)
blu's Avatar

Originally Posted by tipoo

I'm not one either, but whenever I've built PCs and checked power supply calculators for what I should get, there was always over provisioning for capacitor aging. If it didn't affect output, why would that be there? One example:

http://www.extreme.outervision.com/p...ulatorlite.jsp

Well, I just talked to an actual EE, and that calculator is extreme, to put it mildly.

Aging of electrolyte capacitors is mostly a function of how much load you submit them to, by the simple causality: load -> heat -> evaporation of electrolyte. A quality power supply rated at N watts will use reservoir capacitors (responsible for the smoothing of the voltage into 'proper' DC) of both high-quality electrolyte and of sufficient capacity so that they would not degrade to unacceptable levels below N over the projected lifespan of the device (which can be several decades).

The advice on that site you quoted (that you should go for a larger power reserve over a longer projected lifespan) does help for no other reason than the fact that a higher-power PSU, regardless of its quality, would still use larger capacitors, so when used at lower loads those capacitors will function longer within the expected margins for the PSU. So basically if you have doubts about the quality of the PSU, going with a larger one will buy you some extra lifespan.

But at the same time the site sets the issue of aging PSU's onto the wrong premise. A badly-aged PSU does not produce less DC power per se - in 3 years it will not produce perfectly good DC at 30% less (arbitrary numbers) - no, it will produce power of lesser DC quality. Whether for the device using that power that translates to dropping of some power lines and shutdowns, or the death of some components - that's entirely up to how that device was designed to withstand bad DC. And that is what causes devices to fail when used with aged (low-quality) PSUs. Bottomline being, an aged (low quality) PSU is not 'just as good as new but for lower loads' - it's just bad!
Last edited by blu; 01-31-2013 at 02:19 PM.
tipoo
Banned
(01-31-2013, 12:43 PM)
Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.
Last edited by tipoo; 01-31-2013 at 12:46 PM.
Shin Johnpv
Ninty Ninty Ninty
Ninty Ninty Ninty
(01-31-2013, 01:19 PM)
Shin Johnpv's Avatar

Originally Posted by tipoo

There has to be some over-provisioning. Capacitors age, similar to batteries, and in two years that 75w power supply may only be able to hit say 70 watts (not an actual calculation, just for example). So anything that tries to draw the original 75 watts from it is going to crash. If they want a console to have a healthy lifespan, as Nintendo always does, they have to over provision quite a bit.

I've always been under the impression that PSU capacitor aging leads to lowered efficiency, and not sever drops in power output.

*edit*

Should have read the rest of the thread before replying.
Last edited by Shin Johnpv; 01-31-2013 at 01:33 PM.
Schnozberry
Member
(01-31-2013, 01:46 PM)
Schnozberry's Avatar

Originally Posted by tipoo

Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.
tipoo
Banned
(01-31-2013, 02:05 PM)

Originally Posted by Schnozberry

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/di...-green-console
Last edited by tipoo; 01-31-2013 at 02:08 PM.
Donnie
Member
(01-31-2013, 02:24 PM)
Of course they dont change clock speeds. But still there are Xbox 360/PS3 games that use around 10% more power than other games on the same system. So I expect something similar with WiiU.
Last edited by Donnie; 01-31-2013 at 02:31 PM.
Nostremitus
Member
(01-31-2013, 02:31 PM)
Nostremitus's Avatar

Originally Posted by Schnozberry

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

If Cheesemeister's translation is accurate...

Originally Posted by Cheesemeister

The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.

This makes it sound as though the 40w does not include USB, but it could have been mistated.
Kenka
Member
(01-31-2013, 02:37 PM)
Kenka's Avatar
Since money was gathered how much should we wait to get the die pictures?
Thanks to everyone who contributed. I came too late to throw my bucks :-(
chaosblade
Member
(01-31-2013, 02:43 PM)
chaosblade's Avatar

Originally Posted by Kenka

Since money was gathered how much should we wait to get the die pictures?
Thanks to everyone who contributed. I came too late to throw my bucks :-(

See this post. Chances are only Fourth Storm will be getting the actual details/picture, he'll just pass along info to everyone else.

http://www.neogaf.com/forum/showpost...postcount=2475
Kenka
Member
(01-31-2013, 02:51 PM)
Kenka's Avatar

Originally Posted by chaosblade

See this post. Chances are only Fourth Storm will be getting the actual details/picture, he'll just pass along info to everyone else.

http://www.neogaf.com/forum/showpost...postcount=2475

Thanks ! I am looking forward for the actual news blowout.

Originally Posted by ozfunghi

This has been posted up the wazoo in the WUST threads, but how much would 1GB GDDR5 consume in power?

http://www.amd.com/la/Documents/AMD-...duct-brief.pdf

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.

This was long considered on GAF as the culprit on which the GPU is based. I hope we get something comparable to the e6760 in terms of power.
Schnozberry
Member
(01-31-2013, 03:06 PM)
Schnozberry's Avatar

Originally Posted by tipoo

It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/di...-green-console

Ok, I missed that for some reason. Thanks for pointing it out. How many watts are we assuming the CPU consumes? 5-10?
prag16
Member
(01-31-2013, 03:10 PM)
prag16's Avatar

Originally Posted by Schnozberry

Ok, I missed that for some reason. Thanks for pointing it out. How many watts are we assuming the CPU consumes? 5-10?

I think somebody was estimating under 2 per core at one point recently; forget who/where. So ~5.
Smurfman256
Member
(01-31-2013, 03:20 PM)
Smurfman256's Avatar

Originally Posted by prag16

I think somebody was estimating under 2 per core at one point recently; forget who/where. So ~5.

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).
Last edited by Smurfman256; 01-31-2013 at 03:39 PM.
Chronos24
Junior Member
(01-31-2013, 04:20 PM)

Originally Posted by Smurfman256

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

So that leaves 25 more watts for everything else. I wonder how much the GPU is using in that regard.
Donnie
Member
(01-31-2013, 04:29 PM)
I doubt 33w will end up being the max the system uses. It'll probably push 36-37w max IMO.
Last edited by Donnie; 01-31-2013 at 04:32 PM.
Fourth Storm
Member
(01-31-2013, 04:38 PM)
Fourth Storm's Avatar

Originally Posted by Nostremitus

If Cheesemeister's translation is accurate...



This makes it sound as though the 40w does not include USB, but it could have been mistated.

I'm guessing Iwata threw in the load for one USB peripheral to reach that ballpark. It doesn't seem that games have gone past 33 watts standalone.

I'd still be surprised if there wasn't at least some minimal fluctuation. I remember when my Wii GPU was in the process of frying (as apparently the WiiConnect24 setting should have been labeled "Low and Slow"), it was only specific and seemingly graphically demanding games that would set it off. I could play Wii Sports and NSMBWii fine, but pop in SMG2 or Silent Hill: SM and the image would quickly start showing telltale signs of GPU damage. I would imagine that this would indicate that not all loads were necessarily equal. Perhaps somebody else has a better answer, though...
OryoN
Member
(01-31-2013, 04:49 PM)
OryoN's Avatar

Originally Posted by Fourth Storm

Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.
blu
Member
(01-31-2013, 04:51 PM)
blu's Avatar
I just noticed some wrong numbers posted earlier that slipped unnoticed.

Originally Posted by joesiv

True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox 360 had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...

Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.
Fourth Storm
Member
(01-31-2013, 04:57 PM)
Fourth Storm's Avatar

Originally Posted by OryoN

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.

I don't have the answer to that yet, but it certainly seems possible.
McHuj
Member
(01-31-2013, 05:05 PM)
McHuj's Avatar

Originally Posted by Smurfman256

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

Your math is off, but the actual power numbers maybe close to your estimates.

180nm => 90nm is a 4X reduction in area, there's a 130nm node in between. Same for 90 => 45nm, there's a 65nm node in between. Unfortunately, power scaling hasn't scaled as linearly as density. If it did, you could divide you numbers by another half.
joesiv
Member
(01-31-2013, 05:16 PM)

Originally Posted by blu

I just noticed some wrong numbers posted earlier that slipped unnoticed.


Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.

My apologies if they were incorrect, I went with images of the powersupplies off google and just took the watt value that was printed on the label. It's possible that the images I used were third party, or the something...

*edit*, that's interesting, isn't this the official NGC power supply?


What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency? Anyways, my numbers for the GC and Wii were on the AC side, Can someone check what the DC rating (specifically amps) the WiiU's power supply label says?
Last edited by joesiv; 01-31-2013 at 05:32 PM.
chaosblade
Member
(01-31-2013, 05:38 PM)
chaosblade's Avatar
15 volts, 5 amps, which would be 75w.
joesiv
Member
(01-31-2013, 06:04 PM)

Originally Posted by chaosblade

15 volts, 5 amps, which would be 75w.

Well, that's interesting, thanks!

Thread Tools