• Register
  • TOS
  • Privacy
  • @NeoGAF
  • Like

ScepticMatt
Member
(01-31-2013, 02:11 AM)
ScepticMatt's Avatar

Originally Posted by Donnie

hat still leaves 114mm2. You also probably have a tiny DSP on there and very small ARM CPU, but that's unlikely to take up more than a few mm2.

If you subtract a few mm from 114 it would still get close to my number.
Looking at some Anandtech VLIW5 diagrams, the core numbers must be a multiple of 80

So the closest other option would be 400 (440 GFLOPS @550 MHz)

I guess we will know soon anyway.
Here is the rv770 (hd 4870). you can see the 10 SIMD core rows in the middle right, with 10x4 TMUs per SIMD core to the middle left.
(think of the above diagram rotated 90 degrees)
ozfunghi
Member
(01-31-2013, 02:56 AM)
ozfunghi's Avatar
This has been posted up the wazoo in the WUST threads, but how much would 1GB GDDR5 consume in power?

http://www.amd.com/la/Documents/AMD-...duct-brief.pdf

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.
ScepticMatt
Member
(01-31-2013, 03:12 AM)
ScepticMatt's Avatar

Originally Posted by ozfunghi

but how much would 1GB GDDR5 consume in power?

4.3 W @ 1.35V @ 46 nm according to this Samsung PDF I found via Google
could be lower with newer tech

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.

The power consumption scales linear with clock (8% lower), and square of voltage (unknown). So this GPU should be within the WiiU power envelope, I think.
ozfunghi
Member
(01-31-2013, 03:22 AM)
ozfunghi's Avatar

Originally Posted by ScepticMatt

4.3 W @ 1.35V @ 46 nm according to this Samsung PDF I found via Google
could be lower with newer tech

The power consumption scales linear with clock, and square of voltage if they manage to reduce it. So this GPU should be within the WiiU power envelope, I think.

That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.
japtor
Member
(01-31-2013, 03:23 AM)
japtor's Avatar

Originally Posted by wsippel

I think they appreciate our craziness, though. It's good marketing.

Originally Posted by lostinblue

I'd most definitely be interested in anything they have to say about it, yes.

And they seem to be cool dudes, I mean it's a business, but they're apparently going the extra mile for us. Gotta appreciate that.

I'm very eager for the results.

Yeah I have to imagine a group of people on a forum just doing this for fun isn't exactly their usual customer, I'm interested in seeing what they're coming up with.

Originally Posted by ozfunghi

You guys seemed rather excited. Are there any indications we might get a few nice surprises concerning WiiU GPU performance or features, or is it because Chipworks is giving you/us a "better deal" of sorts, which will not relate to performance of the console?

The latter for me...although it might relate depending on what they're doing. Ultimately they know a lot more about chip layouts and stuff, so if they can help out in sorting out some details it could help in figuring out performance metrics. Plus they have images of other chips that aren't being bought (yet?) so anything on those could help too.
Smurfman256
Member
(01-31-2013, 03:27 AM)
Smurfman256's Avatar

Originally Posted by oversitting

So who wants to start taking bets on the numbers?

I'm not gonna bet, but my guess is somewhere between 400 and 480 shaders, 32 TEVs and 16 ROPs.
ScepticMatt
Member
(01-31-2013, 03:33 AM)
ScepticMatt's Avatar

Originally Posted by ozfunghi

That would bring it down to about 28W @ 550MHz minus the GDDR5? What else could be cut, that Nintendo might not need? Is 28W realistic when your console - so far - has only used about 30W (33W from the wall)? That would mean current games have only stressed the GPU about halfway... even though there doesn't seem to be much between consumption comparing software/games. So that doesn't seem likely.

23W with a 10% core voltage reduction.
also the 35W in the PDF is the "Thermal design power", which may not yet have been reached by the WiiU
blu
Wants the largest console games publisher to avoid Nintendo's platforms.
(01-31-2013, 11:33 AM)
blu's Avatar

Originally Posted by tipoo

I'm not one either, but whenever I've built PCs and checked power supply calculators for what I should get, there was always over provisioning for capacitor aging. If it didn't affect output, why would that be there? One example:

http://www.extreme.outervision.com/p...ulatorlite.jsp

Well, I just talked to an actual EE, and that calculator is extreme, to put it mildly.

Aging of electrolyte capacitors is mostly a function of how much load you submit them to, by the simple causality: load -> heat -> evaporation of electrolyte. A quality power supply rated at N watts will use reservoir capacitors (responsible for the smoothing of the voltage into 'proper' DC) of both high-quality electrolyte and of sufficient capacity so that they would not degrade to unacceptable levels below N over the projected lifespan of the device (which can be several decades).

The advice on that site you quoted (that you should go for a larger power reserve over a longer projected lifespan) does help for no other reason than the fact that a higher-power PSU, regardless of its quality, would still use larger capacitors, so when used at lower loads those capacitors will function longer within the expected margins for the PSU. So basically if you have doubts about the quality of the PSU, going with a larger one will buy you some extra lifespan.

But at the same time the site sets the issue of aging PSU's onto the wrong premise. A badly-aged PSU does not produce less DC power per se - in 3 years it will not produce perfectly good DC at 30% less (arbitrary numbers) - no, it will produce power of lesser DC quality. Whether for the device using that power that translates to dropping of some power lines and shutdowns, or the death of some components - that's entirely up to how that device was designed to withstand bad DC. And that is what causes devices to fail when used with aged (low-quality) PSUs. Bottomline being, an aged (low quality) PSU is not 'just as good as new but for lower loads' - it's just bad!
tipoo
Banned
(01-31-2013, 01:43 PM)
Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.
Shin Johnpv
Ninty Ninty Ninty
Ninty Ninty Ninty
(01-31-2013, 02:19 PM)
Shin Johnpv's Avatar

Originally Posted by tipoo

There has to be some over-provisioning. Capacitors age, similar to batteries, and in two years that 75w power supply may only be able to hit say 70 watts (not an actual calculation, just for example). So anything that tries to draw the original 75 watts from it is going to crash. If they want a console to have a healthy lifespan, as Nintendo always does, they have to over provision quite a bit.

I've always been under the impression that PSU capacitor aging leads to lowered efficiency, and not sever drops in power output.

*edit*

Should have read the rest of the thread before replying.
Schnozberry
Member
(01-31-2013, 02:46 PM)
Schnozberry's Avatar

Originally Posted by tipoo

Hmm, interesting. But there's still the fact brought up earlier in the thread that Nintendo system power supplies have almost always been about double the rating of what the system actually drew - there must be a reason for that, even if it's not electrolyte aging. Efficiency maybe, I think PSUs are more efficient at 50-80% of their max load.

My point being that this talk of a future overclock that makes the system draw near double the power to operate faster seems like a pretty huge stretch to me, since it's typical of Nintendo to have a power supply rated for so much more than the system draws. It's also typical of Microsoft and Sony, although perhaps to a lesser discrepancy.

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.
tipoo
Banned
(01-31-2013, 03:05 PM)

Originally Posted by Schnozberry

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/di...-green-console
Donnie
Member
(01-31-2013, 03:24 PM)
Of course they dont change clock speeds. But still there are Xbox 360/PS3 games that use around 10% more power than other games on the same system. So I expect something similar with WiiU.
Nostremitus
Member
(01-31-2013, 03:31 PM)
Nostremitus's Avatar

Originally Posted by Schnozberry

I don't think anybody was banking on future overclocks. Mario was the game said to draw 33w, so I thought it might have been possible that it wasn't pushing the hardware to full load, because Iwata had said in a ND that the system would normally draw 45w in operation. What we don't know, is if the 45w he spoke of includes headroom for attached devices via USB and using all the devices various radio antennas. It seems logical that it would, so it's probably a moot point.

If Cheesemeister's translation is accurate...

Originally Posted by Cheesemeister

The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.

This makes it sound as though the 40w does not include USB, but it could have been mistated.
Kenka
Member
(01-31-2013, 03:37 PM)
Kenka's Avatar
Since money was gathered how much should we wait to get the die pictures?
Thanks to everyone who contributed. I came too late to throw my bucks :-(
chaosblade
Member
(01-31-2013, 03:43 PM)
chaosblade's Avatar

Originally Posted by Kenka

Since money was gathered how much should we wait to get the die pictures?
Thanks to everyone who contributed. I came too late to throw my bucks :-(

See this post. Chances are only Fourth Storm will be getting the actual details/picture, he'll just pass along info to everyone else.

http://www.neogaf.com/forum/showpost...postcount=2475
Kenka
Member
(01-31-2013, 03:51 PM)
Kenka's Avatar

Originally Posted by chaosblade

See this post. Chances are only Fourth Storm will be getting the actual details/picture, he'll just pass along info to everyone else.

http://www.neogaf.com/forum/showpost...postcount=2475

Thanks ! I am looking forward for the actual news blowout.

Originally Posted by ozfunghi

This has been posted up the wazoo in the WUST threads, but how much would 1GB GDDR5 consume in power?

http://www.amd.com/la/Documents/AMD-...duct-brief.pdf

That chip is on a 40nm process, has 480 SPU's, runs at 600MHz and consumes 35 W.

This was long considered on GAF as the culprit on which the GPU is based. I hope we get something comparable to the e6760 in terms of power.
Schnozberry
Member
(01-31-2013, 04:06 PM)
Schnozberry's Avatar

Originally Posted by tipoo

It was 33w during Mass Effect 3 as well. Eurogamer wasn't able to make it budge an inch past 33w with any game. USB perhipherals are probably a different story, but in terms of the internals being underutilized right now I don't think that's the case. Like most consoles it seems to only have a few power modes to pick from, "off", "gaming", and "netflix", no matter how light or heavy the game is. They don't bother with SpeedStep and PowerPlay like technologies in consoles.

http://www.eurogamer.net/articles/di...-green-console

Ok, I missed that for some reason. Thanks for pointing it out. How many watts are we assuming the CPU consumes? 5-10?
prag16
Member
(01-31-2013, 04:10 PM)
prag16's Avatar

Originally Posted by Schnozberry

Ok, I missed that for some reason. Thanks for pointing it out. How many watts are we assuming the CPU consumes? 5-10?

I think somebody was estimating under 2 per core at one point recently; forget who/where. So ~5.
Smurfman256
Member
(01-31-2013, 04:20 PM)
Smurfman256's Avatar

Originally Posted by prag16

I think somebody was estimating under 2 per core at one point recently; forget who/where. So ~5.

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).
Chronos24
Member
(01-31-2013, 05:20 PM)

Originally Posted by Smurfman256

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

So that leaves 25 more watts for everything else. I wonder how much the GPU is using in that regard.
Donnie
Member
(01-31-2013, 05:29 PM)
I doubt 33w will end up being the max the system uses. It'll probably push 36-37w max IMO.
Fourth Storm
Member
(01-31-2013, 05:38 PM)
Fourth Storm's Avatar

Originally Posted by Nostremitus

If Cheesemeister's translation is accurate...



This makes it sound as though the 40w does not include USB, but it could have been mistated.

I'm guessing Iwata threw in the load for one USB peripheral to reach that ballpark. It doesn't seem that games have gone past 33 watts standalone.

I'd still be surprised if there wasn't at least some minimal fluctuation. I remember when my Wii GPU was in the process of frying (as apparently the WiiConnect24 setting should have been labeled "Low and Slow"), it was only specific and seemingly graphically demanding games that would set it off. I could play Wii Sports and NSMBWii fine, but pop in SMG2 or Silent Hill: SM and the image would quickly start showing telltale signs of GPU damage. I would imagine that this would indicate that not all loads were necessarily equal. Perhaps somebody else has a better answer, though...
OryoN
Member
(01-31-2013, 05:49 PM)
OryoN's Avatar

Originally Posted by Fourth Storm

Hey guys, I suppose I should give the contributors somewhat of an update. No, I don't have the image yet. However, I have been in contact with some friendly people at Chipworks and I assure you that you will be satisfied with the information to come in a few more days. They are actually going to special lengths for us. I humbly ask for a bit more patience and to not bombard them with emails.

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.
blu
Wants the largest console games publisher to avoid Nintendo's platforms.
(01-31-2013, 05:51 PM)
blu's Avatar
I just noticed some wrong numbers posted earlier that slipped unnoticed.

Originally Posted by joesiv

True enough.

I did some more research and i Found:

The Wii Had a power supply of 52Watts, but at load only drew 18Watts in games
The Origional Xbox 360 had a power supply of 203Watts and a load of 186Watts in games
The Gamecube had a power supply of 48Watts, and a load of ~23Watts in games

Given Nintendo's history, using < 50% of the rated power supply seems normal. Boo...

Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.
Fourth Storm
Member
(01-31-2013, 05:57 PM)
Fourth Storm's Avatar

Originally Posted by OryoN

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.

I don't have the answer to that yet, but it certainly seems possible.
McHuj
Member
(01-31-2013, 06:05 PM)
McHuj's Avatar

Originally Posted by Smurfman256

If my math is right (and there might be a few errors), If Gekko ran at 486MHz and had a TDP of 4.9W at 180nm, then that means that a 729MHz Broadway core @ 90nm draws 3.675 watts per core, and a 1.25GHz Espresso (1.4x the clock speed of Broadway) and at 45nm draws 2.5725 watts/core. It's actually closer to 8 watts for the CPU (7.7175, to be exact).

Your math is off, but the actual power numbers maybe close to your estimates.

180nm => 90nm is a 4X reduction in area, there's a 130nm node in between. Same for 90 => 45nm, there's a 65nm node in between. Unfortunately, power scaling hasn't scaled as linearly as density. If it did, you could divide you numbers by another half.
joesiv
Member
(01-31-2013, 06:16 PM)

Originally Posted by blu

I just noticed some wrong numbers posted earlier that slipped unnoticed.


Gamecube's power supply is DC 12V, 3.5A = 42W.
Wii's power supply is DC 12V, 3.7A = 44.4W.

My apologies if they were incorrect, I went with images of the powersupplies off google and just took the watt value that was printed on the label. It's possible that the images I used were third party, or the something...

*edit*, that's interesting, isn't this the official NGC power supply?


What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency? Anyways, my numbers for the GC and Wii were on the AC side, Can someone check what the DC rating (specifically amps) the WiiU's power supply label says?
chaosblade
Member
(01-31-2013, 06:38 PM)
chaosblade's Avatar
15 volts, 5 amps, which would be 75w.
joesiv
Member
(01-31-2013, 07:04 PM)

Originally Posted by chaosblade

15 volts, 5 amps, which would be 75w.

Well, that's interesting, thanks!
tipoo
Banned
(01-31-2013, 07:23 PM)
Around 8 watts max sounds right for the CPU, I would bet dollars to donuts that the larger TDP budget was given to the GPU (pretty safe bet based on size alone). Anyone know how much the optical drive uses? I'm guessing the GPU would be left with under 20, all things accounted for. Maybe 15 ish. There's still NAND, controllers, RAM, etc to account for.
Lonely1
Member
(01-31-2013, 07:52 PM)
Lonely1's Avatar

Originally Posted by OryoN

Interesting...didn't see that bold part before. Is it possible they can confirm the GPU's manufacturing process for us too? They seem to have all the tools for this kind of stuff.

DO we know 3DS manufacturing process?
GhostTrick
Member
(01-31-2013, 07:53 PM)
GhostTrick's Avatar

Originally Posted by Lonely1

DO we know 3DS manufacturing process?

I thought it was 65nm or 45nm.
Lonely1
Member
(01-31-2013, 07:55 PM)
Lonely1's Avatar

Originally Posted by GhostTrick

I thought it was 65nm or 45nm.

Well, yes. It must be one of those two, but I have yet to read anywhere stating which.
z0m3le
Banned
(01-31-2013, 07:56 PM)
Wouldn't they use less voltage for the smaller CPU? or is that already taken into account? also while it is based on the Wii CPU, it isn't exactly 3 Wii CPUs shrunk to 45nm and stitched together, that is technically not how it was done. Was Gecko to Broadway the same efficiency that people are suggesting here? or was more done to broadway to make it more efficient?

Also this same can be said about the GPU, it's hard to guess what components are using however I believe the disc drive should be easy to find, take measurement of a DD game and that same game playing from the disc, would give you that measurement. (at least probably)

ARM, DSP... these should be very small in wattage use, RAM is also the low powered stuff right? 4 chips, I'm sure it can be googled.
Lonely1
Member
(01-31-2013, 07:57 PM)
Lonely1's Avatar

Originally Posted by z0m3le

Wouldn't they use less voltage for the smaller CPU? or is that already taken into account? also while it is based on the Wii CPU, it isn't exactly 3 Wii CPUs shrunk to 45nm and stitched together, that is technically not how it was done. Was Gecko to Broadway the same efficiency that people are suggesting here? or was more done to broadway to make it more efficient?

Also this same can be said about the GPU, it's hard to guess what components are using however I believe the disc drive should be easy to find, take measurement of a DD game and that same game playing from the disc, would give you that measurement. (at least probably)

ARM, DSP... these should be very small in wattage use, RAM is also the low powered stuff right? 4 chips, I'm sure it can be googled.

Bigger caches at the least.
z0m3le
Banned
(01-31-2013, 08:00 PM)

Originally Posted by Lonely1

Bigger caches at the least.

I mostly just meant from an energy point, it was iirc slightly improved in performance, thanks to as you say, increase in cache size.
blu
Wants the largest console games publisher to avoid Nintendo's platforms.
(01-31-2013, 11:58 PM)
blu's Avatar

Originally Posted by joesiv

My apologies if they were incorrect, I went with images of the powersupplies off google and just took the watt value that was printed on the label. It's possible that the images I used were third party, or the something...

*edit*, that's interesting, isn't this the official NGC power supply?


What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency? Anyways, my numbers for the GC and Wii were on the AC side, Can someone check what the DC rating (specifically amps) the WiiU's power supply label says?

Da heck? I had it the wrong all this time?.. Serves me right for not actually checking the brick before posting.
Metal Gear?!
Member
(02-01-2013, 05:27 AM)
Metal Gear?!'s Avatar

Originally Posted by joesiv

What's interesting is it shows 48W for the AC side, and 12v @ 3.25 on the output side (39Watts). Conveniently it's around 81% difference, perhaps it's taking into account the efficiency?

Yes, roughly 80% efficiency is about what you would expect from power supply conversion.
ozfunghi
Member
(02-01-2013, 12:51 PM)
ozfunghi's Avatar
Any new updates on the Chipworks story?
Captain Smoker
"Hey! What's your name
  then?"
"Mancomb Seepgood."
(02-01-2013, 12:55 PM)
Captain Smoker's Avatar

Originally Posted by ozfunghi

Any new updates on the Chipworks story?




:P ;)
lostinblue
Member
(02-01-2013, 04:32 PM)
lostinblue's Avatar

Originally Posted by tipoo

Around 8 watts max sounds right for the CPU, I would bet dollars to donuts that the larger TDP budget was given to the GPU (pretty safe bet based on size alone). Anyone know how much the optical drive uses? I'm guessing the GPU would be left with under 20, all things accounted for. Maybe 15 ish. There's still NAND, controllers, RAM, etc to account for.

The Blu Ray drive should be 12 watts maximum, not more seeing that's the max TDP listed for such parts on desktop PC's.

Probably closer to 4/5 watts (it's possible, my only doubt actually steams from the fact the drive is actually clunky sized).
Chronos24
Member
(02-01-2013, 05:10 PM)
Really hoping all this anticipation heightens the pleasure! Wanna solve this mystery!!
japtor
Member
(02-01-2013, 05:30 PM)
japtor's Avatar

Originally Posted by lostinblue

The Blu Ray drive should be 12 watts maximum, not more seeing that's the max TDP listed for such parts on desktop PC's.

Probably closer to 4/5 watts (it's possible, my only doubt actually steams from the fact the drive is actually clunky sized).

No clue if this is useful:
tipoo
Banned
(02-01-2013, 06:14 PM)

Originally Posted by japtor

No clue if this is useful:

Is that from the Wii U drive? I'm not sure which of those numbers to go with, moving parts like optical drives and hard drives usually use the 12V rail but that calculation leads to 22 watts which seems too high. The next bump down gets us 5 watts, and the 3.35 one would be 1.6 watts which then seems too low.

So I'm not sure which of the three numbers is right, and what it means that there are three of them. It wouldn't make sense for it to have three different power draws.


Watts = Amps x Volts btw
tkscz
Member
(02-01-2013, 06:15 PM)
tkscz's Avatar

Originally Posted by ozfunghi

Any new updates on the Chipworks story?

This, seriously, where is this?
Fourth Storm
Member
(02-01-2013, 06:17 PM)
Fourth Storm's Avatar

Originally Posted by ozfunghi

Any new updates on the Chipworks story?

We're looking at probably Monday or Tuesday.
joesiv
Member
(02-01-2013, 06:57 PM)

Originally Posted by Fourth Storm

We're looking at probably Monday or Tuesday.

We should start a count down timer.

Fourth Storm Direct confirmed!
AzaK
Member
(02-01-2013, 07:24 PM)

Originally Posted by joesiv

We should start a count down timer.

Fourth Storm Direct confirmed!

Yeah FS, this better be a video and you better have hair as good as Iwata's.
japtor
Member
(02-01-2013, 07:30 PM)
japtor's Avatar

Originally Posted by tipoo

Is that from the Wii U drive? I'm not sure which of those numbers to go with, moving parts like optical drives and hard drives usually use the 12V rail but that calculation leads to 22 watts which seems too high. The next bump down gets us 5 watts, and the 3.35 one would be 1.6 watts which then seems too low.

So I'm not sure which of the three numbers is right, and what it means that there are three of them. It wouldn't make sense for it to have three different power draws.


Watts = Amps x Volts btw

Yeah it's from iFixit's tear down, wish they opened it up considering how non standard it looks. I looked up Panasonic's other drives and could only find laptop ones.

Thread Tools