• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 power draw - 80W idle, 120W-140W gaming, max 45 Degrees Celsius exhaust

Summoner

Member
There's a lot more than just a CPU in play here (a dedicated graphics card, for example), and it's likely running at a much higher clock speed than netbooks do, because it has the space and fans to dissipate the heat. Completely different situation.
Speaking of "clock speeds".....do we know what the official clock speed of the PS4's CPU is yet?
 
Read the article?! Are you fucking crazy?

Actually, I'm boycotting Eurolamer. With good reason once more I may add, the noise measurements are bullshit. A monkey on crack could do a better job.


Why is it bad?

What were you expecting? A randomised control trial with 5000 units for a decent sample size?

Also concur with the post above. Does anyone know?
 

vazel

Banned
80W idle seems a bit much. My 3570k idles at 9W, my 660ti idles at 16W. Even with my motherboard and RAM and two hard drives I can't imagine my PC reaches 80W idling.
 

Chumpion

Member
What were you expecting? A randomised control trial with 5000 units for a decent sample size?

I was expecting them to choose some other place than the goddamn Piccadilly Circus for their noise measurements. Then they proudly report them rounded to the nearest decibel.

Assuming they were rounded, a low estimate would be 41.6 dB - 40.4 dB = 35.4 dB, and a high estimate would be 42.4 dB - 39.6 dB = 39.2 dB.
 

TheExodu5

Banned
45C is pretty good, especially for AMD. The last few AMD processors I owned would idle at 50 and top near 70C.

45C is the exhaust temp, not the core temp. The core temp will be quite a bit higher.

Keep in mind that these units are brand new. You're comparing to your launch 60GB which is probably loud as fuck now compared to what it was at launch, because of dust build up and fans kicking in at high speed early. I'd bet your unit is a lot louder than the values given for slim PS3s by that german website.

DF's article (and that german chart) also have a problem: they don't mention the conditions very thoroughly. How long have then been playing for? Was it a disc-based game or DD? Etc.

Nevertheless, you have to keep in mind that the values DF give are not "corrected" for ambient noise, which they measured at 40dB. Therefore, when they say 40dB for a PS4 in X mode, It's the "total" noise, if you wish, and this means that there was no added, perceived noise on top of the background noise.

Consequently, when they say 42dB it's not for the PS4 only (thus comparisons with charts giving you absolute values are misleading), you can actually roughly make out the noise produced by the console itself. It's around 38dB (40dB background + 38dB PS4 = 42.1 dB). So, yes, sitting at 1m from the console, you will notice that your PS4 is making noise (40 > 42dB), but it's rather quiet. Essentially, with these conditions and values (sitting at 1m, gaming), your environment will sound ~1.1-1.2 times louder when your PS4 is on, compared to off.

Oh wow...I didn't realize their ambient noise level was 40dB. That's a bit ridiculous.
 

nico1982

Member
Speaking of "clock speeds".....do we know what the official clock speed of the PS4's CPU is yet?
Since Sony is quiet, it is safe to assume is lower than Xbone's. To be honest, they didn't talk about True Audio even when MS was playing the SHAPE card. The point is that Sony is now in a position where, even if they tout unquestionable advantages, they will likely draw unnecessary fire from the press. It is very smart of them to just shut up, really.
 

AmyS

Member
Sorry for going OT.

You cant find netbooks anywhere anymore, they stopped existing. People should probably stop with those comparisons anyways.

This is just not true.

Acer still puts out some great 11.6 inch LED netbooks.

The Acer Aspire One AO756 series that launched in 2012 comes with a dual-core Celeron 877 or (or Pentium). The Celeron versions are actually a bit faster than the Pentium models. Yet both are "real" laptop-class CPUs (that is, laptop CPUs from several years ago) in comparison to netbooks that use Intel Atom CPUs or AMD Bobcat-based APUs.

I own the Acer Aspire One AO756-2808 that came with 4 GB DDR3 RAM and a 500 GB HDD. I also just got a hardly-ever used AO756-2623 that was for sale on Amazon-- I believe this unit may have started with 2 GB RAM though it was 4 GB when I ordered it. I had the seller upgrade to 8 GB RAM. because it had a smaller 320 GB HDD.

The Celeron 877 units that I have are lightning fast with either 4 or 8 GB RAM. It's easy to upgrade the RAM unlike older Aspire One netbooks, and it seems relatively easy (from the videos I've seen) to upgrade to a solid-state drive/SSD. You just have to check to make sure you get an SSD with the right dimensions.

One can even upgrade to 1333 MHz DDR3 provided both RAM sticks are the same speed. I haven't tried this yet.


I've had 2 older netbooks in the past, both had Intel Atom. One had 1 GB and the other had 2 GB. I'd say the Acer AO756 series is world ahead of old Netbooks with Atom and still significantly faster than newer netbooks with an AMD APU.


Here's a review: http://www.youtube.com/watch?v=_FNsFbaGyjY

p.s. While she is right about the speakers not being good, you CAN adjust the sound/speaker settings to improve it, somewhat. However I just use HDMI output to my 1080p LED and switch to 1920 x 1080, 60p Hz in the Intel HD graphics settings. In fact, I'm using the 8 GB Netbook right now.
 

lyrick

Member
only 131W while gaming. I guess it's time people stop pretending that there's a HD7870 or a HD7850 in there and start looking again at the mobile Pitcairn chip.
 

Ryoohki360

Neo Member
only 131W while gaming. I guess it's time people stop pretending that there's a HD7870 or a HD7850 in there and start looking again at the mobile Pitcairn chip.

It was pretty clear to me from the start that this was it. I mean it's struggling to keep 1080p 30 fps at launch, most launch title don't even use AA or AF (I mean come on, AF eat like 2-3 FPS..). XBOX game where 720p at start, then they started added effects, and they lowered the resolution as time gone. Pretty sure 900p and lower will be the standard in 2 years..
 
only 131W while gaming. I guess it's time people stop pretending that there's a HD7870 or a HD7850 in there and start looking again at the mobile Pitcairn chip.

Before posting something like this you should check your facts, the HD7850 use less than 100w on full load.
 

Summoner

Member
Since Sony is quiet, it is safe to assume is lower than Xbone's. To be honest, they didn't talk about True Audio even when MS was playing the SHAPE card. The point is that Sony is now in a position where, even if they tout unquestionable advantages, they will likely draw unnecessary fire from the press. It is very smart of them to just shut up, really.
I wasn't expecting an announcement from Sony but.... I thought with the some of the PS4's already out there some of the experts would have dissected by now and throw out a number for us.
 

diffusionx

Gold Member
That is not very good, considering the entire focus of the computing industry the past several years has been on power consumption and, subsequently, noise/heat.

My guess is that the PS4 OS is not optimized for standby and low power states at all.
 
Amazing, Sony is getting more power per wattage than the Wii U. Nintendo really got screwed with their hardware.


Impressive hardware Sony.
 

Skeff

Member
That is not very good, considering the entire focus of the computing industry the past several years has been on power consumption and, subsequently, noise/heat.

My guess is that the PS4 OS is not optimized for standby and low power states at all.

It is optimized very well for standby, rumors are that the power consumption is between 0.5-2W during standby.

Idle on the Menu is completely different to standby.
 

Paganmoon

Member
Keep in mind that these units are brand new. You're comparing to your launch 60GB which is probably loud as fuck now compared to what it was at launch, because of dust build up and fans kicking in at high speed early. I'd bet your unit is a lot louder than the values given for slim PS3s by that german website.

DF's article (and that german chart) also have a problem: they don't mention the conditions very thoroughly. How long have then been playing for? Was it a disc-based game or DD? Etc.

Nevertheless, you have to keep in mind that the values DF give are not "corrected" for ambient noise, which they measured at 40dB. Therefore, when they say 40dB for a PS4 in X mode, It's the "total" noise, if you wish, and this means that there was no added, perceived noise on top of the background noise.

Consequently, when they say 42dB it's not for the PS4 only (thus comparisons with charts giving you absolute values are misleading), you can actually roughly make out the noise produced by the console itself. It's around 38dB (40dB background + 38dB PS4 = 42.1 dB). So, yes, sitting at 1m from the console, you will notice that your PS4 is making noise (40 > 42dB), but it's rather quiet. Essentially, with these conditions and values (sitting at 1m, gaming), your environment will sound ~1.1-1.2 times louder when your PS4 is on, compared to off.

Very nice explanation, thanks.
 

tipoo

Banned
Amazing, Sony is getting more power per wattage than the Wii U. Nintendo really got screwed with their hardware.


Impressive hardware Sony.

28nm vs 45 and 40, is that surprising? Anyone expecting magical efficiency out of the Wii U was setting themselves up for letdown.
 
Wow those heat numbers with an internal PSU? WOW.

I expected atleast PC GPU numbers of around 60c during gameplay bouts.

But @ 80w if you ran the PS4 idle for 24 hours a day all year long, you would pay 92$ a year if you had .13c per kw/h
 

tipoo

Banned
But @ 80w if you ran the PS4 idle for 24 hours a day all year long, you would pay 92$ a year if you had .13c per kw/h

Idle =/= standby, in case you confused them. Standby will still consume almost no power. Idle is sitting while on without a game in.

And yeah, the exhaust goes over the PSU too which explains some of the warmness.
 
The fastest 32nm AMD Richland APU needs 30W in idle. A Core i7 4770k + Geforce Titan combo needs 60W in desktop mode. 80W is too much for a games console. GCN has great power saving modes and the Jaguars don't need much power anyway. Sounds like a software problem to me. I think they will fix it with a firmware update.
My guess also.

Started a thread on this at Semiaccurate. There are a couple of modes where the GPU should be turned off which would drop power consumption by 40-60 watts.
 
OS uses WebGL for rendering, that is probably the cause of high power drain in the menus.
BUT when OS UI overlay is enabled the power goes up by about 20 watts and stays there even if there is no update to the UI. This has to mean some hardware is turned on, either a power gated GPU block or a second GPU outside the APU.

I noticed the power went up when a UI Overlay is called from within a game with my own measurements but mine showed more than 10 watts but less than 20 watts increase.
 

le.phat

Member
one hour of playstation 4 idling is 80 watt ?
1 Kilowatt/hour (KWh) costs € 0,14 overhere, give or take ( Europe )

24*75 = 1920 Watt = 1.9 KW

1.9 * 0.14 = 0.266 euro per day

0.266 * 365 = 97.07 euro of electricity for one year of idling.
 
jeff_rigby said:
BUT when OS UI overlay is enabled the power goes up by about 20 watts and stays there even if there is no update to the UI. This has to mean some hardware is turned on, either a power gated GPU block or a second GPU outside the APU.

Nonsense.
Either one would be a surprise wouldn't it. If AMD is to have larger GPUs in their APUs they need a smaller GPU block with a UI overlay for Browser and XTV and a larger GPU for Games and compute. Right now they do this with small GPU in APU and larger dGPU as AMD GPUs do not individually power gate the CUs, they are all on or off. In contrast ARM GPUs are individually gated and multiple GPUs are designed to work together. UI/webgl makes a good point to seperate functionality.
 

spwolf

Member
one hour of playstation 4 idling is 80 watt ?
1 Kilowatt/hour (KWh) costs € 0,14 overhere, give or take ( Europe )

24*75 = 1920 Watt = 1.9 KW

1.9 * 0.14 = 0.266 euro per day

0.266 * 365 = 97.07 euro of electricity for one year of idling.

its 10, not 80.
 

Durante

Member
Idle is higher than I expected, gaming exactly in the expected range.

Remember all the people panicking before launch because of the enormous power consumption/heat of GDDR5? Lol.
 
The only people who did are fools who don't know anything about technology. Its fantastic as it was a good barometer to see who knew anything about technology and who didn't (guess what, the bad games industry folk argued that GDDR5 would be HOT HOT HOT).

One second of thought would tell you "yes it consumers more energy and puts out more heat, not enough to matter". GPUs generally don't have hugeass heatsinks on their GDDR5 modules and a lot of cards have people overclocking memory by 1000mhz and more.
 

spwolf

Member
Idle is higher than I expected, gaming exactly in the expected range.

Remember all the people panicking before launch because of the enormous power consumption/heat of GDDR5? Lol.

it is actually 10w... 80w is when console is still doing stuff in the background despite looking like it is in standby. Confirmed by Ars on 2nd try.
 
Top Bottom