• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC gamers use ridiculous amount of energy

Ted

Member
Except that's not really the case. Hardware development is already focused almost exclusively on efficiency rather than pushing high-end performance. If there's any area where you really don't need a further push towards efficiency, it's computing.

Given the differences in consumption even between the two main GPU brands (before even considering any Intel / AMD gap in CPUs) we're not there yet. That is clear.

That's not really the point though of my original reply...

The user here decided that they didn't need the be all and end all of GPU/GPUs to run games at a level of quality acceptable to them [and presumable higher in many/most/every case than the consoles can provide] instead opting for a lower consumption device.

That's not silly, as you called it. That's an entirely rational decision for some people. Just as running a quad Fury X with some bad mutha 1500W PSU might be an entirely rational decision for someone else.
 
My PC on somewhere maybe around 15 hours a day and looking at my last bill my electric usage was on average 15kWh per day for my entire home, $2.08 per day (last billing cycle was also 29 days) which has been the norm for Summer usage for a long time, averages around twice that in the winter.

Meh, there are other things that need efficiency improvement in my old dump that need attention more than my computer which to reach that 1,400kWh a year would need to be on around 10-12 hours a day at or near full load. I really need to get a new wattage monitor but I know I'm not anywhere near that. The people using that much power I'm going to wager are a relatively small minority.
 
Gaming energy consumption is not a concern once you start paying for electricity for your own house.

There are other appliances that require far more energy. And once you get there, you don't have much time to game anyway.
 

kswiston

Member
I would imagine that my system draws somewhere between 400-500 watts at peak usage based on my processor (i5 4690k) and GPU (R9 280x) which run at around 300 watts when in use.

CPUs are definitely more energy efficient than they were 5-10 years ago (I remember having 125+ Watt CPUs in the past), but the GPU ratings keep creeping up.

Oh OK, that's good to know. The last time I tried to look into this, every place I looked was recommending some ridiculously expensive gadget and I lost interest.

Kill-a-watt monitors are great. I bought one about 5 years ago for $20-25, and use if for various monitoring purposes.
 
I have a 1000W PSU and a Sapphire 280X.


I may or may not be part of the problem.



Just out of curiosity do you plan to get two additional 280X? Because otherwise I really don't see why you would buy a PSU that's much more expensive than what you actually need (~450-500W) while also being less efficient.
 

Red Hood

Banned
Just one?
Two, for the biggest part, until one stopped working. So it's just one now.

Just out of curiosity do you plan to get two additional 280X? Because otherwise I really don't see why you would buy a PSU that's much more expensive than what you actually need (~450-500W) while also being less efficient.
I suppose I just wanted to have it ready in case I needed the extra juice (especially for crossfire), because the difference in prices between 1000W and ~600W PSU were rather small when I bought it.
 

kharma45

Member
Two, for the biggest part, until one stopped working. So it's just one now.


I suppose I just wanted to have it ready in case I needed the extra juice (especially for crossfire), because the difference in prices between 1000W and ~600W PSU were rather small when I bought it.

That sounds odd unless it's a bargain basement 1KW PSU that probably can't actually deliver that power. The price difference should have been significant.
 

Eiji

Member
My Pioneer Kuro 9G plasma TV heats my room up if I don't have it on energy save "mode 2" and consumes on average around 300W. With mode 2 it consumes 100W less on average.

I use it for gaming with my PC too.

I still won't be replacing it with an OLED TV yet as the picture quality is more than adequate and doesn't suffer from the OLED disadvantages such as poor picture uniformity, vignetting and poor motion resolution.
 

QaaQer

Member
article touched a nerve apparently.

Me? I don't care, as a) no kids and b) our coming AI robot overlords will solve all this stuff.
 

n0razi

Member
Im kind of OCD about power usage so my setup uses about the same watts as my PS4

- i5 4690K *stock volts*
- msata SSD
- GTX 970
- single 92mm HSF

no other drives or accessories it uses about 30 watts idle and slightly over 200 watts while gaming
 

n0razi

Member
I agree with you, but one should need to not underestimate the amount of amateur PC builders who put 800W or 1000W PSU in their low- and mid-tier machines which would be fine with 500W PSU. Thankfully, the problem isn't as big as in 2000s when notebooks were expensive and niche and pre-built PCs weren't as popular as custom-made ones - now most of amateurs moved out to pre-built solutions.

Its not really wasted as the PSU (no matter its rating) only pulls what is needs from the wall. You ideally want 50% saturation for max efficiency (over 90% with a good PSU) but many people also specifically want around 20% sauration (ie using a 1000w PSU for a 200w system) because they are going for silence and most PSU fans dont kick in till past 20%
 

Sanctuary

Member
Yes, this. People use their PCs for stuff that isn't just videogames.

I hear the consoles have Blu-ray players and can also play other forms of media. Hmm...

My Pioneer Kuro 9G plasma TV heats my room up if I don't have it on energy save "mode 2" and consumes on average around 300W. With mode 2 it consumes 100W less on average.

I use it for gaming with my PC too.

I still won't be replacing it with an OLED TV yet as the picture quality is more than adequate and doesn't suffer from the OLED disadvantages such as poor picture uniformity, vignetting and poor motion resolution.

Don't forget the unknowns too; life expectancy, especially of the blues, image retention and input lag. Hopefully by the time I need to replace my Panasonic (high five!), in another ten years the OLED tech will finally have caught up. And also offer more than just 55 inches and higher.
 
Sure, but it shouldn't prevent us from trying to reduce our energy consumption in the meantime.

Currently I:

- use public transport, and occasionally carshare, services instead of owning a car
- turn off all my lights and electronics when they're not in use
- pay additional fees per watt to ensure that my energy provider puts renewable energy back into the grid on my behalf
- only buy appliances with a 4 star energy rating or above

And any number of other power-saving practices. I'm constantly trying to reduce my carbon-footprint. So frankly I don't need some sanctimonious poster on GAF; who judging by their username "biglittleps" is only here to take a dump on people for not choosing Playstation for all their gaming needs; clutching their pearls in my direction because I'm destroying the planet by playing PC games.
 

Pagusas

Elden Member
how much energy does our AC use.... Yeah, not worried about the pennies it cost to run a gaming PC.
 

le-seb

Member
And any number of other power-saving practices. I'm constantly trying to reduce my carbon-footprint.
More power to you, but I was only reacting to the point you made, not the way you live.

Mine being that the whole world cannot run exclusively from renewable energy at the time being or in the foreseeable future, and that we consequently need to save these finite resources until it can.
 
Because of those fucking LED lights we now have global warming.

Thanks, PC gamers.
LED lights?

Any self respecting pc gamer (aka a person who hates the environment) will light their case with these:
4451_13537_el.asjad_087.JPG
If it's not a fire hazard then it's not worthy of being in my pc.

This is what I use to walk my dog. Two laps around the block in first gear.

Did you know this guy is a fervent pc gamer?

How to recognise if your global warming denying congressman is a nature hating PC gamer?
Shake his hand and look for the mark:
That's how you recognise us.

Now I must hide as I have said too much.
 

The Technomancer

card-carrying scientician
None of the article mentions whether this conclusion--or rather, the magnitude of it--is a function of the fact that PCs are left on most/all the time even when not gaming, while consoles are generally turned off
I'd be curious about this as well. I try to be super mindful of keeping my PC off when I'm not using it
 
I never realized how many power-conscious people there were until I read this thread. I can't say it's something I ever think about. My PC idles at >300w (almost 500w with Chrome open) and regularly goes over 1000w when gaming. Once in a while when a game comes close to maxing out my 3 GTX680s my UPS alarm will go off letting me know I'm approaching my PSU's max of 1200w.

That said I doubt my PC comes close to matching my cooling costs. I live in Texas but keep my apartment at 70F year-round. I guess I'm a terrible person.
 
I never realized how many power-conscious people there were until I read this thread. I can't say it's something I ever think about. My PC idles at >300w (almost 500w with Chrome open) and regularly goes over 1000w when gaming. Once in a while when a game comes close to maxing out my 3 GTX680s my UPS alarm will go off letting me know I'm approaching my PSU's max of 1200w.

That said I doubt my PC comes close to matching my cooling costs. I live in Texas but keep my apartment at 70F year-round. I guess I'm a terrible person.

That doesn't seem right. If you overclocked your cpu right it should still downclock when idle, and your gpus should each downclock heavily when idle.
Google says idle power consumption for a single 680 is 14watts.... your idle power consumption should be under 100 watts.
I'm also struggling to understand how 3 680s (160W TDP) while gaming can fully load a 1200watt psu...

You should top out at like 600-700watt for the entire system at most (and only in fur mark), less if you kept your gpus at stock voltage.
If you had 3 titans I could see it but not with 680s

You should really check your bios and gpu settings.

Assuming the power supply has decent efficiency, your computer probably draws like 350 watts at full load and like 80-90 in idle.

Edit:




i don't have that mark.
I guess you use a controller on your pc then or don't play much :p Never seen anyone who doesn't have it.
 
That doesn't seem right. If you overclocked your cpu right it should still downclock when idle, and your gpus should each downclock heavily when idle.
Google says idle power consumption for a single 680 is 14watts.... your idle power consumption should be under 100 watts.
I'm also struggling to understand how 3 680s (160W TDP) while gaming can fully load a 1200watt psu...

You should top out at like 600-700watt for the entire system at most (and only in fur mark), less if you kept your gpus at stock voltage.
If you had 3 titans I could see it but not with 680s

You should really check your bios and gpu settings.


I guess you use a controller on your pc then or don't play much :p Never seen anyone who doesn't have it.

I'm just going by what my UPS readout says. I just configured it on the PSU calculator and it came out to almost 1200w at 90% load. 3930k @ 4.8ghz, 4 DDR3 modules, 2 SSDs, 2 5400rpm HDDs, 14 fans on a front bay controller, and 2 pumps for watercooling.

The readout is counting my monitor, external DAC/headphone amp, and speakers as well though. I know the monitor is 90w, not sure about the DAC/speakers.
 
Roll a small towel up and place it under your wrist. I never found a comfortable gel wrist support or anything, but a towel works well.
 

Miracle

Member
None of the article mentions whether this conclusion--or rather, the magnitude of it--is a function of the fact that PCs are left on most/all the time even when not gaming, while consoles are generally turned off

I always turned off my PC when I'm not using it.

Leaving the PC for a long extensive amount of time seems like it would overheat a lot quicker no? Regardless of the Energy bill.
 

Crzy1

Member
I ran 2 GTX 480s for years. Definitely not energy efficient by any means (or cool and quiet for that matter). After that I got a couple of 780 Ti's that were close to the same power draw but not quite, one of those died and got them replaced with a couple of GTX 980s. Have a 1200W PSU that isn't even really being taxed now, these 980s can draw almost 200W each, but they don't seem to, total system draw is just a little above 450W when benchmarking and well below 400 in most gaming scenarios now (always enable vsync so the cards barely get pushed). With the GTX 480s, that was closer to 850W while gaming and a little bit higher than 900W when benchmarking (as soon as they started getting power, they seemed to drink as much as they could). My system has gotten much more efficient since 2012, at least. I haven't upgraded my CPU in years, though, so it could potentially be much lower.
 

Devildoll

Member
I guess you use a controller on your pc then or don't play much :p Never seen anyone who doesn't have it.

Sitting at the computer waving a mouse is probably the activity I've done most in my life, apart from perhaps sleeping.

I really don't use a gamepad to play pc games.
I don't think I've seen that mark on any of my friends either, come to think of it.
But i'll check next time i meet em.
 

Red Hood

Banned
That sounds odd unless it's a bargain basement 1KW PSU that probably can't actually deliver that power. The price difference should have been significant.
It's a Corsair RM1000, which I paid around €150 for. I forgot how much the lower PSUs costed at the time, but looking at a Corsair CX750M now, they're priced at around 90$, and I'm assuming they costed more when I bought it (late 2013). I found the different at the time not too big, and I rather have overkill than risking a shortage of juice.

?
You only lose a few points of efficiency, the 1000W is just what it is rated up to. It only uses what it needs.
Assuming the power supply has decent efficiency, your computer probably draws like 350 watts at full load and like 80-90 in idle.

If I remember correctly when I bought my Sapphire 280X (and later on another one for CrossFire), a minimum of ~700 for a PSU was highly recommend since one 280X already needs about 300W.
 

derExperte

Member
If I remember correctly when I bought my Sapphire 280X (and later on another one for CrossFire), a minimum of ~700 for a PSU was highly recommend since one 280X already needs about 300W.

It doesn't, that's close to what a whole PC with one of those needs under load. Official PSU recommendations are completely useless because exaggerated beyond all reason.
 
Top Bottom