Taming The Energy Use Of Gaming Computers

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
How much energy can you save by using more efficient components in your computer? The results are pretty surprising. Do you guys care about conserving energy? Probably not as much as 3-way SLI, 1500w power supplies and stacks of hard drives. :D

We found enormous performance-normalized variations in power ratings among the gaming computer components available on today’s market. For example, central processing units vary by 4.3-fold, graphics processing units 5.8-fold, power supply units 1.3-fold, motherboards 5.0-fold, RAM 139.2-fold, and displays 11.5-fold. Similarly performing complete systems with low, typical, and high efficiencies correspond to approximately 900, 600, and 300 watts of nameplate power, respectively.
 
How does it compare to my fridge? Then I might care.

Haha, not sure if that was a poke at the article, but it does so... "The energy use of a single typical gaming PC is equivalent to the energy use of 10 game consoles, 6 conventional desktop computers, or 3 refrigerators."

Not saying I agree with the numbers, but there's your fridge comparison.
 
How does it compare to my fridge? Then I might care.

Compared to mine, the savings in kwh from their optimized system to the base system is about the cost to run my fridge for 1.4 years based on the energy guide I got with my fridge when purchased last year. About $100 of electricity where I live. About One year and $70 if you leave the display alone (because $$ or because it is already efficient).
 
Well I shut the power off on the back when it is not in active use, mainly becuase it is haunted and turns on all by itself and apparently nothing I can do will fix it... oh and save the planet too.
 
In most of the USA, no one cares, the price cost for say ultra efficient PC parts, would take longer than their useful life to have any ROI. Power is just so cheap here it doesn't matter. Now, if we are taking ultra efficient hot water heaters or freezers, yeah, you might see the impact from them, but still can take a while to see any ROI on them, but most also have much longer use life than a gaming computer as well.
 
This significant energy footprint can be reduced by more than 75% with premium efficiency components and operations, while improving reliability and performance. This corresponds to a potential savings of approximately 120 billion kilowatt-hours or $18 Billion per year globally by 2020.

Just imagine how much we will have to spend on gear to save $18 Billion in electricity.
 
Where the hell are they pulling these numbers? A gaming PC pulls as many watts as 10 game consoles? That might be true if we were comparing a triple graphic card setup to an original 1980's Nintendo.

Compare an average single GPU gaming system to an Xbox One or PS4, you might find its it consumes closer to 1.5-2x as much power as a console.
 
Stupid article. They downgraded the CPU from a 4820k to a budget Pentium G3258 to get most of their power savings in an article about gaming PCs. Whatever...
 
I was wondering what "improved" means in regards to GPU, CPU and display. Do they all mean less powerful?

And also, "A gaming PC pulls as many watts as 10 game consoles". To me, that line alone makes the rest the credibility of the rest of that article nil.
 
I was wondering what "improved" means in regards to GPU, CPU and display. Do they all mean less powerful?

And also, "A gaming PC pulls as many watts as 10 game consoles". To me, that line alone makes the rest the credibility of the rest of that article nil.

Well, it's not too unreasonable of an estimate if your PC uses more than 1000 watts.

http://www.extremetech.com/gaming/1...to-sony-advantage-and-future-efficiency-gains

Though most people don't require that much power even though they purchased a PSU that's rated to that output. You'd really need like 3-way SLI to get there and a whatever over the top CPU. Gaming computers that use like 400 watts under full load are still perfectly reasonable at playing games with pretty high settings.
 
I was wondering what "improved" means in regards to GPU, CPU and display. Do they all mean less powerful?

And also, "A gaming PC pulls as many watts as 10 game consoles". To me, that line alone makes the rest the credibility of the rest of that article nil.

If you open the actual report linked in the article it gives you the specs of the test system vs the "upgraded" system.

The article is pointless as all it shows is that old outdated stuff isn't as efficient as new stuff, and less powerful parts use less power.

Their original display is an Apple cinema HD 23" circa 2008. The upgraded one is a 24" Asus gysnc monitor.

Video card is a 780 reference vs a Zotac 970 AMP.

The original power supply is a 550w which is probably running near its max on their test system, vs a new modular Corsair 760w.

Then the i7 ivy bridge-E processor vs the budget dual core pentium, which is really laughable.
 
I currently have approx 625w to spare on my 1300w psu. The way I see it, I'm still saving 675w when I use my rig. :p
 
I don't have the time to game 24/7. I also need to sleep, work, eat, etc. So, I really doubt my gaming PC uses as much energy as my fridge, water heater, or furnace (during winter).
 
I want to get a new gaming computer at the end of the year since my current one is 7 years old ... depending on my budget (depends on the company year end bonus) I will be looking at performance and not necessarily energy efficiency ... my fridge runs 24 hours a day so it needs to be efficient, my computer is only on for gaming a few hours a day except weekends so I think the grid and my electric bill can handle a good gaming performance configuration not just an energy saver ... if I was doing a home server or some other always on device then energy efficiency would be higher priority ;)
 
Article doesn't discuss how much I save on my heating bill since I'm using a Radeon.
 
I'm not happy unless when I press the power button it makes Tim the toolman Taylor nod in approval and makes some hippy weep.
 
If you open the actual report linked in the article it gives you the specs of the test system vs the "upgraded" system.

The article is pointless as all it shows is that old outdated stuff isn't as efficient as new stuff, and less powerful parts use less power.

Their original display is an Apple cinema HD 23" circa 2008. The upgraded one is a 24" Asus gysnc monitor.

Video card is a 780 reference vs a Zotac 970 AMP.

The original power supply is a 550w which is probably running near its max on their test system, vs a new modular Corsair 760w.

Then the i7 ivy bridge-E processor vs the budget dual core pentium, which is really laughable.

Probably not maxing it out. IME, single GPU systems don't use 500W of power. Everytime I build a system, if it's got a gaming card in it, people start talking about ridiculous power supplies. Originally it was 500 watts. Then it became 750 and then people started talking about 1KW PSUs. I haven't tested it in a long time, but as I recall, it was no more than 300-350 watts at full load.

That said, most PSUs are most efficient at roughly 80% load so a 550w PSU would probably be fine for most systems. As for the the CPU, isn't it a given that games are rarely CPU bound? Why would you buy an extreme CPU for gaming? I understand if you're running other apps that are CPU/Core bound, but seems like a waste for games.

That said, I'm getting a 6700k and I'm sure more than 90% of the time I could live with a any recent CPU.
 
My UPS shows my i7 4930k/R9 290X/8x hard drives + 50" TV pull 300-700w.

I built a small 40w Athlon 5350 machine to use when I don't need all that power. It worked out nicely. I just wish the iGPU was strong enough to handle a denoise shader for DVD upscaling. It plays 1080p video and a surprising number of games just fine.
 
...
I built a small 40w Athlon 5350 machine to use when I don't need all that power. It worked out nicely....


Are you sure of that? Did you actually log the Watts used vurses Watts saved and tally the dollar figure and then compare that to the cost of you "more efficient" system cost you built?

I bet you didn't save a thing, and you're still way down int he negatives with the cost of the whole new system.

Power savings on devices are so over stated in media its just not funny any more.

"Save this, Save that, huge cost benefits". Its all just BS. If you run actual tests and see what power devices draw, tally up how much its costs if you bought a more efficient device (not just its power efficiency cost but its purchase prices as well,.. we want the WHOLE PICTURE). Most of the time its just not worth it. All just marketing buzz.


NOTE: I'm not being nasty to you good Sir, just making this point as its often ignored and people like to kid themselves instead of doing the work and getting the real numbers.
 
Oh look, another article decrying modern technology and civilization (in a roundabout sort of way).

The only reason anyone is concerned about their power use is because we are too big a pussies to build bigger and more efficient (read: cleaner) power plants, because supposedly CO2 warms the earth (not), and politicians and their cronies want to line their pockets with your cash.

I want my 2000W Quad-SLI, Dual-CPU Gaming Rig with 3 giant monitors, and I want it now!
 
Well, it's not too unreasonable of an estimate if your PC uses more than 1000 watts.

http://www.extremetech.com/gaming/1...to-sony-advantage-and-future-efficiency-gains

Though most people don't require that much power even though they purchased a PSU that's rated to that output. You'd really need like 3-way SLI to get there and a whatever over the top CPU. Gaming computers that use like 400 watts under full load are still perfectly reasonable at playing games with pretty high settings.

Well, extremetech did not factor in the tv watt usage..
 
Well, extremetech did not factor in the tv watt usage..

True, but I dunno if they factored in screen wattage usage either. Still, if you skip out on the screen completely and just look at the PC/console/whatever the power demand for a single GPU gaming computer is like 3x-ish higher which isn't anywhere near the 10x estimate out there for very fringe, small numbers of computers. Like I've personally NEVER met anyone who has more than one video card in their computer and I've had to work around ultra-nerdy computer guys for a while now. Even most of them are smart enough to build a moderately powerful computer and upgrade it once every few years and I'd just like knee-jerk guess that the power needs of their computers don't usually go over 300 watts.
 
Like I've personally NEVER met anyone who has more than one video card in their computer and I've had to work around ultra-nerdy computer guys for a while now. Even most of them are smart enough to build a moderately powerful computer and upgrade it once every few years and I'd just like knee-jerk guess that the power needs of their computers don't usually go over 300 watts.

Well, i am not as smart it seems.. with my sli setup and all.. and actually, i know a few people with SLI setups.

And sometimes i really don't wanna be reasonable, i want stupid power, where other people might be satisfied with less, i am craving more.
 
Well, i am not as smart it seems.. with my sli setup and all.. and actually, i know a few people with SLI setups.

And sometimes i really don't wanna be reasonable, i want stupid power, where other people might be satisfied with less, i am craving more.

There's nothing wrong with that. It's just extremely rare, I think for people to use more than one graphics card since desktops aren't that popular and most people are playing games on smaller-than-laptop stuff like tablets and phones.

Personally, I've been pretty seriously thinking about keeping a nice laptop with a docking station for big chores like writing where a keyboard matters and then just using a tablet. Though I really wish there were good options for MP3 players with like a 4 inch screen and a fairly modern tablet OS. I could almost get away with a bluetooth keyboard and mousey connected to one if I didn't mind sitting kind of close to it when I'm writing.
 
Probably not maxing it out. IME, single GPU systems don't use 500W of power. Everytime I build a system, if it's got a gaming card in it, people start talking about ridiculous power supplies. Originally it was 500 watts. Then it became 750 and then people started talking about 1KW PSUs. I haven't tested it in a long time, but as I recall, it was no more than 300-350 watts at full load.

My system, while not the newest or most powerful has an ok setup, along with the two AIO pulling power and allot more HDDs now, however it, along with my monitors, pulls 367W max at full synthetic load, that being running IBT and Kombuster, which cause the system to pull FAR more power than is does in games, I see at most 280W spikes in games, as most games tend to max mostly the CPU or GPU, not both at the same time, at idle I see around 130W, most people way over buy PSUs, very few have the HW to draw what those 1k+ PSUs are for.
 
reminds me...it is cooling off

I need to find something to mine for the winter
 
Mine draws in the high 600watt range. It's a small portion of my electricity bill.
 
Where the hell are they pulling these numbers? A gaming PC pulls as many watts as 10 game consoles? That might be true if we were comparing a triple graphic card setup to an original 1980's Nintendo.

Compare an average single GPU gaming system to an Xbox One or PS4, you might find its it consumes closer to 1.5-2x as much power as a console.
That's what I was thinking. A 8 core Intel overclocked with triple/quad top end cards all going 100% in a test might yield 10x more then a console. Maybe [H] can test! ;)
 
The results are pretty surprising.

No, the results don't make any sense.

Do you guys care about conserving energy?

Sure. System in my sig idles at under 20W. Since I'm already about 3x better than what these goobers managed with their "improved system", I'm gonna call it good.
 
Are you sure of that? Did you actually log the Watts used vurses Watts saved and tally the dollar figure and then compare that to the cost of you "more efficient" system cost you built?

I bet you didn't save a thing, and you're still way down int he negatives with the cost of the whole new system.

Power savings on devices are so over stated in media its just not funny any more.

"Save this, Save that, huge cost benefits". Its all just BS. If you run actual tests and see what power devices draw, tally up how much its costs if you bought a more efficient device (not just its power efficiency cost but its purchase prices as well,.. we want the WHOLE PICTURE). Most of the time its just not worth it. All just marketing buzz.


NOTE: I'm not being nasty to you good Sir, just making this point as its often ignored and people like to kid themselves instead of doing the work and getting the real numbers.

That was a ridiculously hostile response. I mentioned my idle/load wattage from my UPS in my post and I didn't say anything about building it to save money.

My main desktop is housed in a Mountain Mods Ascension plus a pedestal. It's an i7 4930k, R9 290x, 8x 7200 RPM hard drives in RAID10, 2x 10,000 RPM drives, 6x SSDs, swiftech pump for the water cooling loop, 18x 120mm fans, and various peripherals. With the CPU and GPU at stock frequencies my UPS shows that the idle desktop power consumption (with the 50" screen off) is ~180w. It goes over 200w just doing things like web browsing.

The Kabini CPU and Mainboard cost about $80 together. I recycled an itx case/power supply and DDR3 1333 memory. It pulls 19w idle and 40w under load. Add a few watts for the external hard drive if I'm watching a movie. It doesn't heat up the room and it's practically silent since there's only one 120mm case fan that always runs at minimum RPM and no separate CPU fan (I just took it off the CPU heatsink, the 120mm fan blows down into the case anyway).

My point was simply that for those of us with epic builds (and/or live in hot climates) it can actually be nice to have a secondary silent, low power PC for the 90% of the time we aren't gaming. These low-power CPUs are actually pretty snappy. If you have a typical desktop then sure, there's no reason to run out and build something new or downgrade.
 
My main desktop is housed in a Mountain Mods Ascension plus a pedestal. It's an i7 4930k, R9 290x, 8x 7200 RPM hard drives in RAID10, 2x 10,000 RPM drives, 6x SSDs, swiftech pump for the water cooling loop, 18x 120mm fans, and various peripherals.

Holy shit dude! 18 x 120mm fans?! You got any pictures of this beast?
 
I bet you didn't save a thing, and you're still way down int he negatives with the cost of the whole new system.
Power savings on devices are so over stated in media its just not funny any more.
"Save this, Save that, huge cost benefits". Its all just BS.

I went for efficiency when I built my HTPC. It draws less than 40 watts, even when recording 4 channels.
When it's not being used, it goes to sleep, saving even more power. Automatically wakes up 5 minutes before it needs to record something. I saw my electric bill go down over $5/month after I made the switch. The main reason for the upgrade was the ability to record HD channels, record 4 shows at once, etc., but the power savings where a nice bonus.

Before this I was using a couple Replay TV DVR's.
Each one could only record a single SD channel at a time, and they drew a little over 40 watts each.
Plus there was no sleep mode. When you turned them off, they where still running at full power internally 24x7 x 40 watts x 2 DRVs.
 
Holy shit dude! 18 x 120mm fans?! You got any pictures of this beast?

Sure, here's my desk. The 40w Kabini PC is the case on top. They look blue-ish on my phone camera but the power/hdd/fan/keyboard/tube amp LED's are all white.

Desktop.jpg
 
Does that setup get very loud? I would guess maybe not seeing they don't really have to work hard lol.
 
Back
Top