Looks like PC power efficiency does mean something

Are you going to link us the computer configurations from the original article or just show a graph with no base information for which to use it?

No, that IS the original article. They make you download the paper and read it to find out anything remotely useful, like any shitty abstract should :D
 
No, that IS the original article. They make you download the paper and read it to find out anything remotely useful, like any shitty abstract should :D

They make you *pay* for it. I'm not going to shell out $40 for it, but taking pictures out of context isn't good. It says its better than figure 12, but here they use a "single gpu". Well of course a single faster GPU will use less power than 2, but what is the original system they are comparing against?
 
I wonder why we do not have like 30 different people claiming this is obviously a Nvidia Shill promoting their company etc etc. Like we do in any thread dealing with an AMD advantage.

As for this study?
Great Job, I guess?

Maybe because it doesn't conflict with common sense.
 
I have the full paper (perks of the job :D), it's quite massive at 18 pages but to answer the burning questions of the base system used:

PSU (Seasonic G Series, 550 W),
CPU (Intel Core i7 4820 K—quad core, 3.7 base GHz),
GPU (NVIDIA Reference Geforce GTX 780, 900MHz boost),
motherboard (ASUS P9X79-E WS),
RAM (32GB (8×4 GB) Kingston HyperX Beast 1866 MHz, 1.65 V),
display (Apple HD Cinema, 23 in.).
Operating system:Windows 7 Professional 64 bit; BPower saver^ energy
management settings in Windows 7 OS.
Operating hours: active gaming (Open Gaming Alliance 2015), Web browsing and video
streaming (Short 2013), idle from Urban et al. (2014), and off/ sleep is residual divided equally.
Assumes one display

Or if you prefer an infographic:



So the biggest efficiency improvement they saw on going to a more efficient motherboard+CPU, had them go from a 4820K to a Pentium G3258. I cannot facepalm at this hard enough.

I'm also not sure if I agree with assigning 4.4 hours of gaming time per day over a 24 hour duty cycle for the entire year. That translates to almost 31 hours of gaming per week, and certainly nobody that I know of with a proper full time job and social life games anywhere near that amount. But in any case this is how they arrived at the number:

The results indicate unit energy consumption of 1394 kWh/year (based on an average of 4.4 h/day in gaming mode), including the display. The "Avid" user sub-segment (29.5 million people, USA) spends 3.6 h/day gaming, uses 1300 kWh/year, while the "Extreme" user segment (8.1 million people) spends 7.2 h/day uses 1890 kWh/year (36 % more; utilization rates from Short 2013). For the typical gamer (4.4 h/day, weighted average of Avid and Extreme), we found that a much larger proportion of total energy (80 %) occurs in modes above idle than is the case for traditional personal computers, which have low computing loads (Beck et al. 2012).

Beck, N., May-Ostendorp, P., Calwell, C., Vairamohan, B.,Geist, T. (2012). How low can you go? A white paper on cutting edge efficiency in commercial desktop computers. Prepared for the California Energy Commission, Public Interest Energy Research Program. CEC 500-2012-065, 29 pp.

Short, J.E. (2013). How much media: report on American consumers, Institute for Communications Technology Management, Marshall School of Business, University of Southern California, 52 pp. http://www.marshall.usc.edu/faculty/centers/ctm/research/how-much-media Accessed 15 Oct 2014.

Some other bits you might find interesting:





 
Last edited:
Thanks, what is Figure 9 that 10 refers to? It sounds like those are the listed wattage numbers not actual ones like in Table 2 which they found are often much much less (49% gaming and 64% cpu benchmarks)

I think its funny that one of the most efficient computers is their most crazy one since its doing 300+ FPS vs the crap from most systems.
 
4.4 hours is actually on the mark with some of the gamers I know and they do have jobs, gf's etc. Excluding the stupid CPU change, the actual vs "nameplate" power consumption figures is quite interesting. I'd have liked to see them go from 290x to Fiji and see if the observed efficiency gains would have been similar to Kepler-->Maxwell.

+1 for honesty. Can't say I agree with most of your points, but if you're at least up front about your lack of impartiality, that puts you miles ahead of most people in my book.

Nothing in my original post suggests bias anyhow so I don't know how my "up front lack of impartiality" has anything to do with this thread. The components used were NVIDIA Kepler and Maxwell ones and the efficiency gains were quite obvious with the stated impact in this report. If people are reading into this as AMD vs NVIDIA, that's their own bias seeping through.
 
4.4 hours is actually on the mark with some of the gamers I know and they do have jobs, gf's etc. Excluding the stupid CPU change, the actual vs "nameplate" power consumption figures is quite interesting. I'd have liked to see them go from 290x to Fiji and see if the observed efficiency gains would have been similar to Kepler-->Maxwell.

You can see those in the article I linked before: https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/31.html

perfwatt_3840.gif


perfwatt_2560.gif


The Fury line is much more efficient
 
Only the plain Fury does somewhat well in those graphs you linked and it's nearest competitor (GTX 980) is up to 21% more efficient. Of course that's stock vs stock so an OC'd 980 would probably decrease that gap quite a bit. But yes, the fury line (and nano) are the direction AMD should be taking with efficiency like NVIDIA did with Maxwell. What remains to be seen is if the gap will get smaller or bigger with the next gen 16 nm products both being on HBM 2.
 
how much thought is given to the electrical use that my PC requires? NONE. Primary nd only concern is bang for the $
 
Turn off a few lights in your house.
Run the AC a few degrees warmer.
Get solar panels.
I'm sorry but gaming computers are not the problem.
I run a 4770k at 4.5ghz with SLI 980 Ti and kept my AC at 74 degrees all last month. The power company owed me 9 bucks this month.
 
people caring for environment = vaginas?

That sounds accurate.

My pet peeve is more that people do a piss poor job of actually realizing where waste is. It's also none of their business how others spend their money.

For instance I use around 15 gallons of gas/day for work. That's enough energy to sustain 60 watts for a year. Ever fly in a jet or go on a cruise? Anyone who has been has no business critizing anyone.

Half these people that say this shit go off and over clock their rigs, doubling the power for 10 - 30% gains.
 
Last edited:
Unless you only turn your PC on to game, I think the most important thing is how efficient your system is at the desktop. I have a GTX260 and a Strix 960 4GB OC, and at idle, the system went from 100-120 Watts at idle (from the plug) to 50-70 watts. In the inefficient cards heat the place up. Not horrible in the winter, but in the summer, you're getting hit 2x...and my computer room can easily run 10 degrees hotter than the rest of the house.
 
So, all you drive a fucking prius...

Most of you should be using 4790S and waiting for the 6700S then...

i rather drive my 16mpg GTR and use my 6700k's along with my nano and fury x, and pay that extra $16.00 a year....
 
Last edited:
I think its honestly very yawn inducing that we have a thread like this every two months or so.
We all get it, no one is even saying that AMD has better efficiency.
The question is did this matter to you when the GTX580 was out?
If it didn't then please go back into your hole and stay their.
 
Maybe because it doesn't conflict with common sense.

Common sense would tell you that someone who enters every single AMD thread with a negative bias is most likely a shill or troll at the very least.
I guess that doesn't conflict with your common sense but for most others with a functioning brain it does.
 
I just upgraded my motherboard, RAM and CPU from a i7-970 to a 6700k and now save about 100W at a cost of $600. It will only take me about 19.5 years to get my money back on the energy savings, taking into account my current usage and cost of electricity. So why did I upgrade? My motherboard was failing and I wanted something a bit faster (turns out it is about 46% faster @4.4 vs old @3.2).
 
Back
Top