GPU power draw vs normal usage

allen200

Weaksauce
Joined
Apr 1, 2011
Messages
120
Do high-power GPUs generally consume more power during non-gaming situations, versus say an APU or 1650 (no ext power)? Realistically I'll only game about 5% of the time the computer is on, so I want to know what kind of wattage they are pulling while not being stressed. For the top cards (with 2 power cables and all) is there like a power tax, for just being in use? Think leaving the lights on in a room you never go into.

If it's all the same then I could just pick up a monster card and be done with it. But if they are different, then by how much?
 
It's around ten watts per-card. integrated graphics idle is around 1w.

More powerful cards are going to have a higher idle draw, but it's not more than double.

This is the raw power draw of GPUs

power-idle.png
 
  • Like
Reactions: Auer
like this
It's around ten watts per-card. integrated graphics idle is around 1w.

More powerful cards are going to have a higher idle draw, but it's not more than double.

Thanks for that graph, useful. What about a bump up to web browsing/youtube? Would they increase by the same base or is there real variance?
 
Why do you care about power so much? With your monitor you’re probably looking at something like 200W vs 210W.

Based on the US average 10W is around $0.03 a day. Generally better insulating your house, ect., would save way more.
 
Why do you care about power so much? With your monitor you’re probably looking at something like 200W vs 210W.

Based on the US average 10W is around $0.03 a day. Generally better insulating your house, ect., would save way more.
That's not exactly the point. Even though my PC doesn't draw a lot of power (well below 200W even with dual monitors) I actually don't mind if it needed 400 or 500W, except the issue is I hate being wasteful if everything else is equal. In your example, 10W is fine. But if it's 30 or more, well it's info I'd like to consider, since I game less and less. So my question is just to know what the ongoing difference is (if any) of running that faster card.
 
That's not exactly the point. Even though my PC doesn't draw a lot of power (well below 200W even with dual monitors) I actually don't mind if it needed 400 or 500W, except the issue is I hate being wasteful if everything else is equal. In your example, 10W is fine. But if it's 30 or more, well it's info I'd like to consider, since I game less and less. So my question is just to know what the ongoing difference is (if any) of running that faster card.
If you're not using the GPU for some sort of compute task, then the difference is just a handful of watts. Don't worry about it. You wouldn't ever notice the difference in non-gaming tasks.

If you do plan on playing games, get a gaming GPU. If not, something like a 1650 should be fine.
 
Thanks for that graph, useful. What about a bump up to web browsing/youtube? Would they increase by the same base or is there real variance?

It's more noticeable when you're running a notebook, because you always have to deal with the higher power draw. This means higher idle temperatures, and lower web-browsing battery life. Nvidia has created Optimus to counter this: it shuts on-and-off the discrete GPU in real-time, giving you ALMOST the same battery life as integrated alone!

https://en.wikipedia.org/wiki/Nvidia_Optimus

On the desktop, It's lost in the noise with your average build; this is the reason why Optimus is only available on notebooks.

Optimus is also only viable on notebooks because it can cause switching issues, and most notebooks offering the feature don't offer dedicated discrete output (making it all work seamlessly means you have to route the discrete card's output to the onboard).
 
Last edited:
Back
Top