[PCPER] Nvidia cards power usage goes up by 100watts when 144hz+!!

Huh. That's really interesting. Just tested it on my monitor.

From 60 to 120Hz my GPU sits at 135MHz

At 144Hz it jumps to 1000MGHz

Power % goes from 12 up to 27

The temp even raises a couple degrees.
 
I thought this was common knowledge? It happens(happened?) when running monitors in eyefinity/surround too.
 
They just need to add another low-power state for people with high refresh rate monitors--they probably only have two or three low-power states (any more would probably be inefficient in workloads where the state changes frequently), and none are enough to drive a display at that refresh rate.

Adding one shouldn't be too difficult...so long as it doesn't require a hardware change. :/
 
I like how the autor of the article sound like a child discovering new disneyland.. :D but also suggesting crazy ancient things like separated engine clock and shader clocks... =D so are we going to return to Pixel Clock and Shader clock no more unified shader what's next? separated Pixel, Vertex and geometry shaders? what a stupid commentary IMO.

That "issue" is happening since the introduction of high refresh panels 1920x1080@144hz, or multi monitor setups as NickJames Said, but even not only in eyefinity/surround. AMD somewhat fixed it with Hawaii the 290 and 290X were able to stay at 144hz single monitor without trigger 3D clocks, however you can't run more than single 1080P panel or it will be at full 3D clocks, however with nvidia you can run 3x 1080P monitors without trigger 3D clocks, but that's the limit once you start to mix things with 1440P, 1080P or 4K it will be always at 3D clocks..



I thought this was common knowledge? It happens(happened?) when running monitors in eyefinity/surround too.

^This..
 
That's a good jump in power consumption at 144Hz & again at this new 165Hz (this one is surprising). If power consumption bothers you I guess this is an interesting issue, especially considering this is at idle where most of the time is spent. Also interesting is there doesn't seem to be any fix coming from nvidia for existing cards.
 
I thought this was common knowledge? It happens(happened?) when running monitors in eyefinity/surround too.
I thought so, too :confused:. NVIDIA have stated that they put the video card into a high power state to prevent issues when running at high refresh rates or multiple monitors, including lockups, flashing, and loss of signal. I heard that AMD tried to introduce a low power state for these configurations, but they experience these very issues NVIDIA describes.

I just run my desktop at 120 Hz and fullscreen games at 144 Hz using the "Highest available" option for refresh rate in the NVCP.
Problem solved.

/thread
 
I thought so, too :confused:. NVIDIA have stated that they put the video card into a high power state to prevent issues when running at high refresh rates or multiple monitors, including lockups, flashing, and loss of signal. I heard that AMD tried to introduce a low power state for these configurations, but they experience these very issues NVIDIA describes.

I just run my desktop at 120 Hz and fullscreen games at 144 Hz using the "Highest available" option for refresh rate in the NVCP.
Problem solved.

/thread

Yeap that's the same trick I use too. My PC stays on nearly 75% of the day so power saved is money saved.
 
If I can save money doing something simple as downclocking my GPUs when idle then why not? Also I turn off my lights when I leave the house and make my own lunches for work! :eek:
 
That's 2 take out meals. Hardly adds up to much of anything.

You could say the same thing about AMD Cards. People bitch about power usage. Now people are saying added power from this isn't a big deal. But it is if its AMD's power we are talking about.
 
You could say the same thing about AMD Cards. People bitch about power usage. Now people are saying added power from this isn't a big deal. But it is if its AMD's power we are talking about.

It's an unwinnable argument.

Fermi: Guys it's not about the temperature.
Kepler: Haha look how cool our GPUs run.
Pascal: The temps aren't important! Look at the VRAM!
 
You could say the same thing about AMD Cards. People bitch about power usage. Now people are saying added power from this isn't a big deal. But it is if its AMD's power we are talking about.

We're talking peak power versus higher idle power. Peak means you MAY need better PSU / case / watercooler to handle the increased load during a game, and that's not so cheap.

The difference is even more pronounced if you're running two cards, or if you're a fan of silence. That "great deal" you got could cost you more when you look at the supporting cast you need to remove that extra heat.
 
We're talking peak power versus higher idle power. Peak means you MAY need better PSU / case / watercooler to handle the increased load during a game, and that's not so cheap.

The difference is even more pronounced if you're running two cards, or if you're a fan of silence. That "great deal" you got could cost you more when you look at the supporting cast you need to remove that extra heat.

dont you think you would use more power every month if your PC is idle and using 200w. over using 100w more while gaming?

Your PC's idles way more then being at 100% for gaming. So in a sense you would pay more for electricity if you idle at 200w over 41w. SPECIALLY since you idle your pc more then you game.
 
dont you think you would use more power every month if your PC is idle and using 200w. over using 100w more while gaming?

Your PC's idles way more then being at 100% for gaming. So in a sense you would pay more for electricity if you idle at 200w over 41w. SPECIALLY since you idle your pc more then you game.

Did you even read my post? I was ignoring the electricity cost, and specifically targeting the BUILD cost. Higher power draw means you have to handle it, and that increases the REST of your build cost.

Read my post one more time. Then read it again just to make sure, because I don't know what you read.

As to why this is a non-issue: if you set it to 120 Hz, it goes back down to idle. Soooo hard!
 
120 vs 144hz can you even tell the difference? I'm asking people who've actually owned both.
 
Meh... Non-issue. If they fix it, awesome, bonus points. For the time being just run 120 on the desktop and set your games to 144. To everyone comparing this to load power use, apples and oranges. This has a "band-aid" fix you can apply until (maybe) they release a fix, only band-aid you can put on your gaming load power draw is down-clocking.
 
Well, when your load power becomes your idle power, it's a pretty big deal (especially if you cheaped out on your PSU...don't cheap out on your PSU!). That said, yeah, at least there's a work-around for now.

I'm sure there are plenty of [H]ard-goers who turn off low-power states in their CPUs already--for them this is probably no biggie. I leave it on, myself--stable enough for me.
 
It's not a case of your load power becoming your idle power, it's a case of your idle power increasing 60w. If my load power was 60w over my idle power, I would jump up and down with joy (as I'm sure many other [H]'ers would).
 
If I can save money doing something simple as downclocking my GPUs when idle then why not? Also I turn off my lights when I leave the house and make my own lunches for work! :eek:

Not to derail buuuutt... taking my lunch to work saved me countless money in fuel alone. I always went home before so my cost for the food didn't really change.
 
It's not a case of your load power becoming your idle power, it's a case of your idle power increasing 60w. If my load power was 60w over my idle power, I would jump up and down with joy (as I'm sure many other [H]'ers would).

I'm not saying full/heavy load, if that's what you think I'm saying. You would still (at least generally/usually) need a load to generate 60W more power than idle if everything is working as it should. It's just now you're constantly at a high enough load that the GPU has to throttle up. In effect, your (light, if you prefer) load power becomes your idle power.

I'm with you on jumping for joy at 60W full load, though. :eek:
 
I was able to get the low power state working when I ran surround and sli. It was pretty buggy you would have to change to adaptive in ncp then power cycle the PC until it worked. Sometimes would randomly stop working but I figured it out for the most part
 
I like how the autor of the article sound like a child discovering new disneyland.. :D but also suggesting crazy ancient things like separated engine clock and shader clocks... =D so are we going to return to Pixel Clock and Shader clock no more unified shader what's next? separated Pixel, Vertex and geometry shaders? what a stupid commentary IMO.

That "issue" is happening since the introduction of high refresh panels 1920x1080@144hz, or multi monitor setups as NickJames Said, but even not only in eyefinity/surround. AMD somewhat fixed it with Hawaii the 290 and 290X were able to stay at 144hz single monitor without trigger 3D clocks, however you can't run more than single 1080P panel or it will be at full 3D clocks, however with nvidia you can run 3x 1080P monitors without trigger 3D clocks, but that's the limit once you start to mix things with 1440P, 1080P or 4K it will be always at 3D clocks..

^This..

I don't care about the issue, but I want to add that my R9 290 goes into a low power state with my (2) 1080p monitors hooked up. I disabled it because the DP on my 120Hz Samsung monitor will sometimes lose connection when it does it. Now my second monitor isn't 120Hz and I just run extended displays. Maybe that's why. No fan of games stretched across 2 displays.
 
Weird I have a 144hz monitor and have that set as my refresh and my card down-clocks to 135mhz on the desktop just fine. I monitored it with gpu-z and afterburner.


Using the latest gsync "hot fix" beta drivers.

XR8b.png
 
Like others have said this is nothing new. I've always noticed that my temps go up when the desktop is set to 144Hz. Did it on both my 680 and now by 980Ti. However, lately I've been noticing my temps go up even when set to 120Hz or even 60Hz. I started seeing this after getting into overclocking my card but even at stock clocks it does it. Not sure if it's a Windows 10 driver thing or my ROG Swift.
 
Dumb question from a 60Hz display owner: can you set separate refresh rates on your monitor for gaming and desktop use? I would not be happy with my card at full clocks at desktop, not from the power usage as much as the heat and related fan noise.
 
Dumb question from a 60Hz display owner: can you set separate refresh rates on your monitor for gaming and desktop use? I would not be happy with my card at full clocks at desktop, not from the power usage as much as the heat and related fan noise.

Some games allow setting the refresh rate in the game, but most do not. Also this approach would not work with Windowed games.

A workaround could just be to change you desktop refresh rate manually before playing a game.
 
All that's really happening is the driver forcing "Prefer Maximum Performance" power state at all times, and ignoring any profiles set to "Adaptive" which normally allows low-power downclocking at idle.

If you do something like set the Global Profile as well as Explorer.exe to "Prefer Maximum Performance" under Power Management, you'd achieve this effect at any refresh rate. The increased power draw is just from using the low-power 3D core clock and voltages at all times, rather than the low-power idle clocks (~10x lower) and voltages (~25% lower). I know this, since in the past I've run some of my NVIDIA cards this way for improved performance consistency in lightweight 3D applications where having the lowest possible render times is still important.

There is still a question of why NVIDIA never added an intermediary power state to handle high refresh rates on their modern GPUs, but as others has mentioned this isn't really new information. It should have no impact during Gaming, since you'd already be at high-power 3D clocks and voltages at all times.
 
Weird I have a 144hz monitor and have that set as my refresh and my card down-clocks to 135mhz on the desktop just fine. I monitored it with gpu-z and afterburner.


Using the latest gsync "hot fix" beta drivers.

http://i.picpar.com/XR8b.png[/im][/QUOTE]

Resolution matters. Is yours 1440p?
 
All you guys should be running F@H for Team 33 when your computers are idling anyway. Shame on all of you!
 
there's not only power differences when at 144Hz, but also software related issues as well. i've heard reports for example that if one were to set their display to 144Hz then would have trouble shutting down (black screen/hang). and other issues with games, etc. dropping the refresh down to 120Hz would alleviate the issues.
 
Back
Top