Does vsync result in energy savings?

Staples

Supreme [H]ardness
Joined
Jul 18, 2001
Messages
7,978
I recently turned on vsync on in the nvidia control panel. This forces vsync on all games regardless of the game's settings. Also, I noticed that FRAPS reports that games now run at 60 fps max. Many of these games ran at a higher frame rate before. I don't mind the 60 fps cap since I would never be able to see beyond that with a 60hz display but another question came to mind. Since fewer frames are being rendered, will the video card use any less energy?
 
yea like was mentioned....depends on gpu usage....if your gpu is powerful enough to run the game at 60fps at 50% usage then there should be some energy savings not to mention less heat....how much you would need to measure it with some kinda of Electricity Usage Monitor for you pc to be plugged into while testing like this
http://www.amazon.com/P3-P4400-Electricity-Usage-Monitor/dp/B00009MDBU
Capture_zps3a0457b5.jpg
[/URL][/IMG]
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Thanks. I have a killowatt.

Just wasn't sure what to try with and without vsync to get definitive results. Power use fluctuates like crazy in most games.
 
Doesn't it increase input lag quite a bit?

If you are getting a lot more frames in a game, what about setting a tab in Afterburner with a lower voltage and clock rate? Cutting voltage is going to save more power.
 
Doesn't it increase input lag quite a bit?

If you are getting a lot more frames in a game, what about setting a tab in Afterburner with a lower voltage and clock rate? Cutting voltage is going to save more power.

triple buffer vsync shouldn't, though there are some shitty implementations that somehow do (source engine.)
 
I don't mind the 60 fps cap since I would never be able to see beyond that with a 60hz display
This has been brought up before. You can actually make use of framerates in excess of 60 FPS on a 60Hz monitor if you don't mind tearing.

If your graphics card finishes a new frame before it has finished drawing the previous frame, it forgets about drawing the rest of the previous frame and finishes filling the screen with data from the more-current frame.

This results in a tear, but it also results in everything after the tear being closer to real-time. It's also worth noting that this can happen multiple times within a single screen refresh.
 
in some games where you're getting a framerate way above refresh rate, or if you've got a 120/144 hz monitor and are getting near that, tearing isn't noticeable.
 
Not a big fan of tearing. To tell you the truth, every 120hz monitor I've gamed on has had nearly zero tearing, this is one of the huge reasons I recommend them. Even if you target 60fps, you'll get a tear-free experience.
 
Not a big fan of tearing. To tell you the truth, every 120hz monitor I've gamed on has had nearly zero tearing, this is one of the huge reasons I recommend them. Even if you target 60fps, you'll get a tear-free experience.

That's what I hear... I'm loathe to get a 120-144Hz monitor for that exact reason, actually. I don't want to become spoiled on it. For the time being, I prefer having 1440p for gaming at 60Hz with v-sync on since more resolution is awesome. I guess there're some panels in the pipeline with high resolution and refresh though.
 
I really love having a 120hz Monitor and using adaptive VSync. There are plenty of times when it's pegged at 120 (and it does save power) but whenever I can't maintain 120fps, I want every last frame I can get, tearing or not.
 
Haha, buying a high-end GPU and then worrying about the energy bill seems to me like buying a Ferrari and worrying about the cost of gas!
 
Honest question how much are you saving a month? All this talk about power savings but noone braggs how much cheaper in dollars there bill is each month.

Plus I cant stand the input lag of vsync.
 
Just to get in out there — you can get tearing on any frame rate or high refresh rate monitor; as soon as the GPU and Monitor are out of sync tearing will occur, regardless of your frame output or hardware. Period. This phenomenon is what G-Sync and FreeSync are set out to remedy.

In regards to the OP question, I might be mistaken since it has been a long time since I read it nor do I have time to check right now, but didn't [H]'s article on Adaptive V-Sync cover this question ?
 
Honest question how much are you saving a month? All this talk about power savings but noone braggs how much cheaper in dollars there bill is each month.
I doubt I will save much, maybe a few dollars a year at most. I just upgraded from a GTX 460 to a GTX 480 (yes, old but got it off Ebay) and am amazed how much more energy it uses, even when idle.
 
Just to get in out there — you can get tearing on any frame rate or high refresh rate monitor; as soon as the GPU and Monitor are out of sync tearing will occur, regardless of your frame output or hardware. Period. This phenomenon is what G-Sync and FreeSync are set out to remedy.

no one said you couldn't, it's just not as noticeable with high refresh rate monitors because frames are displayed for a very short amount of time.
 
no one said you couldn't, it's just not as noticeable with high refresh rate monitors because frames are displayed for a very short amount of time.
KazeoHin is stating a "tear-free experience" even at 60 FPS; even considering the first part of his sentence that statement remains false. Case in point, the other day I was playing Guacamelee and whether at 60 FPS or 120 FPS the game was tearing all over the place.

While I do agree the phenomenon is less noticeable than on 60Hz displays, it is still very noticeable to me on high refresh rate monitors. And my eyes are nothing exceptional.
 
It is absolutely false. The only way to have a "tear-free experience" on a traditional isochronous display is to enable vertical sync.
 
Not a big fan of tearing. To tell you the truth, every 120hz monitor I've gamed on has had nearly zero tearing, this is one of the huge reasons I recommend them. Even if you target 60fps, you'll get a tear-free experience.
Lol so you went from nearly zero tearing to tear free in just 2 sentences? Yes the tearing is much less noticeable but its still there. The tearing you get from things like flickering lights and explosions though will look almost just as bad as when using a 60hz screen. This is from Dead Space 3 back when I had the 144hz Asus monitor. What you see here is exactly what I saw while playing. I recorded it because some people were claiming there is no tearing at all on a 120/144 hz screen. Of course that is worse case scenario but still forced me to turn on vsync to keep from having a seizure. lol https://www.youtube.com/watch?v=Y3T6chyW2Vo
 
what was your framerate when you took that video?
Not sure but it was well over 144 fps. The framerate would not really matter much in that scenario as I am just standing still. Its the flickering lights that is making the tearing seem insane and that will happen at 30 fps or 300 fps. The flickering lights in the first FEAR game many years ago was what made me start using vsync as the tearing drove me crazy when the flickering happened.
 
Back
Top