V-Sync and GPU Temps

HAL_404

[H]ard|Gawd
Joined
Dec 16, 2018
Messages
1,240
curious Q here ... been playing Metro Exodus using no V-sync because when I turn on in-game V-sync it drops to 35 fps or less. With or without full
v-sync my Asus Strix GTX 970 needs 80% fan to keep it under 70C, no other game in my arsenal comes near that kind of temp or fan speed. One day I got curious and tried the Half v-sync setting and lo and behold GPU temp hit 48C max but the video suffers.

why the drop in temp with HALF v-sync and why so hot with no or full v-sync? I know my rig is running well so it has to be something in the game I would think anyways. Bad scripting?
 
curious Q here ... been playing Metro Exodus using no V-sync because when I turn on in-game V-sync it drops to 35 fps or less. With or without full
v-sync my Asus Strix GTX 970 needs 80% fan to keep it under 70C, no other game in my arsenal comes near that kind of temp or fan speed. One day I got curious and tried the Half v-sync setting and lo and behold GPU temp hit 48C max but the video suffers.

why the drop in temp with HALF v-sync and why so hot with no or full v-sync? I know my rig is running well so it has to be something in the game I would think anyways. Bad scripting?
Of course your frame rate dropped with V-Sync on. Anytime the system's frame rate falls below your monitor's refresh rate it will cut your FPS down to half the refresh rate. For example, if you have a 60Hz monitor and your GPU drops to 58FPS, it will drop to 30FPS. If it drops to 29FPS then it cuts the framerate to 15FPS. That's the way it works.

Half V-Sync is where your system uses V-Sync, but the target frame rate will be half your monitor's refresh rate. Again, if using the half setting you get 30FPS for a 60Hz monitor. This will be the target but it will work the same. Drop to 29FPS and you'll get 15FPS, etc. The reason you get a huge drop in temperatures is because you are making your video card render frames slower. You lessen its workload. This is why with VSync or no V-Sync your GPU runs hotter. You are also seemingly obsessed with temperatures. There is need to keep the GPU's target temperature under 70c. You aren't going to have a problem until the GPU throttles. Which is around 90c I believe.

The reason why one game is more demanding than another has nothing to do with scripting. It depends on how demanding that game engine is and what features the game uses to create its visuals. Metro Exodus is simply more demanding than most. Mass Effect 1 isn't as demanding as Mass Effect 3, and Mass Effect 3 isn't anywhere near as demanding as Metro Exodus. And if you think Exodus is hard to render, try Cyberpunk 2077 with all the visuals cranked up that your GPU can support. What's the connection? The games I mentioned ascend in visual quality and GPU demand. It's really that simple.
 
Of course your frame rate dropped with V-Sync on. Anytime the system's frame rate falls below your monitor's refresh rate it will cut your FPS down to half the refresh rate. For example, if you have a 60Hz monitor and your GPU drops to 58FPS, it will drop to 30FPS. If it drops to 29FPS then it cuts the framerate to 15FPS. That's the way it works.
I'm sure I'm missing a common cause for this that maybe depends on certain GPUs or displays or something, but I've only run into this Vsync behavior like 3 or 4 times in my entire history of gaming. If I have Vsync enabled and the framerate drops below the refresh, it just.. goes below the refresh. Granted I've had Gsync displays for a few years so that's irrelevant for me now.
 
please define "seemingly" ... I'll wait

We’ll because you’re trying to keep it below 70C apparently? 70C is not that hot these days compared to newer cards. You’re in for a rude awakening if you upgrade to something like a 3070 or better. Most of these newer cards are meant to run routinely 75-85C depending on the specifics and have hot spots that can run hotter (GDDR6 ram).
 
I'm sure I'm missing a common cause for this that maybe depends on certain GPUs or displays or something, but I've only run into this Vsync behavior like 3 or 4 times in my entire history of gaming. If I have Vsync enabled and the framerate drops below the refresh, it just.. goes below the refresh. Granted I've had Gsync displays for a few years so that's irrelevant for me now.
If you enable triple buffering then you won't run into the half-step frame rate issue with V-Sync. For some reason not all games support triple buffering, though. If you use V-Sync it's best to just enable triple buffering in the NVIDIA control panel.
 
If you enable triple buffering then you won't run into the half-step frame rate issue with V-Sync. For some reason not all games support triple buffering, though. If you use V-Sync it's best to just enable triple buffering in the NVIDIA control panel.

Well, triple buffering is an OpenGL-only setting, so...anything that uses DirectX, Vulkan, etc won't see any impact.

Of course, that also means you are effectively forcing the card to render 3 frames out, so...

...not great if you are shooting for lowest-latency responsiveness to user inputs (important basically only in twitch shooters, granted). Indeed, if you enable nVidia's "low latency mode" to chase this performance goal, it limits your queued frames to 1.
 
Well, triple buffering is an OpenGL-only setting, so...anything that uses DirectX, Vulkan, etc won't see any impact.

Of course, that also means you are effectively forcing the card to render 3 frames out, so...

...not great if you are shooting for lowest-latency responsiveness to user inputs (important basically only in twitch shooters, granted). Indeed, if you enable nVidia's "low latency mode" to chase this performance goal, it limits your queued frames to 1.
This thread isn't about input lag. Optimal in that case would be using scanline sync if you want minimal lag while preventing tearing.

Fact is you will always get input lag with V-Sync, but if you are going to use it you need to enable triple buffering when possible. Otherwise use adaptive v-sync and live with the tearing when FPS is below the refresh rate.
 
This thread isn't about input lag. Optimal in that case would be using scanline sync if you want minimal lag while preventing tearing.

Fact is you will always get input lag with V-Sync, but if you are going to use it you need to enable triple buffering when possible. Otherwise use adaptive v-sync and live with the tearing when FPS is below the refresh rate.
"Enabling triple buffering" makes it about input lag.

Sure, enabling v-sync broadly creates input lag. Enabling triple buffering will make the input lag worse, though. Sort of. Well...it does, just the question is whether that cure is worse than the disease of input lag (it usually is - IE., triple buffering will definitely smooth out the framerate spikes/drops, so if you are using full-v-sync, where you drop to 59fps and instead the GPU output has to drop to 30fps to stay in sync with the monitor...that will feel like MORE input lag than enabling triple buffering, which hopefully prevents you from that 59fps drop in the first place. If triple buffering helps you stay above 60fps, then the monitor sync at 60hz works out fine and everything is cookin' with gas, just with the triple-buffer-induced-input-lag, but hopefully that's not as bad as the drop to 30fps from 60).

But there are things that change that:
  • As you note, adaptive sync. Output locked at 60fps for a 60hz monitor, unless you drop below 60fps to something even in the 50s, then you get screen tearing as if v-sync was disabled (but also you are not forced all the way down to 30fps if 60fps isn't achievable)
  • G-sync and Freesync solve this in an even bigger way, by avoiding the screen tearing when rendering partial frames - although this does require the monitor to support it, most do these days. Or scanline sync, sure, but given how widespread at least AMD's Freesync is...
 
"Enabling triple buffering" makes it about input lag.

Sure, enabling v-sync broadly creates input lag. Enabling triple buffering will make the input lag worse, though. Sort of. Well...it does, just the question is whether that cure is worse than the disease of input lag (it usually is - IE., triple buffering will definitely smooth out the framerate spikes/drops, so if you are using full-v-sync, where you drop to 59fps and instead the GPU output has to drop to 30fps to stay in sync with the monitor...that will feel like MORE input lag than enabling triple buffering, which hopefully prevents you from that 59fps drop in the first place. If triple buffering helps you stay above 60fps, then the monitor sync at 60hz works out fine and everything is cookin' with gas, just with the triple-buffer-induced-input-lag, but hopefully that's not as bad as the drop to 30fps from 60).

But there are things that change that:
  • As you note, adaptive sync. Output locked at 60fps for a 60hz monitor, unless you drop below 60fps to something even in the 50s, then you get screen tearing as if v-sync was disabled (but also you are not forced all the way down to 30fps if 60fps isn't achievable)
  • G-sync and Freesync solve this in an even bigger way, by avoiding the screen tearing when rendering partial frames - although this does require the monitor to support it, most do these days. Or scanline sync, sure, but given how widespread at least AMD's Freesync is...
The OP, HAL_404, has never brought up input lag in the discussion. They are concerned about why their FPS is being cut in half while the GPU temperature remains high.
 
curious Q here ... been playing Metro Exodus using no V-sync because when I turn on in-game V-sync it drops to 35 fps or less. With or without full
v-sync my Asus Strix GTX 970 needs 80% fan to keep it under 70C, no other game in my arsenal comes near that kind of temp or fan speed. One day I got curious and tried the Half v-sync setting and lo and behold GPU temp hit 48C max but the video suffers.

why the drop in temp with HALF v-sync and why so hot with no or full v-sync? I know my rig is running well so it has to be something in the game I would think anyways. Bad scripting?
If you have an adaptive sync monitor, set it to G-Sync compatibility mode if available in your Nvidia control panel. Once your in your game, turn Vsync Off. Use MSI Afterburner with Rivatuner and limit your your framerate to the desired number. That should lower your video card temps.
 
Back
Top