My Experience with FreeSync vs. G-Sync

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,821
My Experiences:
Fury X in Crossfire with FreeSync on a Eyefinity trio of HP Omen 32" 1440P 75Hz FreeSync monitors (7680x1440) (used for > 6 months)
1080TI in SLI (or lately just a single 1080TI) with a single Alienware 34" G-Sync monitor 3440x1440 at 120Hz (used for a couple months)

Perhaps surprising, but I actually liked my FreeSync experience better than G-Sync.

Why?

G-Sync momentarily hitches when my frame rate goes above the G-Sync monitor refresh range of 120Hz. That happens constantly - even at my Alienware's 3440x1440 resolution on a single 1080TI. Take for instance the Star Wars Battlefront 2 I've been playing a lot lately. I might get 90FPS on Ultra in a big open environment but 180 FPS in a corridor. When I cross 120Hz I get a slight stutter every time. Cutscenes, gameplay, whatever. It's annoying.

FreeSync doesn't have that stutter when you exceed the FreeSync range, it stays smooth -- sure you might see occasional tearing if you go above FreeSync range (tearing is less a problem at higher FPS than lower FPS though - so this isn't much of an issue) - but between the two characteristics - the 120Hz G-Sync stutter on my monitor is absolutely more offputting than the potential for tearing at higher than FreeSync range. That stutter at 120hz makes is all the more noticeable because the experience with the 1080TI is otherwise so consistently silky smooth!

Any advice?

I need to try capping the FPS at 120Hz with something. I've read I can set a FPS cap with a tool like Riva Tuner, is that the best way? Just seems I shouldn't' have to mess with that type of thing on a premium matching G-Sync setup. I can also use V-sync - but that defeats the point of FreeSync or G-Sync right? - to get away from V-sync's mouse lag.

As another note - this one I don't know if it's just my specific Alienware 34" monitor or G-Sync characteristic in general, but the blinking screen with this setup is annoying me. FreeSync never blinked the screen. G-Sysnc seems to do it 3x's every time my monitor wakes up. I assume it's the G-Sync chip waking and syncing with the LCD display? Or maybe it's an Alienware monitor nuance? But it is annoying too.


----

All in all, the horsepower in a 1080TI is off the charts amazing. I've never owned a graphics card this powerful. The 1080TI is a better card than the Vega 64 Liquid cooled card I owned briefly HANDS DOWN. EVERYTHING I throw at the 1080TI - the card just laughs and curb stomps it. NEVER an issue with low frame rate, like I'd see with my Vega and occasionally with my Fury X. A single 1080Ti is a force to be reckoned with.

I just need to address these little G-Sync or setup idiosyncrasies to bring the Nvidia card and G-Sync setup out of the shadow of my Fury X setup with FreeSync.

As it is - I wish I would have kept the pair of Fury X cards, and bought the 38" Acer with FreeSync for my gaming experience. I tried 3 different Vegas at their launch. Junk cards at the time - that's all I'll say here. The Fury X experience was incredibly polished by the June 2017 timeframe. Excellent!, (Vega launch drivers were trash). Alas, the Fury X was just on the border of running too few FPS to stay within the minimum 48Hz FreeSync range at 7680x1400 on newer titles at > high settings. At 3880x1600 on a single 38" they would have been pretty solid I think.

----


Any suggestions to get my G-Sync setup firing on all cylinders? On paper, you'd think the G-Sync setup would be superior.
 
Last edited:
I've noticed this too but I simply cap games at 97fps (100Hz max PG348Q). I didn't really think much of it since then.
 
Nvidia Profile Inspector is the tool I use to cap my framerates. I always just cap it globally at right below my monitor's refresh rate. This way, I don't need to worry about whether a given game has the ability to cap it where I want it.

I even do this on machines that aren't hooked up to G-Sync monitors. There's no reason to drive your GPU harder than your display can refresh, unless you're stress testing.
 
Nvidia Profile Inspector is the tool I use to cap my framerates. I always just cap it globally at right below my monitor's refresh rate. This way, I don't need to worry about whether a given game has the ability to cap it where I want it.

I even do this on machines that aren't hooked up to G-Sync monitors. There's no reason to drive your GPU harder than your display can refresh, unless you're stress testing.

Take a look at the above link. Riva tuner creates less frame delay. Might want to give it a shot.
 
Last edited:
+1 for RTSS (Rivatuner). I've been limiting to under refresh rate like previously mentioned since back when severe coil whine was a problem for power-hungry cards.
 
It is best to always keep v-sync enabled with g-sync, otherwise you will get occasional hitching or tearing as explained here.
Nvidia should really add a note to the v-sync option for g-sync users. They added the option to disable v-sync with g-sync not so long ago but the result is people complaining about tearing or stuttering with g-sync because nvidia never cared to explain how it all works. It is a pretty pointless option honestly, if you are some super competitive hardcore gamer you might as well disable g-sync entirely to shave off those extra 5ms of lag (being generous). Or leave it on, with v-sync on, and cap your frames like sane people. G-sync on + v-sync off just works poorly in practice.

I usually cap my frames a notch below the max refresh rate but that is purely to avoid the input lag. When I forget to (or don't care to) cap the framerate I frequently run into the limit and it doesn't cause any hitching for me. But yes, that's with v-sync enabled.
 
It is best to always keep v-sync enabled with g-sync, otherwise you will get occasional hitching or tearing as explained here.
Nvidia should really add a note to the v-sync option for g-sync users. They added the option to disable v-sync with g-sync not so long ago but the result is people complaining about tearing or stuttering with g-sync because nvidia never cared to explain how it all works. It is a pretty pointless option honestly, if you are some super competitive hardcore gamer you might as well disable g-sync entirely to shave off those extra 5ms of lag (being generous). Or leave it on, with v-sync on, and cap your frames like sane people. G-sync on + v-sync off just works poorly in practice.

I usually cap my frames a notch below the max refresh rate but that is purely to avoid the input lag. When I forget to (or don't care to) cap the framerate I frequently run into the limit and it doesn't cause any hitching for me. But yes, that's with v-sync enabled.


After quite a bit of testing, I think the consensus has been that with high refresh, G-Sync monitors, to achieve the lowest input lag it's best to turn V-Sync off in demanding games in which you expect to have a frame rate in the 40 - 70fps range... And to use a frame limiter and turn V-Sync on in less demanding or first person shooter games where you expect to see fps in the 100+ range.


(I know it's Linus, but his process seemed legit.)
 
After quite a bit of testing, I think the consensus has been that with high refresh, G-Sync monitors, to achieve the lowest input lag it's best to turn V-Sync off in demanding games in which you expect to have a frame rate in the 40 - 70fps range... And to use a frame limiter and turn V-Sync on in less demanding or first person shooter games where you expect to see fps in the 100+ range.


(I know it's Linus, but his process seemed legit.)

That video conclusion is jacked. It looks like Gsync and v-sync is broken at lower FPS?
 
After quite a bit of testing, I think the consensus has been that with high refresh, G-Sync monitors, to achieve the lowest input lag it's best to turn V-Sync off in demanding games in which you expect to have a frame rate in the 40 - 70fps range... And to use a frame limiter and turn V-Sync on in less demanding or first person shooter games where you expect to see fps in the 100+ range.


(I know it's Linus, but his process seemed legit.)


Linus doesn't do very rigorous testing and uses a small sample size. I'm not attacking him, he is constantly making videos about a lot of things, it's hard to fault him for not being extremely thorough on one specific subject.

You should check out Battle(non)sense and blurbusters instead. For example here :
There was a third guy (competitive Street Fighter player) making good testings too but I lost his name.
 
Turning vsync on in my experience with going on terrible. Everything feels so sluggish. There is no way I can play like that.
 
I been running a Samsung 27" 1080p with Freesync on an all AMD Ryzen 5 1400 and 290x with the lastest drivers and it has been awesome playing games.
 
It doesn't work --- what am I doing wrong? I tried to limit it to 118FPS and it's just behaving like it did before I installed RivaTuner. This attempt with Star Wars Battlefront 2.

upload_2018-1-14_11-41-57.png
 
You don't need an external program for Battlefront 2 (or any other DICE game). Create a user.cfg file in the BF2 root folder and just put " GameTime.MaxVariableFps 118 " in it. You can also open the console in game and type it in there but it will reset when you close the game.
 
You don't need an external program for Battlefront 2 (or any other DICE game). Create a user.cfg file in the BF2 root folder and just put " GameTime.MaxVariableFps 118 " in it. You can also open the console in game and type it in there but it will reset when you close the game.

doesn't work -- did I do it right? When I enter that line manually in the game it doesn't work either. It says result 32 something or other, but my FPS is still way higher than the limit I set.

upload_2018-1-15_1-3-40.png
 
Nvidia still don't have a frame-limiter built into their driver software? That's bunk. Like selling chopsticks that leave splinters in people's fingers each time they're used.

Yes they do. It's just not shown in the GUI and requires a third party app (nvinspector) to access it. It's also not as good as Rivatuner's limiter and adds a touch of input lag.

doesn't work -- did I do it right? When I enter that line manually in the game it doesn't work either. It says result 32 something or other, but my FPS is still way higher than the limit I set.

I just had a look and you're right it doesn't work any more. It worked fine in BF1 and all other DICE games, no idea what happened there. I mean, the console commands are exactly the same and it even saves the value when you type in GameTime.MaxVariableFps something. What did they do? I am almost 100% positive it worked on the earlier builds of the game too...

Guess it's back to making Rivatuner work (or using nvinspector which gives about 2 frames of lag, not really perceptible at 100+ hz but still technically a bit worse).
 
It doesn't work --- what am I doing wrong? I tried to limit it to 118FPS and it's just behaving like it did before I installed RivaTuner. This attempt with Star Wars Battlefront 2.

View attachment 50029
Try creating an application profile specifically for SWBF2.

I've also found that frame limiters on my system only work in +/- 5 increments, so try setting it to 115.

You could also try setting V-Sync to Fast in the NVIDIA control panel. Despite the internet wisdom I have not noticed any detrimental effects of using Fast sync with G-Sync.

If you have to, just turn V-Sync on in the NVIDIA control panel instead (not the game). Worst case scenario is you'll get 1 frame of input lag when the framerate peaks for you in open areas.
 
You could also try setting V-Sync to Fast in the NVIDIA control panel. Despite the internet wisdom I have not noticed any detrimental effects of using Fast sync with G-Sync.

This one tip seems to have met my satisfaction with the games I've been playing lately. Thanks!
 
Yep.

Fast-Sync is great!

Welcome to the green side :D

ha.


Why isn't it enabled by default? What is the negative? I've spent time with a couple games so far - Battlefront 2 and PubG, but both titles seem to be perfectly fine with Fast-Sync, and both games were broken with just g-sync alone - hence this thread.
 
Really? There's not FTC built into Nvidia's drivers? Seems like a bit of an oversight. I'm still using my Omen 32 and Freesync as well, just have the FTC at 74 FPS and I almost never have any issues.
 
I had never heard of FastSync... holy crap, finally no tearing in Overwatch! It's litterally the only game I've come across that even when you limit the FPS to a few under the 100Hz refresh of my panel, it still tears with G-Sync on. I just turned on FastSync for Overwatch specifically, disabled the FPS cap in game and NO TEARING. Finally!!!
 
ha.


Why isn't it enabled by default? What is the negative? I've spent time with a couple games so far - Battlefront 2 and PubG, but both titles seem to be perfectly fine with Fast-Sync, and both games were broken with just g-sync alone - hence this thread.

No idea.

I think partially because people are familiar with VSYNC and GSYNC is pretty cutting edge technology that's hard to explain to begin with.

I also, though I could be wrong, believe that Fast Sync is only available on 9+ series where as GSYNC is available from 6+ I believe.

It's confusing, but they absolutely should configure this stuff by default. I wish they'd combine GeForce Experience and the Nvidia Control Panel and just by default enable things like Fast Sync so that you have to muck things up on purpose to not get the best experience.
 
So yeah, playing around a bit more, G-Sync and FastSync enabled together is the best of both worlds. When you're bother rendering under and over your panels refresh rate. With only a 100Hz panel (PG348Q), this is helpful especially in something like overwatch where over 100FPS is pretty damn easy to achieve all the time.
 
Not even gonna read it. Overwatch feels great for the first time since getting my GSync panel, I'm not changing anything, regardless what the numbers say.

It's all about feel for me.
 
I've been satisfied using my 32 inch Omen under Freesync @ 75hz. No tearing and no input lag that I've noticed.
 
I've been satisfied using my 32 inch Omen under Freesync @ 75hz. No tearing and no input lag that I've noticed.

Have you tried creating a custom resolution with the built in AMD tool or even using CRU? I would imagine you can push your refresh rate to at least 80hz, might even get up to 85 or so depending on the scaler...I was able to bump my cheap Dell 27" from 48-75 to 45-80Hz which is nice considering I am running wayyy too much GPU for 1080P lol....I set everything to 1800P/4K via VSR and enjoy my sharper imagines while still staying buttery smooth with FS. This $159 panel is the best LCD (dollar for dollar) I have ever owned, on a tie with my HP 27ZR something that is in my sig...
 
Have you tried creating a custom resolution with the built in AMD tool or even using CRU? I would imagine you can push your refresh rate to at least 80hz, might even get up to 85 or so depending on the scaler...I was able to bump my cheap Dell 27" from 48-75 to 45-80Hz which is nice considering I am running wayyy too much GPU for 1080P lol....I set everything to 1800P/4K via VSR and enjoy my sharper imagines while still staying buttery smooth with FS. This $159 panel is the best LCD (dollar for dollar) I have ever owned, on a tie with my HP 27ZR something that is in my sig...

Haven't tried OCing the refresh rate. I'm kind of hesitant because after I did it to my last monitor it started to exhibit signs of failing soon after. Hence why I have the Omen now.
 
Haven't tried OCing the refresh rate. I'm kind of hesitant because after I did it to my last monitor it started to exhibit signs of failing soon after. Hence why I have the Omen now.

I have never really found it to be a problem long term but I guess I can understand the concern. It really depends on if you run native resolution and how much gpu power you have to drive that resolution. If you are using a card that can't break 75FPS on ultra settings @ 1080P, then the issue is moot. If you have more GPU Powa then you can use @ 1080P, then cranking it up via VSR and upping it will help improve your game play experience.

I have been eyeing a new LCD replacement myself, but I really want to go 32"-40" 1440P/4K @ 100Hz or higher. Trying to let the HDR shit show shake itself out for a while longer. Between that and the delay with HDMI 2.1 VRR rolling out on 4K sets, I have been delaying an LCD and TV replacement.
 
I have never really found it to be a problem long term but I guess I can understand the concern. It really depends on if you run native resolution and how much gpu power you have to drive that resolution. If you are using a card that can't break 75FPS on ultra settings @ 1080P, then the issue is moot. If you have more GPU Powa then you can use @ 1080P, then cranking it up via VSR and upping it will help improve your game play experience.

I have been eyeing a new LCD replacement myself, but I really want to go 32"-40" 1440P/4K @ 100Hz or higher. Trying to let the HDR shit show shake itself out for a while longer. Between that and the delay with HDMI 2.1 VRR rolling out on 4K sets, I have been delaying an LCD and TV replacement.

I'm running a RX 580 8GB @ 2560x1440. For the games I play it maxes 75fps just fine at Ultra/high settings. I've been happy with it. Although if I had a choice I would have stuck with my GTX 980, but the 980 had weird stuttering issues on the OMEN anything above 60hz.

Would love to upgrade to aVega 56 but its way overpriced.
 
I thought that gsync was a pain free tech to solve tearing and stuttering on reasonable framerate

Why gsync simply doesn't cap the framerate to the monitor's refresh rate to solve all this problems?
Should I worry about capping framerate when using gsync? If this is needed, why this isn't the fedault option in drivers?
 
I thought that gsync was a pain free tech to solve tearing and stuttering on reasonable framerate

Why gsync simply doesn't cap the framerate to the monitor's refresh rate to solve all this problems?
Should I worry about capping framerate when using gsync? If this is needed, why this isn't the fedault option in drivers?

Good questions!

It’s not taken care of automatically. You have to figure out a way. Adaptive sync seems a pretty easy way. Gsync hasn’t worked right for me on Path of Exile for months. Other than that I think all my games have been working good with gsync and it’s a very enjoyable tech.
 
Good questions!

It’s not taken care of automatically. You have to figure out a way. Adaptive sync seems a pretty easy way. Gsync hasn’t worked right for me on Path of Exile for months. Other than that I think all my games have been working good with gsync and it’s a very enjoyable tech.
Adaptive Sync should do the trick, cap it when it is above monitor refresh and turn off when not. Adaptive Sync is working great for a 144hz FreeSync 2 HDR monitor running at 100hzr. 100hz to have a very consistent 100 fps level of play, a very buttery smooth experience. Gsync would just make that much easier.
 
I thought that gsync was a pain free tech to solve tearing and stuttering on reasonable framerate

Why gsync simply doesn't cap the framerate to the monitor's refresh rate to solve all this problems?
Should I worry about capping framerate when using gsync? If this is needed, why this isn't the fedault option in drivers?
It did when G-Sync was first released to the masses, but for some reason people bitched about it so NVIDIA made it optional in the drivers. G-Sync On + V-Sync On (in the control panel, no buffering) = the original behavior and provides the best experience. Make sure you turn off V-Sync in your games as apparently there is a conflict that causes a big drop in framerate with G-Sync if you don't. The control panel setting should be overriding the game setting with no issues if the bug didn't exist.
Good questions!

It’s not taken care of automatically. You have to figure out a way. Adaptive sync seems a pretty easy way. Gsync hasn’t worked right for me on Path of Exile for months. Other than that I think all my games have been working good with gsync and it’s a very enjoyable tech.
Adaptive V-Sync, and I'm not sure the behavior when framerate approaches the max refresh rate using this approach.
 
Adaptive V-Sync, and I'm not sure the behavior when framerate approaches the max refresh rate using this approach.

Yes,

I don’t know either. All I know is adaptive v-sync got rid of the hitch in smoothness as gsync went over the monitors allowable range.

Before I turned on adaptive vsync in the Nvidia control panel every time my framerate transitioned over or under 120hz I’d have a hitch in FPS — assumingly related to the g-sync module engaging? (120hz is the resfresh rate of my monitor) It was terribly annoying.
 
I see many users complaining about flickering when GSYNC is enabled.
I have a GSYNC monitor but I can't try it because now I have a GTX580 video card,
I sold my old GTX980Ti SLI to buy the RTX2080Ti and I'm driving the PC with the good old 580 now.

Is flickering something that I should worry about? If yes, I will return my Acer XB271HK right now.
 
Back
Top