Is G-Sync worth it?

Humpernator

Weaksauce
Joined
Oct 23, 2015
Messages
112
I'm interested in hearing from other about their opinion on G-Sync. After long debate, I decided to pick up one for myself and I must say my first impressions aren't that great. I know I have everything enabled in the control panel but it just doesn't seem to be the high level upgrade I was hoping for. I have the Dell S2716dg.
 
Well whether gsync is worth it's asking price I would say heavily depends on the price premium over a non gsync variant. For the 1440p 144Hz monitors I would say yes it's worth it because the difference between a non gsync and a gsync isn't too big, about $100 give or take if you compare monitors like the BenQ XL2730Z to the Dell S2716DG both 1440p 144Hz. However on the 4k side a 27 inch IPS freesync like the LG27MC67 costs $450 while a gsync monitors like the Asus PG27AQ and Acer XB271HK costs $900! WTF! That's DOUBLE the price or a $450 price premium for gsync. In that case I would say hell no it is not worth it.
 
I'm interested in this as well -- I'm currently running Xfire 290's and won't be going back to camp green for probably another year when the next gen ti hits. I'll probably update my monitors before then and have been looking at Gsync's

Been eyeballing the Dell S2716DG - and it's a fair price $580 - but the fact it's a TN panel with Dells shitty Antiglare coating on it are huge turns offs. (i'd love IPS glossy, thin bezel 144hz gsync) Guess I need to be patient :)
 
I'm interested in hearing from other about their opinion on G-Sync. After long debate, I decided to pick up one for myself and I must say my first impressions aren't that great. I know I have everything enabled in the control panel but it just doesn't seem to be the high level upgrade I was hoping for. I have the Dell S2716dg.
It would be helpful if you actually told us what your expectations were, and why you're disappointed with it.
 
I'm interested in this as well -- I'm currently running Xfire 290's and won't be going back to camp green for probably another year when the next gen ti hits. I'll probably update my monitors before then and have been looking at Gsync's

Been eyeballing the Dell S2716DG - and it's a fair price $580 - but the fact it's a TN panel with Dells shitty Antiglare coating on it are huge turns offs. (i'd love IPS glossy, thin bezel 144hz gsync) Guess I need to be patient :)

I bought mine for $510 from Newegg last week. My previous monitor was a Dell U2515H (IPS, 1440p) and the S2716SG isn't that bad for being a TN panel. I don't really notice the antiglare coating but perhaps I'm just used to it.

It would be helpful if you actually told us what your expectations were, and why you're disappointed with it.

Well I thought about it and it's hard to put into words. I think it's just me. I can definitely tell that tearing is gone but I guess It's more of an issue with the 144Hz after playing a few hours last night. I can definitely tell the difference at around 90-100 fps but anything above that isn't as noticeable. Basically this post was a bit premature.
 
Well I thought about it and it's hard to put into words. I think it's just me. I can definitely tell that tearing is gone but I guess It's more of an issue with the 144Hz after playing a few hours last night. I can definitely tell the difference at around 90-100 fps but anything above that isn't as noticeable. Basically this post was a bit premature.
Make sure that you're running the games in full-screen exclusive mode, and not windowed.
Try running 144Hz without G-Sync and see how that compares. And try 144Hz with triple-buffering (or full-screen windowed mode) without G-Sync.
 
I bought mine for $510 from Newegg last week. My previous monitor was a Dell U2515H (IPS, 1440p) and the S2716SG isn't that bad for being a TN panel. I don't really notice the antiglare coating but perhaps I'm just used to it.



Well I thought about it and it's hard to put into words. I think it's just me. I can definitely tell that tearing is gone but I guess It's more of an issue with the 144Hz after playing a few hours last night. I can definitely tell the difference at around 90-100 fps but anything above that isn't as noticeable. Basically this post was a bit premature.


I like my S2716SG but I wish Dell and other companies stopped using those shit panels from Acer/AU Optronics. They're inferior to any other panel maker such as Samsung/LG but it seems a lot cheaper at the same time. I would prefer to pay a bit more instead all those overpriced and junk Asus and Acer monitors.
 
Make sure that you're running the games in full-screen exclusive mode, and not windowed.
Try running 144Hz without G-Sync and see how that compares. And try 144Hz with triple-buffering (or full-screen windowed mode) without G-Sync.

There was an option in the NVIDIA Control Panel to allow G-Sync in windowed mode. Thought I was covered. I'll try your suggestions and see if it makes a difference. Thanks.
 
I'm interested in this as well -- I'm currently running Xfire 290's and won't be going back to camp green for probably another year when the next gen ti hits. I'll probably update my monitors before then and have been looking at Gsync's

Been eyeballing the Dell S2716DG - and it's a fair price $580 - but the fact it's a TN panel with Dells shitty Antiglare coating on it are huge turns offs. (i'd love IPS glossy, thin bezel 144hz gsync) Guess I need to be patient :)

Gsync And freesync are basically the same thing. I've used both, if you ask me they are absolutely game changing. The resulting smoothness is ridiculous. What about something like this that has freesync?

eBay
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Gsync And freesync are basically the same thing. I've used both, if you ask me they are absolutely game changing. The resulting smoothness is ridiculous. What about something like this that has freesync?

eBay

I thought about it -- there are a few seemingly okay FreeSync panels out there -- if I bought it, I would be able to enjoy them for maybe a year before I plan to go back to a single card nvidia solution. Long term plans are to end up with a full next gen nvidia gsync solution.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
If you can't immediately tell the difference the game isn't being GSYNCd. It is a night and day difference. So you're hitting your refresh rate limit or it isn't setup correctly.
 
IMHO the biggest wow factor is not g-sync itself because that's kinda similar to what I already had (I refused to play anything that didn't keep 60 fps locked) but running stuff in strobing mode and clarity of the screen with it. Altrough g-sync mode is nice too since I don't have to be 60 fps taliban anymore ;)
 
ULMB does not have the smoothness of G-Sync, unless you have a 100% locked framerate. (And it must be 85/100/120 FPS with ULMB)
The problem with traditional displays is that we're reaching a point where it's not possible to guarantee a minimum of 60 FPS no matter what hardware you've got - and it's not just a minimum of 60 FPS, it has to be completely locked to 60. Anything above or below 60 will stutter.
 
Well I thought about it and it's hard to put into words. I think it's just me. I can definitely tell that tearing is gone but I guess It's more of an issue with the 144Hz after playing a few hours last night. I can definitely tell the difference at around 90-100 fps but anything above that isn't as noticeable. Basically this post was a bit premature.
The closer you are to the max refresh rate in framerate the more the panel will act like regular V-Sync. The point of G-Sync is so you can turn the details up without much worry about the resulting framerate since the panel will take care of the varying framerate. You need to get out of the mindset of trying to adjust your games to run at or close to your monitor's refresh rate.

First thing I would do is disable any overlays you're running in your games to tell you what the framerate is. Launch any game and turn all the details except AA to the max with the resolution at 2560x1440. Now play it for about 20-30 minutes with G-Sync off and then another 20-30 minutes with G-Sync on. If you don't feel the difference or get that "wow" factor then it is my opinion that you're just a weirdo :cat:.
 
ULMB does not have the smoothness of G-Sync, unless you have a 100% locked framerate. (And it must be 85/100/120 FPS with ULMB)
The problem with traditional displays is that we're reaching a point where it's not possible to guarantee a minimum of 60 FPS no matter what hardware you've got - and it's not just a minimum of 60 FPS, it has to be completely locked to 60. Anything above or below 60 will stutter.

Yes that's all true. But since I used to be rather poor until very recently I developed a pattern of buying games mostly on steam sales while using best possible hardware I could afford so by the time I get to games that have trouble hitting 60 fps today I'll have big Pascal :)

It's probably one of biggest waste of gpu power ever or close to it but at the moment I'm doing Kings Bounty Armored Princes 120 fps@2880p to downsample it to 1440p screen since I was in a mood for something turn based.
 
Yes that's all true. But since I used to be rather poor until very recently I developed a pattern of buying games mostly on steam sales while using best possible hardware I could afford so by the time I get to games that have trouble hitting 60 fps today I'll have big Pascal :)

This is just good money management, poor or not.
 
The closer you are to the max refresh rate in framerate the more the panel will act like regular V-Sync. The point of G-Sync is so you can turn the details up without much worry about the resulting framerate since the panel will take care of the varying framerate. You need to get out of the mindset of trying to adjust your games to run at or close to your monitor's refresh rate.

First thing I would do is disable any overlays you're running in your games to tell you what the framerate is. Launch any game and turn all the details except AA to the max with the resolution at 2560x1440. Now play it for about 20-30 minutes with G-Sync off and then another 20-30 minutes with G-Sync on. If you don't feel the difference or get that "wow" factor then it is my opinion that you're just a weirdo :cat:.

I ended up doing this over the weekend. I cranked RotTR to max and I definitely noticed G-Sync kicking in. I did have the Steam fps counter on but those drops to 40ps were unoticable. Looks like I'm not a weirdo afterall :)
 
Hell yes it's worth it. I didn't realise how much of a difference it made until I turned it off one day to compare. The difference is night and day.
 
as a new convert I picked up a acer p27oh gsync and well wow...
 
thinking about jumping on the bandwagon and buying a Dell S2716DG here in a few days. Probably selling my Xfire 290 setup and one of my current Korean 1440p monitors to finance it, will have to limp along on my 970M laptop for a few months till Pascal comes out :)
 
I've got a freesync monitor waiting for me when I get home. Asus MG279Q for my R9 390. To me, the G-sync add-on cost wasn't worth it, plus my GTX970 would've been too little of a card for a 1440p display. (See sig for rigs.) G-sync would've meant getting a 1080p (silly), or buying a 980 class card.

I'm curious to find out if I see a difference. Fingers crossed...
 
I had it on the ROG Swift array I used to run. I liked G-Sync a lot, but ultimately other issues with those displays are why I went another direction. While it's nice and all, I prefer 4K resolution on a display larger than any equipped with G-Sync at present.
 
There's a 32" Samsung FreeSync 4 monitor. That's not big enough for you? Oh, you have Nvidia cards.

Yeah, we really need to standardize variable refresh. It shouldn't be tied to a specific vendor.
 
There's a 32" Samsung FreeSync 4 monitor. That's not big enough for you? Oh, you have Nvidia cards.

Yeah, we really need to standardize variable refresh. It shouldn't be tied to a specific vendor.

Nothing is stopping NVIDIA from using FreeSync other than its own interest in selling G-Sync hardware.
 
GSINK is never worth the money, as it cannot be used with strobing, which provides much more benefit. It's way better to be able to clearly see moving stuff than to not have teeny imperfections.

Some monitors support strobing and GSINK, but you can only use them individually. If you buy one of those monitors, enable strobing and keep GSINK off, it is worth it.

Re low FPS: A high end GPU (GTX980) will handle anything 1600x1200@60 (roughly equal to 1920x1080). 120 if you compromise detail. Higher resolutions need faster video cards; you can't cheap out if you do things right
 
Personally I find screen tearing so immersion breaking sometimes I'd rather have gsync than whatever strobing is (never used it so not sure what exactly it is, I just know I'd rather have gsync to eliminate screen tearing)
 
I bought one of the new 21:9 ultrawide's with G Sync and it's great all around, but it really shines when you have games with frame rates that frequently dip below 60fps. It eliminates the hitching and skipping you get when framerate plummets and keeps it smooth. If you have a monster GPU setup that consistently runs high frame rates, you won't benefit nearly as much from G Sync as a user with a mid range or budget GPU.
 
Personally I find screen tearing so immersion breaking sometimes I'd rather have gsync than whatever strobing is (never used it so not sure what exactly it is, I just know I'd rather have gsync to eliminate screen tearing)

Does your GPU produce an adequate frameate (equal or above monitor refresh rate)? You should try strobing. You will like it/ It might be called "ULMB", "LightBoost" or "turbo240" depending on your monitor.
 
GSINK is never worth the money, as it cannot be used with strobing, which provides much more benefit. It's way better to be able to clearly see moving stuff than to not have teeny imperfections.
Depends entirely on what you're playing.
If you had a system with infinitely fast hardware, sure, strobing is always better.
When you're playing new games on hardware with finite performance, smooth gameplay is often more important than motion blur reduction.
Strobing looks worse than full-persistence motion blur if you cannot sustain FPS = Hz. Much worse.

Re low FPS: A high end GPU (GTX980) will handle anything 1600x1200@60 (roughly equal to 1920x1080). 120 if you compromise detail. Higher resolutions need faster video cards; you can't cheap out if you do things right
You obviously haven't tried playing any new game releases. Even a 980Ti will drop below 60 FPS in new games.
A lot of games are becoming CPU-limited rather than GPU-limited now, and CPU performance has been largely unchanged in the past five years - at least as far as gaming is concerned.

And don't forget that most new displays with strobe options do not support it at 60Hz. ULMB starts at 85Hz, many others operate at 120Hz.

There's a 32" Samsung FreeSync 4 monitor. That's not big enough for you?
32" is about the worst size possible for a 4K display.
At 140 pixels per inch, everything is too small at 1x scale, and far too big at 2x scale.
Non-integer scaling (ideally 1.4x at 140 PPI) is poorly supported and tends to look really bad.

Ideally a 4K display would be 22" (200 PPI) or 44" (100 PPI) in size so that 1x or 2x scaling looks perfect.

Variable refresh is mandatory for me. I won't buy another monitor that doesn't have it at this point.

Only applies to fighting games really, where latency is critical and motion blur isn't much of a concern.
For games which are constantly scrolling the screen (shmups, platformers etc.) the amount of motion blur on a flicker-free display is a far bigger problem than 2 frames of latency and minor speed inaccuracies. Stutter is never a problem with modern emulators when configured correctly - they will lock the gamespeed to your refresh rate.
 
Depends entirely on what you're playing.
If you had a system with infinitely fast hardware, sure, strobing is always better.
When you're playing new games on hardware with finite performance, smooth gameplay is often more important than motion blur reduction.
Strobing looks worse than full-persistence motion blur if you cannot sustain FPS = Hz. Much worse.


You obviously haven't tried playing any new game releases. Even a 980Ti will drop below 60 FPS in new games.
A lot of games are becoming CPU-limited rather than GPU-limited now, and CPU performance has been largely unchanged in the past five years - at least as far as gaming is concerned.

And don't forget that most new displays with strobe options do not support it at 60Hz. ULMB starts at 85Hz, many others operate at 120Hz.
I have a CRT monitor, so I just drop resolution if my GPU chokes. I never have issues w/ one at 1600x1200 100Hz. 2048x1536 85Hz usually is OK. Note that I do not use the maximum settings. I would never buy a LCD above 1920x1200 for gaming, as GPUs are not fast enough and I hate scaling. 1920x1080 is about equal to 1600x1200. If you play with reasonable settings, a GTX 980 will handle it
 
I have a CRT monitor, so I just drop resolution if my GPU chokes. I never have issues w/ one at 1600x1200 100Hz. 2048x1536 85Hz usually is OK. Note that I do not use the maximum settings. I would never buy a LCD above 1920x1200 for gaming, as GPUs are not fast enough and I hate scaling. 1920x1080 is about equal to 1600x1200. If you play with reasonable settings, a GTX 980 will handle it
There is no way that you're staying above 100 FPS with a GTX 980 in new AAA games at 1080p or 1600x1200. It isn't possible.
Your system wouldn't even hold 60 FPS, no matter how low you turn down the graphics settings - unless you're dropping resolution well below what you claim.
Not only is a single 980 not enough for many new games, your CPU is a big bottleneck for that card.
My 2500K at 4.5GHz is bottlenecking a 960.
 
Only applies to fighting games really, where latency is critical and motion blur isn't much of a concern.
For games which are constantly scrolling the screen (shmups, platformers etc.) the amount of motion blur on a flicker-free display is a far bigger problem than 2 frames of latency and minor speed inaccuracies. Stutter is never a problem with modern emulators when configured correctly - they will lock the gamespeed to your refresh rate.

Bullshit. MAME, basically the only emulator that really matters for playing games properly, runs games at their native refresh rates, making variable refresh mandatory. Most games, including shooters, really don't scroll that quickly. Motion blur is heavily overrated and less important than stuttering and input latency.

You can't force old games to run at 60hz without changing the speed they run at (too fast or too slow).
 
Bullshit. MAME, basically the only emulator that really matters for playing games properly, runs games at their native refresh rates, making variable refresh mandatory. Most games, including shooters, really don't scroll that quickly. Motion blur is heavily overrated and less important than stuttering and input latency.
It seems pretty clear that fighting games are your priority, and I don't believe you'd make those comments if you spent much time with shmups, platformers, or other 2D games which are constantly scrolling the screen and have you looking ahead.
Even at very slow scrolling speeds there is significant blur on a full-persistence display, it's awful. Far worse than the latency of V-Sync on a CRT in my opinion.
Perhaps it's not going to get you killed, because you just need to see that there is something coming towards you, instead of being able to identify an object in motion, but I end up with a headache and eyestrain in no time at all because my eyes are trying to focus on a screen which is just one big blur.

When you bought a G-Sync monitor, what were you upgrading from? A crappy 60Hz LCD?
When was the last time you looked at a decent CRT running these games? (multi-sync PC monitor or some other display hooked up via RGB)

You can't force old games to run at 60hz without changing the speed they run at (too fast or too slow).
Obviously, but that speed difference is less than 1% for 99% of the games in existence.
For the other 1% of games which differ significantly from 60Hz, that's why multi-sync displays exist.
G-Sync is not a requirement for speed accuracy - at least within a fraction of a percent.


It's too bad that NVIDIA don't have a third mode for their G-Sync displays which combines G-Sync with ULMB.
It would work poorly for games with variable framerates, but very well for anything that runs at a fixed speed.
The current ULMB implementation is nearly useless due to it being limited to 85/100/120Hz.
 
bigbluefe is spot on, most 2D arcade games don't move anywhere near as fast mouse-controlled 3D games do, a 60Hz panel with good average pixel response and no PWM backlighting is more than capable for those.
And there are way more than 1% of games that are too far-off 60Hz for the speed increase to be acceptable.
G-Sync or similar solution is clearly a very valuable thing to retrogamers.
Not saying a higher refresh rate or strobing aren't nice, of course they're desirable, but I'd kill for a 60Hz OLED w/ G-Sync, even just Full-HD.
Only thing I agree with you here zone74 is that so kind of 'hybrid' mode would be awesome.
 
bigbluefe is spot on, most 2D arcade games don't move anywhere near as fast mouse-controlled 3D games do, a 60Hz panel with good average pixel response and no PWM backlighting is more than capable for those.
And there are way more than 1% of games that are too far-off 60Hz for the speed increase to be acceptable.
G-Sync or similar solution is clearly a very valuable thing to retrogamers.
Not saying a higher refresh rate or strobing aren't nice, of course they're desirable, but I'd kill for a 60Hz OLED w/ G-Sync, even just Full-HD.
Only thing I agree with you here zone74 is that so kind of 'hybrid' mode would be awesome.
I was specifically referring to 2D games, not 3D ones.
Even really slow movement is a complete blur on a full persistence 60Hz display - including OLEDs.
TestUFO at 4px/frame (240px/s) already shows considerable blur, and most games move at least that fast.

I did a couple of quick tests, and in Super Mario World, you can get moving at least at 9px/frame. This is 8px/frame.
In Sonic 2 you can get moving up to about 30px/frame. This is 24px/frame.
24px/frame is as sharp in motion as it is when paused on a CRT or low-persistence LCD at 60Hz, and a total blur on a full persistence display like a G-Sync monitor or OLED TV.

I'm not saying there aren't any benefits to G-Sync, but being able to see what's happening when things move on the screen is far more important than speed being <1% more accurate.
It really is a tiny fraction of all games which run at something considerably lower than 60Hz, and for those you can change the refresh rate.
 
in complete agreement with zone here. i will never use a display that doesn't strobe again whether i have to deal with stutter/tearing or not, sample and hold is absolutely disgusting.
 
I'm just in disagreement with your 'considerable blur' judgement.

My point is that such level of blur is really acceptable, not many 2D games scroll as fast as Sonic games do so often and consistently, and if occasional speedy sequences happen in many games it's not a catastrophe if your monitor has decent pixels response to not make things worse by adding lots of smearing.
Picture settings have their importance too, for instance the latest HLSL defaults are an absurd horror apparently voluntarily adding tons of blur and color smearing, talk about misleading crap (not looking even close to any decent rgb low res crt monitor/tv).
Instead use integer scaled, sync-locked, lightly filtered settings to compare and you'll see motion clarity is worlds apart. I'm always saying people; "you own a 60Hz monitor ? > ask it to display contents it's ok with", since it will blur stuff to a degree anyway why add heavy blurry color and contrast-destroying filters/shaders ? It looks nice when still but the moment things start moving it will get in the way.

Still, the problem with how people present things now, is that since strobing has come out of the niche/custom woods and become a commercial thing and they've spent big bucks on those new monitors, they've labeled everything else before that an absurd load of useless unwatchable junk, like it's burning their eyes.
You've become what I call cutting-edge-owner-perfectionists (a common feature of any pc hardware addict since computers have become a mass-consumer market).
I understand people should always demand better/best, but that doesn't mean it's the only equipment level which existence makes sense in any situation.

So, I'm not as radical as you guys are, IMO you're dramatically exaggerating things, again like most people do after they're got their hands on the better hardware solutions.
I am just saying, sorry for the redundancy, that 60Hz displays are still more than acceptable for most people when it comes to arcade/console 2D games, and yes even the reasonably demanding crowd, as long as we're talking decent performance monitors and games/emulator settings that make sense.
Frankly even 1080p, which is considered low resolution today, is okay if you accept it's not enough to use the fanciest shaders and effects to their full potential, there are decent alternatives anyway, and they make sense in the 60Hz world.

You might not see where I'm standing here, but I'm talking about 'product+use' vs. 'always more!' logic, I believe it's a thing and kind of a sacrilege maybe but heh...
Or shall we always recommend everyone to buy a multi-thousand bucks setup to play everything, including retro games and emulators ?
 
Back
Top