GSYNC worth it?

dogbyte_13

[H]ard|Gawd
Joined
Dec 21, 2004
Messages
1,486
is the premium price tacked onto the monitor with GSYNC worth it, or should i wait for prices to come down? Does it have a real noticeable difference compared to adaptive vsync even on a low input lag monitor?
 
If you have a nice 144hz monitor I would say NO unless the Price doesn't bother you.
I get reduction in stuttering and tearing in almost all my games with my BenQ XL2411Z.
I think if I had to do it over I would go for that 144hz 24" LG
 
At this point, you may as well wait for G-sync monitors with it built into the display controller rather than using the current FPGA implementation. That will drop the price down to the project price of DP Adaptive Sync monitors (as the hardwre to drive the panel is identical for both methods).
 
If you have a 60hz I wouldn't wait just buy anything High Refresh then get a expensive Gsync later.
 
Thanks for the input, i really can't justify spending 600$ on a monitor when it cost almost half of what i paid for my system.
 
IMHO it is worth if you spend most of your time gaming. 1440p @144hz (27in Asus Swift) is awesome.

FPS,RTS, MOBA, you name it I play them all and they benefit from GSYNC.

My monitor has been another great buy that compliments my setup and will shine on my desk for many years to come.

i7920 @ 4.0 GHZ, 12gigs 1600 DDR3, Asus P6T Deluxe, 120gig SSD, 2x EVGA 680 SLI (reference cards) Asus RoG Swift pg278q

I honestly mostly game and watch movies. As much as I want to upgrade my core components, they continue to serve me well.
 
is the premium price tacked onto the monitor with GSYNC worth it, or should i wait for prices to come down? Does it have a real noticeable difference compared to adaptive vsync even on a low input lag monitor?
We don't know yet how adaptive sync will compare to G-sync. Adaptive sync monitors are just around the corner, so if you have any doubt you will know for sure in the coming months which way to go.

That said, G-sync is a thing of beauty. This technology is the biggest game-changer for gaming I have seen in a long time. Input lag is reduced to near-imperceptible levels and screen tearing is completely eliminated. It has reignited my spark for gaming on the PC platform. Because of this I hope that Adaptive sync will be just as good so there is market competition to drive the price of G-sync down. The current $200 price premium is too much for most people to justify. Personally, I think it was worth every penny as an "early-ish" adopter.
 
If you have a nice 144hz monitor I would say NO unless the Price doesn't bother you.
I get reduction in stuttering and tearing in almost all my games with my BenQ XL2411Z.
I think if I had to do it over I would go for that 144hz 24" LG

G-Sync makes a huge difference compared to just 144/120hz. It's night and day.

If you don't mind the nvidia exclusive part, G-Sync is totally worth it. Only problem is that the current G-Sync monitors are pretty mediocre. You can get a good Swift like I did but it's a high price to pay and it's still far from perfect (TN and all that) :/ So I don't disagree with the people who say "wait" but at the same time I've been waiting so long already for a half-decent LCD that I don't regret getting the Swift. And I have serious doubts that it will be beaten any time soon, but I REALLY hope to be proven wrong.
 
Yes, it's worth it and very noticeable (asus rog swift). I was blown away and got rid of my qnix 27" ips and Eizo "240hz" fg2421.

Whether or not the technology is worth the $300 will be different for everyone. Personally, I no longer care about my fps reaching triple digits, which is very nice relief.
 
I'd argue the G-sync is one of if not the most important features of a gaming monitor at the moment. It's that much of a game changer for the perceived smoothness (frames drops are not felt nearly as badly), image clarity (no screen tearing), and reduction of input lag (not having to use v-sync). I'd personally rather have a G-sync instead of 144 hertz, if I was forced to choose for my gaming monitor.
 
G-Sync is awesome, and really makes high-refresh rates worthwhile. Previously (with V-Sync) you would need to maintain over 120fps *minimum* or it resulted in stutter (dropping to 60Hz periodically). So you needed a beefy machine or (in my case for a while) playing games on the lowest possible setting to try to reach 120fps.

Now, with G-Sync, you can have a moderate machine, or a high-end machine (at higher settings), and get around 100fps and it looks butter smooth. It really is a sight to behold. Unlike some other features that may (or may not) be gimmicks, I think everyone can agree G-Sync is a real step forward after seeing it.

Whether that justifies a 2x increase in the sticker price is a conversation between you and your wallet.
 
Last edited:
id just wait till next yr for prices to drop

edit: erm later this year
 
I like the concept behind the G-Sync and FreeSync. There is just a couple of issues:
  • I just purchased a new Acer H6 H276HLbmid Black 27" 5ms IPS monitor in April, 2014. This replaces a ASUS VK266H Black 25.5" 2ms monitor which I had purchased in February, 2009 and which I have given to my mother when she got a new computer. So, unless something breaks, it's going to be a while.
  • Because the Acer monitor lacks VESA holes, I ended up purchasing a BenQ GL2760H GL2760H Black 27" 2ms monitor for work. So, I ended up cornering myself!
  • I prefer a 27" monitor. The price premium for the G-Sync results in monitors that cost $600-$800.
  • The current crop of products is essentially first generation. I'll let the early adopters take the hit and wait until third generation and the resulting technological improvements and bug fixes.
FWIW.... I looked at the price history, and....
ASUS VK266H - $424 (February, 2009)
Acer H6 H276HLbmid - $232 (April, 2014 as a daily special)
BenQ GL2760H - $176 (November, 2014 as a Black Friday special)
 
It's totally worth the cost if you do a lot of gaming with your PC. I never thought a monitor would make that much of a difference in your gaming experience, but it truly does. Not having any tearing or input lag at high refresh rates is just unbelievable till you've experienced it. Games move so smooth, it's almost dizzying at first if all you're used to is 60hz with v-sync.

My advice is to wait for a refurb or other deal if you simply cannot justify the cost for a mediocre display with a killer feature, which is what all the current G-Sync displays really are at the moment. Personally, I was going to drop $800 on a ROG Swift till I manged to get a 27" Acer with G-Sync for $350 refurbished. The deals are there, you just have to be lucky. :)
 
Last edited:
It would not hurt to wait until Free-sync comes out (AMD's version of G-Sync) and see how it performs compared to G-Sync.

However, G-Sync works very well, so it's not a bad thing to have it right now.

But, with new G-Sync monitors (EG LG's 144hz IPS 1440p) coming year this year, I would wait at least for that before jumping. I really like Swift, and have no intention of going back, but at this moment, I would recommend waiting just a little longer.
 
I think the answer will vary from person to person depending on what games you play. I've seem some say that they felt no difference with G Sync while others are claiming that it's night and day. Heres the thing: What does G Sync do? As far as I know the point of it is to get rid of screen tearing like V Sync but without the drawbacks so no stuttering. But what if you're someone who plays a bunch of games like League of Legends or CS:GO where your frame rates are high sky in the hundreds. At that point G Sync behaves like traditional V Sync locked in at 144Hz so at that point you wouldn't even notice a difference between G Sync and V Sync. And for those claiming that G Sync is still better than V Sync because it won't add input lag well sorry but blurbusters showed that there is still some added input lag from G Sync if you've hit the refresh rate cap: http://www.blurbusters.com/gsync/preview2/

In short, G Sync is worth it if you get fluctuating frame rates but if you are playing games where your frame rate is so high that it always hits the refresh rate cap regardless than no it is not worth it.
 
I think the answer will vary from person to person depending on what games you play. I've seem some say that they felt no difference with G Sync while others are claiming that it's night and day. Heres the thing: What does G Sync do? As far as I know the point of it is to get rid of screen tearing like V Sync but without the drawbacks so no stuttering. But what if you're someone who plays a bunch of games like League of Legends or CS:GO where your frame rates are high sky in the hundreds. At that point G Sync behaves like traditional V Sync locked in at 144Hz so at that point you wouldn't even notice a difference between G Sync and V Sync. And for those claiming that G Sync is still better than V Sync because it won't add input lag well sorry but blurbusters showed that there is still some added input lag from G Sync if you've hit the refresh rate cap: http://www.blurbusters.com/gsync/preview2/

In short, G Sync is worth it if you get fluctuating frame rates but if you are playing games where your frame rate is so high that it always hits the refresh rate cap regardless than no it is not worth it.

That's not quite correct. If you can cap your framerate slightly below 144 you don't get any input lag and it's much better than v-sync. So a game like Quake (which is best played at 125fps) or Jedi Academy (best played at 100fps) benefits immensely from g-sync: perfect picture with no input lag (too bad we can't combine ULMB with it though). If the game has no built-in framerate limiter (and the framerate is always higher than 144) then yeah, g-sync will do no good compared to v-sync.

But that's the thing about g-sync, the flexibility it offers is amazing. Whatever game you throw at it you're guaranteed to have a perfect picture. Even if the game runs with some weird-ass locked ramerate of if that framerate fluctuates a lot you'll be fine. Even games hardlocked at 60fps (commonplace these days) feel really good on a g-sync monitor.
 
Why did it take so long for this technology to arrive? Refreshing the screen only when the videocard has a new frame to deliver sounds very natural.
 
Why did it take so long for this technology to arrive? Refreshing the screen only when the videocard has a new frame to deliver sounds very natural.

Nvidia did not include displayport as a standard output with their GPUs until Kepler with the 6xx generation which coincidentally is the first generation of their GPUs to support this.

Displayport, unlike previous mainstream display connections, is packet based which allows the dynamic data output needed for something like this. DVI for example operates on a fixed timing cycle for updates and cannot be varied on the fly.

As such I assume, from Nvidia's approach at least, that the market timing has only recently come together with the requisite technology standards and adoption rate being suitable.
 
Why did it take so long for this technology to arrive? Refreshing the screen only when the videocard has a new frame to deliver sounds very natural.

If you trust Tom Petersen to tell the whole story, it's because:

monitors, for historic reasons, have fixed refresh rates at 60Hz. That’s because PC monitors initially used a lot of technology from TVs, and in the U.S. we standardized on a 60Hz refresh way back in the 1940s, around the time Ed Sullivan was still a fresh face. That occurred because the U.S. power grid is based on 60Hz AC power, and setting TV refresh rates to match that made early electronics easier to build for TVs. The PC industry just sort of inherited this behavior because TVs were the low-cost way to get a display.

So back at NVIDIA, we began to question whether we shouldn’t do it the other way. Instead of trying to get a GPU’s output to synchronize with a monitor refresh, what if we synchronized the monitor refresh to the GPU render rate?

...

Hundreds of engineer-years later, we’ve developed the G-SYNC module. It’s built to fit inside a display and work with the hardware and software in most of our GeForce GTX GPUs.

NVIDIA has not historically been a player in the display business (neither has any other GPU manufacturer, AFAIK) and has been basically dealing with the limitations inherited from displays. If your question is why they did not branch out beyond GPUs earlier to do this, I couldn't tell you. In my experience, however, companies usually spend most of their energies shoring up their 'turf' markets and excelling within those markets. They can and do branch out at times, but they usually need to be in a cash-rich state to do so. NVIDIA has been doing well in recent years, so that might be it.
 
Most likely the extra cash made it possible. Intel and AMD have had the starter tech for this for years and were pushing for its implementation, but it seemed the only takers were in the laptop sector mainly because it helped battery life. The rest of the monitor industry sat idle with no desire to work on this for desktop or tv till G-sync. Guarantee it is the money, just like the laptops, that made adaptive-sync come to fruition rather than the desire to bring quality to frame-rates.
 
G-Sync to me is similar to having too much VRAM. With too much VRAM I just never worry about it. G-Sync has me ignoring the frame rate counter now. It is absolutely worth it.
 
G-Sync to me is similar to having too much VRAM. With too much VRAM I just never worry about it. G-Sync has me ignoring the frame rate counter now. It is absolutely worth it.

I'm pretty sure if your frame rate drops below 30 you wouldn't be ignoring it whether you had G Sync or not. :p
 
No, but at least the tolerable frame rates buffer has increased, and thus the GPU powered required to drive a game to acceptable frame rates decreased.

That being said, I am not entirely sure if G-Sync can save sub 30 fps games if it didn't have that limit.
 
G-Synch is the single greatest thing to happen PC gaming since the demise of the CRT.

Absolutely, without hesitation, yes it is worth it.
 
If you haven't seen G-sync in action then frankly it can't be explained or shown in a video to you. It really is the biggest upgrade in recent memory to monitors and the age old issue of V-sync tearing. It also takes games with shitty engines that tend to stutter and provides almost a "smoothing" effect to them as a side benefit (Metro games for instance). If you can afford it then absolutely go for it. You'll have that monitor for a very long time likely anyway and Nvidia won't stop supporting it. Just look at Nvision 3D as its basically dead but they keep supporting it because of the community.

Adaptive is really a big question mark. No one knows how well it'll work being that its entirely GPU dependent versus a shared hardware structure like G-sync. I'm inclined to believe it'll be much much more sensitive to driver updates since all of the work is being loaded onto the GPU and the monitor really doesn't have anything to do with it other than supporting a form of frame input not previously built in.

I think it'll end up like this.

Either :

1. G-sync will thrive. Lots of monitors with G-sync are getting announced. Finally 120hz IPS G-sync which is a big deal. Adaptive will have better support because of its lesser requirements but will be more touchy and dodgy because of driver dependence and it'll introduce input latency because of the extra workload on the GPU making one of two issues with V-sync still a problem. Top that off Nvidia will not support AMD leveled tech and will simply be one of two options.

2. Adaptive will get much broader support , its cheaper by far and will remain so. It will remain on AMD cards strictly since it doesn't benefit Nvidia to abandon G-sync. It won't have many driver issues because AMD will have made it a simple process at a lower level of priority in the driver making driver updates unlikely to break it. Adaptive will eventually win over and Nvidia will have to support it but under some funky new name.

I think both standards will co-exist for a while. It might be years and years before one truly wins out. Nvidia can offer incentives to monitor manufacturers to keep stuffing into monitors and they can make the hardware more efficient and cheaper making it a minor expense that will still be a "billed" feature for enthusiasts.
 
G-Sync is an absolute game changer for playing classic games on your PC.

Basically every video game console and arcade game ran at a non standard refresh rate between 30hz and 61hz. Pacman runs at 61hz. Mortal Kombat runs at 54hz. Street Fighter 2 runs at 59.63hz. DoDonPachi runs at 57hz. R-Type runs at 55hz.

That's why scrolling and movement is stuttery shit with v-sync on traditional computer setups.

G-Sync just magically solves the problem universally. Every game is glass smooth because it can run at its native refresh rate. If you care anything about playing old console and arcade games on your PC, it's worth it for that alone, and that's ignoring all the benefits people have mentioned above.
 
Thanks for the input, i really can't justify spending 600$ on a monitor when it cost almost half of what i paid for my system.

see, I cant understand this way of thinking. The Display is the portal to everything you do on your computer. Ultimatley if I didnt have a choice id rather run a slower GPU with lower settings but plugged into a higher quality monitor than run a next level beast GPU and run it through a small poor quality screen.

As for the thread question. My answer is to wait for the new IPS 1440p 144hz monitors or cross your fingers for a 120hz 4k monitor at 32" or above ( DP1.3 can handle 120hz 4k). Just because g-sync can flatter sub 45fps gaming doesnt mean you should be aiming for that as a target.
 
see, I cant understand this way of thinking. The Display is the portal to everything you do on your computer. Ultimatley if I didnt have a choice id rather run a slower GPU with lower settings but plugged into a higher quality monitor than run a next level beast GPU and run it through a small poor quality screen.

So true. I kid you not my Swift makes me wanna play 24/7, seeing all my games run so perfectly smooth and with such sharpness (and very decent colours) is just so unreal. And all that without the dreadful v-sync input lag.
 
So true. I kid you not my Swift makes me wanna play 24/7, seeing all my games run so perfectly smooth and with such sharpness (and very decent colours) is just so unreal. And all that without the dreadful v-sync input lag.

I'm in the same boat. It's just shocking how much better everything is. And my K/D in my FPS of choice has - no exaggeration - at minimum tripled versus my older 60hz display.
 
I'm in the same boat. It's just shocking how much better everything is. And my K/D in my FPS of choice has - no exaggeration - at minimum tripled versus my older 60hz display.

Yea I also find it easier to focus on aiming and such when I don't get any distracting tear lines or stutter (or mouse lag) etc. :)
 
shouldn't unless you're hitting >144fps

It does but its extremely minor. Without question however you wouldn't notice it so only a real input latency test would reveal a difference .. very slight one.

This is one of the concerns about Freesync. If G-sync with its added hardware to take the load off the GPU has a slight amount then how much will it add to the workload of the GPU in Freesync form? If its not also extremely low then its going to be a real issue and possibly a stay of execution for G-sync long term.
 
G-Sync is a game changer. It's probably one of the biggest technological innovations since the introduction of the 3d rendering video card. Completely changes the experience for the better.
 
So basically G-Sync not only fixes age old problems associated with displaying images, it has the added bonus of smoothing out lower fps counts.

It's starting to look like this is start of the norm of Monitors for the next few years
 
So basically G-Sync not only fixes age old problems associated with displaying images, it has the added bonus of smoothing out lower fps counts.

It's starting to look like this is start of the norm of Monitors for the next few years
Low framerates are still going to look low, especially if you're used to triple-digits. IIRC G-sync doesn't work below 30 or 40 FPS. If you're on the PG278Q, I think it looks best when you're around 70-100 FPS. It does make lower framerates more bearable, though. It's true when previous posters say they don't even pay attention to FPS numbers anymore because it generally doesn't matter.

One thing to keep in mind is you may also have to invest in a better mouse. I would definitely recommend getting a mouse that offers a polling rate of at least 500 Hz, as movement can get jerky any lower than that.
 
i'm with a lot of other people in this thread, g-sync is a game changer. it's just phenomenal. i don't regret one cent of my Swift purchase.
 
Low framerates are still going to look low, especially if you're used to triple-digits. IIRC G-sync doesn't work below 30 or 40 FPS. If you're on the PG278Q, I think it looks best when you're around 70-100 FPS. It does make lower framerates more bearable, though. It's true when previous posters say they don't even pay attention to FPS numbers anymore because it generally doesn't matter.

One thing to keep in mind is you may also have to invest in a better mouse. I would definitely recommend getting a mouse that offers a polling rate of at least 500 Hz, as movement can get jerky any lower than that.


The mouse callout is a great one; I'd push it even further and insist on a 1000hz poll rate to insure everything stays smooth.

Re: framerate slowdowns you can definitely see/feel it when they drop below 70/80.
 
Back
Top