GSYNC worth it?

I have a G400s, from the specs I think it's up to 1000 hz polling rate?
 
I have a G400s, from the specs I think it's up to 1000 hz polling rate?

Yes it is even without installing any drivers.

Low framerates are still going to look low, especially if you're used to triple-digits. IIRC G-sync doesn't work below 30 or 40 FPS. If you're on the PG278Q, I think it looks best when you're around 70-100 FPS. It does make lower framerates more bearable, though. It's true when previous posters say they don't even pay attention to FPS numbers anymore because it generally doesn't matter.

One thing to keep in mind is you may also have to invest in a better mouse. I would definitely recommend getting a mouse that offers a polling rate of at least 500 Hz, as movement can get jerky any lower than that.

I'm not convinced by the "g-sync doesn't below 30 fps" that I see thrown around. I can't see any tearing or stutter below 30fps with g-sync on, it looks like low fps of course and the LCD blur becomes atrocious. But it still looks clean to me. I'm not saying the screen drops below 30hz (which would look horrible I assume) but it could run at say, 50 or 100hz for 25fps etc. But I don't know, maybe the flickering in some loading screens is actually related to the super low framerate & refresh rate actually.
 
Last edited:
Why wouldn't g-sync work below 30 fps?

GSYNC's lower cap is 30 fps. You won't get any tearing below 30 fps but you can get stuttering, not that you would really be able to tell at that low of a frame rate.

It is because liquid crystals will decay back into their natural state if they aren't given power. You'll start to notice this decay at about 30hz so they put the lower cap at 30 fps.

If your LCD was only being updated at 15hz it would constantly be fading to black before each refresh. And each color has a different decay rate so the image would look really garbled too.
 
I'm honestly not sure the monitor goes down to 30-40hz, wouldn't that look pretty bad? I can't say that playing at 35-45fps looks tiring or flickery.

Why couldn't it just run at 60-80hz instead? V-synced 60fps on 120hz works just fine for example.
 
mmm I'm going to have to go visit my local compstore to see if they have any gsync monitors available to play with. Very much tossing up a 27" gsync monitor or a 34" ultrawide dell.
 
I've gone as low as 15-20fps in super modded Skyrim (more CPU limited than anything) as well as 5K DSR Alien Isolation (because SLI + G-Sync + DSR = only 2/3 work on the Swift right now). I didn't see any flickering. I have however noticed it only loading screens where the frame rate may drop to almost nothing. So I can say that G-Sync does work below 30fps. But in the input delay feels atrocious when you're used to ~100fps.
 
I've gone as low as 15-20fps in super modded Skyrim (more CPU limited than anything) as well as 5K DSR Alien Isolation (because SLI + G-Sync + DSR = only 2/3 work on the Swift right now). I didn't see any flickering. I have however noticed it only loading screens where the frame rate may drop to almost nothing. So I can say that G-Sync does work below 30fps. But in the input delay feels atrocious when you're used to ~100fps.

G-Sync does not work below 30fps. Below 30fps it will simply re display the same frame. You won't notice flickering because it's still outputting at 30hz but G-Sync is also off as it isi not making the monitor wait for the next frame as it redraws the same one which can cause stutter.
 
No one is talking about the response time of it. The wonks are saying that IPS has higher latency and therefore cannot be as fast as the Swift.....but should still be plenty fast.

I bought mine as a second, gaming-only monitor so speed is paramount. I don't really worry too much about perfect color in games.
Crappy color quality makes graphics look much worse, as does color shifting. I'd also add in that the much worse dark contrast makes you play at a disadvantage in any game with shadows/darker spots, honestly. For me TN, even the new 28 inch 4k ones that are improved, is an absolute no go still. I currently am running an Acer b326hk 32 inch ips 4k 60hz dp1.2 monitor. The only improvement I could even want now would be to add gsync in that mix. Before I used an x-star dp2710 plus 2560x1440 110hz for a long time and a Dell UP2414Q 4k 60hz most 24 inch several months. To me, even for gaming, using a TN is like slamming your graphics settings down sharply.... It is just that much worse a picture compared to a good ips.
 
I don't really get the G-sync/Freesync hype. It sounds like cool tech, but are that many people really suffering from tearing? I turn v-sync off in almost every game, and haven't seen tearing in years. Are there that many people that suffer from tearing? Is it only in the fps games (which I generally don't play)?
 
I don't really get the G-sync/Freesync hype. It sounds like cool tech, but are that many people really suffering from tearing? I turn v-sync off in almost every game, and haven't seen tearing in years. Are there that many people that suffer from tearing? Is it only in the fps games (which I generally don't play)?

It isn't just tearing, when the framerate is not a multiple or equal to the refresh rate frames are dropped or repeated which is ugly and it has always bothered me way more than tearing which is pretty minimal with high refresh rates &/or framerates.

And the beauty of g-sync is how it's always smooth even when your framerate is fluctuating a lot (which WILL happen in demanding games with maxed settings).
 
It isn't just tearing, when the framerate is not a multiple or equal to the refresh rate frames are dropped or repeated which is ugly and it has always bothered me way more than tearing which is pretty minimal with high refresh rates &/or framerates.

And the beauty of g-sync is how it's always smooth even when your framerate is fluctuating a lot (which WILL happen in demanding games with maxed settings).

That makes sense, though I've never noticed that. Now I probably will though. ;)

I think I'll have to see one in action to see the difference.
 
That makes sense, though I've never noticed that. Now I probably will though. ;)

I think I'll have to see one in action to see the difference.

It really depends on the games you are playing but the vast majority of games you'll get a benefit from gsync. Only in games where you can get a constant 120/144fps with no fluctuating frame rate will gsync be not so useful.
 
Crappy color quality makes graphics look much worse, as does color shifting. I'd also add in that the much worse dark contrast makes you play at a disadvantage in any game with shadows/darker spots, honestly. For me TN, even the new 28 inch 4k ones that are improved, is an absolute no go still. I currently am running an Acer b326hk 32 inch ips 4k 60hz dp1.2 monitor. The only improvement I could even want now would be to add gsync in that mix. Before I used an x-star dp2710 plus 2560x1440 110hz for a long time and a Dell UP2414Q 4k 60hz most 24 inch several months. To me, even for gaming, using a TN is like slamming your graphics settings down sharply.... It is just that much worse a picture compared to a good ips.

I'll agree to an extent but much of what you're describing re: contrast in my experience is more relevant to the crap viewing angles. While the color isn't in the superb category it certainly isn't awful either and I see nothing that could be described as degraded graphic fidelity (graphics =/= color). I work with color for a living, FWIW.

My opinion, as an owner of the Swift, is that the tradeoff is worth it for fast-action FPSs. Other game types, maybe not so much. Anyone considering a Swift and has the means should at least give one a test drive because G-Synch is a truly, no-joke amazing tech and will enhance your experience that much.
 
My opinion, as an owner of the Swift, is that the tradeoff is worth it for fast-action FPSs. Other game types, maybe not so much. Anyone considering a Swift and has the means should at least give one a test drive because G-Synch is a truly, no-joke amazing tech and will enhance your experience that much.

i agree. most of my play time is spent on fps games. i went from the PB278Q PLS to the Swift and i don't regret it one bit.
 
see, I cant understand this way of thinking. The Display is the portal to everything you do on your computer. Ultimatley if I didnt have a choice id rather run a slower GPU with lower settings but plugged into a higher quality monitor than run a next level beast GPU and run it through a small poor quality screen.

Because not everyone has the cash to throw on a monitor for 600-800 dollars. It's really that simple.
And then there's all the notorious defects with Swift. Some people have done 3-4 RMAs and it's not an extreme case. Don't really understand why this is difficult to comprehend :)

To OP: I'd wait for a while. Adaptive Sync, the underlying tech behind both G/F-Sync, is a big deal. But you're right to question the sky-high prices and the (very) limited availability right now.

Personally, I'd prefer to buy a monitor that isn't bound to any specific GPU. Asus' 120 Hz IPS display has DP 1.2a, which has A-Sync as an option and AMD said that it would work with their GPUs.

I imagine Nvidia will be forced to support the open technology at some point. G-Sync has its own algorithms, but their margin for individuality, if we put it like that, is relatively slim. So you probably don't have to worry about locking yourself in to either red or green(if you're like me, at least, who typically switches between both depending on where in the upgrade cycle I am and which has the best price/perf ratio).

P.S. It's interesting to read several commenters here who say that the technology works best in 70-100 fps range. PCPer.com's Ryan Shrout has stated that 4K is really the more natural realm for G-Sync/Freesync because it's in the lower frame rates that the effect is the biggest.
 
Because not everyone has the cash to throw on a monitor for 600-800 dollars. It's really that simple.
And then there's all the notorious defects with Swift. Some people have done 3-4 RMAs and it's not an extreme case. Don't really understand why this is difficult to comprehend :)
How many people have not posted their good experience with the display compared to those who have had a poor one? Going by some of the posts here I would be led to believe that coil whine is a problem with all GTX 970s regardless of manufacturer, but I have had no such issues with mine. The number of complaints with monitors from batches that started shipping at the end of October have been far less than those who bought from the earliest batches. One person's experience can't speak for everyone, just as the experience of those people posting on the internet can't speak for all purchasers. Regardless, LCD monitors are always a lottery. As the number of parts goes up, the number of defects in the manufacturing tolerance also goes up.

To OP: I'd wait for a while. Adaptive Sync, the underlying tech behind both G/F-Sync, is a big deal. But you're right to question the sky-high prices and the (very) limited availability right now.

Personally, I'd prefer to buy a monitor that isn't bound to any specific GPU. Asus' 120 Hz IPS display has DP 1.2a, which has A-Sync as an option and AMD said that it would work with their GPUs.

I imagine Nvidia will be forced to support the open technology at some point. G-Sync has its own algorithms, but their margin for individuality, if we put it like that, is relatively slim. So you probably don't have to worry about locking yourself in to either red or green(if you're like me, at least, who typically switches between both depending on where in the upgrade cycle I am and which has the best price/perf ratio)..
Adaptive-sync support technically is bound to the hardware because the drivers have to be written to support it. With word on whether NVIDIA will add it to their drivers or not is yet to be seen, so technically it is bound to a specific GPU with AMD's Freesync currently the only software offering to work with Adaptive-sync. Open standards are good, but let's not jump the gun here. We don't even know yet how Adaptive-sync's software reliance will compare to the hardware interface approach of G-sync.

P.S. It's interesting to read several commenters here who say that the technology works best in 70-100 fps range. PCPer.com's Ryan Shrout has stated that 4K is really the more natural realm for G-Sync/Freesync because it's in the lower frame rates that the effect is the biggest.
I was the first to provide this comment, and you left out one important qualification:
If you're on the PG278Q, I think it looks best when you're around 70-100 FPS.
The Acer XB280HK is using a completely different panel at a different native resolution. But I'd still like to respectfully disagree. One of the complaints I sometime hear from those advocating for 30 FPS caps is that the fluctuation in framerate is lessened, which makes the experience "appear" more smoother for them. While technically true, this also means the assertion that G-sync shines in lower framerates is untrue. With framerates having the potential to vary more wildly in the upper reaches, G-sync makes this experience better because it reduces the stutter and input lag of the display that occurs when these changes happen.
 
That's not quite correct. If you can cap your framerate slightly below 144 you don't get any input lag and it's much better than v-sync. So a game like Quake (which is best played at 125fps) or Jedi Academy (best played at 100fps) benefits immensely from g-sync: perfect picture with no input lag (too bad we can't combine ULMB with it though).

Quake is best played at 250 fps, not 125 fps. They (somewhat) recently increased the frame cap from 125 fps to 250 fps (quake live).
 
I don't think they fixed the footstep bug of 250 fps though I haven't been following Quake Live news too closely as of late. It's a bit better (more responsive) but 125fps/125hz+g-sync looks a bit better. 250fps+120hz+ULMB isn't too bad either but I really enjoy the smoothness of g-sync more than anything.
 
I don't think they fixed the footstep bug of 250 fps though I haven't been following Quake news too closely as of late.
I don't blame you. Quake Live has mutilated the Quake III formula. Basically they lowered the skill ceiling to appeal to casuls, while alienating the original fan base.
 
Quake is best played at 250 fps, not 125 fps. They (somewhat) recently increased the frame cap from 125 fps to 250 fps (quake live).

Are there any displays that can actually output 200+ Hz at respectable resolutions?
Even my top-end CRT from 15 years ago barely hit 200 Hz at 800x600 IIRC.
 
Are there any displays that can actually output 200+ Hz at respectable resolutions?
Even my top-end CRT from 15 years ago barely hit 200 Hz at 800x600 IIRC.
Nope. TN panels could technically get that high without ghosting problems. OLED supposedly has the potential for "unlimited" refresh rate. I think the last CRT monitor I had could do about 200 Hz at 800x600, too.
 
For a game like quake, it's more about responsiveness than silky smoothness. I play at 160 hz with 250 fps, and I'd take this any day over 160hz with gsync, as the higher framerate means there is less overall input lag.
 
OLED supposedly has the potential for "unlimited" refresh rate. I think the last CRT monitor I had could do about 200 Hz at 800x600, too.

OLED can theoretically deliver an infinite contrast ratio, not refresh rate. That said, the theoretically-possible refresh rate of OLED monitors - WHEN someone actually chooses to go beyond 60-75Hz - is supposed to be in the order of 1000s of Hz.
 
OLED can theoretically deliver an infinite contrast ratio, not refresh rate. That said, the theoretically-possible refresh rate of OLED monitors - WHEN someone actually chooses to go beyond 60-75Hz - is supposed to be in the order of 1000s of Hz.

That is far off in the distance. Maybe 5+ years or more before we see anything beyond 144hz refresh rate. Bandwidth is the issue as well as display technology.

OLED does have the benefit of something like 0.001ms pixel response which is a huge advantage over 1ms in the fastest gaming displays.

Maybe with the new MHL standard that can do 8k at 120hz we finally have the bandwidth we've needed and manufacturers won't have a reason not to.
 
keep in mind that with higher refresh rates, you're gonna need more brightness. For example, the luminance of each refresh in a 1000 hz display will need to be 10 times greater than that of a 100 hz display, if each refresh is going to appear equally bright. Within a certain duration window, the human visual system integrates luminance over time, and you can exchange luminance and duration equally. So, a 1 ms pulse of light that is 100 cd/m^2 will appear to be the same brightness as a 10 ms pulse of light that is 10 cd/m^2.
 
For me, it'll be worth it when it's on an IPS panel that is 34" with 21:9 aspect (or larger, preferably). Until then, I'll just hold tight...my guess is ~ 18 months (pure guess).
 
Armenius I largely agree with your points, except on the RMA issue. In Sweden at least, the quality control issues have been enormous. And I don't really think it's easy to dismiss as a one-country issue. Plenty of people in NA had the same issue.

The fact that this is seen as normal says a lot about the lowered expectations that people have internalised when it comes to LCD displays. That's not the fault of the Swift, but it doesn't fundamentally alter the initial point.


That is far off in the distance. Maybe 5+ years or more before we see anything beyond 144hz refresh rate. Bandwidth is the issue as well as display technology.

OLED does have the benefit of something like 0.001ms pixel response which is a huge advantage over 1ms in the fastest gaming displays.

Maybe with the new MHL standard that can do 8k at 120hz we finally have the bandwidth we've needed and manufacturers won't have a reason not to.

What are the limits of the human eye's acuity to perceive such low input lag? I mean, 1 ms is already extremely fast.
Would there be any discernible difference between 0.1 ms, say, and 0.001 ms? I'm asking because you can calculate how easy it is for (most) people to make out pixels using arcmins and distance to screen, as well as screen ratio & size. Is there a theoretical framework surrounding input lag, too, do we know?
 
As much as I hate to contradict quite a few people in this thread I'd like to share my quick experience with gsync/ROG swift. I read through the thread and went to check out a ROG swift at a local computer store. They put on BF4 for me and let me have a play around.

My initial impressions were the good. Coming from a U2713hm using the ROG was quite pleasant. It was responsive and quite snappy, no lag and animations were smooth and crisp. I did notice that it looked a fair bit more washed out compared to my IPS but that's to be expected.

That being said I couldn't help but feel slightly underwhelmed. Yes it was great, everything ran as it should but for me it's not really a game changer. It didn't introduce any new experience or really add anything too substantial to my gameplay experience *for me*. As much as I appreciated the smoothness I felt that gsync/144hz was not a game changer for my use case.

I enjoy and play a mixture of games, and have sunk a fair amount of time into BF4, COD, and Insurgency. I can appreciate how input lag and screen tearing can effect ones experience. With that being said I don't think that gsync/freesync will be something that will be a determining factor in my next purchase. It would be nice, but i'm not going to buy a monitor because it doesn't include those features. I am the type of person that likes to become immersed in my games, to get sucked in.

Using gsync was nice, however it did not add enough value to my gameplay experience in order for me to come away thinking yup, that was awesome, I would like that in my next monitor. I didn't use it and automatically go "Holy balls this is incredible" it was more of a "Ok this is nice, I can see why people would appreciate this", whereas when I loaded up BF4 on a LGUC97 I thought "This is seriously cool".

My thoughts about the Swift are as follows, as much as I enjoyed the crispness and smooth animation it offered I still couldn't help but feel underwhelmed. It didn't over me anything substantially different to what I was currently used to. I'm tossing up between a 34" monitor and a ROG and using the ROG felt really like a minor upgrade. My gameplay experience from my U2713hm would only be marginally different as opposed to opting for a 34" monitor which would change the way I experienced games quite drastically. I personally enjoyed the scope of the 34" monitor much more than I enjoyed the increased smoothness of the ROG. That's not to say the ROG is a fantastic monitor, what it does it does very well, but for my own use case it is not as well suited. I consider myself a gamer, and spend a lot of my time indulging that hobby, today I found out that I personally value the immersive experience of a larger screen much more than I do a super responsive one.

As with other things your mileage may vary, if you're an absolute stickler for screen tearing, or you take fps games very seriously the rog will no doubt serve you very well. If you play a variety of game genres and aren't too concerned with more than 1ms of input lag I wholeheartedly encourage you to consider one of the 34" monitors.
 
Last edited:
I used the Asus PG278Q @ 144 hz and g-sync, backed by a GTX 980. The smoothness of everything was freaking amazing. Fast games like BF4, Heroes and Generals, WarThunder, etc were amazing. Thing is, i cant say if it was the g-sync tech, or going from 60hz to 144hz, or a combination of the two put together. I never did test the monitor at 144hz w/ g-sync turned off to see if there was a noticeable difference, I was too busy gaming. In the end I sold it on eBay because I really really like the real estate and colors of my LG 34" 3440 x1440. The 27" Asus just seemed too small but man was the speed amazing.
 
That is far off in the distance. Maybe 5+ years or more before we see anything beyond 144hz refresh rate. Bandwidth is the issue as well as display technology.

I know, but my hope is that it's closer to only 1-3 years away.
Seen this, by the way? http://forums.blurbusters.com/viewtopic.php?f=7&t=48
 
keep in mind that with higher refresh rates, you're gonna need more brightness. For example, the luminance of each refresh in a 1000 hz display will need to be 10 times greater than that of a 100 hz display, if each refresh is going to appear equally bright. Within a certain duration window, the human visual system integrates luminance over time, and you can exchange luminance and duration equally. So, a 1 ms pulse of light that is 100 cd/m^2 will appear to be the same brightness as a 10 ms pulse of light that is 10 cd/m^2.

Is that why the EIZO 2421 is 400cdm2?? I seen as high as 450 cdm2
With higher brightness I suspect it would be harder to get good gamma and contrast with the led backlights.
I suspect G-sync isn't a bad idea if that is the case.
 
OLED's perceived brightness should be much better than LCD's, though I don't believe I've read anything with number comparisons between the maximum theoretical luminance levels
 
keep in mind that with higher refresh rates, you're gonna need more brightness. For example, the luminance of each refresh in a 1000 hz display will need to be 10 times greater than that of a 100 hz display, if each refresh is going to appear equally bright. Within a certain duration window, the human visual system integrates luminance over time, and you can exchange luminance and duration equally. So, a 1 ms pulse of light that is 100 cd/m^2 will appear to be the same brightness as a 10 ms pulse of light that is 10 cd/m^2.

Not close to correct because each refresh doesnt need to be of comparable brightness.
Otherwise the same would apply to all types of lighting that use higher frequency switching.

If you said there were 100 refreshes per second and those that take 10nS to complete need to to produce the same average light output as those that take 1mS, then you would have a point.
But when the number of refreshes increases, well, there are more of them :)
 
It will be worth it when it starts working in ULMB mode. Otherwise its not that amazing because at low framerate you will not get any tearing, but you will get rather crappy motion resolution. It also needs 120Hz VA monitors to be truly worth it.

Eizo Foris FG2421 is the only worthy monitor these days, but there are some bad units out there that have overly low gamma on the right and left edges.
 
It will be worth it when it starts working in ULMB mode. Otherwise its not that amazing because at low framerate you will not get any tearing, but you will get rather crappy motion resolution. It also needs 120Hz VA monitors to be truly worth it.

Eizo Foris FG2421 is the only worthy monitor these days, but there are some bad units out there that have overly low gamma on the right and left edges.

:rolleyes:
 
Back
Top