GSYNC worth it?

Not close to correct because each refresh doesnt need to be of comparable brightness.
Otherwise the same would apply to all types of lighting that use higher frequency switching.

If you said there were 100 refreshes per second and those that take 10nS to complete need to to produce the same average light output as those that take 1mS, then you would have a point.
But when the number of refreshes increases, well, there are more of them :)

I'd definitely agree with you if the sole purpose of a display was to provide a source of illumination.

But what about when you're rapidly panning, and now each of those refreshes contains different pixel information? My brain's not awake enough right now to say for sure, but it seems as if in that situation the image quality would suffer a loss of quality.

I know that at any given refresh rate, if you implement a strobed backlight (e.g. lightboost), then the shorter the duty cycle the more "boost" you need for each light pulse, but this is a slightly different question from the refresh vs required luminance issue.
 
Last edited:
I'd definitely agree with you if the sole purpose of a display was to provide a source of illumination.
How is it different for this context?

But what about when you're rapidly panning, and now each of those refreshes contains different pixel information? My brain's not awake enough right now to say for sure, but it seems as if in that situation the image quality would suffer a loss of quality.
You still get the same amount of light to the eye in the same time period without any increase in brightness.
Scaling up to infinite refresh rate, by your reckoning, would need infinite brightness.

I know that at any given refresh, if you implement a strobed backlight (e.g. lightboost), then the shorter the duty cycle the more "boost" you need for each light pulse, but this is a slightly different question from the refresh vs required luminance issue.
Very different situation.
The frequency remains the same, the time period the light is on is reduced.
Its easy to see why a brighter light helps.
 
Yes, it's worth it and very noticeable (asus rog swift). I was blown away and got rid of my qnix 27" ips and Eizo "240hz" fg2421.

Whether or not the technology is worth the $300 will be different for everyone. Personally, I no longer care about my fps reaching triple digits, which is very nice relief.

This. It seems that G-Sync lets you get away with less costly video cards.
 
You still get the same amount of light to the eye in the same time period without any increase in brightness.

I'm thinking more along the lines of the following:

With a 100 hz (sample and hold) display, suppose you have a red screen on one refresh followed by a green screen on another refresh, and these flashes, each of which lasts 10 ms, are each 50 cd/m^2.

Now take a 200 hz (sample and hold display), and display the same two successive flashes, each of which lasts 5 ms. My contention is that in this second display, each flash would need to be 100 cd/m^2 to match the intensity of the presentation in the first display.

Basically what I'm getting at is that with higher refresh rates, you have the opportunity to cram the same amount of information (in the above scenario, the information is "red then blue"), in a smaller amount of time, and you therefore need more light energy per refresh to maintain the same intensity of the information.
 
You still get the same amount of time displaying each colour at the same brightness, the eye doesnt see a less bright image, it doesnt need a change in brightness.

Observe how monitors work.
When you double the refresh, does the brightness double or does it stay the same?
What difference in brightness do you perceive if you keep the brightness level the same while changing the refresh?
You can test this going from 30Hz to 60Hz or 120Hz if you have a capable display.

Check the specs of 60Hz and 120Hz monitors, you wont find that the 120Hz need higher brightness unless they use lightboost.
 
You still get the same amount of time displaying each colour at the same brightness, the eye doesnt see a less bright image, it doesnt need a change in brightness.

So you're saying that:

a 5 ms red field @ 100 cd/m^2 followed by a 5 ms green field @ 100 cd/m^2

will be as bright as

a 10 ms red field @ 100 cd/m^2 followed by a 10 ms green field @ 100 cd/m^2?

I get that if you're feeding the display the same time-matched signal, then they'll appear equal (the 200 hz display will just repeat each colored field twice).
 
Actually I see your point. If you scaled up the brightness with refresh rate, like I had originally suggested, then you'd also scale up the luminance of a static image, which wouldn't be good.

(this wouldn't be the case with a strobed situation, but we agree on that).
 
So with all said is GSYNC compatible with all games or does it run independent? Does it run over 60HZ or are there issue with that or is it the fact alot of games are console ports? For instance Skyrim is capped at 60Hz and will flip out if it is above that. I mean what be the purpose of buying a monitor over 60HZ if there are issue and that would be a pain to send a monitor to the MFR for any kind of update to the rom.
 
G-Sync combats exactly that, it makes fps dips less noticeable overall, especially at lower fps, unless it drops off a cliff.

I am pretty sure G-Sync is a driver thing, so games do not need to actively support it to benefit, although I think there are games out there that don't play nice with it.
 
G-Sync combats exactly that, it makes fps dips less noticeable overall, especially at lower fps, unless it drops off a cliff.

I am pretty sure G-Sync is a driver thing, so games do not need to actively support it to benefit, although I think there are games out there that don't play nice with it.
Yes. This isn't MFAA. I have yet to run into a game that didn't play nicely with G-sync.
 
One thing I have noticed with G-Sync, at least in the few games I've tried (Battlefield 4, Bioshock Infinite), is that anything under 100fps looks extremely choppy. So when I am getting around 60fps I think it looks visibly worse than 60fps on a 60Hz monitor (either V-Sync off or locked 60Hz V-Sync no drops). I find this weird cause everyone was saying it should be good for low fps, but I did not find this to be true.

With Bioshock Infinite I actually put the settings on low across the board so I could hit 144fps and it looks awesome like this. Not saying G-Sync isn't great. It is. It's just not going to save you if your PC isn't up to snuff. In many ways, I found I wanted to get higher frame rate (turning down settings) for the best smoothest experience.
 
Maybe this explains why Gsync displays are 144Hz.

I found this annoying and wasteful because a major benefit of syncing to drawn frames is it should look smoother at lower framerate.
Although other reports are positive at lower framerates so perhaps you have an issue that isnt common?
Or those with issues arent reporting them.
 
Maybe it's because one is trying to G-Sync multiple monitors? I don't have that issue with just 1 Swift
 
One thing I have noticed with G-Sync, at least in the few games I've tried (Battlefield 4, Bioshock Infinite), is that anything under 100fps looks extremely choppy. So when I am getting around 60fps I think it looks visibly worse than 60fps on a 60Hz monitor (either V-Sync off or locked 60Hz V-Sync no drops). I find this weird cause everyone was saying it should be good for low fps, but I did not find this to be true.

With Bioshock Infinite I actually put the settings on low across the board so I could hit 144fps and it looks awesome like this. Not saying G-Sync isn't great. It is. It's just not going to save you if your PC isn't up to snuff. In many ways, I found I wanted to get higher frame rate (turning down settings) for the best smoothest experience.

That's an issue with your setup, at least regarding BF4 as I don't play Bioshock. Multi-GPU &/or multi-monitor issue I would bet.
 
I wouldn't be surprised if it was my setup (multi-monitor, multi-GPU). Aside from that things appear working.
 
Also my peripheral vision is not so great so anything over 24" is really hard to see and wont fit on my tiny desk.
 
One thing I have noticed with G-Sync, at least in the few games I've tried (Battlefield 4, Bioshock Infinite), is that anything under 100fps looks extremely choppy. So when I am getting around 60fps I think it looks visibly worse than 60fps on a 60Hz monitor (either V-Sync off or locked 60Hz V-Sync no drops). I find this weird cause everyone was saying it should be good for low fps, but I did not find this to be true.

With Bioshock Infinite I actually put the settings on low across the board so I could hit 144fps and it looks awesome like this. Not saying G-Sync isn't great. It is. It's just not going to save you if your PC isn't up to snuff. In many ways, I found I wanted to get higher frame rate (turning down settings) for the best smoothest experience.

This is what I assumed. Going by the FreeSync/GSync demos that I saw on YouTube... I wasn't terribly impressed with the technology. Yeah, it removed tearing without using something like VSync, but it still looked really choppy to me when the refresh rate was forced to 40hz on AMD (and 60hz on nVidia).

I suspected that would be the case, but hoped that perhaps because it was recorded with a phone/camera @ 60hz, then perhaps it was just some sort of artifact from the display/camera refresh rates not being synchronized or something. I'm normally very sensitive to low FPS so maybe that has something to do with it? I'm not discounting the tech at all... it would be nice for every monitor to come with this capability just to do away with Vsync, but it certainly doesn't seem to be the holy grail that it was made out to be (30FPS butter smooth). At the very least, I hope to see a demo setup at a B&M store to see it in person.

I might get one in the future, but at the moment I think I'll stick with my 120hz
 
Last edited:
This is what I assumed. Going by the FreeSync/GSync demos that I saw on YouTube... I wasn't terribly impressed with the technology. Yeah, it removed tearing without using something like VSync, but it still looked really choppy to me when the refresh rate was forced to 40hz on AMD (and 60hz on nVidia).

I suspected that would be the case, but hoped that perhaps because it was recorded with a phone/camera @ 60hz, then perhaps it was just some sort of artifact from the display/camera refresh rates not being synchronized or something. I'm normally very sensitive to low FPS so maybe that has something to do with it? I'm not discounting the tech at all... it would be nice for every monitor to come with this capability just to do away with Vsync, but it certainly doesn't seem to be the holy grail that it was made out to be (30FPS butter smooth). At the very least, I hope to see a demo setup at a B&M store to see it in person.

I might get one in the future, but at the moment I think I'll stick with my 120hz

From the videos i watched on youtube, specifically from Linus Tech Tips is that you would need to see it in person because a camera unless you use one that records alot of frames and then slow it down, you wont see it captured in videos.
 
One thing I have noticed with G-Sync, at least in the few games I've tried (Battlefield 4, Bioshock Infinite), is that anything under 100fps looks extremely choppy. So when I am getting around 60fps I think it looks visibly worse than 60fps on a 60Hz monitor (either V-Sync off or locked 60Hz V-Sync no drops). I find this weird cause everyone was saying it should be good for low fps, but I did not find this to be true.

With Bioshock Infinite I actually put the settings on low across the board so I could hit 144fps and it looks awesome like this. Not saying G-Sync isn't great. It is. It's just not going to save you if your PC isn't up to snuff. In many ways, I found I wanted to get higher frame rate (turning down settings) for the best smoothest experience.

This doesn't make the slightest sense. The whole purpose of GSYNC/FreeSync is to improve the lower spectrum of playable frame rate (30-60, mostly). Higher frame rate supposedly yields lower benefits.

G-Sync combats exactly that, it makes fps dips less noticeable overall, especially at lower fps, unless it drops off a cliff.

I am pretty sure G-Sync is a driver thing, so games do not need to actively support it to benefit, although I think there are games out there that don't play nice with it.

That's one thing I'd like to clarify, because many folks on the BlurBuster forums claim that some games are actually worse with GSYNC than they are with VSYNC. Even recent ones like MGSV Ground Zeroes, COD: Advanced Warfare, Assassin's Creed Unity and Dragon Age: Inquisition. Can anyone confirm or deny this?
 
When are the g sync ips monitors coming to the market? I just built a new pc and need a new monitor for it debating on what to get...I also want a second monitor something cheaper and smaller just for work browsing, etc
 
Gsync is worth it but don't fall for misconception that even low framerate is smoother it doesn't make lower frame rate smoother so don't ditch your multi GPU setup thinking gsync will be a godsend.
80 will feel like 80, 60 will still feel like 60fps with or without Gsync.
The transition is smoother but you will always find higher 110-144fps best even with gsync.

People who don't use vsync will notice it feels smoother because it removed tearing and no more vsync stutter but it doesn't magically make low frame rate smoother.

I ended up returning though i cant stand matte displays and i don't really have prob keeping 100-120 fps but does get annoying when frames drop and experience vsync stutter.
 
Last edited:
The first ASUS 24" Monitor with 1440P and Gsync at 450.00 or less I'll buy one or two of those.
 
This doesn't make the slightest sense. The whole purpose of GSYNC/FreeSync is to improve the lower spectrum of playable frame rate (30-60, mostly). Higher frame rate supposedly yields lower benefits.



That's one thing I'd like to clarify, because many folks on the BlurBuster forums claim that some games are actually worse with GSYNC than they are with VSYNC. Even recent ones like MGSV Ground Zeroes, COD: Advanced Warfare, Assassin's Creed Unity and Dragon Age: Inquisition. Can anyone confirm or deny this?
To both points, I can say that Ass Creed Unity is much better with G-sync. Unity, for one, does not support Triple Buffering, so V-sync is going to halve the framerate at anything under your refresh rate unless you use some form of Adaptive V-sync. For two, a game like Unity is already suffering from stuttering issues due to the number of draw calls. G-sync isn't supposed to be some miracle cure to technical issues that exist in a game engine. What G-sync does do in Unity, however, is alleviate some input lag from the slower response inherent in running at a lower framerate and keeps frame transitions as smooth as possible. Because of the lack of Triple Buffering, people may be experiencing what appears to be a better experience to them because the framerate is actually being capped lower than the would expect it to be. The same could be said of the other newer games you listed, like DA:I.

In my experience G-sync does not make anything look worse regardless of the framerate (until you go under the lower threshold, of course, of 30 with G-sync). To the contrary, every game I have played since purchasing a G-sync displays has been vastly improved over past expereiences with "traditional" displays. Is it worth the price premium? Probably not, but if I had to make the same purchase again I would every time.

As has been said time and again, you really have to experience the tech in person to get what it's all about. I think most of what is going around is just FUD ever since the realization of moduleless G-sync in laptops and the recent release of Adaptive-sync monitors.
 
To both points, I can say that Ass Creed Unity is much better with G-sync. Unity, for one, does not support Triple Buffering, so V-sync is going to halve the framerate at anything under your refresh rate unless you use some form of Adaptive V-sync. For two, a game like Unity is already suffering from stuttering issues due to the number of draw calls. G-sync isn't supposed to be some miracle cure to technical issues that exist in a game engine. What G-sync does do in Unity, however, is alleviate some input lag from the slower response inherent in running at a lower framerate and keeps frame transitions as smooth as possible. Because of the lack of Triple Buffering, people may be experiencing what appears to be a better experience to them because the framerate is actually being capped lower than the would expect it to be. The same could be said of the other newer games you listed, like DA:I.

In my experience G-sync does not make anything look worse regardless of the framerate (until you go under the lower threshold, of course, of 30 with G-sync). To the contrary, every game I have played since purchasing a G-sync displays has been vastly improved over past expereiences with "traditional" displays. Is it worth the price premium? Probably not, but if I had to make the same purchase again I would every time.

As has been said time and again, you really have to experience the tech in person to get what it's all about. I think most of what is going around is just FUD ever since the realization of moduleless G-sync in laptops and the recent release of Adaptive-sync monitors.

Thank you. I'm still deciding if GSYNC/FreeSync is more or less important than 21:9.

The 29/34 LG UM67 seemed to have solved that, but for now the ghosting issues and limited range made me back away. Hopefully LG will make a revision of the model...
 
Is it worth the price premium? Probably not, but if I had to make the same purchase again I would every time.

Stop equivocating. :p It's totally worth it. If you're going to use a full-persistence display i.e. an LCD there's no reason it shouldn't be G-SYNC, and not even FreeSync until they can get their issues sorted out (which'll be another couple of years).
 
ULMB looks significantly better than GSYNC if you can maintain the constant fps requirement to avoid stuttering (on the ROG Swift that equates to a constant 120, or 85, depending on what you cap your refresh rate at). Of course at that point GSYNC isn't doing anything for you anyway.

There's a misconception floating around that GSYNC gets rid of input lag. This is not true, if anything it adds a little (barely perceptible, but it's there). If you previously had VSYNC turned on in a game you often play, then you turn it off and use GSYNC instead, there will be a noticeable reduction in input lag, which might cause you to think input lag has been completely eliminated. There are other factors at play too—monitors which feature GSYNC are generally high refresh rate, low input lag, which probably compare favorably in that regard to whatever monitor they're replacing. Also, the complete elimination of screen tearing means that the image appears to be more synchronized with your mouse movements which can create a perceived reduction of input lag.

I've got a ROG Swift and it's nice but we're starting to see large 60hz 4k monitors with reasonably low input lag available for around the same price. I use two cheap-ish 30hz 4k TVs at work for productivity and while I wouldn't want those specific models for gaming due to their high input lag and low refresh rate, I can tell you that the picture quality, size and resolution make them much more enjoyable for general use than the Swift. So for a multi-purpose monitor, I'd consider holding off from buying any of the GSYNC options currently available...we might eventually get a large VA/IPS monitor with GSYNC, which would feel like much more of an upgrade than anything currently on the market. And if you've already got a fast 120hz non-GSYNC monitor, the $200 premium is not worth it unless you're also getting higher size and/or resolution along with the upgrade.
 
Last edited:
There's a misconception floating around that GSYNC gets rid of input lag. This is not true, if anything it adds a little (barely perceptible, but it's there). If you previously had VSYNC turned on in a game you often play, then you turn it off and use GSYNC instead, there will be a noticeable reduction in input lag, which might cause you to think input lag has been completely eliminated. There are other factors at play too—monitors which feature GSYNC are generally high refresh rate, low input lag, which probably compare favorably in that regard to whatever monitor they're replacing. Also, the complete elimination of screen tearing means that the image appears to be more synchronized with your mouse movements which can create a perceived reduction of input lag.

I think most people just mean to say it gets rid of the v-sync input lag, and it really does do that if you stay below the max refresh rate of the monitor. 60hz g-sync is smooth and responsive (at least on the g-sync monitors that do >60hz, IDK about that weird 4k panel), 60hz v-sync is horribly laggy. Sure g-sync adds its own lag but blur blusters has measured it to be virtually nothing, something so small no one should worry about it.
 
My problem is that I also need an HDMI port as I play on console as well, which is why I'm leaning towards the XL2420G...But 550€ basically just for the GSYNC seems a lot.

Another option could be the 21:9 29" 2560x1080 LG 29UM65 (not the Freesync one), which gives an entirely different experience at half the price (298€) with far wider FOV in games.
However, there are downsides to this as well as the average frame rate will undoubtedly be lower due to the higher resolution, and the tearing/stuttering problem will still be there with no GSYNC or FreeSync to boot.

What do you think? Which one should I choose?
 
My problem is that I also need an HDMI port as I play on console as well, which is why I'm leaning towards the XL2420G...But 550€ basically just for the GSYNC seems a lot.

Another option could be the 21:9 29" 2560x1080 LG 29UM65 (not the Freesync one), which gives an entirely different experience at half the price (298€) with far wider FOV in games.
However, there are downsides to this as well as the average frame rate will undoubtedly be lower due to the higher resolution, and the tearing/stuttering problem will still be there with no GSYNC or FreeSync to boot.

What do you think? Which one should I choose?
FWIW I thought the 29" LG looked too small when I saw it on display at a B&M store. It's about half an inch (15mm) shorter than a 24" 16:9 display.
 
FWIW I thought the 29" LG looked too small when I saw it on display at a B&M store. It's about half an inch (15mm) shorter than a 24" 16:9 display.

I'm not really concerned with height, in fact I prefer it to be shorter than my 16:10 display which currently, mounted to the wall, forces me to look up a bit.

However there can be actual gameplay advantages with 21:9 as you see a lot more, as shown in this Bioshock Infinite gif (the most narrow is 4:3, 16:9 is middle):

n2ZLcb.gif
 
29" ultra wide is 23" display with wider sides
34" is 27" with wider sides

Also 24" 16:10 will have height closer to 27" 16:9
 
Posted the acer 144hz g-sync ips thread:
The real benefits of G-Sync actually come when your frames are between 40-60fps. The perceived smoothness will be as good at 40fps as it is at 60, it's really the sweet spot.

If you can push over 100fps in 1440p you'd be better off using the UMLB mode instead.

I disagree. When people mention frame rate they really mean *average* frame rate which fluctuates dynamically quite a bit in demanding games. G-sync is great for the for the frame rate roller coaster at average frame rates well above 60fps-hz.

This graph is a good example
http://www.hardocp.com/image.html?image=MTQxNTI2MzIyN1ZrcExKTXo1Q2lfNF8xX2wuZ2lm

Current ULMB modemonitors mute the screen too much and does nothing for judder during fps swings. 100hz limit for strobing on the acer could be too slow making it borderline flashy as well. I prefer g-sync mode. At high fps, the motion clarity is more of a soften blur rather than a smear blur. You are also getting way more motion definition/articulation at higher hz-fps, and conversely losing out on both the motion definiton and blur reduction at 60hz-fps and less.


Yes , what I've read is that g-sync multiplies/dupes frames at sub 30fps so that the refresh rate doesn't crap out or isn't starved for frames to display.

Really much under 60fps average is a total freeze-frame zone though (frozen on the same world action state through multiple hz of screen updates). Personally I have zero interest in my frame-rate range even dipping that far down let alone playing at those numbers as the average frame rate with +/- frame rate roller coaster for pretty still-shot/screenshot graphics settings.
You don't start to notice an increase in hz motion flow and motion articulation until around 75 - 85hz/fps either, with 100 fps average being a more suitable target lower threshold vs graphics settings for a 5:3 ratio motion definition/articulation increase vs 60hz-fps. As discussed earlier, the higher frame rates are also required to get appreciable blur reduction.

Imo if you are getting that low of framerate vs your chosen settings, bottoming out more or less, you are probably better off getting a 1080p g-sync monitor where you would get a lot higher frame rate. Otherwise you would be losing out on the increased motion definition/flow/articulation/animation (motion definition), and blur reduction (motion clarity) that these modern monitors provide and would only be benefiting from the g-sync (lack of stutter/judder/or v-sync tradeoffs).

Less dot's per dotted line = less defined/articulated motion paths, less defined motion flow of individual objects and of moving the entire game world relative to your viewpoint when mouse-looking and movement-keying. Also potentially less animation cycle definition.

60hz-120hz-30-15_onesecondframe.jpg


(simulated graphic obviously)
120hz-vs-60hz-gaming.jpg


Yes I just wanted to be clear that in those scenarios people are essentially running a low hz monitor.
- Retarding motion definition/articulation.
- Losing any increase in motion clarity
(possibly making the blur even worse than a 60hz monitor at lower fpz-hz linked in g-sync, and in the acer's case with increased response times at lower hz ranges).

The main draw of this kind of monitor tech is 1440p at high hz with the lowest response time possible, with g-sync (and ulmb option) as a great additional feature.

I can understand the flexibility comment for some games like old code games and emulators that may be locked at low frame rates, but in cases where gamers can trade-off to a scenario of a crippled hz monitor at low frame rate for higher static graphics quality I wanted to be clear of what you would be losing - which ends up being most of the superior features of these modern gaming monitors. Also consider the cost of these monitors and the fact that you could get a much easier to render 1080p g-sync monitor for higher frame rates. Of course it's everyone's own money/monitors and they can do whatever they want with them :p
 
Back
Top