Variable Refresh Rate Monitors (G-Sync) --- Refresh Rate Varies While You Play!!!

So it does go lower despite having a higher brightness? Just overall it's (can be) brighter?

I just read reviews on New egg and Amazon saying it's too bright and some people have it down to zero and it's still too bright.

The problem is I'm sensitive too brightness so I make a big stink about it. I would say something on the Asus forums about it =)
 
It's a fair question. Some monitors don't allow you to lower brightness below a certain threshold, or more commonly, they can't lower the backlight outside a narrow range, instead they just start crushing the brightness range. The VG248QE does not do this, and you can tell by looking at measure contrast ratio over a brightness range. Minimum brightness (full white) looks to be about 80cd/m2, which is pretty dim. If this is too bright, you are probably playing in total darkness, and I highly recommend against this for eyestrain reasons. You should leave some small amount of bias lighting on to allow your eyes to focus on something other the screen from time to time (i.e. look away and focus on something else from time to time). Without this, you could easily develop vision issues quite quickly.

Short answer - mostly you don't have to worry about a monitor being too bright (unless it's poorly designed), it can be dimmed to your liking up to a limit.
 
K thanks for that I don't play in total darkness with my current monitor a 21.5 led lcd Asus but it has a overall lower brightness. I do have a Antec bias light and two sets of backups.
I just don't want to adjust for web pages and then adjust for games....
 
I guess this monitor can be updated with a Gsync chip manually when it comes out !
 
So, this may be a stupid question. But wouldn't G-Sync cause the same problems that people experience with PWM? I mean it's essentially going to flicker the screen at 60-120hz and people are complaining about 250+hz flicker from PWM. It may "cure the blur" but wouldn't it be at the cost with the side effects of PWM?
 
G-Sync doesn't flicker the screen, you're thinking of strobed backlighting. At the moment, they are separate technologies. You can't use one at the same time as the other.
 
So, this may be a stupid question. But wouldn't G-Sync cause the same problems that people experience with PWM? I mean it's essentially going to flicker the screen at 60-120hz and people are complaining about 250+hz flicker from PWM. It may "cure the blur" but wouldn't it be at the cost with the side effects of PWM?

It's not a stupid question. There are 2 different effects you are describing here: G-Sync and backlight strobing.

All G-Sync does is allow the display's pixel values to be updated at irregular intervals, regardless of strobing or other factors. This is completely compatible with a non-flickering display.

Strobing turns on the backlight for only part of each frame, giving a very strong PWM effect, but also eliminating much of the blur from sample-and-hold and pixel transitions. If you can stand to look at a display doing this (I can't), then it can be beneficial for motion effects.

AFAIK there is nothing stating that G-Sync must be combined with strobing, and there is no reason they cannot be combined either. That said, there are still challenges to be overcome when combining them (e.g. what to do about flicker if the framerate drops too low, how to adjust strobing brightness with varying framerate).
 
It's not a stupid question. There are 2 different effects you are describing here: G-Sync and backlight strobing.

All G-Sync does is allow the display's pixel values to be updated at irregular intervals, regardless of strobing or other factors. This is completely compatible with a non-flickering display.

Strobing turns on the backlight for only part of each frame, giving a very strong PWM effect, but also eliminating much of the blur from sample-and-hold and pixel transitions. If you can stand to look at a display doing this (I can't), then it can be beneficial for motion effects.

AFAIK there is nothing stating that G-Sync must be combined with strobing, and there is no reason they cannot be combined either. That said, there are still challenges to be overcome when combining them (e.g. what to do about flicker if the framerate drops too low, how to adjust strobing brightness with varying framerate).
That's what I get for not reading fully, I thought it was being combined. Thanks for setting me straight :)
 
They are separate features like he said. g-sync's backlight strobing mode is supposed to be "superior to lightboost". There is also the Eizo backlight strobing VA panel which works regardless of gpu camp. When you strobe at very high refresh rates like 120hz, you get what is (exemplified in pursuit camera photos) essentially zero blur of the viewport during FoV movement in 1st/3rd person perspective games. As long as the refresh rate is very high (100hz - 120hz), strobing shouldn't be an issue provided the monitor can maintain decent brightness. 60 - 75hz strobing would be bad though.
.
edit: Backlight strobing essentially eliminates FoV movement blur and high speed object blur in more static scenes. A good example of this is on the blurbusters site where Mark R used a pursuit camera. Unlike the lightboost photo on the linked page, the Eizo FG2421 and assumingly the "better than lightboost" backlight strobing function of gsync (on 120HZ monitors!) shouldn't suffer extreme dimming and color washouts.

http://www.blurbusters.com/faq/60vs120vslb/

If I were to use the phrase "eliminates much of the blur" rather than in essence, all of the blur (very nearly so), I would be talking about 144hz without backlight strobing which removes 60% of the blur. 120hz non backlight strobed eliminates 50% of the blur.

120hz pursuit cam of a lightboost hack setup (suffers dimming, muted colors, but shows amount of blur elimination):
http://www.blurbusters.com/wp-content/uploads/2013/05/CROPPED_LightBoost50.jpg

baseline - 60 Hz mode (16.7ms continuously-shining frame)
- the worst blur "outside of the lines"/shadow masks of everything in the viewport
- the lowest definition motion/animation, worse accuracy/timing/reaction time due to slower and less frequent action updates shown.
50% less motion blur (2x clearer) - 120 Hz mode (8.33ms continuously-shining frame)
60% less motion blur (2.4x clearer) - 144 Hz mode (6.94ms continuously-shining frame)
85% less motion blur (7x clearer) - 120 Hz LightBoost, set to 100% (2.4ms frame strobe flashes)
92% less motion blur (12x clearer) - 120 Hz LightBoost, set to 10% (1.4ms frame strobe flashes)

blurbusters quote
There is extremely little leftover ghosting caused by pixel transitions (virtually invisible to the human eye), since nearly all (>99%+) pixel transitions, including overdrive artifacts, are now kept unseen by the human eye, while the backlight is turned off between refreshes.
 
Last edited:
http://www.anandtech.com/show/7582/nvidia-gsync-review

http://www.tomshardware.com/reviews/g-sync-v-sync-monitor,3699.html

http://www.guru3d.com/articles_pages/nvidia_g_sync_review_guide,1.html


All of those articles focus on variable hz function of g-sync and not the supposed "superior to lightboost" backlight strobing option. The articles say "30 to 40 fps is 'fine'", with 40 being the sweet spot. I would disagree. These same people complain about marginal input lag milliseconds, yet accept long "freeze-frame" milliseconds with open arms in order to get more eye candy. I think people will be cranking up their graphics settings and getting 30 - 40fps. At 30fps you are frozen on the same frame of world action for 33.2ms while the 120hz+120fps user sees 4 game world action update "slices". At 40fps you are seeing the same frozen slice of game world action for 25ms, while the 120hz+120fps user see 3 action slice updates. This makes you see new action later and gives you less opportunities to initiate action, (less "dots per dotted line length") then you add input lag to your already out of date game world state you are acting on. Additionally, the higher hz+higher frame rates provide an aesthetically smoother control, aesthetically smoother higher motion definition and animation definition. Of course 120hz also cuts the continual FoV movement blur of the entire viewport by 50% as well, and backlight strobing at high hz eliminates FoV blur essentially.
 
Most people game at 30-60fps. This isn't set to change anytime soon, which is a combination of poor monitor availability -- not everyone wants to live with a crappy 1080p TN monitor, they really are borderline unusable for anything other than being in the middle of, specifically, a first and maybe third person action game -- high gpu requirements to consistently get 120fps, continually increasing graphics power requirements especially now that we are seeing the console bottleneck finally widen with next gen titles, etc.

G-sync is Nvidia's attempt to change the way graphics cards communicate with monitors, across the board. While it will begin as an expensive enthusiast feature on certain monitors, the GOAL is to improve the visual quality of gameplay for everyone who plays games as g-sync hardware becomes cheaper and more ubiquitous.

This really has nothing to do with 120fps+120hz gaming. You can argue that the whole industry should hold at 1080p and current graphical quality and focus purely on improving framerates, but, good luck making that argument...it isn't going to happen. It makes perfect sense to focus on improving the quality of 30-60fps gaming.
 
I'm more interested in high fps and zero blur obviously, even if I have to turn down the ever higher *arbitrarily set by devs* graphics cieling "carrot" that people keep chasing (that ceiling could be magnitudes higher if they wanted).
I still play some "dated" games too.. fps is high.
.
Yes people will still do this and that.. and most will be satisfied with low fps and 720p or 1600x900 on demanding console games, with unsophisticated controller capability on ghosting, blurring , and often input lag ridden tvs. That doesn't mean it is the better route to take.
.
1080p is the same exact scene in HOR+ at 16:9 , which is almost every 1st/3rd person perspective game and every virtual camera render. The difference is the amount of pixels in the scene obviously. This is a big difference but a much bigger difference for desktop/app real-estate than games vs gpu budgets/fps.
.
You are seeing multiple frames skipped and behind a 120hz+120fps user, watching "freeze-frames" for 25ms to 33.2 ms at 30fps and 40fps, and every time you move your FoV you are smearing the entire viewport into what can't even be defined as a solid grid resolution to your eyes/brain. So much for high rez.
I think people are sacrificing a lot motion, animation, and control wise aesthetically as well as sacrificing seeing action sooner and being given more and sooner opportunities to initiate actions - to reach for higher still-detail eye candy aesthetically.
You don't play a screen shot :b

.
 
Last edited:
To review,
- every time you move your FoV greater than a snails pace on a sub 100hz, non backlight strobing monitor you drop to such a low rez that it isn't even definably a solid grid to your eyes and brain. So continual bursts/path-flow of the worst resolution possible more or less, the entire viewport dropping all high detail geometry and textures (including depth via bump mapping) into a blur.

-at low hz and low fps, you are at greatly reduced motion definition and control definition.
Greatly less the amount of new action/animation/world state slices shown, seeing longer "freeze frame" periods during which a high hz+high fps person is seeing up to several newer updates.
1/2 the motion+control definition and opportunities to initiate actions in response at 60hz-60fps
1/3 the motion+control definition and opportunities to initiate actions in response at 40.
1/4 the motion+control definition and opportunities to initiate actions in response at 30.

-you need at least 100hz to support backlight strobing for essentially zero blur (120hz better).
-you can upscale 1080p x4 fairly cleanly on higher rez 3840x2160 (aka "quad HD") monitors if you have to, its not optimal but it can work
(so you can game at higher fps/lower rez on demanding games yet still use a high rez monitor for desktop/apps for example)

-the eizo FG2421 is a high hz 1080p VA panel that uses backlight strobing, it isn't TN.
- we know that nvidia is still supposed to support backlight strobing function as part of g-sync monitors, just that it won't work with the dynamic hz function (at least not for now). So "the industry" is still addressing backlight strobing for zero blur in both the eizo and the g-sync strobe option (which again, requires higher hz to make the strobing viable).
-We know there are higher rez and likely ips g-sync monitors due out, but we do not know if they will have the max hz bumped up which is necessary to utilize the backlight strobe function adequately.

There is more to a game than a screen shot resolution/definition.
There is continual FoV movement blur (an undefinable"non"definition resolution, unless perhaps you were to equate it to an extremely bad visual acuity number /"out of focus")
There is otherwise essentially zero blur using high hz and backlight strobing,
and there is high or low action and motion definition, animation definition, and control definition.
 
Last edited:
baseline - 60 Hz mode (16.7ms continuously-shining frame)
- the worst blur "outside of the lines"/shadow masks of everything in the viewport
- the lowest definition motion/animation, worse accuracy/timing/reaction time due to slower and less frequent action updates shown.
50% less motion blur (2x clearer) - 120 Hz mode (8.33ms continuously-shining frame)
60% less motion blur (2.4x clearer) - 144 Hz mode (6.94ms continuously-shining frame)
85% less motion blur (7x clearer) - 120 Hz LightBoost, set to 100% (2.4ms frame strobe flashes)
92% less motion blur (12x clearer) - 120 Hz LightBoost, set to 10% (1.4ms frame strobe flashes)


Figure out the difference between "as clear as" and "clearer" as there is a 100% difference. Not sure why nobody in the world seems to get this right (seriously, I almost never see anyone understand the distinction), but I seriously hope you're not an engineer. If you are, such a mistake could literally be fatal.
 
That is a direct quote from Mark r of blur busters in his testing, taken from the original light boost 'hack' threads. I pasted it as is. The point is, blur is reduced by 50 to 60 percent at higher hz, and is essentially eliminated at high hz + aggressive back light strobing.
 
good stuff. I may have to log in to his site later to post a follow up question in one of the replies on that site.
.
They are discussing the fact that overlord is working on building a 120hz 2560x1440 ips with g-sync. For some of us, that wouldn't be as big of a deal unless it actually supported the zero blur mode. If they are just adding a g-sync board for dynamic refresh capability to a "standard" korean overclockable/120hz ips, I doubt that the backlight would be capable of it since it wasn't designed for strobing backlight like a 3d capable monitor or the eizo fg2421 are. I fear that the upcoming higher resolution g-sync monitors will all be the same in this regard, supporting the dynamic refresh function but having backlights incapable of zero blur g-sync "superior to lightboost" aka "low persistence" function option.
 
So, it's not possible to just add it to a non-Gsync monitor even by some qualified engineer?..
 
How long till we are able to buy a g-sync monitor? I need another monitor and I'm debating waiting till a g-sync monitor is available.

Also, I always see "lower" input lag. Does that mean g-sync will still have a bit of lag to it compared to no v-sync/g-sync?
 
How long till we are able to buy a g-sync monitor? I need another monitor and I'm debating waiting till a g-sync monitor is available.

Also, I always see "lower" input lag. Does that mean g-sync will still have a bit of lag to it compared to no v-sync/g-sync?

G-sync will be lower input lag than any other mode because it will immediately display the frame instead of waiting for the next interval.
 
Will the initial round of Gsync monitors be TN? Any IPS or VA gsync monitors announced?
 
There's only one 144Hz TN panel annouced so far.
We are probably looking at end of 2014 for semi decent choice of g-sync model.
 
May sound like a silly question, but does this restrict multiple-monitor usage at all? Assuming the variability would be uniform across monitors?

Just noticing that the places slated to sell the new GSync monitors generally only allow you one monitor.
 
Back
Top