Variable Refresh Rate Monitors (G-Sync) --- Refresh Rate Varies While You Play!!!

Mark Rejhon

[H]ard|Gawd
Joined
Jul 6, 2004
Messages
1,395
This is an amazing MONITOR technology that nVidia just invented --
nVidia G-Sync is a variable-refresh-rate monitor technology, that allows the monitor to immediately display frames "on-the-fly" from the GPU. Refreshing of the screen is no longer at discrete intervals! The refresh rate is no longer an exact metronome. The monitor is refreshed synchronously with frame rate!

gsync.jpg


nVidia:
http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming

HardOCP:
http://www.hardocp.com/news/2013/10/18/nvidia_introduces_gsync_technology

Blur Busters:
http://www.blurbusters.com/nvidia-g-sync-variable-refresh-rate-monitors/

AnandTech Live Blog:
http://www.anandtech.com/show/7432/nvidia-montreal-event-live-blog

IGN Article:
http://www.ign.com/articles/2013/10/18/nvidias-g-sync-could-eliminate-pc-game-stuttering-forever

This is a MONITOR technology, because of custom monitor modifications needed.
New monitors will be required for nVidia G-Sync.
It is a variable-refresh rate technology (asynchronous monitor refreshing!).
Refresh rates are no longer a discrete schedule with nVidia's G-Sync.

Blur Busters has commented on its pros:
http://www.blurbusters.com/nvidia-g-sync-variable-refresh-rate-monitors/

AnandTech has the best explanations (great screenshots of nVidia's powerpoint presentation)
http://www.anandtech.com/show/7432/nvidia-montreal-event-live-blog

Pros:
* The pros of VSYNC ON combined with the pros of VSYNC OFF -- best of both worlds
* Lower input lag at all framerates
* Eliminates stutters during varying framerates
* Eliminates tearing
* Varying framerates now looks much better

Interesting Behavior:
* Display motion blur now becomes directly proportional to framerate. (in non-strobed mode). Display motion blur (sample-and-hold) gradually reduces the higher the framerate you go, up to a certain limit (144Hz). Just like 120fps@120Hz has half the display motion blur of 60fps@60Hz, you now have a continuously variable display motion blur, up to the display's maximum framerate/refreshrate. It's like displays finally got CVT that runs at all times while you play a game. Continuously variable transmission, instead of "gears" (60Hz, 75Hz, 85Hz, 100Hz, 120Hz) that requires you to pause a game to switch.

Cons:
* Motion blur won't be better than LightBoost. At best, it's similar to 144Hz.
..... (until the G-Sync max framerate limit is raised, e.g. future 240Hz/480Hz monitors)
* nVidia Lock-in (which may not be a problem for some)

Wishlist:
* LightBoost combined with G-Sync. Variable-rate strobing is reasonably practical above a certain frame rate (requires ultra-precise strobe modulation to prevent brightness udulations during variable frame rates).
---OR---
* Variable refresh rate monitor with a higher frame rate limit than 144Hz, for PWM-free flicker-free LightBoost-like motion clarity.
This is harder because Flickerfree LightBoost-like clarity won't occur until approximately 400fps@400Hz (and up), and current LCD panels cannot yet be refreshed at that frequency yet.

In fact, John Carmack actually mentioned combining strobing and G-Sync, so it might be eventually possible too. I, of Blur Busters fame, understand the visual science concept of nVidia's G-Sync methodology. It's a good stepping stone to tomorrow's "Holodeck" (unlimited-refresh-rate displays that no longer requires the "CRT bandaid" of strobing to eliminate motion blur) in long-term humankind progress.
 
Last edited:
Photographs from the liveblog are very self explanatory for the technologically-minded (Blur Busters Squad fully understands -- as does display engineers, and people with good understanding of displays)

NVMontreal-099_575px.jpg


NVMontreal-098_575px.jpg


NVMontreal-097_575px.jpg


NVMontreal-096_575px.jpg


NVMontreal-095_575px.jpg


NVMontreal-094_575px.jpg


Variable Refresh Rate Monitors (e.g. G-Sync) allow synchronizing the variable framerate of the game, to a monitor, completely eliminating stutters, completely eliminating tearing, keeping input lag low.

And, technologically it's theoretically strobe-compatible (With special LightBoost modifications) assuming a minimum framerate (to avoid repeat strobes), although I'm not sure if variable-rate strobing will be included as a feature (yet), but the fact that a few big names such as John Carmack talked about it already, is damn exciting to me as Chief Blur Buster. Clever engineering of strobe length allows variable strobing to look unnoticeable (at least above flicker fusion threshold) with no brightness or flicker udulations, but it's quite difficult monitor engineering. (e.g. Electronics Hacking: Creating a Strobe Backlight shows the complex engineering that goes into high-efficiency strobe backlights).
 
I am incredibly excited for this technology, this fixes many of my issues with modern gaming. I plan to hold off on my next display purchase so i can get something with this, and my next video card purchase is gonna have to be nvidia.

The days of lowering settings just so you don't dip below and stutter are gone, i think we'll be getting a huge perceivable performance increase in games as well.
 
I'll be the first to admit that when I first read the terribly unfortunate branding "G-Sync", the first place my mind went was NOT computing or monitor related.
 
Pros:
* Lower input lag at all framerates

Correct me if I'm wrong here. Vsync off will send the output buffer as it's being updated, hence tearing. Gsync will only send the last completed frame, even if a new one has almost finished.

This can be seen in the following slide:

6p0x.jpg


While the 3rd frame is being sent, a 4th frame is being generated. It may finish just after initiating the 3rd frame output to monitor, but will sit buffered. The scan time will always be default refresh rate, or 8.33ms for 120hz.
 
Correct me if I'm wrong here. Vsync off will send the output buffer as it's being updated, hence tearing. Gsync will only send the last completed frame, even if a new one has almost finished.

This can be seen in the following slide:

6p0x.jpg


While the 3rd frame is being sent, a 4th frame is being generated. It may finish just after initiating the 3rd frame output to monitor, but will sit buffered. The scan time will always be default refresh rate, or 8.33ms for 120hz.
Correct, frame transmission time of G-Sync is limited by whatever the current dotclock is.

The dotclock used with G-Sync is currently 144Hz (the current G-Sync maximum framerate), so you've got frame transmission times of 6.9ms, but tomorrow's G-Sync monitors will probably have frame transmission times of 240Hz (and up). G-Sync is a stepping stone to tomorrow's infinite-framerate Holodeck displays. Within a few years, vendors will be forced to come up with faster methods of transmitting frames to the display (e.g. parallel DisplayPort 2.0 channels), to raise maximum framerates and reduce input lag even further.

However, look closely. The slide you said "Less Lag", which is exactly true. It won't eliminate frame transmission time completely, but it will eliminate frame waiting time during all framerates less than refresh rates. 45fps won't "round-off" to the next refresh rate schedule, so you get 45fps with lower input lag.

So both myself and nVidia is correct: You get lower input lag. Mathematically speaking, there is a theoretical average of approximately 3.5ms more input lag than 1000fps@144Hz VSYNC OFF, since during that time, every scanline is from a frame that's generated 1ms or less ago (spliced into the current refresh). Halftime of 6.9ms is about 3.5ms. However, the advantage of stutter-free / tearing-free operation will almost alway outweigh that 3 millisecond difference in theoretical ultra-elite VSYNC OFF (1000fps@144Hz) versus 144fps@144Hz operation; since you play more accurately when not disrupted by stutters. This has less input lag than LightBoost, which still improves Battlefield3 scores because the improved human performance outweighs LightBoost's very minor sub-frame added input lag (see www.blurbusters.com/lightboost/testimonials ). Even pro game players have commented they generally like G-Sync, although over time, we shall see if there are ultra-elite players that perform better without G-Sync.

Frame transmission times will become faster. Scan times will become faster (today, 240Hz+ LCD's already exist for HDTV's, and theoertically can be migrated to the desktop eventually). Eventually it won't matter once frame transmission times becomes faster, as G-Sync opens the door to future unlimited-refresh-rate technologies with faster panel technologies (e.g. future generations of OLED, blue-phase LCD, etc) as the fast-motion benefits are still human-perceptible all the way up to 1000Hz and beyond due to the sample-and-hold effect. Flickerfree (non-strobed) G-Sync won't match LightBoost until G-Sync is able to operate at least 400fps@400Hz (to approximately match the motion blur reduction created by LightBoost's 1/400th second strobe length). For CRT quality motion and LightBoost=10%, you will need flickerfree 700fps@700Hz (1.4ms per frame) to flickefree 1000fps@1000Hz (1.0ms per frame) in order to match ideal LightBoost motion blur quality without using strobing. LightBoost strobe length varies from 1.0ms to 1.4ms depending on LightBoost monitors during LightBoost=10%. (I've since discovered some LightBoost monitors can have strobe lengths as small as 1.0ms). Strobing motion blur elimination is a stopgap to the ideal theoretical unlimited-framerate displays. CRT (and its flicker) motion clarity is a bandaid to the ideal theoretical unlimited-framerate displays. VSYNC ON lag won't matter during 1000fps@1000Hz because it'd now add only 1ms of lag. Just like many people now agree that 1000Hz computer mice does have less lag than 125Hz computer mice. The silly "120Hz is beyond human vision limits" doesn't apply; as motion blur still occurs 120Hz->240Hz->480Hz->960Hz on sample-and-hold displays (mathematically, 1ms of static frametime translates to 1 pixel of motion blur during 1000 pixels/second motion).

Blur Busters agrees that variable refresh rate displays are an appropriate stepping stone to the Holodeck future. I did not anticipate it would happen this soon; although the 144Hz limit may not be raised high enough for quite a while.
 
Last edited:
I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync:

From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.
 
When would you strobe the backlight for lightboost using a variable refresh rate display? From BlurBusters: "Time your LED backlight strobe circuit to flash the backlight towards the end of the blanking interval, preferably partially overlapping the start of next refresh by a fraction of a millisecond". With a variable refresh rate it would not be possible to know how long until the next refresh starts. So would you strobe at the start of the next refresh knowing that the pixel transitions have started during the strobe, or delay the refresh by the strobe length and introduce a very small amount of additional lag? This matters less as strobe times decrease, but shortening them also has other side-effects.

Interesting that someone else had the same idea (and name) for variable refresh rate back in March (here). Not sure who actually came up with the idea/name first, but this potentially limits Nvidia's patents and trademarks.

I'm also curious about how RTC would be implemented with a variable refresh rate. Traditionally the overvoltage has been tuned to the refresh rate, and if the refresh rate is increased this can actually increase the pixel response times (link). One possible solution is to have the panel refresh its voltages internally at regular intervals (like the Eizo FDF-2405W), but then you have both new-frame refreshes and voltage-update refreshes that need to be handled simultaneously by the controller.

Hopefully variable refresh rates will make it obvious how much of a data bottleneck current display cables are. Without the refresh rate being artificially limited it should now be possible to update a display as fast as the link allows, meaning that with more exotic controllers using multiple cables we could get 120+Hz at 4k (or more) very quickly. If the display controller becomes the bottleneck it should even be possible to move some of the display update duties to the GPU, meaning that the display itself no longer handles refreshing, strobe timing, RTC, etc....
 
I can't give credit to companies like Nvidia for pushing proprietary stuff it should be universal for other cards that are not Nvidia
 
I did not anticipate it would happen this soon; although the 144Hz limit may not be raised high enough for quite a while.

Unfortunately one large issue with going above 144 Hz from a total machine concept is the CPU. GPU's may be fast enough to run at higher and higher refresh rates, but CPU's have pretty much stagnated and there are none fast enough to feed GPU's in order to get 200+ FPS in modern titles. CPU's over the last 3 years are within a 10% performance margin which is really sad. The demand on the CPU only increases, leaving the dream of super fast refresh rates above 144 Hz not practical at this point in time. Of course you can then go into frame interpolation territory, but that is another thing altogether.
 
Last edited:
I'm extremely excited about the potential for this tech and I think it's a given that it won't stay "just with Nvidia." I could see this benefiting TVs, monitors, you name it.
 
We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.
wow, it that was true that would be quite some reason to get into G-Sync bandwagon for sure :)
 
Anyone have any clue when we will see monitors that support this? Nvidia mentions an ASUS display will have Gsync built in but I can't find any info on when it will be for sale:confused:
 
Can also hope that future gsync-monitors will be sold both with and without the kit installed, so you can transfer your own kit if you already own it.
 
Regarding light boost:
http://www.neogaf.com/forum/showpost.php?p=86572603&postcount=539

AndyBNV said:
We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.

(When asked about lightboost)
That's a definite dead giveway for variable-rate strobing. The question is; will it flicker during strobe rate changes? Or did nVidia already come up with an algorithm that prevents noticeable flicker during variable-rate strobing? If so, then I'd love to know -- and I can give them due credit for finally making strobe backlights an official feature!

Unfortunately one large issue with going above 144 Hz from a total machine concept is the CPU.
Fortunately, that won't be necessary if we can combine G-Sync + strobing!

Blur Busters is trying to confirm.
 
NV did some cheating on their side with these comparsions :mad:
they run left monitor at 60Hz instead of 144Hz increasing tearing and choppiness by a factor of 2.4x

it is easy to say because at if they run monitor at 144Hz and demo/game is running less than 60Hz then at least one of two frames would show no tearing and here we have almost all frames tear hence it must run 60Hz.
 
NV did some cheating on their side with these comparsions :mad:
they run left monitor at 60Hz instead of 144Hz increasing tearing and choppiness by a factor of 2.4x

it is easy to say because at if they run monitor at 144Hz and demo/game is running less than 60Hz then at least one of two frames would show no tearing and here we have almost all frames tear hence it must run 60Hz.

I wouldn't say it's cheating, vast majority of people have 60hz monitors. You could say they picked an extreme example but that's what they should do IMO, so than more than just a handful of geeks can tell the difference.
 
I don't like the relation between monitors and gaming.
This means that we must change monitors more often and that we cannot use professional monitors to play games in such a good way.

I don't like this.
 
in any case, what's the point of a monitor that removes tearing if exist vsync?

vsync works well with nvidia, ati and any other possible combinations, why have a monitor that remove tearing on an nvidia GTX Nxx series only?
 
in any case, what's the point of a monitor that removes tearing if exist vsync?

vsync works well with nvidia, ati and any other possible combinations, why have a monitor that remove tearing on an nvidia GTX Nxx series only?

Money for Nvidia.
 
in any case, what's the point of a monitor that removes tearing if exist vsync?

vsync works well with nvidia, ati and any other possible combinations, why have a monitor that remove tearing on an nvidia GTX Nxx series only?
if you can push 60fps constant and totally don't mind horrendous input lag then yes, v-sync is the way to go... for you :D
 
Nvidia is pushing too technologies that brings no real improvements to customers but only more money to shell out.
I don't like this nvidia behaviour.
bad nvidia, they should make this technology and give away for free

because you know companies are founded to spend money and not earn them...
 
if you can push 60fps constant and totally don't mind horrendous input lag then yes, v-sync is the way to go... for you :D

Gsync is an amazing technology for the nvidia pockets. No benefits in an hardware vsync while we have the software one and no need to reduced lag while we have fast monitors that works with every cards from every brands. This tech has no sense to me apart of creating a stronger bond with nvidia customers. Suppose that today I buy an nvidia cards with an nvidia ready monitors, what happens if ATI will do a good card and I want to upgrade to it? surely I can think twice before swtiching to an ati if I have an nvidia monitor. I repeat. I see no real improvements for end users, only for nvidia pockets.
 
in any case, what's the point of a monitor that removes tearing if exist vsync?

vsync works well with nvidia, ati and any other possible combinations, why have a monitor that remove tearing on an nvidia GTX Nxx series only?

You don't understand the tech at all then. V-sync has so many shortcomings and is such a poor solution, there is no comparison with something like this. It's been explained before, the idea is to make the monitor a slave of the GPU (it's no longer the GPU that waits for the monitor when using v--sync) >>>>> perfect smoothness and no tearing no matter how fluctuating the framerate may be - and we know how fluctuating it can be in graphically intensive games.
 
Last edited:
I don't like the relation between monitors and gaming.
This means that we must change monitors more often and that we cannot use professional monitors to play games in such a good way.

I don't like this.

It's not like the industry does this to be mean.
It wasn't like this during the CRT-era, the only reason it is like that now is because the professional displays are almost always IPS/VA, which are too slow to display motion properly.

Vsync has many flaws that can't be fixed, such as huge input lag, fixed framerate that you must greatly exceed on everage, else minimum fps will cause drops to 50% or 2/3 of the framerate depending on buffering.

Gsync is like vsync, but done right without any problems. Vsync is so problematic that most people don't use it.
 
You don't understand the tech at all then. V-sync has so many shortcomings and is such a poor solution, there is no comparison with something like this. It's been explained before, the idea is to make the monitor a slave of the GPU (it's no longer the GPU that waits for the monitor when using v--sync) >>>>> perfect smoothness and no tearing no matter how fluctuating the framerate may be - and we know how fluctuating it can be in graphically intensive games.

It's not like the industry does this to be mean.
It wasn't like this during the CRT-era, the only reason it is like that now is because the professional displays are almost always IPS/VA, which are too slow to display motion properly.

Vsync has many flaws that can't be fixed, such as huge input lag, fixed framerate that you must greatly exceed on everage, else minimum fps will cause drops to 50% or 2/3 of the framerate depending on buffering.

Gsync is like vsync, but done right without any problems. Vsync is so problematic that most people don't use it.

every games use triple buffers today, so no tearing and no framerate cap by vsync also on rig that cannot afford stable 60fps.
 
this is free market not communism and NV is actually doing something to help gamers and AMD only helps miners (only reason for AMD card ;) )

it is better to have NV-only option than no option at all. If someone don't like it then he can wait for AMD based solution for few next years... or decades :)

if my FW900 dies then I will buy G-Sync monitor and GeForce to drive it for sure
shut-up-and-take-my-money-530x298.jpg
 
I can guarantee you that AMD will have an equivalent of this in short order. The Nvidia haters need to cool off. The promise behind this kind of tech is too big to ignore and it won't stay "proprietary" for long. Relax.

Now I guess I'm going to splash a little cold water around today just on this point:

My excitement for this is tempered simply on the idea that at least for the next year if not longer the only way I'm going to be able to get to this is buying a TN monitor. I know I'm hardly the only one that deflates on that prospect so I can only hope this concept takes off quickly across the boards.
 
The promise behind this kind of tech is too big to ignore and it won't stay "proprietary" for long. Relax.

Can you elaborate what is the promises that are too big?
The solution to a problem that isn't a problem since it has a software solution?
 
Can you elaborate what is the promises that are too big?
The solution to a problem that isn't a problem since it has a software solution?

You obviously don't have even a basic understanding of the topic to be making such silly comments.
 
Back
Top