Nvidia G-Sync - A module to alleviate screen tearing

Personally I wish Nvidia would release a video card that could do 4K @120Hz or more. This seems like more of a Moore's Law bandaid. Or a remedy for the garbage Rage and ID Tech 5 became. Maybe I'm being too cynical, but this doesn't excite me. I just don't see everyone running out to buy new monitors especially if you already invested in a 120Hz monitor.

The next innovation in monitors will be getting rid of TN panels; not propping the manufacturers up for another decade.
 
Well if its doing 144hz and input lag reduced and variable vsync, I'd say G-sync is LB 2.0.
 
Personally I wish Nvidia would release a video card that could do 4K @120Hz or more. This seems like more of a Moore's Law bandaid. Or a remedy for the garbage Rage and ID Tech 5 became. Maybe I'm being too cynical, but this doesn't excite me. I just don't see everyone running out to buy new monitors especially if you already invested in a 120Hz monitor.

The next innovation in monitors will be getting rid of TN panels; not propping the manufacturers up for another decade.


What does G-sync have to do with Rage? LOL. Rage ran buttery smooth 60 fps it was just ugly and boring.
 
LightBoost is designed to address pixel persistence to reduce ghosting in 3D. This is more like Adaptive Vsync 2.0.

Lightboost (with the proper 1ms monitor) reduces input blur, Gsync reduces input lag... I think they will work best in tandem. Thats what will be pretty much a perfect setup. Lightboost plus G-sync and a couple solid GPU's.
 
Yes, it appears a lot of people are getting G-Sync and Lightboost confused. They are two totally separate technologies, but both improve displays greatly.
 
If you aren't locked at 120 fps and vsync'ed, then yes, you have tearing and stuttering.

No you dont maybe in some games that exceed 120fps but if u r under there is no tearing. Even bfbc2 which is one of the worst games for screen tearing for me I notice nothing. Even at 200fps lol with a 60hz with no vsync it was discusting amount of tearing. I dunno its just my experience I have yet to notice any screen tearing on my previous benq xl2420t or my current asus 144hz. Part of the reason I got rid of my 1440p after 2 days. 120hz or none.
 
The myth of no tearing below the refresh rate has been debunked about seven million times. Nothing changes as a result of going to 120 fps: you simply get (roughly) twice the number of tear lines appearing for half as long.
 
G-Sync is practical, 3D vision is a gimmick. G-Sync improves many negative aspects of display technology industry wide. No comparison whatsoever.

Gimmick? As always spoken by someone with absolutely no idea what he is talking about. A gimmick..something that actually lets you experience 3d worlds in 3D instead of freaking old 2d flatlands....it baffles me how people seem to have no common sense any more..lets keep creating more awesome cards which allow much more realistc 3D worlds but lets keep experiencing them in 2D...yeah dude it makes tons of sense lets do that! :rolleyes:

Both are good things to have and totally unrelated. One helps fix annoying issues with frames dropping, the other makes us get closer to seeing the 3D world the way they should be seen.
This is why we take forever to move forward....
 
Interesting technology out of left field. I'm not sure I agree with what seem like over-the-top assertions of how much of a problem screen refreshing is, but it's certainly interesting anyway. It's almost as if they're trying to do to frame rate what high resolutions did to jagged lines.

Unfortunately, this seems sort of like high sound quality—you don't know what you're missing until you try it out, but it costs money to try out, so no one tries it out. It's particularly tough to ask people to buy new displays. If there were an external module that we could plug in between the Displayport connector on the monitor and the Displayport connector on the GPU, then it might take off.

I think you're missing the bigger picture on why this excitement isn't "over the top"

On a traditional LCD display, drops below 60 frames per second (on a 60hz monitor) will cause the overall scene to "stutter" or jump around. Try aiming accurately when you're used to a constant 60 frames per second, and all of the sudden your frame rate drops to the 30-40 range. Mouse movement (or "Aiming", I should say) starts "stuttering", which disallows the accuracy you're used to at 60 frames. G-Sync essentially makes it nearly imperceivable that your framerate is dropping, and thus continues a smooth motion

this is seriously huge, whether people realize it or not. it completely changes what "playable framerate" means

you have two problems with LCD's

if your fps goes above refresh rate, you get "tearing"
if your fps goes below refresh rate, you get "stuttering"

neither of these should happen, but both do because of the way LCD's have been designed

gsync makes it so neither happen, regardless of what your framerate is
 
Last edited:
While many are excited about the tech, let's not forget a few things:
- you need a specific kind of monitor (from Philips, BenQ, Asus or ViewSonic)
- or you need to "mod" your monitor (no news which might work, except for a single Asus model)
- you need a specific type of GPU (Nvidia GTX 660 or newer)
- if you go to an AMD card, your monitor will (probably) work but as a normal one
- nothing has been said on multimonitor I believe
- you don't get noticeable lag for good fps (>60fps), but you do for using V-sync
- some games have a frame limiter built-in which already reduces the need for V-sync

While I like the tech and what it achieves, I can't (personally) condone this type of vendor-lockin and premium hardware requirements needed for a technology that someday will be replaced and improved upon. Maybe next year there will be G-sync 2.0 and you'll have to get entirely new hardware to circumvent some of the downfalls that have come to light.
 
Months ago, I created a paper, Electronics Hacking: Creating a Strobe Backlight, from my old Arduino scanning backlight experiments before LightBoost became popular. (I also used to work in home theater equipment manufacturing, and am very familiar with how LCD displays works, and understand nVidia's G-Sync fully).

I have quickly invented a new idea of combining PWM-free with LightBoost, while having G-Sync:
New Section Added to "Electronics Hacking: Creating a Strobe Backlight"

To the best of my knowledge, no patents exist on this, and not even John Carmack appears to have mentioned this in his twitch.tv video when he mentioned combining LightBoost with G-Sync. So I'm declaring it as my idea of a further improvement to nVidia G-Sync:

From: New Section in "Electronics Hacking: Creating a Strobe Backlight"

With nVidia’s G-Sync announcement, variable refresh rate displays are now a reality today. Refresh rates can now dynamically vary with frame rates, and it is highly likely that nVidia has several patents on this already. If you are a monitor manufacturer, contact nVidia to license this technology, as they deserve kudos for this step towards tomorrow’s perfect Holodeck display.

However, one additional idea that Mark Rejhon of Blur Busters has come up with is a new creative PWM-free-to-strobing dynamic backlight curve manipulation algorithm, that allows variable-rate backlight strobing, without creating flicker at lower frame rates.

It is obvious to a scientist/engineer/vision researcher that to maintain constant perceived brightness during variable-rate strobing, you must keep strobing duty cycle percentages constant when varying the strobe rate. This requires careful and precise strobe-length control during variable refresh rate, as the display now refreshes dynamically on demand rather than at discrete scheduled intervals. However, a problem occurs at lower framerates: Strobing will cause uncomfortable flicker at lower refresh rates.

Mark Rejhon has invented a solution: Dynamic shaping of the strobe curve from PWM-free mode at low framerates, all the way to square-wave strobing at high framerates. The monitor backlight runs in PWM-free mode during low refresh rates (e.g. 30fps@30Hz, 45fps@45Hz), and gradually become soft gaussian/sinewave undulations in backlight brightness (bright-dim-bright-dim) at 60fps@60Hz, with the curves becoming sharper (fullbright-off-fullbright-off) as you head higher in framerates, towards 120fps@120Hz. At the monitor’s maximum framerate, the strobing more resembles a square wave with large totally-black-gaps between strobes.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

If nVidia or any monitor manufacturer uses this idea (if no patent application dated before October 19, 2013 covers my invention), please give Mark Rejhon / Blur Busters appropriate due credit. It is realized nVidia has several patents, but none appears to be covering this additional improvement being suggested during combining strobing and variable refresh rates. As of this writing, research is being done on any prior art, to determine whether anyone dynamically considered blending from PWM-free to square-wave strobing. If anyone else already came up with this idea, already documented in a patent application prior to October 19, 2013, please let me know & due credit will be given here.
 
Wonder how Lightboost will fit into this equation...

Flickering which is what LightBoost do can work because you have constant refresh rate and because of that you get constant time between refreshes and if you flash for constant amount of time after constant time after you got frame rendered (to hide pixel transition times) you get constant brightness

with G-Sync you refresh rates are not constant anymore and so times between refreshes are unknown. This could be solved by displaying image (backlight on) longer if current frame was rendered longer so that overal brightness is the same. It is only solution as messing with back-light voltage is out-of question.

If eg. 100Hz flickering for 1ms have identical brightness as 50Hz of 2ms this could actually work and give the best gaming experience unmatched even by fastest CRTs :)

But there would probably be a lot of drawbacks and things to consider. How eyes would react to such inconstant flickering? If brightness would be perceived the same? I highly doubt it would work without issues...

Now, I am not saying those issues would be deal breaker. I am only afraid that NV won't invest time and money into creating problematic technology. Besides LightBoost is not official blur reduction technology but mere hack so it is not like NVidia is pro-flickering company...
 
Wonder how Lightboost will fit into this equation...

Flickering which is what LightBoost do can work because you have constant refresh rate and because of that you get constant time between refreshes and if you flash for constant amount of time after constant time after you got frame rendered (to hide pixel transition times) you get constant brightness

with G-Sync you refresh rates are not constant anymore and so times between refreshes are unknown. This could be solved by displaying image (backlight on) longer if current frame was rendered longer so that overal brightness is the same. It is only solution as messing with back-light voltage is out-of question.

If eg. 100Hz flickering for 1ms have identical brightness as 50Hz of 2ms this could actually work and give the best gaming experience unmatched even by fastest CRTs :)

But there would probably be a lot of drawbacks and things to consider. How eyes would react to such inconstant flickering? If brightness would be perceived the same? I highly doubt it would work without issues...

Now, I am not saying those issues would be deal breaker. I am only afraid that NV won't invest time and money into creating problematic technology. Besides LightBoost is not official blur reduction technology but mere hack so it is not like NVidia is pro-flickering company...
 
Love the 1 year G-Sync warranty. Installation likely voids the manufacturers warranty and the 1st monitor this will be available for, the Asus VG248QE has abysmal Lightboost colors.

The 2nd gen 120hz TN panels (Asus VG236H, Planar SA2311W, Acer GN245, Samsung Series 7 & 9) all have much better colors vs. the current 144hz options.

Wait until some decent 144hz TN panels are released (it's been 2 years) with built in G-Sync.
 
With G-Sync you refresh rates are not constant anymore and so times between refreshes are unknown. This could be solved by displaying image (backlight on) longer if current frame was rendered longer so that overal brightness is the same.
I've solved the flicker problem of combining G-Sync with LightBoost.
See my previous post.

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

Either way, duty cycle would be dynamically manipulated to maintain average constant brightness on a time-basis. This requires precise control of backlight. Strobe curve would be calculated based on the time between current refresh and the immediately-preceding refresh.

See: http://www.blurbusters.com/faq/creating-strobe-backlight/#variablerefresh
 
The 2nd gen 120hz TN panels (Asus VG236H, Planar SA2311W, Acer GN245, Samsung Series 7 & 9) all have much better colors vs. the current 144hz options.
A number of users of the XL2420TE has mentioned it has better colors than both XL2420T and VG248QE. It is an interesting question in how they compare to previous panels, but I thought I'd mention relative differences between current 24" panels, too.

Options in triple-digit-Hz-world is expanding a little faster now, thanks to increased publicity, so innovation is going to speed up.
 
Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60Hz — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost
it would be nice but I don't suppose it is doable without running at nonlinearity issues of variable voltage applied to LEDs. It could be countered with calibration though and because we know it add to the cost it make it highly unlikely to happen

Also eyes are not exactly linear devices so it all needs further testing how it will work in practice
 
And while AMD is increasing performance with their proprietary stuff, Nvidia still worries about crap like this...
Damn, they're hitting rock bottom...
 
Personally I wish Nvidia would release a video card that could do 4K @120Hz or more. This seems like more of a Moore's Law bandaid. Or a remedy for the garbage Rage and ID Tech 5 became. Maybe I'm being too cynical, but this doesn't excite me. I just don't see everyone running out to buy new monitors especially if you already invested in a 120Hz monitor.

The next innovation in monitors will be getting rid of TN panels; not propping the manufacturers up for another decade.

The next innovation in monitors should be getting rid of the LCT tech altogether, hopefully...
We've had OLED and SED for ages already...
 
And while AMD is increasing performance with their proprietary stuff, Nvidia still worries about crap like this...
Damn, they're hitting rock bottom...
You mean AMD is still worring about crap like like this and Nvidia is increasing costs with proprietary stuff ?

Because AMD still hasn't solved frame pacing completely (multi-monitor for example) and Nvidia seems to keep making proprietary stuff like this G-Sync and Project Shield.
 
Damn, there's so many trolls and clueless people in this thread. At least watch the videos or read the articles in depth before posting random blabbering. No, having a 120hz or 144hz monitor with lightboost is not "better" than this, and no, CRTs were not tearing free either. If you think that you don't understand G-Sync.

And no this isn't just about gaming, this tech will also be EXTREMELY useful for video playback. Having to fiddle around with custom resolution timings and v-sync to get smooth video playback can be a right pain and some monitors/TVs aren't even capable of doing anything other a fixed 60 or 120hz refresh rate and thus they fail at smoothly rendering all content (50 fps vid on 120hz = bad for example, or 24fps on 60hz etc).

This is absolutely groundbreaking if they make no mistakes in the implementation. Combining this tech with 120hz+ would of course be great for those games you can play at 120fps+ but this is not the most important part here - what's important is that even when your rig can't handle a constant 60 or 120fps you can still enjoy silk smooth gameplay. For graphically intensive games this is absolutely phenomenal. Even if you have the most powerful rig you will still get framerate fluctuations and v-sync is a poor solution with heavy drawbacks (mostly true in multiplayer games but personally I also hate v-sync in single player FPSes and other fast-action games)
 
You mean AMD is still worring about crap like like this and Nvidia is increasing costs with proprietary stuff ?

Because AMD still hasn't solved frame pacing completely (multi-monitor for example) and Nvidia seems to keep making proprietary stuff like this G-Sync and Project Shield.
Increasing costs? As if anyone would bother paying more for a Nvidia that goes less than an AMD card.
Unless AMD is completely retarded and relies on Mantle to compete, it's gonna completely obliterate Nvidia in the next years...
Performance over gimmicks...
 
Is just sad to see how much people don't realize the benefits of this new technology.

Its either that they are not able to perceive the stutter/tearing or they are so ...... that they will not embrace or at least give the benefit of the doubt to a great new concept.

So this raises to me the following question.

Is there really a lot of people that can't really tell the difference on a video game when there is stutter/tearing??:confused:
 
Something that will need this badly are the new consoles.
Too bad they arent using NVidia tech.
 
Something that will need this badly are the new consoles.
Too bad they arent using NVidia tech.

That's true, consoles run at low fps and suffer from high latency, this tech could greatly improve console gaming.
 
Most latency is due to televisions having motion compensation, dynamic contrast and other analysing technology that introduce the delays.
 
And while AMD is increasing performance with their proprietary stuff, Nvidia still worries about crap like this...
Damn, they're hitting rock bottom...

Mate I think you need to Delete your "kache" cause is full of nonsense, old, lame arguments. :rolleyes:
 
but there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on
 
but there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on
setting cap above refresh frequency with v-sync on doesn't do anything at all
also if game is coded by retards and use double-buffering with v-sync then such cap won't help with sudden framedrops to 30fps which is general problem with most games and v-sync

best solution to use v-sync without too much lag is program called D3DOverrider
there are also other Tripple Buffering enablers but somehow only D3DOverrider does it without excessive input lag
 
I wonder if someone will find a way to get this to work with a Korean Monitor.
 
I see no problem whatsoever if you cap fps and use vsync, at least using two 7950 cards, not sure about nvidia though.

"setting cap above refresh frequency with v-sync on doesn't do anything at all" err.. ofc it does, try using vsync on a game that goes at very high fps, it will have massive input lag, the only problem I see nowadays is lcd tech, I still remember how much better crt's were for fps games..
 
I'm all for G-Sync, as long as it also comes with PWM-free backlighting.

No buddy, I do understand how they work. I'm a long time CRT guy.
Then you're going to have to explain why my 19" Viewsonic CRT running at 120 Hz doesn't murder my eyes, but all three LightBoost monitors I've played with in person have had obvious strobe...

I already went over the differences between a CRT and a LightBoost monitor, and I think it's pretty obvious what's going on, but you must have some other idea...

Edit: Most PWM backlights bother my eyes too. I have three Dell U2412M monitors right now, and I have to run them at at least 85% brightness to be usable, and 90+% brightness before the flicker is no-longer apparent. That's a 430 Hz backlight...
 
Last edited:
Increasing costs? As if anyone would bother paying more for a Nvidia that goes less than an AMD card.
Unless AMD is completely retarded and relies on Mantle to compete, it's gonna completely obliterate Nvidia in the next years...
Performance over gimmicks...
lol. its no fun when you are that obvious.

and last I looked Nvidia most certainly can sell equally performing cards at higher prices. but yeah you are right, only AMD innovates and Nvidia will not even exist in a few years. :rolleyes:
 
but there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on
why would you say such a thing? you clearly have no idea how things work. read up a bit more and you will see why gsync should greatly improve the gaming experience. of course based on comments I see all the time, most people cant even figure out the simple concept of adaptive vsync so probably no hope for understanding gsync.
 
Is just sad to see how much people don't realize the benefits of this new technology.

Its either that they are not able to perceive the stutter/tearing or they are so ...... that they will not embrace or at least give the benefit of the doubt to a great new concept.

So this raises to me the following question.

Is there really a lot of people that can't really tell the difference on a video game when there is stutter/tearing??:confused:

I notice [micro] stuttering quite easily and it does irritate me. I notice tearing too but it doesn't bother me nearly as much. I think the tech is interesting, I just don't think it will really take off unless it sees mass adoption, and by mass adoption I mean essentially becoming a market standard for monitors. And because it's NVidia, that probably means they want to charge way too much for manufacturers to incorporate this technology into their displays, which means most won't.
 
Back
Top