Nvidia G-Sync - A module to alleviate screen tearing

The display updates asynchronously to any internally-fixed interval.
There is no internally-fixed interval when G-Sync is active, that's the entire point... the refresh rate is kept perfectly synchronized to the current frame-rate being produced by the graphics card. The graphics card is the clock source for the display.

I don't know what you're not getting here, or how many more way there are to say it. The display is updated, synchronously, to the graphics card's output. End of story. There is NOTHING "asynchronous" going on...
 
During QA section:
02:05PM EDT - Sweeney: every display device, every platform (including mobile platforms), will need to adopt this technology over the next few years, it's really striking

02:04PM EDT - Sweeney: the rest of the industry needs to hardware up

02:04PM EDT - Johan: when people actually see/play it, there will be a lot of movement to transition to something like this (G-Sync) really quickly, then we can target games specifically for this

02:04PM EDT - Carmack: in the space of 5 years I'd hope that something like this is ubiquitous

02:03PM EDT - Carmack: I think this is going to be broadly picked up, it's not that expensive of a technology, I don't know NV's licensing costs, this is something that should be broadly adopted, this is just the right thing
 
I don't know what you're not getting here, or how many more way there are to say it. The display is updated, synchronously, to the graphics card's output. End of story. There is NOTHING "asynchronous" going on...
Fair enough. I concede the point.
 
What about Lightboost? Thats monitor specific and i have one. Its great. G-sync could be great, give it a chance.
 
it was so obvious thing to do for so many people... it's pathetic it took so much time for company like NV to figure out that it could be good selling point :eek:

it should provide v-sync alike feelings independently of framerate and without input lag. But unfortunatelly devil is in the details... it cannot flicker. Of it can but it would next to impossible to implement it in a way that would provide constant non fluctuating brightness

so all LightBoost fans out there: nothing to see, move along :eek:

BTW. I don't get this Carmack guy. Why he create idiotic limitations of 60fps in his engines when he like low input lag that much? :confused:
 
Well for g-sync u need 144hz, i have benq xl2411t and I don't understand why I should used v-sync??? For what? I don't have any stuttering or tearing because I don't have to use v-sync, maybe it will improve input lag but is it going to be a deal breaker?!

If they could pull off motion clarity like with lightboost with out the flickering shit i can't stand then it would be great. And I read that it only works with display port at the moment :/
 
Cool little niche I guess, I'll never give up my $300 2560x1440 Korean Display though!
 
I don't understand how people can use lightboost. It turns the display into a flickery seizure-inducing mess...


It puts your LCD at within a hair of a CRT if not better actually in refresh times, it just seems that way until you get re adjusted. Some people are so used to the slow refreshing LCD's of old that it takes awhile.
 
So this thing is a $100+ module? (At best, I believe $100 was touted as a "target" price, meaning it'll be higher indefinitely)

Good luck with that. I have my doubt the average person will care enough to pay the premium.

Plus, there needs to be some vendor agnostic standard for this, or it'll have a hard time coming down in costs. Too many with Intel and AMD GPU out there.
 
It puts your LCD at within a hair of a CRT if not better actually in refresh times, it just seems that way until you get re adjusted. Some people are so used to the slow refreshing LCD's of old that it takes awhile.
No, it doesn't... might want to check how a CRT works, because Lightboost is an entirely different experience.

LightBoost turns off the backlight during redraw. This means the entire display turns black 120 times per second.
Some (or many) pixels will potentially go from fully white to fully black 120 times per second, which is HIGHLY noticeable.

Compare this to a CRT, which redraws the screen top-to-bottom (and a good one will do it 120 times per second).
The image is scanned top-to-bottom (with the previous image slowly fading out ahead of it), rather than the entire screen being strobed all at once (which is what LightBoost does).
Phosphor does NOT turn black instantly like an LED backlight, it slowly fades out after being activated. Brighter colors / white more intensely activate the phosphor, leading to a longer fade-out for brighter pixels.

Now, you have to remember, human vision is contrast-based. Larger changes in contrast can be perceived at faster rates than small changes in contrast. A CRT automatically cushions harsh transitions (white-to-black) thanks to the inherent fade-out time of phosphor. An LCD with Lightboost cannot do this, the pixels immediately drop from white to black (and back again) all at once.

I don't think you've used a properly set up Lightboost display.
Sorry, yes, I have. Staring at an LED-backlit display flickering at 120 Hz was not doing my eyes any favors (the stroboscopic effect is simply too harsh, the rate isn't fast enough for persistence-of-vision to work given the stark changes in brightness). Far worse than any 120Hz CRT I've ever used.
 
Last edited:
As a long time FW900 user at 1920x1200 @ 96Hz, I find 120 Hz Lightboost (10% brightness) less fatiguing on my eyes. There may be something to be said about everyone being different. Stating "flickery seizure-inducing mess" is a bit over dramatic.
 
As a long time FW900 user at 1920x1200 @ 96Hz, I find 120 Hz Lightboost (10% brightness) less fatiguing on my eyes. There may be something to be said about everyone being different. Stating "flickery seizure-inducing mess" is a bit over dramatic.

How do you like Lightboost? Do you still use your FW900? I've currently packed mine away and am saving up money to send it in to Unkle Vito for some magic. :D
 
I predict this will be as successful as Killer NIC

I predict very soon you will feel VERY dumb about this clueless comment, comparing something which made no difference to something CLEARLY superior to not using it. Unless you are blind, there is no way you cannot see the benefits.
I also predict you are going to want to remove this post ashamed of it so I am quoting it so people can enjoy your moment of brilliance! :rolleyes:
 
Lightboost is pretty sweet, but you are stuck with a TN panel and all of it's flaws. I currently do not use a FW900, but it is a superb monitor. Really, the only things I don't like about it is that somewhat small 22.5" viewable and the "soft" image natural to all CRT's.
 
No, it doesn't... might want to check how a CRT works, because Lightboost is an entirely different experience.

LightBoost turns off the backlight during redraw. This means the entire display turns black 120 times per second.
Some (or many) pixels will potentially go from fully white to fully black 120 times per second, which is HIGHLY noticeable.

Compare this to a CRT, which redraws the screen top-to-bottom (and a good one will do it 120 times per second).
The image is scanned top-to-bottom (with the previous image slowly fading out ahead of it), rather than the entire screen being strobed all at once (which is what LightBoost does).
Phosphor does NOT turn black instantly like an LED backlight, it slowly fades out after being activated. Brighter colors / white more intensely activate the phosphor, leading to a longer fade-out for brighter pixels.

Now, you have to remember, human vision is contrast-based. Larger changes in contrast can be perceived at faster rates than small changes in contrast. A CRT automatically cushions harsh transitions (white-to-black) thanks to the inherent fade-out time of phosphor. An LCD with Lightboost cannot do this, the pixels immediately drop from white to black (and back again) all at once.

Sorry, yes, I have. Staring at an LED-backlit display flickering at 120 Hz was not doing my eyes any favors (the stroboscopic effect is simply too harsh, the rate isn't fast enough for persistence-of-vision to work given the stark changes in brightness). Far worse than any 120Hz CRT I've ever used.

No buddy, I do understand how they work. I'm a long time CRT guy. It's different technologies but Lightboost pulls it off at the expense of some dithering on the desktop depending on color profile. I should know, I've had one since they first came out.

VG248QE user. And remember, we're talking gaming here. Gaming lightboost to CRT, I see almost identical and I've used a shit ton of CRT's. For 2d, CRT's have the edge no doubt if you get a really good one, but an IPS would be better.

http://www.blurbusters.com/zero-motion-blur/lightboost-faq/
 
Fast IPS (or pretty much anything but TN) with G-Sync. That would make me reeeeeeeally excited.
 
To be honest, I figure that Nvidia and AMD would focus on other things then this crap. I think they are both trying really hard with these new consoles coming out in a few weeks.
 
So this thing is a $100+ module? (At best, I believe $100 was touted as a "target" price, meaning it'll be higher indefinitely)

Yup, sounds like nvidia's pricing structure!
 
This has exusted for years. Its called a 120hz monitor. Might b good for all you ips people out there tho. I have no stutter on anything or screen tearing. That is like half the point of 120hz+ lol.
 
Last edited:
I predict very soon you will feel VERY dumb about this clueless comment, comparing something which made no difference to something CLEARLY superior to not using it. Unless you are blind, there is no way you cannot see the benefits.
I also predict you are going to want to remove this post ashamed of it so I am quoting it so people can enjoy your moment of brilliance! :rolleyes:

Most people don't care about this feature and it will fall into a very small category of people who want and can afford it. 90% of the market will not buy into this.
 
Lightboost is pretty sweet, but you are stuck with a TN panel and all of it's flaws. I currently do not use a FW900, but it is a superb monitor. Really, the only things I don't like about it is that somewhat small 22.5" viewable and the "soft" image natural to all CRT's.

Interesting - I actually prefer the soft image that it displays.

On topic - hopefully this can evolve into being at the video card level.
 
"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better."
-- Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!"
-- Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter."
-- John Carmack, co-founder, iD Software

Rollout Plans by Monitor Manufacturers
Many of the industry's leading monitor manufacturers have already included G-SYNC technology in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Philips and ViewSonic.

"ASUS strives to provide the best gaming experience through leading innovations. We are excited about offering NVIDIA's new G-SYNC technology in a variety of ASUS gaming monitors. We are certain that it will impress gamers with its incredible step-up in smoothness and visual quality."
-- Vincent Chiou, associate vice president, Display Business Unit, ASUS

"We are extremely thrilled to build G-SYNC into our professional gaming monitors. The two together, offering gamers a significant competitive advantage, will certainly take PC gaming experience to a whole new level."
-- Peter Chen, general manager, BenQ Technology Product Center

"We can't wait to start offering Philips monitors with G-SYNC technology specifically for gamers. We believe that anyone who really cares about their gaming experience is going to want one."
-- Sean Shih, global product marketing director, TPV Technology, (TPV sells Philips brand monitors)

"Everyone here at ViewSonic is pumped about the great visual experience that G-SYNC delivers. We look forward to making the most of this technology with our award-winning line of gaming and professional displays."
--Jeff Volpe, president, ViewSonic

Enthusiasm by System Builders and Integrators
A variety of the industry's leading system builders and integrators are planning to make G-SYNC technology available in the months ahead. Among them are Digital Storm, EVGA, Falcon Northwest, Overlord and Scan Computers.

"A look at G-SYNC is a look at the future of displays. Having to go back to a standard monitor after seeing it will start to annoy you. It's that good."
-- Kelt Reeves, founder and CEO, Falcon Northwest.

"G-SYNC is a ground-breaking technology that will deliver a flawless gaming experience. We're hugely excited that our customers will be able to enjoy incredible gameplay at any frame rate without tearing or stutter."
-- Elan Raja III, co-founder, Scan Computers.
 
This has exusted for years. Its called a 120hz monitor. Might b good for all you ips people out there tho. I have no stutter on anything or screen tearing.
If you aren't locked at 120 fps and vsync'ed, then yes, you have tearing and stuttering.
 
31056697_u18chan.jpg


He doubts it will become a standard, but confirms that Nvidia says they will license G-SYNC. :)


Robert Menzel ‏@renderpipeline 5 t
@ID_AA_Carmack Will there be a standard to other GPU vendors can also support G-SYNC displays?

John Carmack
‏@ID_AA_Carmack
@renderpipeline I doubt it. Nvidia says they will license, but no idea on terms. We may see multiple implementations of similar ideas.

Digital Foundry ‏@digitalfoundry 5 t
@renderpipeline More likely that Nvidia will license the tech was the impression I got at the event.

https://twitter.com/ID_AA_Carmack/status/391300867447853056
 
Probably alot, sounds like Nvidia 3D Vision all over again.

G-Sync is practical, 3D vision is a gimmick. G-Sync improves many negative aspects of display technology industry wide. No comparison whatsoever.
 
Wonder how much cost nvidia for those quotes ;) jk

Here is Anand's comment on demo:
The G-Sync system, once again, handled the test case perfectly. It delivered the same smoothness and visual experience as if the we were looking at a game rendering perfectly at a constant 60 fps. It's sort of ridiculous and completely changes the overall user experience. Drops in frame rate no longer have to be drops in smoothness. Game devs relying on the presence of G-Sync can throw higher quality effects at a scene since they don't need to be as afraid of drops in frame rate excursions below 60 fps.

Switching gears NVIDIA also ran a real world demonstration by spinning the camera around Lara Croft in Tomb Raider. The stutter/tearing effects weren't as pronounced as in NVIDIA's test case, but they were both definitely present on the traditional system and completely absent on the G-Sync machine. I can't stress enough just how smooth the G-Sync experience was, it's a game changer.
 
Interesting technology out of left field. I'm not sure I agree with what seem like over-the-top assertions of how much of a problem screen refreshing is, but it's certainly interesting anyway. It's almost as if they're trying to do to frame rate what high resolutions did to jagged lines.

Unfortunately, this seems sort of like high sound quality—you don't know what you're missing until you try it out, but it costs money to try out, so no one tries it out. It's particularly tough to ask people to buy new displays. If there were an external module that we could plug in between the Displayport connector on the monitor and the Displayport connector on the GPU, then it might take off.
 
I so hope we can add this to our existing displays!!!

Also, will this allow us to finally watch film and videos at native frame rates?
 
I wonder how AMD with its somewhat still broken frametime issue will compete with this....
 
I think an external module would be FAR less successful than an integrated solution. Not only that, but it seems to me, based on my understanding of how the technology is suppose to work, that an external module would introduce more latency, not less, all the while not actually doing anything more than applying it's own vsync clock to the monitors native refresh rate. G-Sync from what I'm gathering, dynamically adjusts the monitors native refresh rate to whatever the FPS is.
 
I remember years ago when I got my first LCD I noticed this screen tearing stuff immediately... I had trouble explaining even what this phenomenon was to my friends who didn't notice it or weren't bothered by it...

I pulled out my ancient 'Sony 17" Multisync' crt monitor ($1000 back in the day, lol) to compare, and sure enough there was no tearing and it was nice and smooth.

That's when I realized ok somethings up with LCD's... Then I learned all about v-sync and the benefits/negatives of that, finally I got triple buffering to work and I could somehow accept this will be as good as it gets..

Anyway I haven't gamed in a long time but I read about the new adaptive v-sync which I hear works well... But it's come to this, obviously someone at Nvidia is trying to solve the real issue here which I applaud I just hope this gets standardized among all monitors eventually and this will be a thing you tell your kids about...
 
I remember years ago when I got my first LCD I noticed this screen tearing stuff immediately... I had trouble explaining even what this phenomenon was to my friends who didn't notice it or weren't bothered by it...

I pulled out my ancient 'Sony 17" Multisync' crt monitor ($1000 back in the day, lol) to compare, and sure enough there was no tearing and it was nice and smooth.

That's when I realized ok somethings up with LCD's... Then I learned all about v-sync and the benefits/negatives of that, finally I got triple buffering to work and I could somehow accept this will be as good as it gets..

Anyway I haven't gamed in a long time but I read about the new adaptive v-sync which I hear works well... But it's come to this, obviously someone at Nvidia is trying to solve the real issue here which I applaud I just hope this gets standardized among all monitors eventually and this will be a thing you tell your kids about...

It annoys me to no end that LCD's have been around for so long and are extremely popular, yet the advancement in LCD technology, particularly when it comes to monitors, has been excruciatingly slow.
 
G-Sync is more than just elimination of screen tearing. It also lowers input lag and most importantly, eliminates stuttering for a very smooth image.

Input lag is reduced as the monitor now immediately draws the frame the instant it is received. With a set refresh rate, frames are only displayed at perfectly set intervals. The GPU kicks out the frames as fast as possible, the monitor displays the frames as fast as possible. All with no screen tearing and great smoothness. Win/Win/Win.
 
G-Sync is more than just elimination of screen tearing. It also lowers input lag and most importantly, eliminates stuttering for a very smooth image.

Input lag is reduced as the monitor now immediately draws the frame the instant it is received. With a set refresh rate, frames are only displayed at perfectly set intervals. The GPU kicks out the frames as fast as possible, the monitor displays the frames as fast as possible. All with no screen tearing and great smoothness. Win/Win/Win.

Sounds like Lightboost 2.0 then.
 
Back
Top