Nvidia G-Sync - A module to alleviate screen tearing

Anandtech also did a piece on this and there was one thing that really stood out:

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

So it might indicate that they aren't using specific hardware on the GPU-side, just software. On the screen though, you need a new controller because it must accept the variable v-blank signal.

Also this:

I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

Too bad it requires a specific set of monitors.
 
So it might indicate that they aren't using specific hardware on the GPU-side, just software. On the screen though, you need a new controller because it must accept the variable v-blank signal.

Given that this can work with cards from the GTX 660 on up, it has to be software on the GPU side; Nvidia certainly didn't travel back in time and add a chip to the PCB of cards it released two years ago.

Too bad it requires a specific set of monitors.

Agreed. I bought my Dell U2413 less than a year ago. I'm very unlikely to buy a new monitor just for G-Sync. Might try to get one for Christmas, but certainly won't come out of my own money.
 
Anandtech also did a piece on this and there was one thing that really stood out:

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

So it might indicate that they aren't using specific hardware on the GPU-side, just software. On the screen though, you need a new controller because it must accept the variable v-blank signal.

Also this:

I can't stress enough just how smooth the G-Sync experience was, it's a game changer.

Too bad it requires a specific set of monitors.

At the recent NVidia event they were asked if they could open G-Sync up to work on all cards, including AMD, but they were coy to say if they would do that. They probably won't.
 
Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

That was kind of my first thought. This tech is viable to become an open standard feature compatible with DP.

Display makers likely don't want this tech to be limited to one GPU vendor with additional licensing costs, considering how beneficial this tech is. Of course this would require the open standard being developed and adopted and this usually happens a lot quicker if someone picks up the baton (such as AMD with regards to 4k and VESA Display ID v1.3.)

AMD will want to utilize this kind of technology eventually. They could join the game with the same proprietary approach as nVidia, but then display makers would need to pay an additional licensing fee and since nVidia's already in the game they wouldn't gain much over simply making their version open source (which display makers would likely be more willing and quicker to adopt).

Either way, will be interesting to see what happens in the future with this technology.
 
but there's already a fix for this.. cap fps at 60~61 and use vsync, no input lag and vsync is still on

edit: apparently you're insinuating that by capping your framerate at 60-61fps, you don't need to turn vsync on

this is 100% false. you will still get tearing if you cap your framerate (with Rivatuner for example) to 60-61fps, because the monitor will still be getting frames out of sync with the graphics card

try it if you don't believe me, I've done it, Deus Ex: HR as well as Dark Souls, both capped at 60fps, but still had tearing.

your solution doesn't solve the inherent problem with current display tech, in that when your fps drops below the refresh rate of your monitor, your game will feel "choppy"

you do realize that gsync makes it nearly imperceivable that your game is dropping framerate
 
edit: apparently you're insinuating that by capping your framerate at 60-61fps, you don't need to turn vsync on

this is 100% false. you will still get tearing if you cap your framerate (with Rivatuner for example) to 60-61fps, because the monitor will still be getting frames out of sync with the graphics card

try it if you don't believe me, I've done it, Deus Ex: HR as well as Dark Souls, both capped at 60fps, but still had tearing.

your solution doesn't solve the inherent problem with current display tech, in that when your fps drops below the refresh rate of your monitor, your game will feel "choppy"

you do realize that gsync makes it nearly imperceivable that your game is dropping framerate


Well everyone percieves things different. Different eyes. Some people dont even care about 60 fps. I can notice all the way to 120 and beyond. Lightboost 10/65 with color profile at 120 hz i notice no flicker. Im a big time crt guy. If you notice flicker or problems, stick to crt or ips.
 
Given that this can work with cards from the GTX 660 on up, it has to be software on the GPU side; Nvidia certainly didn't travel back in time and add a chip to the PCB of cards it released two years ago.

Maybe it's because DisplayPort became standard on Kepler.

I know my Fermi card doesn't have DP (nor do my monitors) so I don't see how it could work regardless in my case.
 
Maybe it's because DisplayPort became standard on Kepler.

I know my Fermi card doesn't have DP (nor do my monitors) so I don't see how it could work regardless in my case.

Could be that as well.

G-Sync is slowly intriguing me more and more. If this becomes part of the Displayport standard…
 
edit: apparently you're insinuating that by capping your framerate at 60-61fps, you don't need to turn vsync on

this is 100% false. you will still get tearing if you cap your framerate (with Rivatuner for example) to 60-61fps, because the monitor will still be getting frames out of sync with the graphics card

try it if you don't believe me, I've done it, Deus Ex: HR as well as Dark Souls, both capped at 60fps, but still had tearing.

your solution doesn't solve the inherent problem with current display tech, in that when your fps drops below the refresh rate of your monitor, your game will feel "choppy"

you do realize that gsync makes it nearly imperceivable that your game is dropping framerate

? reading compression fail ?

I said cap fps at 60~ and vsync on..

gsync seems good but they are asking people to get new monitors when this could be solved with proper drivers or gpu output
 
I said cap fps at 60~ and vsync on..

That's not nearly as good as G-sync, you still get input lag and choppy feeling when the framerate drops below 60.

G-sync is done on the monitor side, you clearly don't understand how it works if you think it could simply be a software solution. The nvidia guy said it clearly on the vid : the idea is to make the monitor a slave of the GPU instead of the other way around, so the monitor needs to support this, which is why we need new monitors. It's not complicated really. At least to me it was crystal clear after watching the vid (& listening), even though it was a crappy youtube vid at 30fps you could still clearly tell the difference.

http://hardforum.com/showthread.php?t=1786697 > check this thread maybe it will help you understand.
 
Last edited:
? reading compression fail ?

I said cap fps at 60~ and vsync on..

gsync seems good but they are asking people to get new monitors when this could be solved with proper drivers or gpu output

Why do you think this can be solved with software?

Nvidia is making a custom circuit board that needs to be added to a display because the display manufacturers (aside from BenQ and to a lesser extent Asus) have been stagnant on addressing gaming concerns.
 
I notice [micro] stuttering quite easily and it does irritate me. I notice tearing too but it doesn't bother me nearly as much. I think the tech is interesting, I just don't think it will really take off unless it sees mass adoption, and by mass adoption I mean essentially becoming a market standard for monitors. And because it's NVidia, that probably means they want to charge way too much for manufacturers to incorporate this technology into their displays, which means most won't.

I'm not sure about that.

The G-Sync module would be installed in place of the monitor's current DP/display interface. You're substituting one component in manufacturing for another. Will it cost more? Yes, but not as much as it otherwise could have.

The essential thing is to make this a standard for Displayport going forward. Give Nvidia their penny or two as inventors, use economies of scale to drive down the cost of the module.

This should definitely be included in 4K monitors going forward, I'm thinking.
 
Some people just don't get this technology and just how much this changes the gaming landscape on it's head. It's like a complete rewrite. Nvidia is one hell of an innovative company, you got G-sync, Lightboost, 3d Vision, Project Shield, it was technically the first company to coin the term "GPU" with the Geforce 256, first with Transform & Lighting (T&L)....you gotta hand it to them, whether you're an AMD fan or a Nvidia fan, they have cemented their place in history and have changed the path of gaming so many ways.
 
Last edited:
I have been wanting something like this since I first started gaming 10 years ago. its amazing that it took this long to come up with this but I am glad they finally did.
 
Um, it's recorded by a 120 Hz camera and slowed down to 1/4 speed. I take it you guy's didn't eve watch the video as the effect is very apparent. ;)
I watched but I also remember them saying we cant see the full benefits with it being recorded like that. to be clear I can easily see it bit I am saying it will look better in person of course.
 
That video is a pretty good example but we will have to see it in person to really tell the difference based on what people say.
 
Um, it's recorded by a 120 Hz camera and slowed down to 1/4 speed. I take it you guy's didn't eve watch the video as the effect is very apparent. ;)

Youtube reencodes at 30FPS.

Yes, the effect is visible in a decent quality video, but it's not the same as seeing in person.
 
Yes, but not frames, in this case.

It's still not an accurate representation of the original 30fps content. You're asserting we're seeing the same thing the uploaders did after dropping down from 120Hz before uploading. We're not.
 
It's still not an accurate representation of the original 30fps content. You're asserting we're seeing the same thing the uploaders did after dropping down from 120Hz before uploading. We're not.

why would we not be?

if the source is 30fps, and the target is 30fps...
 
why would we not be?

if the source is 30fps, and the target is 30fps...

because 30fps on a 60hz monitor is choppy
30fps on a G-Sync'd monitor is smooth

but a video recorded from a camera aimed at the gsync monitor is going to look like 30fps on a 60hz monitor (choppy)
 
Watching this video in full screen:

http://www.youtube.com/watch?v=NffTOnZFdVs&hd=1

I don't think there will be much of a motion clarity increase like in Lightboost, but the screen tearing and smoothness improvements look quite substantial.
OMG
NV is doing great stuff here but they just couldn't resist not to cheat... :mad:

just look at it, it freaking run at 60Hz, the left monitor I mean
proper comparsion would be 144Hz vs G-Sync and in that case tearing would be more reduced 2,4 times and hence difference would be much harder to notice for inexperienced eyes, and especially on some YT video
 
OMG
NV is doing great stuff here but they just couldn't resist not to cheat... :mad:

just look at it, it freaking run at 60Hz, the left monitor I mean
proper comparsion would be 144Hz vs G-Sync and in that case tearing would be more reduced 2,4 times and hence difference would be much harder to notice for inexperienced eyes, and especially on some YT video
so that is cheating? this tech is not just for 144 hz screens only. and tearing is only part of it too as its the stuttering when you do have vsync on that is aggravating too.
 
OMG
NV is doing great stuff here but they just couldn't resist not to cheat... :mad:

just look at it, it freaking run at 60Hz, the left monitor I mean
proper comparsion would be 144Hz vs G-Sync and in that case tearing would be more reduced 2,4 times and hence difference would be much harder to notice for inexperienced eyes, and especially on some YT video

why would nvidia try to showcase this technology against a monitor most people have never even seen before (144hz)?

most people have 60hz monitors, it's a fair comparison
 
there will not be monitor with G-Sync with slower panel than 6.9ms for quite some time, probably none in 2014
also 60Hz monitor owners ARE NOT INTENDED TARGET GROUP for G-Sync
it is targeted at 120/144Hz freaks, not your ordinary I don't see point of 120Hz folks

monitors with G-Sync will be fewer and more expensive than 120/144Hz and definitely aimed at hard-core gamers, especially online gamers

so yes, that is a cheat
If I was there I would kindly asked them to run left monitor at 144Hz and then never get invitation to another NV event again :p
 
I see your point but I don't think its a big deal as this is not just about tearing. plus many games will never maintain 80-90 fps nevermind 120 or 144. and you will have odd games like Skyrim that cant really be run over 60 fps any without some issues.
 
why would nvidia try to showcase this technology against a monitor most people have never even seen before (144hz)?

most people have 60hz monitors, it's a fair comparison

Have to agree here, i think the 120hz gamers are very small compared to 60hz gamers.

I mean hell you can get 1080p 60hz monitors brand new for $100-120. I don't think you can even get a 1080p 120hz for under $300 right? (unless on sale)

Very fair comparison.
 
I see your point but I don't think its a big deal as this is not just about tearing. plus many games will never maintain 80-90 fps nevermind 120 or 144. and you will have odd games like Skyrim that cant really be run over 60 fps any without some issues.
120/144Hz is not about having to push 120/144fps but about:
- input lag reduction (v-sync both on and off)
- tearing reduction (if v-sync if off)
- stutter reduction
namely the same things that G-Sync does but to a lesser extent

the fact is that 100Hz is ennough for most picky people like myself and while motion is not perfectly smooth it is below irritation threshold point. So if they showed 144Hz vs G-Sync most people would have to look very hard to notice any difference imho. Tears in this video would disappear 2.4 times faster and would be hardly noticeable most of the time, especially without 4x slowdown

I am not trying to say G-Sync is useless, just that if someone is not 120/144Hz freak already then it should just not care about G-Sync at all. And ignorance of most people in this thread suggest that they are just random people without any interest in this. Comments like 'nice but I'd have to buy more expensive monitor....' suggest that those people don't even know what this technology is about. On the other hands comments like 'I WILL buy G-Sync monitor' shows that those people DO KNOW what it is about ;)

---------------------------------
Have to agree here, i think the 120hz gamers are very small compared to 60hz gamers.

I mean hell you can get 1080p 60hz monitors brand new for $100-120. I don't think you can even get a 1080p 120hz for under $300 right? (unless on sale)

Very fair comparison.
not fair at all
when do you suppose we will have $100 G-Sync monitors? in 5 years? in 10 years?
I assume we it will never go that low. It will always be expensive gamer oriented equipment and this comparison is bogus cause it compares $100 monitor performance with $300 monitor + $130 module :eek:
 
not fair at all
when do you suppose we will have $100 G-Sync monitors? in 5 years? in 10 years?
I assume we it will never go that low. It will always be expensive gamer oriented equipment and this comparison is bogus cause it compares $100 monitor performance with $300 monitor + $130 module :eek:

If you are a gamer who is buying a 120/144hz monitor for $300+. You know your buying it with a premium price and what you are buying it for.

If you are willing to spend an extra $100-150 to get a 120hz monitor over a normal 60hz monitor, You are willing to think about spending another $130 for the module.

But again only maybe 1% of the pc gamers buy 120/144hz.
 
And ignorance of most people in this thread suggest that they are just random people without any interest in this. Comments like 'nice but I'd have to buy more expensive monitor....' suggest that those people don't even know what this technology is about. On the other hands comments like 'I WILL buy G-Sync monitor' shows that those people DO KNOW what it is about ;)
Talking about random people, who are you to tell us that we "know" what G-sync is about, purely based on the will to buy proprietary technology that is also a nice vendor lockin to boot ?
 
Back
Top