Nvidia G-Sync - A module to alleviate screen tearing

there will not be monitor with G-Sync with slower panel than 6.9ms for quite some time, probably none in 2014
also 60Hz monitor owners ARE NOT INTENDED TARGET GROUP for G-Sync
it is targeted at 120/144Hz freaks, not your ordinary I don't see point of 120Hz folks

monitors with G-Sync will be fewer and more expensive than 120/144Hz and definitely aimed at hard-core gamers, especially online gamers

so yes, that is a cheat
If I was there I would kindly asked them to run left monitor at 144Hz and then never get invitation to another NV event again :p

Eh? Putting out some questionable info here.

1st: Those tears are pretty fast for a 1/4 speed video. What makes you think the left monitor is running at 60 Hz? You have any proof or just pulling that out of your ass? If it in fact was 60 Hz, there is nothing wrong with that. 99+% of monitors out there are 60 Hz displays. NVIDIA and AMD keep pushing for 4K, which are all 60 Hz displays. My truest interest in G-Sync is paired with a 4K monitor. That would most likely get me to move away from Lightboost Surround which has a ton of flaws.

2nd: G-Sync is targeted at ALL monitors. Just because they are putting it into a very popular gaming TN panel that is easy to retrofit first, doesn't mean that is all it's targeted at. As a matter of fact, 60 Hz panels like IPS and 4K panels would benefit even more from this technology than gaming TN panels. They mentioned this in their discussion video.

3rd. There is absolutely nothing negative about this technology. It fixes or reduces three main problems inherent to all displays; tearing, stuttering and input lag. Any and all displays could use this technology. Whether you are willing to pay the extra cost for it and are willing to stick with NVIDIA is completely personal.
 
Yeah I have to back you up Vega.
This is a low cost addition to standard displays that have Display Port output (Display Port can self clock, it doesnt have rigid timing).

If NVidia licence the tech, it can be used on consoles which wouldnt be used with 120Hz displays unless they were bought for another purpose.
Its very useful for non 120Hz PC gamers too.
 
Really, if people had the foresight decades ago this is how displays should have always operated. The whole locked at 60 Hz nonsense came about due to the invention of CRT televisions and the cheaper electronics matching the 60 Hz power grid.

It's actually sad that coming into 2014 we are just now fixing such an archaic design.
 
Really, if people had the foresight decades ago this is how displays should have always operated. The whole locked at 60 Hz nonsense came about due to the invention of CRT televisions and the cheaper electronics matching the 60 Hz power grid.

It's actually sad that coming into 2014 we are just now fixing such an archaic design.

QFT, that is why some older CRT monitors are still sought over LCD's today!
 
QFT, that is why some older CRT monitors are still sought over LCD's today!

He didnt say that, he said that the TV standard has been in use since CRT days and is the cause of the problem.
Since then, we have had to lock all output devices to the displays refresh rate which causes some jerkiness and delays if the sync rate is not divisible by the framerate output to give an integer (or 1/2, 1/4 ... when framerate exceeds the refresh rate).
There is also an extra delay caused by the flyback period.

If you dont sync to the refresh rate you get tearing and it alleviates some but not all of the delays.
This applies to both CRT and current LCDs.

It isnt a reason why people prefer older CRTs.
 
1st: Those tears are pretty fast for a 1/4 speed video. What makes you think the left monitor is running at 60 Hz? You have any proof or just pulling that out of your ass?
if you have ~50fps at 144Hz monitor and 120Hz camera then at least half of frames of this video should be tear free yet almost all frames show tearing so refresh rate can't be high. Rapid changes in tearing behavious when framerate is fluctuating around 60Hz also suggests 60Hz. You wouldn't have slowly going tearing at almost 60Hz, that would not make any sense on 144Hz mode

If it in fact was 60 Hz, there is nothing wrong with that. 99+% of monitors out there are 60 Hz displays. NVIDIA and AMD keep pushing for 4K, which are all 60 Hz displays.
2nd: G-Sync is targeted at ALL monitors. Just because they are putting it into a very popular gaming TN panel that is easy to retrofit first, doesn't mean that is all it's targeted at. As a matter of fact, 60 Hz panels like IPS and 4K panels would benefit even more from this technology than gaming TN panels. They mentioned this in their discussion video.
ok, but this is 144Hz monitor and all first G-Sync monitors will be 144Hz monitors
they showed comparsion to 60Hz hoping to get better response
as it seems their little white lie worked for them fine :)

3rd. There is absolutely nothing negative about this technology. It fixes or reduces three main problems inherent to all displays; tearing, stuttering and input lag. Any and all displays could use this technology. Whether you are willing to pay the extra cost for it and are willing to stick with NVIDIA is completely personal.
I never said there is anything wrong, just pointed out that they used obsolete 60Hz when they could as well use 144Hz to show real difference to potential buyer of G-Sync module for Asus VG248QE. Don't you think that would be only more fair comparison? :rolleyes:

if they did that then teraing would be 2.4 times less visible and most people watching this video wouldn't have noticed what the fuss is about. NV just didn't wanted that. Their reasoning is totally understandable and I don't blame them but fact is a fact
 
NVIDIA only announced a single monitor at 144 Hz by name, the Asus VG248QE as being released with the G-Sync. I am not sure where you are getting "all first monitors".

The only monitor that will have G-Sync this year according to their information, is the VG248QE and IF you also plug in the module yourself. 2014 holds a wide range of possibilities, here is the exact quote:

Beginning later this year, NVIDIA G-SYNC will be available as monitor module you can install yourself, or buy pre-installed in one of the best monitors currently available. Next year, G-SYNC monitors will be available on the shelves of your favorite e-tailers and retailers, in a variety of screen sizes and resolutions, eventually scaling all the way up to 3840x2160 (“4K”).

Gaming TN panels are only one resolution these days, 1080P and almost certainly never going to make it to the 4K market due to all of it's drawbacks.
 
yeah and many people elect for the higher res of 2560x1440 which is only at 60hz. yes you can oc some monitors but most probably wont.
 
I never said there is anything wrong, just pointed out that they used obsolete 60Hz…

60Hz is not obsolete. It's the standard. The vast majority of displays in use, whether computer or television or otherwise, refresh at 60Hz. This is why 60Hz was used in the demonstration.

… when they could as well use 144Hz to show real difference to potential buyer of G-Sync module for Asus VG248QE.

The purpose of the event wasn't to sell you an Asus display. That's Asus' job. Nvidia was demonstrating their technology, not staging an infomercial for a specific monitor. Besides, Nvidia had to show that G-Sync works at lower refresh rates. Demonstrating at 144Hz would have been a colossally silly idea.
 
60Hz is not obsolete. It's the standard. The vast majority of displays in use, whether computer or television or otherwise, refresh at 60Hz. This is why 60Hz was used in the demonstration.
it was used only to show bigger difference that using 144Hz would do

I just wanted all of you to know it is 60Hz vs whatever you were thinking (or not thinking about it at all :rolleyes: )
and now EOT, no point to argue what NV had in mind...
 
Any likelyhood other monitors will accept a DIY g-sync module? What makes the Asus model special?

Also, any coincidence the compatible monitor is also lightboost/3d vision 2 compatible?
 
Any likelyhood other monitors will accept a DIY g-sync module? What makes the Asus model special?

Also, any coincidence the compatible monitor is also lightboost/3d vision 2 compatible?
 
The monitor needs a port to take the module as it needs integrating.

It was probably a tough decision to include the latest features, something along the lines of:
"Shall we use the latest versions on the new cutting edge display?"
"Yep."
 
I'm all for G-Sync, as long as it also comes with PWM-free backlighting.
G-Sync is unrelated to strobing. BENQ will come out with PWM-free G-Sync monitors. (not sure regarding ASUS VG248QE's G-SYNC upgrade)
There's a strobe feature, but it's optional.

Then you're going to have to explain why my 19" Viewsonic CRT running at 120 Hz doesn't murder my eyes, but all three LightBoost monitors I've played with in person have had obvious strobe...
-- The strobe on LightBoost is squarewave, while the strobe on a CRT is a softer decline (phosphor decay).
-- Also, the LED spectrum is quite harsh, leading to some noticeable issues for some people
-- 24" LCD monitors take up more of your field of vision
-- On some models, LightBoost brightness is often brigther than CRT brightness.
-- The whole screen flashes at once on LightBoost, rather than each dot being flashed sequentially, one scanline at a time. Some humans react differently.

That said, to most human eyes, CRT and LightBoost flicker looks the same, but some people have a sensitivity to a specific type of flicker, for one reason or another. Usually, it flickers like a 120Hz CRT to most eyes (including mine). Not all humans react the same to the strobing. LightBoost can increase eyestrain for some (if more strained by flicker), while decreasing eyestrain for others (if more strained by motion blur). Every human is different. Fortunately, strobing can be turned ON/OFF.
 
there will not be monitor with G-Sync with slower panel than 6.9ms for quite some time, probably none in 2014
also 60Hz monitor owners ARE NOT INTENDED TARGET GROUP for G-Sync
it is targeted at 120/144Hz freaks, not your ordinary I don't see point of 120Hz folks.
Actually, G-Sync improves all variable framerate situations, regardless of Hz. Marketing this is another question altogether, but the benefits are indisputable, even in the 30-60fps @ 60Hz range.

See How Does G-SYNC Fix Stutters? -- it's applicable even to lower framerates and refresh rates, as well.

Certainly, this is more marketable to people who use 120Hz anyway, but the technical benefits are available at all framerates, even all the way to 240fps@240Hz and beyond. As panels are capable of being refreshed faster, there's no reason why the G-Sync framerate can't go up.
 
Last edited:
G-Sync can be implemented on 60Hz mointors like VA and IPS and it would work, just with maximum framerate 60fps and with ~9.7ms larger input lag at bottom of the screen
 
NVIDIA only announced a single monitor at 144 Hz by name, the Asus VG248QE as being released with the G-Sync. I am not sure where you are getting "all first monitors".

The only monitor that will have G-Sync this year according to their information, is the VG248QE and IF you also plug in the module yourself. 2014 holds a wide range of possibilities, here is the exact quote:

Beginning later this year, NVIDIA G-SYNC will be available as monitor module you can install yourself, or buy pre-installed in one of the best monitors currently available. Next year, G-SYNC monitors will be available on the shelves of your favorite e-tailers and retailers, in a variety of screen sizes and resolutions, eventually scaling all the way up to 3840x2160 (“4K”).

Gaming TN panels are only one resolution these days, 1080P and almost certainly never going to make it to the 4K market due to all of it's drawbacks.

The DIY kit is interesting... I wonder how compatible it will be. I am wondering if they are presenting this option because the way in which it installs in the monitor is fairly standard across panels? The VG248QE is a TN panel, so I wonder if the DIY kit will be equally as compatible with VA or IPS panels or if the innards are too different.
 
does the gtx 770 card comes with the G-SYNC feature or will I also have to buy a new video card?
 
I'm really curious about the possibility of DIY functionality for us that will be playing on TV's.
Something worth mentioning (beyond jitters and tearing) - not having to use vsync is an instant performance boost, too. In some games it can be pretty significant.
 
Was anything said about how Gsync could, would or will not work with light boost? I consider this to be one of the most important revolutions in the entire history of the LCD. Light boost is what made it possible to use LCDs. If Gsync cannot instruct the LEDs off in between frames to reduce smearing then it wont help many gamers.
 
well that is good to hear, sounds like 2d lightboost will soon be an official feature and not a hack.
 

That was an interesting article, and a part of makes me wonder if Nvidia might be in trouble:

More intriguing is another possibility Nalasco mentioned: a "smarter" version of vsync that presumably controls frame flips with an eye toward ensuring a user perception of fluid motion.

This was mentioned to TechReport by AMD's David Nalasco in an article on 2/12/12!:eek:

IF AMD has any sort of prior art/prototypes that predate Nvidia's work, this could get messy, ala Intel vs AMD...I am not trying to start a fan boy war, just stating the facts..It would be in EVERYONE's interest if both companies worked together to create an open standard for this..

Of course, that might backfire on them, since the reason many people upgrade is due to low FPS and the resulting studdering etc..If they are going to remove that studdering, then all of sudden Joe Blow now might keep his current card for another 6 months to a year or more..Thoughts?
 
NVIDIA's very heavily invested in the PC at the moment, so whatever bolsters the PC is what benefits them. A G-Sync-like technology becoming a DisplayPort standard would probably not impact them too heavily, but it is in their best interest — assuming they can't just be a successful display ASIC company — to keep the technology exclusive to PCs.
 
IF AMD has any sort of prior art/prototypes that predate Nvidia's work, this could get messy, ala Intel vs AMD...I am not trying to start a fan boy war, just stating the facts..It would be in EVERYONE's interest if both companies worked together to create an open standard for this..

I doubt NVIDIA tried to grab a patent on this. It seems like something that AMD and Intel could easily do themselves, if they're willing to put the work into it. Then you just coordinate with the display manufacturers to get a generic chip on the board or something.

Seems like there's a decent amount of software behind this tech, in the drivers and such. Not sure if AMD / Intel would be interested in putting in the effort though.

Of course, that might backfire on them, since the reason many people upgrade is due to low FPS and the resulting studdering etc..If they are going to remove that studdering, then all of sudden Joe Blow now might keep his current card for another 6 months to a year or more..Thoughts?

That is a concern, but looking at the next-gen engines coming out... I think the GPU makers are going to be selling a lot of cards in the future. Especially w/ higher-res panels becoming more desirable.
 
I think NVIDIA's logic in introducing this only on 144 Hz displays at this time is to motivate people to spend more on GPUs. Once a person's gotten a taste of high-refresh, it's extraordinarily difficult to go back, and you need a lot of muscle to do high-refresh in tandem with high quality.
 
I think NVIDIA's logic in introducing this only on 144 Hz displays at this time is to motivate people to spend more on GPUs. Once a person's gotten a taste of high-refresh, it's extraordinarily difficult to go back, and you need a lot of muscle to do high-refresh in tandem with high quality.
its more of getting this out there for people that are the pickiest about judder and motion blur. and they will be having new mode to for the lightboost so that will require a 120/144 screen of course. in other words the 144hz asus will be the best way to experience all that g-sync has to offer.
 
IF AMD has any sort of prior art/prototypes that predate Nvidia's work, this could get messy, ala Intel vs AMD...I am not trying to start a fan boy war, just stating the facts..It would be in EVERYONE's interest if both companies worked together to create an open standard for this..

You can't be stating facts if you have no facts to state. Insinuating that AMD can launch legal action because Nvidia shipped a product and AMD hasn't is beyond ridiculous.
 
Considering the fact that this is only looking towards a 1080p display, with no 1080p+, or SLI, or Surround support, GSYNC is a bunch of pissing in the wind.
 
Removed. jwcalla posted the information.
 
Last edited:
In regards to post #197 .... well, you don't see that often.

I'm just hoping they can integrate GSYNC into a projector somehow. I don't really want to go back to bezels.
 
I think NVIDIA's logic in introducing this only on 144 Hz displays at this time is to motivate people to spend more on GPUs. Once a person's gotten a taste of high-refresh, it's extraordinarily difficult to go back, and you need a lot of muscle to do high-refresh in tandem with high quality.

I doubt it. The number of people who have 144Hz displays are very small, and the number of those that will upgrade to a G-Sync monitor is smaller still, and the ones that will upgrade to G-Sync AND upgrade their GPU's is quite literally, inconsequential.
 
This is why NV rules the roost. They know gaming is more than raw framerates. I'm in for one these monitors.
 
I doubt it. The number of people who have 144Hz displays are very small, and the number of those that will upgrade to a G-Sync monitor is smaller still, and the ones that will upgrade to G-Sync AND upgrade their GPU's is quite literally, inconsequential.

I have 144 Hz monitor... I've had Dell IPS surround also and I loved it. I'd rather have my single ASUS TN monitor. Once you've seen the difference in input lag and ultra-smoothnes... I wouldn't give it up. Only draw back is... some games can't be played... messes up the physics. (Skyrim)

I think as gamers upgrade their hardware, if this tech is good tech, it will be very consequential.

*oops double my bad.*
 
Back
Top