Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

Why does freesync cause ghosting?

I guess if the GPU / drivers are making the pixel response times slower by holding them up somehow it could cause ghosting similar to if the monitor had slow pixel response times anyway, but then if it is the GPU / Drivers causing it, then it can be fixed as gsync does not have ghosting.
 
How can ghosting be caused by drivers? What influence does the drivers have on how long the previous picture remains visible on the panel after it was told to show another one?
 
Why does freesync cause ghosting?

I guess if the GPU / drivers are making the pixel response times slower by holding them up somehow it could cause ghosting similar to if the monitor had slow pixel response times anyway, but then if it is the GPU / Drivers causing it, then it can be fixed as gsync does not have ghosting.

According to where nVidia was quoting this from, Freesync had ghosting when the frame rate went above or below the refresh rate range for the monitors that were being tested. Which happened to be IPS panels with a max refresh of 75hz, the demo used with the windmill was 240FPS.

the cheaper TN panels are out with 144Hz max, so unless you're playing at ultra low details in quake 3 it's a minimal issue.

G-sync also forced vsync with absolutely no ability to disable it.
 
Because it's just a scaler with no additional hardware for improving image quality.
The scaler is the only hardware needed. The entire reason the GSync module exists is because the original panels didn't have an adaptive scaler. It doesn't do anything else with the image itself.
 
Honestly I haven't even heard of ghosting until the PCPer review lol. Even looking at side by side comparisons on Youtube I still can't really tell if a panel was ghosting or not.

I have a VG248QE 144Hz panel if that matters.
 
Honestly I haven't even heard of ghosting until the PCPer review lol. Even looking at side by side comparisons on Youtube I still can't really tell if a panel was ghosting or not.

I have a VG248QE 144Hz panel if that matters.

It also depends on the individual. Some gamers may notice slight ghosting while others may not on the same display.
 
G-sync also forced vsync with absolutely no ability to disable it.

Except for the option in NV Control Panel to disable G-Sync. :rolleyes:

"However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver."
 
Except for the option in NV Control Panel to disable G-Sync. :rolleyes:

That post wasn't referring to the ability to toggle G-Sync. Of course you'd be able to turn that on and off. That post was referring to the fact that with G-Sync enabled, V-Sync is enabled, and you cannot turn the V-Sync off.

With FreeSync, you can.
 
That post wasn't referring to the ability to toggle G-Sync. Of course you'd be able to turn that on and off. That post was referring to the fact that with G-Sync enabled, V-Sync is enabled, and you cannot turn the V-Sync off.

With FreeSync, you can.
It seems like you're misunderstanding what the non-issue is. G-sync doesn't "enable" V-sync. When the game is constantly at the upper limit of the refresh rate range the monitor is already displaying frames as fast as it can. In this case, it is acting like V-sync because this is the ideal situation for traditional V-sync, thus the refresh rate of the panel does not need to change to keep the display in sync with the game. If you're running a game with settings such that you're constantly meeting or exceeding the maximum refresh rate of the panel, then there is no reason for you to be running any kind of frame syncing at all to begin with. So I really don't understand why this is an issue. Freesync is turning itself off above the maximum refresh rate, not preventing V-sync from turning on. G-sync is not turning V-sync on at the maximum refresh rate, it is behaving as a frame syncing method is supposed to.
 
It seems like you're misunderstanding what the non-issue is. G-sync doesn't "enable" V-sync. When the game is constantly at the upper limit of the refresh rate range the monitor is already displaying frames as fast as it can. In this case, it is acting like V-sync because this is the ideal situation for traditional V-sync, thus the refresh rate of the panel does not need to change to keep the display in sync with the game. If you're running a game with settings such that you're constantly meeting or exceeding the maximum refresh rate of the panel, then there is no reason for you to be running any kind of frame syncing at all to begin with. So I really don't understand why this is an issue. Freesync is turning itself off above the maximum refresh rate, not preventing V-sync from turning on. G-sync is not turning V-sync on at the maximum refresh rate, it is behaving as a frame syncing method is supposed to.

Because more fps often means better input response and better physics in many games. Due to the way these calculations are implemented. It might not be an issue with a 120 or 144hz display but with a 60hz display it is.
 
Because more fps often means better input response and better physics in many games. Due to the way these calculations are implemented. It might not be an issue with a 120 or 144hz display but with a 60hz display it is.

I don't even really think that 60Hz is the issue. I think it's that some monitor manufacturers chose to go with cheaper scalars. Remember FreeSync works from 9Hz to 240Hz which is superior to Nvidia's tech. If I remember right some of the scalars in the first batch of FreeSync monitors only worked down to 40Hz. I think that because monitor manufacturers chose cheaper scalars to implement FreeSync as cheaply as possible.

In short it's going to have to be up to consumers to let the monitor manufacturers know that they need to implement better scalar technology. I guess that some are feeling out the market to see if consumers will gravitate to the monitors with the wider FreeSync frequency support or not care and just buy what's available.
 
Except for the option in NV Control Panel to disable G-Sync. :rolleyes:

"However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver."

As from the article that nVidia is using to justify why G-Sync is better

"With G-Sync today you do not get the option to enable or disable VSync when outside the variable refresh range. At the high end that means that NVIDIA today forces users into a VSync on state, something that many competitive gamers don’t approve of. AMD’s implementation allows those gamers to disable VSync, giving them the lowest possible input latency, highest possible frame rates though at the cost of re-introducing screen tearing. That just happens to be a choice that many gamers would make.

As it turns out, NVIDIA is starting to recognize this as a concern for some users and will hopefully consider enabling the option to set VSync off when gaming above the maximum panel refresh rate in a future driver."
 
Because more fps often means better input response and better physics in many games. Due to the way these calculations are implemented. It might not be an issue with a 120 or 144hz display but with a 60hz display it is.
If input lag is a concern then you wouldn't be using any type of frame syncing to begin with. Adaptive-sync methods still introduce lag because the display is now waiting to be told when to present the next frame.
 
Couldn't games just set a FPS cap, so that they don't exceed the display's maximum refresh rate? Note: this is not the same as V-Sync, as we are not waiting for a V-Blank, we're just not rendering new frames as frequently.
 
Couldn't games just set a FPS cap, so that they don't exceed the display's maximum refresh rate? Note: this is not the same as V-Sync, as we are not waiting for a V-Blank, we're just not rendering new frames as frequently.

Yes but that is not what g-sync does to make it butter smooth. It keeps the refresh rate of the monitor in sync with the graphics so when the game bogs down cause of alot of action say drops from 120 to 60 it keeps thrm in sync so there's no tearing or lag.
 
Yeah, but if it has issues when you go above the max refresh of the monitor (which current freesync implementations apparently have), then it'd make sense for games to set a fps cap/limit, and that should be the end of it. I don't see the ghosting issue being a real issue unless you absolutely must have 200fps+ on your 60Hz display.
 
I saw Benq had a TN panel ( I know the stigma but go with it) with 144hz max, think the min was still 40hz. Most users were loving it. It seemed rather expensive, which most attributed to Newegg jacking up for new tech, being listed as $799- selling for $699. Those that had the driver said it worked wonderfully with one card, but with the lack of a CF driver single was the way to go for now.
 
to be honest I have either never had stuttering or never been bothered by it, my issue without gsync is tearing, its solved by vsync but vsync is only ok if the gpu can maintain the fps.

Having looked into it a bit more freesync is useless below 30fps, and all gsync monitors seem to be limited to just display port outputs, meaning I consider both solutions not good enough. I seen a £700 IPS type gsync monitor with just one displayport, thats almost as bonkers as spending £400 on a TN screen. I dont hate tearing that much.
 
I just came here to add. If you have never playing on a G-Sync monitor, you have no idea what you are missing out on. I owned an XB280HK(returned for refund, Oculus DK2 refresh rate reasons, plus TN panel), and my GTX 970 SLI barely reached 40fps on some games, but at 40fps, the games were butter smooth. You just need to try it. I have a Titan X on the way, and will have XB270HU soon, and I welcome the 144hz with G-Sync. Never going to play on a regular monitor again.
 
I just came here to add. If you have never playing on a G-Sync monitor, you have no idea what you are missing out on. I owned an XB280HK(returned for refund, Oculus DK2 refresh rate reasons, plus TN panel), and my GTX 970 SLI barely reached 40fps on some games, but at 40fps, the games were butter smooth. You just need to try it. I have a Titan X on the way, and will have XB270HU soon, and I welcome the 144hz with G-Sync. Never going to play on a regular monitor again.

have you ever played on a 144hz monitor before G-Sync, ...already buttery smooth. Im sure it helps but try the monitor with G-Sync turned off if you can I think you might be thanking the wrong product.
 
Wow, what a long thread on Gsync vs Freesync.

Actually I want to say that both GSync and Freesync are considered niche market products. This is because I believe many gamers do not bother so much about monitors. I have been living w/o such tech for ages...... Still gaming today. Don't see how Gsync or freesync will significantly improve my gaming experience (other than lighten my wallet).

To me, I don't care about all these ghosting, flickering etc...... As long as the image looks good and it does not lag. Thats about all it matters. When you game do you even have time to observe ghosting in the image? Eg. playing BF4. I don't bother.
 
Couldn't games just set a FPS cap, so that they don't exceed the display's maximum refresh rate? Note: this is not the same as V-Sync, as we are not waiting for a V-Blank, we're just not rendering new frames as frequently.

Games can do this and some have done so in the past. Unreal Tournament for example had console commands and .ini file settings for this. I am not sure this is the end all and be all of solutions though. For one thing not everyone liked the FPS caps as they were usually artificially low (60FPS for example) and it didn't help when the refresh rate dropped too low.

Yes but that is not what g-sync does to make it butter smooth. It keeps the refresh rate of the monitor in sync with the graphics so when the game bogs down cause of alot of action say drops from 120 to 60 it keeps thrm in sync so there's no tearing or lag.

Pretty much this. V-Sync does solve the problem of screen tearing just fine but it creates very perceptable input lag. Naturally V-Sync impacts performance negatively when the frame rate drops below 60FPS. Lets face it this is a potentially common occurence if you run displays above 1920x1080 with the more demanding games out now. With multiple monitors the problem is compounded. I can't always hit 60FPS+ as a maximum at 7680x1440 or 4320z2560 and you can forget hitting 144FPS+. V-Sync isn't a realistic option if your playing Crysis 3 or BF4 with all the eye candy at those resolutions. At that point playable is all your looking for. You won't get that with V-Sync effectively halving your frame rate when you get below 60FPS. G-Sync doesn't do that. It still doesn't change how 30FPS looks or feels, but it won't drop your frame rate even further when your system starts struggling to deliver frames.

I'm also sensitive to input lag. On my previous displays (3x 30" Dell 3007WFP-HC's) I had no real shot of hitting 100FPS+ in the latest games unless poor monitor support forced me to use a single display. At that point screan tearing was a huge issue and V-Sync introduced horrible lag.

IPS displays are known for ghosting and that never bothered me nearly as much as screen tearing does. Without G-Sync I frequently ran into screen tearing when I was forced by some game to run a single display. (Or when I'd play older games.)

Yeah, but if it has issues when you go above the max refresh of the monitor (which current freesync implementations apparently have), then it'd make sense for games to set a fps cap/limit, and that should be the end of it. I don't see the ghosting issue being a real issue unless you absolutely must have 200fps+ on your 60Hz display.

Some people are very sensitive to ghosting. I'm not. I played games on an IPS display for years without complaint.

have you ever played on a 144hz monitor before G-Sync, ...already buttery smooth. Im sure it helps but try the monitor with G-Sync turned off if you can I think you might be thanking the wrong product.

True.

Wow, what a long thread on Gsync vs Freesync.

Actually I want to say that both GSync and Freesync are considered niche market products. This is because I believe many gamers do not bother so much about monitors. I have been living w/o such tech for ages...... Still gaming today. Don't see how Gsync or freesync will significantly improve my gaming experience (other than lighten my wallet).

To me, I don't care about all these ghosting, flickering etc...... As long as the image looks good and it does not lag. Thats about all it matters. When you game do you even have time to observe ghosting in the image? Eg. playing BF4. I don't bother.

I lived with 60Hz IPS displays for years. Ghosting didn't bother me too much but flickering is annoying anytime it becomes present. Flickering and ghosting however by their concepts go against the "image looking good" as you put it. I'd agree that ghosting isn't always noticable unless your just sensitive to it but flickering is distracting, annoying and totally undesirable.

I wouldn't bother with FreeSync until it supports Crossfire and the ghosting issues are fixed, but that's just me.

What NVIDIA should do is open up G-Sync to be GPU / Card independent. Being an NVIDIA only option sucks. They'd sell more G-Sync modules if they worked for everyone.
 
Last edited:
I just came here to add. If you have never playing on a G-Sync monitor, you have no idea what you are missing out on. I owned an XB280HK(returned for refund, Oculus DK2 refresh rate reasons, plus TN panel), and my GTX 970 SLI barely reached 40fps on some games, but at 40fps, the games were butter smooth. You just need to try it. I have a Titan X on the way, and will have XB270HU soon, and I welcome the 144hz with G-Sync. Never going to play on a regular monitor again.
People really need to stop with this nonsense gsync does not magically make low frame rate smoother It might seem smoother because it removes tearing but 40fps is 40fps with or without gsync.
Low frame rate still looks like crap when you are use to playing high at a framerate.

And yes i had gsync monitor but returned because i cant stand matte displays

I'll get the free One.
What when you buy your new monitor?
 
Last edited:
People really need to stop with this nonsense gsync does not magically make low frame rate smoother It might seem smoother because it removes tearing but 40fps is 40fps with or without gsync.
Low frame rate still looks like crap when you are use to playing high at a framerate.

And yes i had gsync monitor but returned because i cant stand matte displays.

I think the thing that people notice is that V-Sync will cut your frame rate down to 30FPS when you drop below 60FPS and G-Sync doesn't. Therefore G-Sync always feels smoother and the lack of input lag in the equation also helps deliver a better experience. 30FPS sucks but it sucks worse with V-Sync.
 
I think the thing that people notice is that V-Sync will cut your frame rate down to 30FPS when you drop below 60FPS and G-Sync doesn't. Therefore G-Sync always feels smoother and the lack of input lag in the equation also helps deliver a better experience. 30FPS sucks but it sucks worse with V-Sync.

Standard vsync doesn't work that way.
I don't know the exact name for it but nvidia has options called *vsync smooth* and farcry 4 have this option also that does what you described and its stupid i don't get why its even an option its not smooth at all dropping 30fps instantly when your frame drop even one frame under refresh rate.
 
Last edited:
Standard vsync doesn't work that way.
I don't know the exact name for it but nvidia has options called *vsync smooth* and farcry 4 have this option also that does what you described and its stupid i don't get why its even an option its sucks and not smooth at all dropping 30fps instantly

OK it is situational but that's not entirely true.

"Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

Now maybe you can see why people loathe it. Let's go back to the original example. You're playing your favorite game at 75Hz refresh and 100FPS. You turn VSync on, and the game limits you to 75FPS. No problem, right? Fixed the tearing issue, it looks better. You get to an area that's particularly graphically intensive, an area that would drop your FPS down to about 60 without VSync. Now your card cannot do the 75FPS it was doing before, and since VSync is on, it has to do the next highest one on the list, which is 37.5FPS. So now your game which was running at 75FPS just halved it's framerate to 37.5 instantly. Whether or not you find 37.5FPS smooth doesn't change the fact that the framerate just cut in half suddenly, which you would notice. This is what people hate about it."


It depends on if your using triple buffering or not. Regardless if your sensitive to input lag V-Sync sucks. G-Sync doesn't create the input lag V-Sync does.
 
What you described does not happen when you use triple buffering either.

edit
After reading your link i understand that vsync smooth option but again you have things confused.
 
Last edited:
What you described does not happen when you use triple buffering either.

edit
After reading your link i understand that vsync smooth option but again you have things confused.

I understand that the frame rate drops do not occur with triple buffering. I said that in my post above. The input lag I believe is present regardless so long as V-Sync is enabled. It always was in any game I've ever played where I tried it. G-Sync doesn't create additional input lag.

I don't think I'm confused. :p
 
have you ever played on a 144hz monitor before G-Sync, ...already buttery smooth. Im sure it helps but try the monitor with G-Sync turned off if you can I think you might be thanking the wrong product.
I have played on a 144Hz monitor (VG278HE) and a G-sync monitor (PG278Q). There is no comparison between the two. Of course V-sync on a 144Hz monitor will be smooth if you can maintain that framerate. But because that is actually pretty unrealistic for most people in modern games, not only do you get tearing, but you also get judder (an issue that is present when content is running at a framerate below the refresh rate of the display regardless of the use of V-sync). G-sync at 60-100 FPS has none of these issues compared to a 144Hz display running at the same framerate with V-sync disabled.

With G-sync I don't have to look at framerates anymore to dial in game setting because it makes the experience that much better. Again, you (read: anyone) really can't speak intelligibly about the subject without having experienced the tech yourself.
 
As from the article that nVidia is using to justify why G-Sync is better

I misread your other post.

Other forum posts are full of people stating G-Sync is forced on and no way to turn it off. Still had that in the back of my head.

My apologies bud!
 
-snip-

With G-sync I don't have to look at framerates anymore to dial in game setting because it makes the experience that much better. Again, you (read: anyone) really can't speak intelligibly about the subject without having experienced the tech yourself.

So you're telling me....this "titan" is actually worth its weight? ;):D
 
One advantage of these Freesync displays is that there will finally be a semi-decently priced 144Hz 1440p panel.

Really if you think about it, as of now there does not exist a single 144Hz 1440p panel that does not come bundled with either G-Sync or Freesync, unless you wanted an "overclockable' IPS panel (NOPE). At least with Freesync I can get a guaranteed 144Hz stable 1440p panel without paying over $700 for it.

Still hoping that sometime down the road somebody will have the sense to make a plain 144Hz 1440p panel without any bundled Sync crap, then we'd be talking.
 
Wow, what a long thread on Gsync vs Freesync.

Actually I want to say that both GSync and Freesync are considered niche market products. This is because I believe many gamers do not bother so much about monitors. I have been living w/o such tech for ages...... Still gaming today. Don't see how Gsync or freesync will significantly improve my gaming experience (other than lighten my wallet).

To me, I don't care about all these ghosting, flickering etc...... As long as the image looks good and it does not lag. Thats about all it matters. When you game do you even have time to observe ghosting in the image? Eg. playing BF4. I don't bother.

It's a long thread because most/all people commenting don't have a definitive answer as to the root cause of the ghosting reported related to one of the freesync monitors.


We know that within the vrr window freesync performs virtually identically to gsync in terms of keeping frames looking smoother and removing tearing.

But pcper threw a dark cloud over the entire freesync celebration train with assertions of ghosting (implying it was directly related to how freesync handles the display compared to how the gsync module deals with the display).

I have yet to hear anyone in the know that is not pcper opine on the actual cause of the ghosting observed. I don't know if the ghosting appears throughout the entire refresh range, or just beneath the vrr window. So many unknowns. But what pcper and their nvidia contacts have suceeded in doing is throwing a cloud over the potential performance of freesync vs gsync in certain circumstances. I don't know if it's really the panels as some suggest, or some special "magic" that the gsync module and only that module can work on, or something that could be updated and handled with drivers for specific displays.

People seem to be asserting the case one way or the other, but how do they actually KNOW? And how do WE as mostly lay people actually know what the causes are?


Someone with the right displays and deeper knowledge of the mechanics of all of this needs to investigate, and NOT pcper, they are too poisoned with pro nvidia bias for me and many others to take at face value. I want to know if there are differences that cannot be easily overcome, or if it really is just a panel specific issue. If it's the latter, that still has consequences, it will lead many gamers to wait for the goldilocks panels/(other relevant display hardware) that specifically works to address the reported issues instead of jumping on the early displays.

The most interesting freesync capable monitor I've heard about is not even released, and it's not 4k. It's that asus 1440p ips display reported earlier. We know virtually nothing about it though in terms of it's performance with vrr, we are so in the dark it's not even funny. This was supposed to be a time of clarity and celebration, now there is only a cloud, and until the choked out fumes of FUD and faulty information are cleared, there can be no end to the speculation. I have to congratulate nvidias contacts and pr, their vassals at pcper have done their job well.

Star-Wars-7-Rumor-Emperor-Returning.jpg
 
Let's bring some more Sith lightnings then into this thread then:

Polish Pclab portal have made a review for freesync and they tested ghosting with synchronisation on and off (Swift vs 2730Z at 45 fps)

http://pclab.pl/art62755-4.html

synchronisation on:

http://www.youtube.com/watch?feature=player_embedded&v=84G9MD4ra8M

synchronisation off:

http://www.youtube.com/watch?feature=player_embedded&v=VJ-Pc0iQgfk


test of both methods at 27 fps

http://www.youtube.com/watch?feature=player_embedded&v=1jqimZLUk-c

at 35 fps

http://www.youtube.com/watch?feature=player_embedded&v=zzW6YIC0sks

at 45 fps

http://www.youtube.com/watch?feature=player_embedded&v=84G9MD4ra8M

at 75 fps

http://www.youtube.com/watch?feature=player_embedded&v=aTJ_6MFOEm4

and at 144

http://www.youtube.com/watch?feature=player_embedded&v=HZtUttA5Q_w

I've extracted links from the article since I guess You guys aren't too good with Polish ;)
 
Back
Top