cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Techspot has performed testing on FreeSync monitors and NVIDIA GPUs to determine how well do they work together. They were extremely skeptical of NVIDIA claims that FreeSync monitors exhibit issues with AMD video cards. They knew that cheap brand monitors had quality control issues, but otherwise everything worked fine on the AMD side. So their expectations were that everything should work with NVIDIA GTX 10 series GPUs and newer also.

After explaining the 4 tiers of adaptive sync support that NVIDIA has, Techspot ended up testing 7 different monitors that didn't have NVIDIA certification. All worked except for one monitor that required FreeSync over HDMI. NVIDIA currently doesn't support FreeSync over HDMI as it has tied its G-SYNC technology to Displayport only. Owners of high end FreeSync televisions are left out in the cold for now as most TVs only have HDMI connections. Everything worked perfectly fine and without issues just as they predicted. NVIDIA even supports low framerate compensation (LFC) and HDR with adaptive sync support on FreeSync monitors. They haven't tested the exclusive FreeSync 2 AMD features on NVIDIA GPUs, but they expect the basic FreeSync and HDR on those to work just fine also.

If you're wondering about input lag, we measured no appreciable difference to input lag between adaptive sync enabled and disabled on Nvidia GPUs. Enabling adaptive sync does not appear to increase GPU-side processing time, which is also the case for AMD GPUs. Bottom line, Nvidia supporting FreeSync is nothing but a good thing for the industry and consumers in general. When shopping for a new gaming monitor, you'll just have to make sure it's a solid high quality display first, worry about variable refresh rate technology second.
 
Maybe the G-Sync "tax" can now die.

G-Sync wasn't more expensive exclusively due to the NVIDIA PCB. Anyone selling G-Sync monitors had to get NVIDIA's approval on panel selection. The manufacturer also had to pass some sort of quality certification.

That said, I now regret getting the Alienware AW3418DW G-Sync monitor. Samsung has indicated they're releasing a 49" 5120×1440, 120Hz, FreeSync 2 monitor.
 
...

If you're wondering about input lag, we measured no appreciable difference to input lag between adaptive sync enabled and disabled on Nvidia GPUs. Enabling adaptive sync does not appear to increase GPU-side processing time, which is also the case for AMD GPUs. ...

Correct me if I'm wrong, but wouldn't it be extremely weird if adaptive sync did negatively affect lag and/or gpu side processing?
 
It won’t - they are already trying to turn it into a premium branding. Honestly that may not be a bad spot for it

Well, I'd agree- G-Sync monitors have offered the best variable V-Sync experience from day one- along with the best GPUs, 'premium' fits pretty well.
 
Well, now I can use 61 to 75 Hz on my monitor without frame skipping.

I never understood why they sell FreeSync monitors that go up to a certain refresh rate, but make you use some form of Adaptive Sync to properly use it.
 
Makes me wonder where the premium is to be found after watching this...:D I own the Samsung 32" version of the monitor in the test and I could not be happier with it.


Honestly, if a monitor checks all the Freesync 2 boxes there really isn't going to be a heck of a lot of difference between it and a standard G-Sync display. The CHG70 is a damn good Freesync 2 display. The biggest thing that makes G-Sync better is just the over-all level of feature quality. You will never find a G-sync panel that doesn't support the entire range of the monitor, from a single hertz all the way up to the max refresh rate of the panel. That means every display, no matter the G-sync version or age, supports LFC. Freesync's problem has always been the sheer number of displays with bad Freesync implementation. Finding a good one generally required a ton of looking and research, especially as a lot of reviews do a piss poor job of covering Freesync support. However, if you found a Freesync monitor that matched the feature set of G-sync then there should be no real difference between them.

For people with the money to burn I think G-Sync Ultimate will be the real premium G-Sync experience, with a real premium price to go along with it.
 
I was just asking this very question in the Displays thread. There's a really comprehensive Google doc at the nV subreddit with folks' experiences. As expected, most freesync displays are working just fine, a few with issues.

As expected, nVidia was vastly overstating issues and obscuring the details in their announcement and CES demo. Hell, they even went out of their way to find the displays that didn't work to demo them. But I do suppose if they would have just stated "most freesync displays work" people would be up in arms because of the few models that have issues.
 
well, i was hopping to use \the VRR on my 82nu8000 samsung, but since it is not supported over HDMI 2.0b that will not be the case.

but on the other hand my 32" korean 4k screen should benefit.
 
Well, now I can use 61 to 75 Hz on my monitor without frame skipping.

I never understood why they sell FreeSync monitors that go up to a certain refresh rate, but make you use some form of Adaptive Sync to properly use it.

Same here 1080Ti + LG34UC88 VRR OK @75Hz without frame skipping or flickering (test carried out with Pendulum, Battlefield V, Wolfenstein New Order, Doom, AC Odyssey.
 
well, i was hopping to use \the VRR on my 82nu8000 samsung, but since it is not supported over HDMI 2.0b that will not be the case.

I *expect* HDMIs VRR implementation to win the day when all is said and done, given it's in the baseline HDMI 2.1 specification (meaning *everyone* has to support it) and performance should be predictable for all displays. Also, HDMI is far more widely supported, given Displayport is not a factor in the TV market.

I'm expecting HDMI 2.1 is going to do to Displayport what USB 2.0 did to Firewire.
 
So you can't use sync with a gsync monitor over HDMI? With an nvidia card I mean.
 
I'm expecting HDMI 2.1 is going to do to Displayport what USB 2.0 did to Firewire.

I've been saying the same for a little while now. When I first read the specs for 2.1 a year or two ago my first thought was "it's about time amd truly impressive". From 1.4 to 2.0b it's been a nickel and dime increase with more hype than advance. Poor DP, they've ruled in that time but don't seem to be maintaining momentum for the next wave of tech.

Odd thing is that AMD has hinted they might be able to support HDMI 2.1 over DP in the recent interview Kyle had with Scott H. If NV did the same then this continue to help DP, at least on the card level.
 
They were extremely skeptical of NVIDIA claims that FreeSync monitors exhibit issues with AMD video cards. They knew that cheap brand monitors had quality control issues, but otherwise everything worked fine on the AMD side.

They're full of shit, Freesync is an incredibly mixed bag with a lot of problems. Gsync is more expensive but I've only had a problem with it in one single game.
 
I've been saying the same for a little while now. When I first read the specs for 2.1 a year or two ago my first thought was "it's about time amd truly impressive". From 1.4 to 2.0b it's been a nickel and dime increase with more hype than advance. Poor DP, they've ruled in that time but don't seem to be maintaining momentum for the next wave of tech.

Odd thing is that AMD has hinted they might be able to support HDMI 2.1 over DP in the recent interview Kyle had with Scott H. If NV did the same then this continue to help DP, at least on the card level.

Displayport might backport some of the HDMI 2.1 feature set, but at the end of the day resolution/refresh options are bandwidth limited. Not much Displayport can do about that, and given they just released their 1.4 specification, it will be at least two to three years before they can catch up.

HDMI really went overboard on the bandwidth and features this revision, and it's really obsoleted the need for a second specification. Displayport got a leg up by adding *just* enough bandwidth to handle 4k HDR 4:4:4 @ 60Hz when they released the 1.3 specification (this is a notable limitation of HDMI 2.0), but they didn't up the bandwidth in 1.4 and now hopelessly out of date compared to HDMI 2.1's feature set.

Fact is, Displayport never really broke into the TV market, to the point where one of it's unexpected major features (Freesync) has been back-ported and fudged unofficially into HDMI in order to get some form of VRR into TVs. The fact TV manufacturers chose that route instead of supporting Displayport says all you need to know about the specs future.
 
They're full of shit, Freesync is an incredibly mixed bag with a lot of problems. Gsync is more expensive but I've only had a problem with it in one single game.

AND what testing have you done on what GPU's and with what monitors? I can link 3 separate articles that did testing with many monitors and both sides of GPU's without issue with Freesync. I'd love to see your tech review if you're calling all of these other reviewers full of shit.
 
AND what testing have you done on what GPU's and with what monitors? I can link 3 separate articles that did testing with many monitors and both sides of GPU's without issue with Freesync. I'd love to see your tech review if you're calling all of these other reviewers full of shit.

There is no need to get defensive, the technology is great and free - just pointing out that it isn't flawless like the article suggests and neither is the expensive option of Gsync, AMD"s own forums are filled with the problems people have over extended use. I've used it for actual gaming for many, many hours over the last few years and not just a quick test in some top AAA titles. I think I've had around 30 Freesync monitors now with cards spanning 290/X to the current V64 and V56 I have now and dozens of driver revisions. I've observed mostly flickering as the #1 problem, although I have had some issues with black screen flashing and some very strange tearing artifacts. Once you go outside the current release huge titles like Battlefield or whatever, its a crapshoot.

When they work, they seem identical. However, Gsync works in a number of games I play flawlessly while Freesync does not. The only game I have ever seen completely fail on Gsync was Skyrim, but that was more an engine problem than the technology.
 
Once you go outside the current release huge titles like Battlefield or whatever, its a crapshoot.

I play less intense current titles (League of Legends) and older titles (Mass Effect, Civ 5) and others that do not tax my system at all, and run locked at the frame limiter with G-Sync enabled. Mass Effect was actually demanding when I played through the first time- now playing at 165FPS with G-Sync and no lag?

I've been in heaven. And I'm glad I didn't settle.
 
G-Sync wasn't more expensive exclusively due to the NVIDIA PCB. Anyone selling G-Sync monitors had to get NVIDIA's approval on panel selection. The manufacturer also had to pass some sort of quality certification.

That said, I now regret getting the Alienware AW3418DW G-Sync monitor. Samsung has indicated they're releasing a 49" 5120×1440, 120Hz, FreeSync 2 monitor.

I wouldn't feel too bad. That alienware and predator x34p I have the only two IPS panels in that class available as far as I know. Everything else is not just Freesync but also MVA. I went through a couple of those before I gave up and splurged on the X34P. MVA Ultrawide is trash G-sync or not.
 
Stutter-fest on my 24UD58-B + GTX 1060. However, I am using triple monitors, two with DSR@UHD (4x scaling). IME, Nvidia cards seem to get finicky with 3+ monitors. Oh well.
 
Tested with nVidia Pendulum (otherwise I honestly wouldn't have noticed the difference), working fine on my Samsung C24FG70.
 
G-Sync wasn't more expensive exclusively due to the NVIDIA PCB. Anyone selling G-Sync monitors had to get NVIDIA's approval on panel selection. The manufacturer also had to pass some sort of quality certification.

That said, I now regret getting the Alienware AW3418DW G-Sync monitor. Samsung has indicated they're releasing a 49" 5120×1440, 120Hz, FreeSync 2 monitor.

Pretty sad resolution for the size though. I saw that 49" super ultra wide Sammy at Microcenter and pixels were the size of a knuckle, also the aspect ration makes it completely worthless for everything. Even most online games ban that aspect and force the black bars on you. Perhaps only for movie watching it would be a good pick.
 
Pretty sad resolution for the size though. I saw that 49" super ultra wide Sammy at Microcenter and pixels were the size of a knuckle, also the aspect ration makes it completely worthless for everything. Even most online games ban that aspect and force the black bars on you. Perhaps only for movie watching it would be a good pick.

the 1440p isnt out yet i dont think. the MC is actually 1080p and yeah I saw it in person its awful. I dont think the 1440p would change my mind.
 
my no name monoprice 1440 panel seems to tolerate it just fine in the couple games I tried. No tearing to be seen, which is about all I expected out of it.
 
Pretty sad resolution for the size though. I saw that 49" super ultra wide Sammy at Microcenter and pixels were the size of a knuckle, also the aspect ration makes it completely worthless for everything. Even most online games ban that aspect and force the black bars on you. Perhaps only for movie watching it would be a good pick.
As stated earlier, the 1440p version isn't even out yet. The 49" 32:9 is the equivalent of two 27" 16:9 screens, so 5120x1440 is a good resolution for it.
 
Works with my AOC 2460. Although I cant tell any difference because its 1080p and my 1080ti always drives it to 144Hz...
 
Honestly, if a monitor checks all the Freesync 2 boxes there really isn't going to be a heck of a lot of difference between it and a standard G-Sync display. The CHG70 is a damn good Freesync 2 display. The biggest thing that makes G-Sync better is just the over-all level of feature quality. You will never find a G-sync panel that doesn't support the entire range of the monitor, from a single hertz all the way up to the max refresh rate of the panel. That means every display, no matter the G-sync version or age, supports LFC. Freesync's problem has always been the sheer number of displays with bad Freesync implementation. Finding a good one generally required a ton of looking and research, especially as a lot of reviews do a piss poor job of covering Freesync support. However, if you found a Freesync monitor that matched the feature set of G-sync then there should be no real difference between them.

For people with the money to burn I think G-Sync Ultimate will be the real premium G-Sync experience, with a real premium price to go along with it.

I have the CGH70. I was surprised that it wasn't on the qualified list lol. Only thing that I think Nvidia is butthurt about supporting freesync. So they did their best to show how its inferior that only 12 monitors qualified lol. I honestly think they disqualified a lot of cheap monitors with low range. But they still missed a lot of good ones including the CGH70 I have. So clearly they did not test all the good monitors because I don't believe for a moment they only found 12 to certify.
 
I have the CGH70. I was surprised that it wasn't on the qualified list lol. Only thing that I think Nvidia is butthurt about supporting freesync. So they did their best to show how its inferior that only 12 monitors qualified lol. I honestly think they disqualified a lot of cheap monitors with low range. But they still missed a lot of good ones including the CGH70 I have. So clearly they did not test all the good monitors because I don't believe for a moment they only found 12 to certify.

When you take into account that literally zero Freesync monitors are up to the G-Sync spec, it makes a bit of sense.

They're not just testing 'does it work'.
 
Back
Top