Nvidia's Freesync Monitor Support Tested

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Nvidia recently released a driver update that adds support for Freesync monitors, but the GPU maker was quick to point out that "non-validated" displays could exhibit serious issues. However, Techspot put that claimto the test last January, and just followed up on it with a handful of new Freesync monitors from LG. Out of the 12 monitors they've tested so far, 11 have worked flawlessly, and the one that didn't only supports Freesync over HDMI, while Nvidia only supports the adaptive sync technology over Displayport.

Check out the video here.

As we said in the original article, we think it's safe to say if you purchase a new FreeSync model today that it will work fine with Nvidia GPUs. You shouldn't expect to see any graphical issues, whether you buy an LG monitor or a display from a different maker. Only the very early, older FreeSync monitors may have some issues. Thus our recommendation continues to be to select your next monitor on its merits, features and preference. If it's a FreeSync gaming monitor and you can save some money, that's great. GeForce GPU owners no longer you need to bother with G-Sync for getting proper variable refresh rate, unless that's the monitor you want for other factors as mentioned above. These results from a selection of LG monitors just reinforces that FreeSync gaming displays can and will perform perfectly.
 
My 1440p 144hz freesync monitor arrives today. Can't wait to try it out! Already had a decent monitor, but didn't have display port, only HDMI.
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?

Yes and no. There is still something to be said for Nvidia's gauntlet of tests monitors need to pass in order to be G-Sync certified. Whether it's worth the $100-200+ difference compared to a roughly comparable Freesync monitor is debatable. If Windows HDR support wasn't dogshit I'd say G-Sync Ultimate would be a compelling reason to keep G-Sync around as they stand as the best of the best currently available HDR panels, but well HDR is pretty pointless on PC right now.
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?

Not sure, especially when the monitor I'm eyeballing has a version for each. I'd say it's dumb to go G-sync when there's a Freesync 2 option available with similar specs. The 34" Ultragears from LG for example. There's an "F" version and a "G" version.

I don't know if my next video card is going to be nVidia or AMD, but I'm on a GTX 1070 right now, and probably will upgrade to whatever flagship comes next from AMD, assuming they hit the bang for the buck like they're known for.
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?

Sure there is if certain conditions are met.

LCD monitors will not work much below 48hz. yes some will take 40hz, but they are not happy.

So we will say this is below gsync/free sync range.
On a gsync monitor, the video card can send 2fps if it wants, the frame buffer in the monitor (that $100-200 gsync tax) will handle it so that the panel electronics never see below 48hz.
On a free sync monitor with AMD card, if "the game" sends 2fps, the AMD driver will use algorithm to multiply the frames till they reach the min freesync range of the monitor. So if the game is only putting out 32fps, the amd card with double that to 64hz.

As of now, nvidia cards do not have this algorithm. So if they go below the freesync range, you either get tearing with vsync off, or vsync lag with vsync on. Currently its a "bug", and you should leave vsync ON for the lease amount of lag in freesync range(yes its less lag than v sync off), and deal with the bug, which is vsync does not engage correctly when below freesync range.



I will say from experience, "windowed" and certain apps do not like freesync via nvidia on my monitor. paint.net does a mini disco fever in windowed mode, as does MPC-BE in full screen mode. Those are the only 2 apps so far that i have to tweak.
 
Last edited:
Sure there is if certain conditions are met.

LCD monitors will not work much below 48hz. yes some will take 40hz, but they are not happy.

So we will say this is below gsync/free sync range.
On a gsync monitor, the video card can send 2fps if it wants, the frame buffer in the monitor (that $100-200 gsync tax) will handle it so that the panel electronics never see below 48hz.
On a free sync monitor with AMD card, if "the game" sends 2fps, the AMD driver will use algorithm to multiply the frames till they reach the min freesync range of the monitor. So if the game is only putting out 32fps, the amd card with double that to 64hz.

As of now, nvidia cards do not have this algorithm. So if they go below the freesync range, you either get tearing with vsync off, or vsync lag with vsync on. Currently its a "bug", and you should leave vsync ON for the lease amount of lag in freesync range(yes its less lag than v sync off), and deal with the bug, which is vsync does not engage correctly when below freesync range.

LFC support gets around that issue. And, really, anyone buying a Freesync monitor right now should only buy one with the range to support LFC.
 
LFC support gets around that issue. And, really, anyone buying a Freesync monitor right now should only buy one with the range to support LFC.

To add to this, the linked video states that their testing shows LFC is working. There isn't any issue with <48 fps as long as you buy a monitor with enough range to support LFC (max freesync fps approximately twice as much as min i.e. 144hz monitors are usually fine).
 
To add to this, the linked video states that their testing shows LFC is working. There isn't any issue with <48 fps as long as you buy a monitor with enough range to support LFC (max freesync fps approximately twice as much as min i.e. 144hz monitors are usually fine).

2.4:1 is the equation. So for a 144hz monitor it would need to support, at least, 60-144hz. Thankfully, AMD's database of Freesync monitors makes it easy by saying if a monitor supports LFC or not.
 
To add to this, the linked video states that their testing shows LFC is working. There isn't any issue with <48 fps as long as you buy a monitor with enough range to support LFC (max freesync fps approximately twice as much as min i.e. 144hz monitors are usually fine).

OK ill play dumb because i havent watched the video. So nvidia does have a working software LFC solution, that is not bugged? As of the first jan release, it was bugged. And i have not read anywhere that they have got it working. And monitors do not have LFC built in. Just if the montitor supports 2 or 2.4 or what ever, AMD drivers(not sure on nvidia) will engage software LFC automatically.
 
This would happen after I bought a G-sync monitor. When I could have gotten a less expensive freesync monitor.

I know this feel bro. Although my GSync monitor was on sale for a similar price to Freesync alternatives, so I don’t feel so bad about it, aside for being locked in to Nvidia for the foreseeable future.
 
OK ill play dumb because i havent watched the video. So nvidia does have a working software LFC solution, that is not bugged? As of the first jan release, it was bugged. And i have not read anywhere that they have got it working. And monitors do not have LFC built in. Just if the montitor supports 2 or 2.4 or what ever, AMD drivers(not sure on nvidia) will engage software LFC automatically.

Yes, the video states its working now. Hardware Unboxed is pretty well known for its monitor review content and graphics cards reviews. I personally would trust that they know what they are talking about. Its worked with the monitors that they've tested that support the feature. The might be a few exceptions I guess but from what they said LFC is working on Nvidia with freesync monitors now by and large.
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?

In the long run probably, but it is going to be awhile. The reason you still might get one now is if there's a product that you like better that uses it, or you want the absolute lowest lag and best VRR support. Unsurprisingly, nVidia cards like their own stuff the best so you get everything working as well as possible if you use nVidia card + nVidia monitor. Also it seems that you get some of the very lowest latency with Gsync monitors. Not that Freesync is bad, Gsync seems to be able to push the latency down a bit more.

However I expect that Gsync will start dying. Monitor vendors are always competing on price and a Gsync module costs more. Plus VRR is a part of the HDMI 2.1 spec so you'll see it in lots of TVs and that'll filter down to monitors. As such Gsync will fade away I would imagine.
 
So is this the end of G-Sync monitors then? Is there any reason to stomach the mark-up now?

If you already have one, and want more. I'm thinking mixing and matching may not work so well.

Someone should test that :)
 
In the long run probably, but it is going to be awhile. The reason you still might get one now is if there's a product that you like better that uses it, or you want the absolute lowest lag and best VRR support. Unsurprisingly, nVidia cards like their own stuff the best so you get everything working as well as possible if you use nVidia card + nVidia monitor. Also it seems that you get some of the very lowest latency with Gsync monitors. Not that Freesync is bad, Gsync seems to be able to push the latency down a bit more.

However I expect that Gsync will start dying. Monitor vendors are always competing on price and a Gsync module costs more. Plus VRR is a part of the HDMI 2.1 spec so you'll see it in lots of TVs and that'll filter down to monitors. As such Gsync will fade away I would imagine.


Makes me wonder though, how long do we have before gaming becomes a streaming experience where you log into a gaming server, establish a session to a VM with your games streamed to your home computer which doesn't have to be much more powerful than a thin client. You get a picture, your mouse and keyboard actions, or controller actions, are sent to the VM which reacts to your input and effects the updating display.
 
Makes me wonder though, how long do we have before gaming becomes a streaming experience where you log into a gaming server, establish a session to a VM with your games streamed to your home computer which doesn't have to be much more powerful than a thin client. You get a picture, your mouse and keyboard actions, or controller actions, are sent to the VM which reacts to your input and effects the updating display.

The day that happens is the day I quit PC & console gaming.
 
Makes me wonder though, how long do we have before gaming becomes a streaming experience where you log into a gaming server, establish a session to a VM with your games streamed to your home computer which doesn't have to be much more powerful than a thin client. You get a picture, your mouse and keyboard actions, or controller actions, are sent to the VM which reacts to your input and effects the updating display.

Over a decade, at least. The US is the single largest game and tech purchasing country on the planet and our internet infrastructure is utter crap. Data caps, overloaded nodes, and high data costs will prevent game streaming from being the only option for a long time.
 
Makes me wonder though, how long do we have before gaming becomes a streaming experience where you log into a gaming server, establish a session to a VM with your games streamed to your home computer which doesn't have to be much more powerful than a thin client. You get a picture, your mouse and keyboard actions, or controller actions, are sent to the VM which reacts to your input and effects the updating display.

I kinda doubt it'll happen. Latency is always going to be an issue with something like that, the speed of light has no workarounds.
 
I kinda doubt it'll happen. Latency is always going to be an issue with something like that, the speed of light has no workarounds.

It depends on what your playing-I was on the Google Chrome streaming test of Assassins Creed and it wasn't Bad at all

The graphic quality wasn't that great (then again I do have a 34inch display-would look fine on a 1080P monitor) but overall it wasn't a horrible experience.
 
There is a "tool" to manually program in the freesync ranges if the driver can not detect them. Its called CRU.
Thanks. I know about CRU and tried it on my TV without luck but I haven't tested on this monitor yet.
 
Over a decade, at least. The US is the single largest game and tech purchasing country on the planet and our internet infrastructure is utter crap. Data caps, overloaded nodes, and high data costs will prevent game streaming from being the only option for a long time.


It's my understanding that game streaming is not high bandwidth. Certainly no more than video streaming.

Anyone who can watch Netflix at under 4K resolutions can do game streaming just fine.
 
I kinda doubt it'll happen. Latency is always going to be an issue with something like that, the speed of light has no workarounds.

There is no increase in latency with game streaming. In either case, signals carry information both ways. The difference is where the greatest delta lies. With traditional online games, the latency lies between the client and the servers/peer clients. With game streaming the greatest delta is between the user and the client. But when all clients are essentially co-located with the servers, you are not waiting on updates from all players to be processed and sent to you. You are only waiting on your client to update your display and you are sending your clicks and key-presses. Your upstream is tiny, your downstream is not huge at all. Any latency is the same latency that you would experience in all online gaming based on your connection to any given server.
 
It's my understanding that game streaming is not high bandwidth. Certainly no more than video streaming.

Anyone who can watch Netflix at under 4K resolutions can do game streaming just fine.

Depends on the quality of the stream. If you want to match even current console quality its going to eat up data. Even the pretty good Google AC: Odd thing you could tell they were making some pretty big visual quality sacrifices to make it work. For game streaming to ever become a dominate thing they need to push 4K, HDR, high-end console settings with decent audio and little to no visual artifacting. They also need to find ways to reduce input lag as much as possible. There is some input lag on Google's system and its fine for something like AC, but it would be impossible with a fast-paced shooter or a fighting game, especially if you add in the lag from wireless controllers and the natural input lag of TVs (where most people would be playing these games).

Netflix 4K requires a 25Mbps connection. The average speed in the US is 18.75.
 
Yes and no. There is still something to be said for Nvidia's gauntlet of tests monitors need to pass in order to be G-Sync certified. Whether it's worth the $100-200+ difference compared to a roughly comparable Freesync monitor is debatable. If Windows HDR support wasn't dogshit I'd say G-Sync Ultimate would be a compelling reason to keep G-Sync around as they stand as the best of the best currently available HDR panels, but well HDR is pretty pointless on PC right now.
HDR support is not dogshit on PC right now. More and more games being released have proper HDR10 implemented on PC and it looks great. There just needs to be better communication on how to use it because right now there is still a conflict between how Windows and hardware vendors switch it on.
There is no increase in latency with game streaming. In either case, signals carry information both ways. The difference is where the greatest delta lies. With traditional online games, the latency lies between the client and the servers/peer clients. With game streaming the greatest delta is between the user and the client. But when all clients are essentially co-located with the servers, you are not waiting on updates from all players to be processed and sent to you. You are only waiting on your client to update your display and you are sending your clicks and key-presses. Your upstream is tiny, your downstream is not huge at all. Any latency is the same latency that you would experience in all online gaming based on your connection to any given server.
Except movement and everything else is still performed on the client for online games. With game streaming you have to wait for the data to travel to the server for your input to register and for it to come back to you to see the result of that input. On a good day that could be an extra 30-40 ms of input lag just from the network latency. That is no bueno for any kind of game.
 
Except movement and everything else is still performed on the client for online games. With game streaming you have to wait for the data to travel to the server for your input to register and for it to come back to you to see the result of that input. On a good day that could be an extra 30-40 ms of input lag just from the network latency. That is no bueno for any kind of game.

Yup, its like having your system ram on a 10gigbit connection on a local network. Yes you 1000 meg a second, but the latency makes it worthless.
 
Depends on the quality of the stream. If you want to match even current console quality its going to eat up data. Even the pretty good Google AC: Odd thing you could tell they were making some pretty big visual quality sacrifices to make it work. For game streaming to ever become a dominate thing they need to push 4K, HDR, high-end console settings with decent audio and little to no visual artifacting. They also need to find ways to reduce input lag as much as possible. There is some input lag on Google's system and its fine for something like AC, but it would be impossible with a fast-paced shooter or a fighting game, especially if you add in the lag from wireless controllers and the natural input lag of TVs (where most people would be playing these games).

Netflix 4K requires a 25Mbps connection. The average speed in the US is 18.75.


How is that average speed determined because people choosing lesser speeds when higher is available would impact that number?
 
How is that average speed determined because people choosing lesser speeds when higher is available would impact that number?

Netfilx and youtube give out stats for your area. So you might be paying for 100mbs, but can only get 18.75 over a single stream. So you might be able get 3 streams, but no single one will be 25mbs. I have 100mbs, i can speedtest friday night at 100mbs, but the odds of a 4k not throttling is low.
 
Back
Top