Nvidia's Freesync Monitor Support Tested

There is no increase in latency with game streaming. In either case, signals carry information both ways. The difference is where the greatest delta lies. With traditional online games, the latency lies between the client and the servers/peer clients. With game streaming the greatest delta is between the user and the client. But when all clients are essentially co-located with the servers, you are not waiting on updates from all players to be processed and sent to you. You are only waiting on your client to update your display and you are sending your clicks and key-presses. Your upstream is tiny, your downstream is not huge at all. Any latency is the same latency that you would experience in all online gaming based on your connection to any given server.

Thing is input lag is a bitch and really noticeable, client server lag not so much. If you have a game that has good net code, 100ms between you and a server is ok. If you go and get 100ms between your mouse and display it is hard to use. Try it some time, turn on the processing on a TV, that'll put the lag in the 100-150ms range. It is a pain even using the desktop with that kind of lag.
 
HDR on PC is confusing and not perfect, but after you get it setup correctly it does look great.

If you see here, there is a growing list of games with support, including big ones like Call of Duty, Battlefield, Assassin's Creed, Resident Evil, etc.

https://www.rockpapershotgun.com/2019/01/31/what-graphics-card-hdr-pc-games/

So can you get HDR working with an nVidia video card but FreeSync2 monitor? I never thought of this till now..was planning on a FreeSync 2 monitor upgrade assuming NAVI will be a good value and worthy upgrade from a GTX 1070. G-Sync and RTX are just budget busters! Grr
 
I've notice some people ream Nvidia for holding the position that monitors may not be supported but they do actually work. I think it's smart that nvidia holds that stance because God forbid a monitor doesn't work and someone at Nvidia says "a majority will work!". After working it support for a while now, I've grown cynical about consumers. We demand 200% but expect to pay 50% :) so quick to be vocal when things are wrong and even when things are right, like 11 out of 12 monitors that aren't suppose to work so work, it's very much possible it will be framed in a way to make a company look bad.
 
So can you get HDR working with an nVidia video card but FreeSync2 monitor? I never thought of this till now..was planning on a FreeSync 2 monitor upgrade assuming NAVI will be a good value and worthy upgrade from a GTX 1070. G-Sync and RTX are just budget busters! Grr
My HDR rig is with AMD (second rig in sig). The TV I have is regular FreeSync, not FreeSync 2, but HDR still works somehow (I think FreeSync 2 includes other specs that must be met, not just HDR).

I have not tested with Nvidia because G-Sync over FreeSync only works on DisplayPort (not HDMI) so I don't believe it will work with my Samsung TV. I could test it just to see, I guess you never know.
 
It's understandable the way Nvidia framed the situation. Probably a lot of monitors do work (from what I've seen online) but because it's not 100% they can't guarantee it.

Also, they probably still want to keep G-Sync as the premium brand (until the VRR standard takes off) so saying FreeSync doesn't work fully gives people a reason to spend extra on G-Sync.
 
It's understandable the way Nvidia framed the situation. Probably a lot of monitors do work (from what I've seen online) but because it's not 100% they can't guarantee it.

Also, they probably still want to keep G-Sync as the premium brand (until the VRR standard takes off) so saying FreeSync doesn't work fully gives people a reason to spend extra on G-Sync.

For the most part Nvidia's qualifications make sense. However I feel like them requiring Freesync/async to be on by default is a bit ridiculous.
 
For the most part Nvidia's qualifications make sense. However I feel like them requiring Freesync/async to be on by default is a bit ridiculous.

People are not good at figuring shit out, they want it to be easy for non-technical users. I mean part of the reason HDMI came to be what it is is because companies discovered that in the early days of HDTV most owners did not have HD hooked up properly, even when they had an HD cable box. The 5 wires shit was just too complex and caused them to be unable to figure it out, so they had their expensive toys hooked up doing SD.
 
The 5 wires shit was just too complex and caused them to be unable to figure it out, so they had their expensive toys hooked up doing SD.
It also didn't help that component and composite cables looked similar, sounded similar, and still worked if you plugged them in wrong (but as SD of course).
 
It also didn't help that component and composite cables looked similar, sounded similar, and still worked if you plugged them in wrong (but as SD of course).

For sure, but it really wasn't that complex. Still, it was too complex for the average user. That was when the industry realized it needed to be a single connection for everything, otherwise it was going to be issues. Same deal with VRR being a default-on. No it isn't complex to change, but there will be a large number of non-technical users that can't figure it out.
 
Well, VRR absolutely should be on by default if the hardware supports it.

While there are a few things you may need to do to get best experience on PC games (for example frame limiting or forcing v-sync), I can't think of anything negative that will happen for it being on.
 
Why is it in a video? Can someone sum it up please? What monitors they tested? I'm not watching hardware review videos cause they're the least efficient way of conveying information.
 
Which monitor? Not every single freesync monitor will work perfectly.
I tried it at a friend's house, it was the 1080 ti on some curved screen, not approved, after switching to freesync it glitched and the colors were messed up. went back to DP 1.1 and it was fine again. I will post the model when I find it
 
I just tried on my Samsung Q7F FreeSync TV. As expected, it did not work.

The TV uses HDMI only, and I know Nvidia says they don't support it, but some dude on Reddit claimed he got it working on his TV.

Once I connected the TV, the G-Sync tab on the Nvidia panel disappeared, and trying the Pendulum demo the G-Sync button wouldn't click and it was visibly not working.

Hey, worth a shot, right?
 
I just tried on my Samsung Q7F FreeSync TV. As expected, it did not work.

The TV uses HDMI only, and I know Nvidia says they don't support it, but some dude on Reddit claimed he got it working on his TV.

Once I connected the TV, the G-Sync tab on the Nvidia panel disappeared, and trying the Pendulum demo the G-Sync button wouldn't click and it was visibly not working.

Hey, worth a shot, right?
Do you have displayport? It uses DP 2.0, even 1.1 won't work. I doubt an adapter would work but you can try that too.
 
I just tried on my Samsung Q7F FreeSync TV. As expected, it did not work.

The TV uses HDMI only, and I know Nvidia says they don't support it, but some dude on Reddit claimed he got it working on his TV.

Once I connected the TV, the G-Sync tab on the Nvidia panel disappeared, and trying the Pendulum demo the G-Sync button wouldn't click and it was visibly not working.

Hey, worth a shot, right?

Won't work unless you're using a Display port.
 
So can you get HDR working with an nVidia video card but FreeSync2 monitor?

Understand that "Freesync2" is just Freesync but with the requirements raised from 'there might be a little VRR range' to 'almost like G-Sync but not quite', with added HDR minimums.

The VRR and the HDR aspects aren't related except by the branding, and Nvidia GPUs now support both over DP.
 
Do you have displayport? It uses DP 2.0, even 1.1 won't work. I doubt an adapter would work but you can try that too.
I tried directly on HDMI 2.0, and then with 2 different DisplayPort adapters connected to the Nvidia card. None of them worked.
 
To be clear, I did not think it would work. But that post on Reddit gave me some hope so I spent a few minutes testing just to be sure.
 
You know, if you actually clicked the link to take you to the front page news item it would have given you a link to an article.

https://www.techspot.com/article/1810-lg-freesync-and-nvidia-geforce/
You know I actually clicked the link and the news item was showing a video with no mention of an article. So I didn't click on the video. I didn't know there was an actual article hidden behind links attached to random words. Oh wait, the article isn't to compliment the video. The video is the summary of multiple previous articles.
 
Last edited:
Back
Top