Nvidia Expands G-Sync Support to Approved Adaptive Sync Monitors

Sounds too good to be true.
I can see it being used by Nvidia to sell their branding back to manufacturers so they have the approved Nvidia branding. Why they can not support Freesync as a whole instead of trying to create another form segmentation and if they are asking money for their "support" more expensive for consumers.

That was never the aim for this platform.
 
It's great news all around I think. Doesn't really hurt anybody who already has a g-sync display and going forwards should mean cheaper monitors that work with both nvidia and amd (and presumably intel's efforts). And if nvidia won't certify it unless it's a high quality VRR implementation then hopefully that means we'll see higher quality freesync implementations from everyone. But I guess we'll see... maybe to get g-sync certified you have to pay a truckload to nvidia.... like all those motherboards which weren't SLI certified even though they had 2+ pci-e 16x slots...
 
In a couple of years G-sync will be as relevant as PhysX.

PhysX is still the leading physics SDK and used in Cryengine, Unity, Unreal, and Dawn engines. It has hardly lost relevance and works on every gaming platform. Maybe you should have clarified GPU PhysX.

Game engines no longer need the extra computing power from the GPU so it has died off due to little need.

G-Sync will remain relevant, even with NV's support of Freesync, as long as the scaler and manufactures quality criteria remains. NV is super picky on the panels that get Gsync, something FS doesn't require. People are obviously willing to pay more for better quality.
 
Doesn't really hurt anybody who already has a g-sync display and going forwards should mean cheaper monitors that work with both nvidia and amd (and presumably intel's efforts).

Hard to hurt someone that has already taken the shaft.
 
I think it's appropriate to remember that what this actually amounts to is that things will finally be getting done the way they always should have been done. The concept that the card should sync to the monitor's refresh rate was flawed from the very beginning. It always should have been the other way around with monitors adjusting to what the display cards can manage.

And it's only taken us 30+ years to get there. That's progress (y)

Yea i remember when DVI was going to solve all our problems. Talk directly to the panel, bypass everything, and make "refresh rate" a thing of the past, because it was just going to update each pixel as needed.
 
I see no reason to drop my gsync ips 144hz display until micro LED takes over for gaming screens. Bought this screen 2 years ago. Expect to use it for another 4 years at least. If it breaks i'll just by the same screen.
Yeah no reason to dump a good gsync for freesync
 
I see no reason to drop my gsync ips 144hz display until micro LED takes over for gaming screens. Bought this screen 2 years ago. Expect to use it for another 4 years at least. If it breaks i'll just by the same screen.
Yeah no reason to dump a good gsync for freesync

True, if you have Gsync already, there is really no reason to dump it, but this does open more options for future monitor purchases.
 
Well this announcement just made my next monitor purchase much easier.

This could actually make monitor reviews actually interesting again. Does x FreeSync monitor work "well" will with Nvidia cards?

Somebody could probably get some clicks/views doing that.....
 
I remember when it was said this wasn’t possible, because it was dependent on a hardware module.
 
I saw the beta driver floating around somewhere this morning that they are releasing in a week+ but didn't snag it can someone link me please. Can't find it now. Thanks in advance.
 
A QC approved list is a good idea but I bet the certification process costs the manufacturer some kind of royalty fee. I think NVIDIA is caving because of Intel jumping into the dGPU market in 2020 has them worried. If it were still just AMD we would never see NVIDIA leaving their proprietary G-SYNC setup. For me this is great news because I've been wanting those new LG/Samsung wide displays but I couldn't justify them due to the lack of G-SYNC.
 
It's great news all around I think. Doesn't really hurt anybody who already has a g-sync display and going forwards should mean cheaper monitors that work with both nvidia and amd (and presumably intel's efforts). And if nvidia won't certify it unless it's a high quality VRR implementation then hopefully that means we'll see higher quality freesync implementations from everyone. But I guess we'll see... maybe to get g-sync certified you have to pay a truckload to nvidia.... like all those motherboards which weren't SLI certified even though they had 2+ pci-e 16x slots...

I think the only people "hurt" by this are current g-sync monitor owners, as they will still be tied to Nvidia GPUs. Now FreeSync owners are not tied to AMD. This is a Win/Win for Nvidia as I see it.


BTW, I own a XG270HU as well and was looking to upgrade GPU to the 7nm Vega 2, but now I will have to compare it to the RTX line and see what is fastest for the price. (Assuming the monitor works exactly the same as it does with the AMD card. If it works better/worse then the choice goes away. Can [H] test this for us?)
 
This will be a serious win for those of us who don't want to pay upwards of $750 for just a monitor.

I have an Acer XR382CQK v2. Already exceeded that number lol. However, at the time (and I still think is the case) there was no GSYNC monitor in these dimensions.

Super excited for this! I need to order a longer DP cable because the one that came with my monitor is just a hair too short.
 
Still a lot of speculation. I know it's an open standard and all, but I'm still waiting for a review to see if this works as expected, or if there is some freakiness that occurs that gets played off as "well it's not Gsync".

Appears it only works on 10 and 20 series GPUs, even though nVidia has supported it since the 600 series. Ok, they want to sell cards...
Does it require a "qualified" monitor, or can you actually enable it on any Freesync/VRR with YMMV?
 
Brass tacks.



The 12 tested are more or less "guaranteed" to work to NVIDIA's "standards".

For the rest...
"There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely."


If I understand the here comments right, then:

* gsync and free sync are marketing names for variable refresh rate.

*Vrr is already supported in display port and will be supported in hdmi 2.1?

Which means even before this new driver switch we could have been trying to either add our monitors into nvidia .inf or possibly spoofing the monitor to appear as a gsync supported model?
 
If I understand the here comments right, then:

* gsync and free sync are marketing names for variable refresh rate.

*Vrr is already supported in display port and will be supported in hdmi 2.1?

Which means even before this new driver switch we could have been trying to either add our monitors into nvidia .inf or possibly spoofing the monitor to appear as a gsync supported model?

My understanding is that while VRR and Freesync work very similarly, G-Sync works differently, and uses some sort of hardware acceleration.

Presumably with Freesync and VRR you wind up with more either GPU or CPU load, but that is unclear to me.

Either way, without an Nvidia blessing, I doubt it would have been as easy as editing an INF file.
 
Still a lot of speculation. I know it's an open standard and all, but I'm still waiting for a review to see if this works as expected, or if there is some freakiness that occurs that gets played off as "well it's not Gsync".

Appears it only works on 10 and 20 series GPUs, even though nVidia has supported it since the 600 series. Ok, they want to sell cards...
Does it require a "qualified" monitor, or can you actually enable it on any Freesync/VRR with YMMV?


Did you read ANY of the information or did you see thread title and go to post?
 
G-SYNC Ultimate HDR is just their new branding to delineate Adaptive-Sync support, with the former still including the FPGA in the display as with the PG27UQ and PG65. I think it was a smart move.

I am surprisingly OK with this decision. Their hand was forced in many ways, but that they relented at all is indicative of the need to move on collectively with display tech. Took them long enough.
 
HOLY SHIT my monitor is on the list! YES

My understanding is that being on the list is a formality. That just means that Nvidia has tested it, and bless that it will work properly. Sortof like a Microsoft WHQL driver.

The way I read it you should be able to go in to the driver settings and manually enable it for any VRR or Freesync display as long as you have a 10xx or 20xx GPU.
 
My understanding is that being on the list is a formality. That just means that Nvidia has tested it, and bless that it will work properly. Sortof like a Microsoft WHQL driver.

The way I read it you should be able to go in to the driver settings and manually enable it for any VRR or Freesync display as long as you have a 10xx or 20xx GPU.
Sounds like a win to me either way. :p
 
My thought process? Samsung has already added VRR (freesync) support for their higher tiered TV's so that XBOX and such can take advantage while in game mode etc. Which means there's a really good chance a lot of the new TV's at CES this week will show off their VRR support too. That's too big of a market for NVidia to just say fuckit to.
 
Like I stated before, garenteed Nvidia is going pull some shinanigans with this. Most likely with branding: if you want to put the GSync logo on your box, you can't have the Freesync logo (or even mention Freesync support at all), or Nvidia may even go so far as to force "qualified monitors" to hack out support for Freesync all together, making it difficult or impossible to enable on AMD cards.

Call me pessimistic, but I simply don't see Nvidia as the 'generous' type.
 
My thought process? Samsung has already added VRR (freesync) support for their higher tiered TV's so that XBOX and such can take advantage while in game mode etc. Which means there's a really good chance a lot of the new TV's at CES this week will show off their VRR support too. That's too big of a market for NVidia to just say fuckit to.

yea since they have to support hdmi 2.1 fully, odds are they finally just through their hands up in the air. Still it was good effort to try to make a walled garden, but i think they lost much more sales vs gained. I guess you could say each gsync monitor sold was a win for team green, but each freesync monitor sold was a loss.
 
I just think it'll be hilarious when they start marketing it "Now with FreeSync Support!"
 
I had to believe Nvidia was going to support VRR through hdmi 2.1, by releasing now they'll also be able to get alot of bugs out of the way without much criticism (via the goodwill garnered by doing something consumer friendly) . Alot of end user testing needed now since G-Sync testing and validation was probably easily automated with Nvidia's strict specifications.
 
Like I stated before, garenteed Nvidia is going pull some shinanigans with this. Most likely with branding: if you want to put the GSync logo on your box, you can't have the Freesync logo (or even mention Freesync support at all), or Nvidia may even go so far as to force "qualified monitors" to hack out support for Freesync all together, making it difficult or impossible to enable on AMD cards.

Call me pessimistic, but I simply don't see Nvidia as the 'generous' type.

Monitors are “dumb” devices. They have zero idea what video card is connected to them. They process the signals sent to them. Free sync/a sync is all software controlled. If the monitor supports a sync then it supports free sync.
 
This is awesome news!

Wondering if my 32 inch HP Omen Freesync monitor will have support? If so, I guess I'll go back to Nvidia....
 
Did you read ANY of the information or did you see thread title and go to post?

I did. Sorry you can't answer the questions. The release from nVidia is vague at best - I may have missed something, as I didn't break it down and analyze it, but hey, that's why we are here discussing.
 
HDMI 2.1 supports VRR but
1) does that mean panel makers must support it?
2) are there any standards? I appreciate that GSYNC mandated that GSYNC monitors were capable of 144Hz and LFC. Sure Freesync seemed a lot cheaper but when you are comparing a 60hz monitor to a GSYNC 144hz monitor, why wouldn't you expect the Gsync one to be more expensive?

I think AMD noticed this too and attached standards to the Freesync 2 certification.
 
so...

doesn't this prove that nvidia was gouging their users for buying gsync monitors then?
 
General VRR question. My Samsung TV and xBox one X support it, but my Marantz AVR is my hub. Is this a pass through type of thing or does the Xbox need to go directly to tv?

Thanks.
 
I did. Sorry you can't answer the questions. The release from nVidia is vague at best - I may have missed something, as I didn't break it down and analyze it, but hey, that's why we are here discussing.


nVidia Press Release said:
For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
 
Back
Top