Nvidia Confirms Adaptive Sync Only Works on Pascal and Turing GPUs

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Nvidia has already stated that support for adaptive sync monitors is limited to Turing and Pascal GPUs, aka the GeForce 10 series and up. But the wording was a little ambiguous, and some hoped that Nvidia would eventually add support for the 900 series. But an Nvidia representative on the GeForce forums confirmed that the company has no plans to support adaptive sync on Maxwell. However, it's not clear if this is due to a hardware limitation or some other factor.

coth: Any word on when VRR will be available on GTX 900 series? ManuelGuzmanNV: Sorry but we do not have plans to add support for Maxwell and below.
 
Nvidia has already stated that support for adaptive sync monitors is limited to Turing and Pascal GPUs, aka the GeForce 10 series and up. But the wording was a little ambiguous, and some hoped that Nvidia would eventually add support for the 900 series. But an Nvidia representative on the GeForce forums confirmed that the company has no plans to support adaptive sync on Maxwell. However, it's not clear if this is due to a hardware limitation or some other factor.

coth: Any word on when VRR will be available on GTX 900 series? ManuelGuzmanNV: Sorry but we do not have plans to add support for Maxwell and below.
Adaptive-Sync requires DisplayPort 1.2a. Out of the Maxwell series only the 980 Ti has it. But if NVIDIA added G-SYNC Compatible support to the 980 Ti people would be crying and complaining about why NVIDIA wouldn't add it to the rest of Maxwell.
 
Freesync can be used over hdmi 2.0. AMD showed it working back in 2016 and they have monitors listed in their support page that say its supported. I would think if nvidia really really wanted to they could update their drivers to support it also.
 
Freesync can be used over hdmi 2.0. AMD showed it working back in 2016 and they have monitors listed in their support page that say its supported. I would think if nvidia really really wanted to they could update their drivers to support it also.

Im no fan of nvidia since the 8800 series, but id imagine the little asterisk next to Freesync support on Maxwell* might cause some outcry. Better to put a hard line in the sand than allow customers to be misinformed imho. The vocal pc crowd can often be ruthless, especially those that don't understand the technologies they are arguing.

*(only available to 980ti, and via HDMI 2.0)
 
Is this a big concern for anyone? I mean only the 980ti and the Titans have enough power to warrant variable sync anyway, right?

Now if I had dropped the Honda Civic level cash on a Titan, I might be a little miffed. Then again these are 2 generations ago (which is only 4 years... I had to look it up. Feels longer)
 
Is this a big concern for anyone? I mean only the 980ti and the Titans have enough power to warrant variable sync anyway, right?

Now if I had dropped the Honda Civic level cash on a Titan, I might be a little miffed. Then again these are 2 generations ago (which is only 4 years... I had to look it up. Feels longer)

Powerful enough? Are you under the impression that you need a high end card to use VRR?
 
Powerful enough? Are you under the impression that you need a high end card to use VRR?

Isn't the whole point of variable refresh to find the sweet spot above vsync? So above 60? I don't have gsync/freesync, just plain ol' vsync and it seems fine to me. But my monitors only go up to 60hz anyway (ok, 59 if we're nitpicking)
 
Worked perfectly for me on my secondary computer's screen that has FreeSync @ 144hz and a GTX 1070. Glad I'm one of the lucky ones!
 
Ahh nGreedia never disappoints... I wonder how many other fake “limitations” they will bake in now, and in the future?

Yeah, we can’t uncomment those couple of lines of code, can we boys! Maybe add a few unfixable bugs along the way...
 
Isn't the whole point of variable refresh to find the sweet spot above vsync? So above 60? I don't have gsync/freesync, just plain ol' vsync and it seems fine to me. But my monitors only go up to 60hz anyway (ok, 59 if we're nitpicking)

At work and typing in a phone so I don’t have time to go into detail but no that is not the point of it. It works much better than vsync.

Ahh nGreedia never disappoints... I wonder how many other fake “limitations” they will bake in now, and in the future?

Yeah, we can’t uncomment those couple of lines of code, can we boys! Maybe add a few unfixable bugs along the way...

Outside of the 980 ti (apparently) no Maxwell card even has the right DisplayPort standard to support VRR. Nothing lines of code can do about it.
 
Outside of the 980 ti (apparently) no Maxwell card even has the right DisplayPort standard to support VRR. Nothing lines of code can do about it.

I could have sworn that my old 970 got a DisplayPort firmware update to 1.4 not more than 6 months ago... Are you sure this is not a software limitation, rather than hardware?
 
Pretty sure G-Sync is also a Pascal and higher only feature as well, which makes sense if those do not have the right version of displayport... See above, displayport has to be of a certain revision to support any variable refresh rate or adaptive sync technology, be it AMD's freesync, nVidia's G-Sync, or whatever else might come along.
 
I could have sworn that my old 970 got a DisplayPort firmware update to 1.4 not more than 6 months ago... Are you sure this is not a software limitation, rather than hardware?
There is a firmware update that will take Display port 1.3 and upgrade it to 1.4 for the purposes of maintaining compatibility with newer monitors, but that doesn't mean that they support the parts of the display port spec that are considered optional which adaptive sync is one of them.
 
There is a firmware update that will take Display port 1.3 and upgrade it to 1.4 for the purposes of maintaining compatibility with newer monitors, but that doesn't mean that they support the parts of the display port spec that are considered optional which adaptive sync is one of them.
Thanks for confirming my opinion.
 
If you read the thread NV clearly explains its a hardware limitation.

ManuelGuzmanNV said:
Maxwell and below do not support the Displayport 1.2a standard. Adaptive Sync requires DisplayPort 1.2a. There is some confusion over the graphics firmware update tool we provide:

https://www.nvidia.com/object/nv-uefi-update-x64.html
https://www.nvidia.com/object/nv-uefi-update-x64.html


The tool enables new features on the graphics card so that it does not display a black screen on P.O.S.T. with DisplayPort 1.3 and DisplayPort 1.4 monitors but it does not enable DisplayPort 1.2a on the card.
 
There is nothing inherent in Adaptive-Sync that it requires DisplayPort 1.2a. It is just not included in the spec. You can still run Adaptive-Sync/FreeSync over DP 1.2 if both sides support it, and it is interoperable with DP 1.2a/1.3/1.4 devices.

And NVidia GPUs support Adaptive-Sync through eDP (G-Sync on Notebooks).
 
Off topic, is there an unspoken rule that YouTubers always have to make a dumb face in the preview image?

LOL, that's just Linus. He always looks like that. Dude probably makes serious bank off his video channel though. He's not my favorite but there are far worse techno idiots on YT.
 
Thanks for confirming my opinion.
Maintaining forward compatibility can be done with firmware updates to a point, but large parts of the spec are hardware dependent and require additional traces or additional power to display circuitry that exists outside the actual GPU die. So you are half correct there is the software component but there are very real hardware components to it as well. So the ability to add the features via a firmware update would very much depend on the specific components between the GPU chip and the actual display port itself those could very well vary between manufacturers as they were not initially specified in the reference card. To my knowledge only the 980TI card had the necessary hardware components worked into the reference so its hardware across the board should be capable but any non reference cards and any 900 series card other than the 980 TI would be a crap shoot.
 
Last edited:
Isn't the whole point of variable refresh to find the sweet spot above vsync? So above 60? I don't have gsync/freesync, just plain ol' vsync and it seems fine to me. But my monitors only go up to 60hz anyway (ok, 59 if we're nitpicking)

No. The slower your GPU the more important VRR becomes. The point is so that you can have zero-tearing at any framerate. Think VSYNC + 40fps actually working well and not being a stuttery disaster. vsync where you don't need to have a super high minimum fps.

It's why consoles & TVs have it now, too, even though they also top out at 60hz on a good day.
 
Pretty sure G-Sync is also a Pascal and higher only feature as well, which makes sense if those do not have the right version of displayport... See above, displayport has to be of a certain revision to support any variable refresh rate or adaptive sync technology, be it AMD's freesync, nVidia's G-Sync, or whatever else might come along.

Um G-Sync works on cards all the way back to the 650ti boost. It doesn’t rely on the VESA standard.
 
Even if there was not an actual hw limitation, I don't find it unthinkable for a company limit support on a new feature to the last couple generations of products.

Lots of things which make me dislike NV's practices, but this doesn't move the needle.

Edit: To clarify - those buyers bought what the card did and was sold to do at that time. Those buyers still got the capabilities on the box. It is not as though this was something promised but not delivered.
 
Can anyone tell me why VRR on geforce cards requires Windows 10? Why can't it run on 7?
 
Maintaining forward compatibility can be done with firmware updates to a point, but large parts of the spec are hardware dependent and require additional traces or additional power to display circuitry that exists outside the actual GPU die. So you are half correct there is the software component but there are very real hardware components to it as well. So the ability to add the features via a firmware update would very much depend on the specific components between the GPU chip and the actual display port itself those could very well vary between manufacturers as they were not initially specified in the reference card. To my knowledge only the 980TI card had the necessary hardware components worked into the reference so its hardware across the board should be capable but any non reference cards and any 900 series card other than the 980 TI would be a crap shoot.
You are just doing the opposite of why I'm doing here...

I say that, based on experience and prior knowledge, nGreedia are lying, in an effort to make people buy new graphics cards, and you are saying that nGreedia would never do that, and are honestly saying its a real hardware limitation...

One of us is right, and we will probably never find out who.
 
Adaptive-Sync requires DisplayPort 1.2a. Out of the Maxwell series only the 980 Ti has it. But if NVIDIA added G-SYNC Compatible support to the 980 Ti people would be crying and complaining about why NVIDIA wouldn't add it to the rest of Maxwell.

Where do you see the 980TI supporting 1.2a? Even if it did support 1.2a there is no guarantee it will support adaptive sync as its an optional part of the standard.

Nvidia themselves have said that no Maxwell card has a 1.2a display port.
 
There is nothing inherent in Adaptive-Sync that it requires DisplayPort 1.2a. It is just not included in the spec. You can still run Adaptive-Sync/FreeSync over DP 1.2 if both sides support it, and it is interoperable with DP 1.2a/1.3/1.4 devices.

And NVidia GPUs support Adaptive-Sync through eDP (G-Sync on Notebooks).

Had this discussion with you already. Both sides don't support it, the Nvidia Desktop GPUs older than Pascal don't support Adaptive sync, they don't have the hardware.
 
You are just doing the opposite of why I'm doing here...

I say that, based on experience and prior knowledge, nGreedia are lying, in an effort to make people buy new graphics cards, and you are saying that nGreedia would never do that, and are honestly saying its a real hardware limitation...

One of us is right, and we will probably never find out who.

If this was really the case, why didn't they just restrict it to Turing cards and newer? Why let cards that are EOL have the technology? Why go to all the trouble of enabling it for Pascal if you just want to sell newer cards.

And it is a hardware limitation. AMD said this in a Q&A session when Freesync was first announced, Nvidia have just said it now. I think Nvidia are a horrible company but they aren't always lying.
 
In the end I'd say not a big deal. I loved my Maxwells back in the day(had a pair of G1 970's SLI) and they rocked for the time. They were pretty solid for 1080p/1440p and a little 4k. Maxwell was pretty much the end of the era for great SLI setups. Outside of something similar or a 980TI I have my doubts how well any of the other Maxwells would hold up to any of today's AAA games w/ max settings in even 1080p. I do feel for the 980TI owners because those are some amazing cards and still have some life to give. Hopefully someone can help with a fix for 'em.
 
  • Like
Reactions: Angry
like this
Isn't the whole point of variable refresh to find the sweet spot above vsync? So above 60? I don't have gsync/freesync, just plain ol' vsync and it seems fine to me. But my monitors only go up to 60hz anyway (ok, 59 if we're nitpicking)
No, the point of VRR is to eliminate tearing by matching the monitor's refresh rate to the frame rate. Since monitors cannot refresh faster than its maximum refresh rate VRR does not operate above the monitor's maximum refresh rate. VRR really shines by smoothing out the experience when you can't maintain frame rate to match the maximum refresh rate.
Pretty sure G-Sync is also a Pascal and higher only feature as well, which makes sense if those do not have the right version of displayport... See above, displayport has to be of a certain revision to support any variable refresh rate or adaptive sync technology, be it AMD's freesync, nVidia's G-Sync, or whatever else might come along.

Um G-Sync works on cards all the way back to the 650ti boost. It doesn’t rely on the VESA standard.
Yes, G-SYNC 1 & 2 are supported on Kepler forward. G-SYNC HDR is only supported on Pascal forward because HDR is a requirement, and HDR wasn't part of DisplayPort until version 1.4.
Can anyone tell me why VRR on geforce cards requires Windows 10? Why can't it run on 7?
G-SYNC works on Windows 7. If I had to take a guess as to why G-SYNC Compatible only works on Windows 10 it's because they only made it work with the latest WDDM. NVIDIA probably doesn't want to develop and support a new feature in their driver for an OS that will be EOL in a year.
 
So I've got a 1080 which has HDMI 2.0b. If I used a Freesync capable monitor everything would be fine. But I game on a TV and TVs will really only start coming with VRR as part of HDMI 2.1. So.... I guess that means that my card will never get a chance to use that feature even if I get a HDMI 2.1 capable TV. Right? Does 2.1 have to be the standard on both ends of the link?
 
So I've got a 1080 which has HDMI 2.0b. If I used a Freesync capable monitor everything would be fine. But I game on a TV and TVs will really only start coming with VRR as part of HDMI 2.1. So.... I guess that means that my card will never get a chance to use that feature even if I get a HDMI 2.1 capable TV. Right? Does 2.1 have to be the standard on both ends of the link?
G-SYNC Compatible only works over DisplayPort.

For HDMI VRR, you need 2.1 at both ends. It supposedly doesn't need any software intervention, so when NVIDIA releases a video card with HDMI 2.1 outputs VRR should work with a TV using HDMI 2.1 regardless of driver support.
 
G-SYNC Compatible only works over DisplayPort.

For HDMI VRR, you need 2.1 at both ends. It supposedly doesn't need any software intervention, so when NVIDIA releases a video card with HDMI 2.1 outputs VRR should work with a TV using HDMI 2.1 regardless of driver support.

Thanks for the depressing clarification. So I'll be without VRR until the 1080 gets replaced in another ~5 years along with the TV.
 
Back
Top