Nvidia Confirms Adaptive Sync Only Works on Pascal and Turing GPUs

And your arguments have been thoroughly shot down.

This is not a question of hardware support, it is entirely in the drivers.

Not really, I stopped replying because you clearly weren't going to accept anything I said. It is hardware support, confirmed by both Nvidia and AMD and nothing you have said changes that.
 
You seem to be borderline Trolling now, but here is a clue to your stupid responce... nGreedia STILL SELL the 10 series...

I'm done with your baiting and acting like you work as a GPU designer working at nGreedia. You are not, and you don't. Stop pretending now, you only know as much as I do.

Borderline trolling, you are the one with stupid Ngreedia, acting like the only the reason they blocked the 9xx cards is because of greed. Both Nvidia and AMD say there are hardware limitations. AMD when they first released Freesync and Nvidia just recently.

They are still selling Pascal cards, but, they are EOL. Most of what's in stock in shops now, when they are gone won't be replaced. The 1060 is one of best selling GPUs, it has the biggest percentage of steam users. Surely if it was about forcing people to upgrade to newer cards then this is the market they would be chasing. The 2060 has just been released and there are further cards on the way.

Again, your theory makes no sense. Why enable Adaptive Sync for EOL of cards if they are supposedly doing it to get customers to buy new cards. It doesn't make any sense if it's purely from a greed point of view. You have just giving your customer another reason to stay with their current Pascal card instead of upgrading to Turing.

Occam's Razor, the simple explanation is usually the correct one. They aren't enabling it on Maxwell desktop GPUs because they can't.
 
There is a firmware update that will take Display port 1.3 and upgrade it to 1.4 for the purposes of maintaining compatibility with newer monitors, but that doesn't mean that they support the parts of the display port spec that are considered optional which adaptive sync is one of them.

Exactly, just because your display port is 1.2, 1.2a, 1.3 or whatever, doesn't mean it supports adaptive sync. It's an entirely optional part of the display port standard.
 
Maintaining forward compatibility can be done with firmware updates to a point, but large parts of the spec are hardware dependent and require additional traces or additional power to display circuitry that exists outside the actual GPU die. So you are half correct there is the software component but there are very real hardware components to it as well. So the ability to add the features via a firmware update would very much depend on the specific components between the GPU chip and the actual display port itself those could very well vary between manufacturers as they were not initially specified in the reference card. To my knowledge only the 980TI card had the necessary hardware components worked into the reference so its hardware across the board should be capable but any non reference cards and any 900 series card other than the 980 TI would be a crap shoot.

Nice post :) I didn't know the 980Ti had the hardware needed,Can you give a link to this info?
 
Nice post :) I didn't know the 980Ti had the hardware needed,Can you give a link to this info?
Sadly I can’t, I just remember watching a tear down video on the card where they casually mentioned it when looking at some of the chips on the card, like a “oh they used a bleh chip here, it should support these features but this is nVidia we’re talking about so it won’t” then they moved onto the rest of the card.
 
This is not a question of hardware support, it is entirely in the drivers.

Care to offer proof of this statement? And I don’t want someone’s opinion or a series of “maybe” and “possibly” terms offered as proof. If you are going to make a definitive statement like that you need to back it up with hard evidence.
 
Didn't Nvidia pull this same type of BS back when the multi-monitor craze hit? They said they couldn't support more than 2 monitors on a video card but we found out later it was just them being lazy and not wanting to update the drivers to support the feature. It was never a hardware limitation it was a legit software issue.
Maybe I am recalling this incorrectly but i thought it was the case.
 
So I've got a 1080 which has HDMI 2.0b. If I used a Freesync capable monitor everything would be fine. But I game on a TV and TVs will really only start coming with VRR as part of HDMI 2.1. So.... I guess that means that my card will never get a chance to use that feature even if I get a HDMI 2.1 capable TV. Right? Does 2.1 have to be the standard on both ends of the link?
nVidia only supports VRR over Displayport it will not work over HDMI
 
Not really, I stopped replying because you clearly weren't going to accept anything I said. It is hardware support, confirmed by both Nvidia and AMD and nothing you have said changes that.
No, you stopped because you realized that you kept digging and making more indefensible claims like Kabini being a GCN 1.0 part.

Care to offer proof of this statement? And I don’t want someone’s opinion or a series of “maybe” and “possibly” terms offered as proof.
Well the proof is inside NVidia's driver, so only they can answer this question in a definitive manner. However, the known facts really leave no other plausible conclusion.
If you are going to make a definitive statement like that you need to back it up with hard evidence.
As mentioned in the other thread, DisplayPort is a superset of eDP since version 1.2. NVidia GPUs before Pascal can output VRR via eDP (namely in G-Sync laptops). This means any NVidia GPU which can output VRR via eDP can also output VRR via DP, unless this function is somehow disabled via polyfuses, firmware, or in the driver.

And this wasn't only apparent to me, it was clear to anybody who did not deliberately ignore NVidia's existing support for Adaptive-Sync:
Ryan Smith (Anandtech) said:
Though they don’t discuss it, NVIDIA has internally supported VESA Adaptive Sync for a couple of years now; rather than putting G-Sync modules in laptops, they’ve used what’s essentially a form of Adaptive Sync to enable “G-Sync” on laptops. As a result we’ve known for some time now that NVIDIA could support VESA Adaptive Sync if they wanted to, however until now they haven’t done this.
https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding
 
So what are the chances laptop model GPUs will be updated to allow VRR on them, as I thought because the GPU had direct connections to the LCD panel it should be a given.....

There were alot of great 1050ti laptops that could benefit greatly from this.
 
No, you stopped because you realized that you kept digging and making more indefensible claims like Kabini being a GCN 1.0 part.

Well the proof is inside NVidia's driver, so only they can answer this question in a definitive manner. However, the known facts really leave no other plausible conclusion.
As mentioned in the other thread, DisplayPort is a superset of eDP since version 1.2. NVidia GPUs before Pascal can output VRR via eDP (namely in G-Sync laptops). This means any NVidia GPU which can output VRR via eDP can also output VRR via DP, unless this function is somehow disabled via polyfuses, firmware, or in the driver.

And this wasn't only apparent to me, it was clear to anybody who did not deliberately ignore NVidia's existing support for Adaptive-Sync:
https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding

Correct me if I’m wrong but aren’t Maxwell and prior mobile GPUs slightly altered compared to their desktop counterparts?
 
Is this a big concern for anyone? I mean only the 980ti and the Titans have enough power to warrant variable sync anyway, right?

Dude, I want VRR on my laptop with integrated Intel graphics that I game on daily. I live with the tearing because I'm even less fond of the input lag, and a good VRR implementation would fix both!
 
Correct me if I’m wrong but aren’t Maxwell and prior mobile GPUs slightly altered compared to their desktop counterparts?

Yes, and the further issue is that the function has to be supported by the installed panel. People are being lazy with their trolling today...
 
I had to laugh - I find it hard to take a post that uses the word nGreedia seriously. I think it devalues the post in general.

I think this it is great nVidia is supporting VRR, even if it doesn’t support Maxwell, it’s still a huge win for the consumer. I guess they have to draw the support line somewhere (assuming this is software and not a hardware limitation.) Maxwell is getting pretty long in the tooth.
 
If this was really the case, why didn't they just restrict it to Turing cards and newer? Why let cards that are EOL have the technology? Why go to all the trouble of enabling it for Pascal if you just want to sell newer cards.
Because they still have an abundance of pascal cards to sell retail...the same reason there was such a delay for turings release. There are no Maxwell cards to sell retail so there’s no advantage for Nvidia to enable it on old cards. Gsync and Freesync will allow me to keep my cards longer than ever before because now occasional drops to 40fps feels as smooth as steady 60.
 
Because they still have an abundance of pascal cards to sell retail...the same reason there was such a delay for turings release. There are no Maxwell cards to sell retail so there’s no advantage for Nvidia to enable it on old cards. Gsync and Freesync will allow me to keep my cards longer than ever before because now occasional drops to 40fps feels as smooth as steady 60.

This line of reasoning is a little off I think. Nvidia just stated that Pascal cards are almost sold out. They have likely sold through most, if not all, of the higher end Pascal chips they ha on hand. That would mean they have little real reason to care about selling more Pascal cards and would rather people but Turing.
 
Off topic, is there an unspoken rule that YouTubers always have to make a dumb face in the preview image?
No, just youtubers who think their audience is dumb. Or want to attract a dumb audience.
 
Sadly I can’t, I just remember watching a tear down video on the card where they casually mentioned it when looking at some of the chips on the card, like a “oh they used a bleh chip here, it should support these features but this is nVidia we’re talking about so it won’t” then they moved onto the rest of the card.

I will take your word for it. I have seen people who have done tear downs of the 980/970 say that there is nothing there to support Adaptive sync, just haven't seen any tear down of the 980Ti. This was back when Nvidia put Gsync in Laptops using adaptive sync that had just been added as optional part of the eDP 1.4a standard. There were a lot of investigations done at the time.
 
Because they still have an abundance of pascal cards to sell retail...the same reason there was such a delay for turings release. There are no Maxwell cards to sell retail so there’s no advantage for Nvidia to enable it on old cards. Gsync and Freesync will allow me to keep my cards longer than ever before because now occasional drops to 40fps feels as smooth as steady 60.

IF it really was an attempt to force people to buy Pascal cards, then why wait until now to implement? Their Stock problems were last July. Why wait until they are almost sold of Pascal cards? If the plan was to get people to buy newer cards then enabling adaptive sync on Pascal cards makes no kind of business sense. Your very last sentence proves that. People are going to hold onto their GPUs longer now. Pascal has been their best selling line of cards ever, why would they want to give Pascal owners another reason to keep their cards? Enabling Adaptive sync on Turing cards would have given those Pascal users on the fence about upgrading a reason to jump. Now it just gives them a reason to stay.

If it was done just for greed then it was the stupidest decision ever.
 
No, you stopped because you realized that you kept digging and making more indefensible claims like Kabini being a GCN 1.0 part.

Well the proof is inside NVidia's driver, so only they can answer this question in a definitive manner. However, the known facts really leave no other plausible conclusion.
As mentioned in the other thread, DisplayPort is a superset of eDP since version 1.2. NVidia GPUs before Pascal can output VRR via eDP (namely in G-Sync laptops). This means any NVidia GPU which can output VRR via eDP can also output VRR via DP, unless this function is somehow disabled via polyfuses, firmware, or in the driver.

And this wasn't only apparent to me, it was clear to anybody who did not deliberately ignore NVidia's existing support for Adaptive-Sync:
https://www.anandtech.com/show/1379...-adaptive-sync-with-gsync-compatible-branding

We have known since shortly after 1080/1070 were launched that they could support Adaptive sync on Pascal cards. It was always a question of when it would happen.

And you are still wrong. Just because a laptop GPU supports adaptive sync doesn't necessarily mean that the desktop version will.
 
IF it really was an attempt to force people to buy Pascal cards, then why wait until now to implement? Their Stock problems were last July. Why wait until they are almost sold of Pascal cards? If the plan was to get people to buy newer cards then enabling adaptive sync on Pascal cards makes no kind of business sense. Your very last sentence proves that. People are going to hold onto their GPUs longer now. Pascal has been their best selling line of cards ever, why would they want to give Pascal owners another reason to keep their cards? Enabling Adaptive sync on Turing cards would have given those Pascal users on the fence about upgrading a reason to jump. Now it just gives them a reason to stay.

If it was done just for greed then it was the stupidest decision ever.

Look, do you have any doubt in your mind, Nvidia couldn’t have turned on Freesync support 3+ years ago? There you just answered your own question.
 
Look, do you have any doubt in your mind, Nvidia couldn’t have turned on Freesync support 3+ years ago? There you just answered your own question.

No, I have no doubt they could have enabled support for adaptive sync when Pascal was launched once people smarter than me found they had support for adaptive sync. What I do know is that the Maxwell desktop cards don't have the hardware. Sorry most Maxwell cards, just after learning that some 980Ti's did.
 
Off topic, is there an unspoken rule that YouTubers always have to make a dumb face in the preview image?

Apparently making an exaggerated reaction face and putting that into your preview thumbnail boosts the amount of views the video gets. You'll see a lot of the popular Youtubers will use that same method for their thumbnails.
 
Didn't Nvidia pull this same type of BS back when the multi-monitor craze hit? They said they couldn't support more than 2 monitors on a video card but we found out later it was just them being lazy and not wanting to update the drivers to support the feature. It was never a hardware limitation it was a legit software issue.
Maybe I am recalling this incorrectly but i thought it was the case.

It was the maximum number of monitors. We could have 4 on Linux but only 3 on Windows. They restricted the Linux driver to 3 from then on for "feature parity."
 
This line of reasoning is a little off I think. Nvidia just stated that Pascal cards are almost sold out. They have likely sold through most, if not all, of the higher end Pascal chips they ha on hand. That would mean they have little real reason to care about selling more Pascal cards and would rather people but Turing.
https://www.pcgamesn.com/nvidia-mid-range-gpu-inventory-cryptocurrency

https://www.extremetech.com/gaming/280800-nvidia-stock-plummets-on-high-inventory-fears

https://www.pcmag.com/news/364990/post-crypto-hangover-leaves-nvidia-with-unsold-graphics-ca

https://www.fool.com/investing/2018/11/16/why-nvidia-corp-shares-fell-as-much-as-20-today.aspx

Plenty more where that came from.

In mid/late November, 2018, there was a big inventory problem that was dropping Nvidias corporate stock price by as much as 20%. Nvidia admitted it directly. They had ~1.5 billion in unsold inventory they were sitting on, which is about twice as much as the year before. That’s not good.

They had to make a decision how to move that old pascal inventory in the face of newer cards coming out and fiercer price competition from AMD. One way to do it would obviously be to turn on Freesync. (vs cutting prices). They made the decision corporately and the driver team was given a couple months to make it happen. The logic flow is as clear as day.
 
Apparently making an exaggerated reaction face and putting that into your preview thumbnail boosts the amount of views the video gets. You'll see a lot of the popular Youtubers will use that same method for their thumbnails.

Also red circles and arrows.
 
https://www.pcgamesn.com/nvidia-mid-range-gpu-inventory-cryptocurrency

https://www.extremetech.com/gaming/280800-nvidia-stock-plummets-on-high-inventory-fears

https://www.pcmag.com/news/364990/post-crypto-hangover-leaves-nvidia-with-unsold-graphics-ca

https://www.fool.com/investing/2018/11/16/why-nvidia-corp-shares-fell-as-much-as-20-today.aspx

Plenty more where that came from.

In mid/late November, 2018, there was a big inventory problem that was dropping Nvidias corporate stock price by as much as 20%. Nvidia admitted it directly. They had ~1.5 billion in unsold inventory they were sitting on, which is about twice as much as the year before. That’s not good.

They had to make a decision how to move that old pascal inventory in the face of newer cards coming out and fiercer price competition from AMD. One way to do it would obviously be to turn on Freesync. (vs cutting prices). They made the decision corporately and the driver team was given a couple months to make it happen. The logic flow is as clear as day.

But all those articles are out of date. The Turing 2060 has been released, 1060's that are going out of stock in retailers aren't been replaced. It's a bit late turning on adaptive sync for Pascal cards when the cards are nearly sold out.

Your theory makes no sense, your timeline makes no sense. The big inventory problem happened last July after the Crypto bubble burst, Nvidia knew then they had a surplus of stock. He announced back then that Turing cards would be delayed. If adding adaptive sync support is really an attempt to sell Pascal cards why not to do it back then? Why wait to add it after they announced the 2060 and after announcing that the Pascal cards are nearly sold out.

Your logic flow doesn't pan out at all.
 
But all those articles are out of date. The Turing 2060 has been released, 1060's that are going out of stock in retailers aren't been replaced. It's a bit late turning on adaptive sync for Pascal cards when the cards are nearly sold out.

Your theory makes no sense, your timeline makes no sense. The big inventory problem happened last July after the Crypto bubble burst, Nvidia knew then they had a surplus of stock. He announced back then that Turing cards would be delayed. If adding adaptive sync support is really an attempt to sell Pascal cards why not to do it back then? Why wait to add it after they announced the 2060 and after announcing that the Pascal cards are nearly sold out.

Your logic flow doesn't pan out at all.
Did you look at the dates of my articles. They were all November 2018.

The 1.5 billion USD in inventory problem was at the end of quarter three, 2018.

There was a BIGGER problem in June/July, but it wasn’t solved by late last year.

Why would you think the Nvidia CEO would say ‘Pascal inventory is almost gone, if you want one you better buy now’, if not to spurn more Pascal sales.’ Why do you think 20x0 series cards came out overpriced so much that people who were holding out for next gen went and bought previous gen at full two year old retail price. If they are truly almost gone why does he care, and why does he care enough to tell the consumer they need to move fast to buy Pascal now under ANY scenario except to move more product.

You need to take off your rose colored lenses.

The timeline makes perfect sense.
 
Did you look at the dates of my articles. They were all November 2018.

The 1.5 billion USD in inventory problem was at the end of quarter three, 2018.

There was a BIGGER problem in June/July, but it wasn’t solved by late last year.

Why would you think the Nvidia CEO would say ‘Pascal inventory is almost gone, if you want one you better buy now’, if not to spurn more Pascal sales.’ Why do you think 20x0 series cards came out overpriced so much that people who were holding out for next gen went and bought previous gen at full two year old retail price. If they are truly almost gone why does he care, and why does he care enough to tell the consumer they need to move fast to buy Pascal now under ANY scenario except to move more product.

You need to take off your rose colored lenses.

The timeline makes perfect sense.

Nope, sorry, your timeline still makes no sense. Instead of releasing adaptive sync back when they had the actual problem, they wait until the release of the 2060. It makes no sense from any kind of business point of view if the objective was to sell Pascal cards.

to quote Idiotincharge's post.

To mark the end of the issue of oversupply?

That makes perfect sense.
 
Correct me if I’m wrong but aren’t Maxwell and prior mobile GPUs slightly altered compared to their desktop counterparts?
At least the G-Sync capable GPUs (965M/970M/980M) are using the same GM206/204 silicon as are the desktop counterparts.
 
Back
Top