Nvidia Expands G-Sync Support to Approved Adaptive Sync Monitors

And since version 1.2, DisplayPort is a superset of eDP. And this is not just theory, people actually tried running iPad eDP screens off their graphics cards' DP outputs. And guess what? With DP 1.1 that didn't work, but with DP 1.2 that worked with minimal circuitry.
All I'm saying is that if your hardware already supports VRR via eDP, then your hardware has everything it needs for supporting VRR via DP.

It just proves that FreeSync/Adaptive-Sync on DP and FreeSync/Adaptive-Sync on eDP are functionally the same.

You are wrong here too. Kaveri is second generation GCN. There are no APUs of first generation GCN (Bobcat is VLIW5, Trinity/Richland are VLIW4, and Kaveri/Kabini are GCN2),

Also your claim that 7970 doesn't support FreeSync is false. It supports FreeSync for video playback, but not for gaming.

First my apologies, I meant Kabini not Kaveri. I am always getting the two of them mixed up. Kabini is first generation GCN. (If you check, it uses GCN 1.0, not 1.1)

I never said Freesync on eDP and DP were functionally different. They shouldn't be as the Display port adaptive sync is based entirely on the eDP spec. What I am saying is that Just because a GPU has a 1.2a or 1.3 or 1.4 or whatever display port, doesn't mean it supports Adaptive sync. The GPU manufacturer still has to make a choice to use the OPTIONAL adaptive sync hardware or not. But any Laptop GPU or any APU that uses eDP 1.2 or higher has to support VRR and because of that work with adaptive sync.

Also I don't think If you have a GPU that supports VRR through eDP, means it will have support for VRR via DP. Remember when Nvidia announced Gsync for laptops? They made Maxwell cards that supported LVDS and eDP. The GPUs that went into the LVDS laptops did not support Gysnc, the ones that went into eDP laptops did. I remember reading on forums about people trying a 980M from a LVDS laptop in an eDP laptop and it didn't work with Gsync, they tried flashing the BIOS, spoofing the ID and everything, the card worked, but, Gsync didn't. While the silicon might be similar, there can be variations.

The 7970 doesn't support adaptive sync. What happens is when you play a video the drivers detect what refresh rate the video is playing it and then sets your monitor to a multiple of that. So if the video is 24hz, it sets the monitor to 48HZ. It's just a one time change and is made possible by the updated scaler in the monitor. It doesn't support VRR at all. It doesn't have the hardware.
 
I'm surprised this thread only has 3 pages as this is a huge step in the right direction for Nvidia. The G-SYNC "fee" or "tax" or whatever you would like to call it was very unappealing and also being limited to either FreeSync if you had AMD or G-SYNC if you had Nvidia was very anti-consumer. Supporting adaptive sync monitors outside of G-SYNC certified ones gives consumers more purchasing power and an option to avoid premiums associated with G-SYNC certified monitors.

However, I am slightly upset to see that out of 400 monitors they tested only 12 passed. Makes me wonder if Nvidia is purposely making the validation process extremely strict (more than it needs to be) on these monitors.
G-SYNC comes with a strict set of parameters that NVIDIA are testing against the Adaptive-Sync monitors on the market. If the monitor doesn't meet those parameters then it doesn't pass validation. Simple as that. That means all those monitors on the market with a narrow dynamic refresh rate range such as 48-75 Hz will not pass validation.

The G-SYNC "tax" as you refer to is the cost of developing the displays with the FPGA. With G-SYNC NVIDIA are hand picking and developing panels with the panel manufacturers, and those panels go through a rigorous testing regimen to ensure display quality and proper operation with the FPGA while the FPGA itself is tweaked to work with each specific panel. All that adds to the premium pricing of what NVIDIA is now calling "G-SYNC Ultimate HDR." FreeSync 2 has a similar price premium, albeit cheaper due to not needing additional hardware. AMD saw the wildly differing experiences with the wild west of their open standard in the original implementation of Freesync, so they added their own certification process with FreeSync 2 to ensure a consistent experience.
 
which is why it pays to not only find a good deal on a monitor by price and also by multiple reviews.
 
First my apologies, I meant Kabini not Kaveri. I am always getting the two of them mixed up. Kabini is first generation GCN. (If you check, it uses GCN 1.0, not 1.1)
I checked. Kabini is 2nd generation GCN (more specifically, GFX7), same as Kaveri. There was no GCN 1 APU. What came before Kabini/Kaveri was Trinitry/Richland based on VLIW4.

Remember when Nvidia announced Gsync for laptops? They made Maxwell cards that supported LVDS and eDP. The GPUs that went into the LVDS laptops did not support Gysnc, the ones that went into eDP laptops did. I remember reading on forums about people trying a 980M from a LVDS laptop in an eDP laptop and it didn't work with Gsync, they tried flashing the BIOS, spoofing the ID and everything, the card worked, but, Gsync didn't. While the silicon might be similar, there can be variations.
This can only be if NVidia specifically implemented a block against this, e.g. through polyfuses.

You are correct that sometimes, product decisions can hamper FreeSync support such as in the PS4 (which does not connect the HDMI port directly to the Liverpool APU, but instead through a Panasonic MN8647xx HDMI encoder). But I am quite certain that such shenanigans are absent from Geforce MXMs.

The 7970 doesn't support adaptive sync. What happens is when you play a video the drivers detect what refresh rate the video is playing it and then sets your monitor to a multiple of that. So if the video is 24hz, it sets the monitor to 48HZ. It's just a one time change and is made possible by the updated scaler in the monitor. It doesn't support VRR at all. It doesn't have the hardware.
That the 7970 supports FreeSync for video playback was stated directly by AMD:
AMD said:
Q. Which AMD Radeon™ GPUs are compatible with AMD FreeSync™ technology?
All AMD Radeon™ graphics cards in the AMD Radeon™ HD 7000, HD 8000, R7 or R9 Series will support AMD FreeSync™ technology for video playback and power-saving purposes. The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
 
I checked. Kabini is 2nd generation GCN (more specifically, GFX7), same as Kaveri. There was no GCN 1 APU. What came before Kabini/Kaveri was Trinitry/Richland based on VLIW4.

This can only be if NVidia specifically implemented a block against this, e.g. through polyfuses.

You are correct that sometimes, product decisions can hamper FreeSync support such as in the PS4 (which does not connect the HDMI port directly to the Liverpool APU, but instead through a Panasonic MN8647xx HDMI encoder). But I am quite certain that such shenanigans are absent from Geforce MXMs.

That the 7970 supports FreeSync for video playback was stated directly by AMD:

I have come back to this, since you seem to think that you have shot me down for some reason.

Kabini is a GCN 1.0 APU. As stated in this review by Bjorn 3d. I quoted the important line.

The GPU on the Kabini is based on GCN architecture like the rest of the 2014 AMD APU lineup and the desktop Radeon GPU. Though Kabini uses GCN 1.0 as oppose to the GCN 1.1 found on the Kaveri.

Second, What it proves is that the there are variations between various GPUs on a laptop and differences between desktop cards and Laptop cards. Just because adaptive sync works on a laptop GPU doesn't mean it will work on desktop GPU.

And Lastly, Yeah, I know AMD stated that, and it's correct, I am just telling you how it works. There is no hardware requirement on the GPU side to use it for video playback, the hardware is in the monitor, all that the GPU needs is a Display port version 1.2 or higher Because it's only a one time change of the Refresh rate. But for dynamic refresh rate you need the hardware on the GPU side as well.

Both AMD and Nvidia have stated there is hardware needed, Nvidia have stated Maxwell desktop GPUs don't have the hardware needed(or at least most of the cards don't) And AMD have stated the same, that there is a hardware requirement to support adaptive sync. The bit you quoted from AMD even says that.

The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.
 
Last edited:
I have come back to this, since you seem to think that you have shot me down for some reason.

Kabini is a GCN 1.0 APU. As stated in this review by Bjorn 3d. I quoted the important line.
So Bjorn3D is wrong on this one. You can go look into AMD open source Linux drivers for first hand information. Kabini is a GFX7 / 2nd gen GCN (sometimes called GCN 1.1) part like the others in the Sea Islands GPU family.

Or look at this Wiki page maintained by the AMD Linux driver team:
Radeon Graphics/Compute Hardware
GFX7 BONAIRE, KABINI, MULLINS, KAVERI, HAWAII
https://www.x.org/wiki/RadeonFeature/#index6h2

There is no hardware requirement on the GPU side to use it for video playback,
I think there is. Older DP 1.2 cards from AMD like the 6000 / Northern Islands series do not even support FreeSync for video playback.

Both AMD and Nvidia have stated there is hardware needed, Nvidia have stated Maxwell desktop GPUs don't have the hardware needed(or at least most of the cards don't) And AMD have stated the same, that there is a hardware requirement to support adaptive sync. The bit you quoted from AMD even says that.
What AMD said was that supporting VRR in gaming needs hardware support. So the graphics core needs support for VRR (since 2nd gen GCN) and the display engine needs support for Adaptive-Sync (since 1st gen GCN).

I therefore still think the Maxwell silicon has the capability and everything that is required, and there exists nothing between the GPU and the Display that would interfere (like that Panasonic HDMI encoder in the PS4).

However after thinking a bit more about what you wrote that there are MXMs which apparently will not enable G-Sync, maybe NVidia used special firmware or possibly fused of this function to allow it to work only in G-Sync certified laptops. In any case, the disabling was a deliberate act of NVidia after the hardware was produced.
 
Been having problems with this for a few weeks now. Had been on the original VRR driver, updated, and now my monitor is acting like it's getting out of range signals and blacking out when gets below 45fps.
 
There are so many monitor brands, lines, and specs out there.
Gsync seemed like a control gate for quality, in that manufacturers wouldn't put gsync on less than their best panels, and those would have higher quality control.

Is this not true?
Is Gsync now as untrustworthy and useless as the Freesync branding?
 
There are so many monitor brands, lines, and specs out there.
Gsync seemed like a control gate for quality, in that manufacturers wouldn't put gsync on less than their best panels, and those would have higher quality control.

Is this not true?
Is Gsync now as untrustworthy and useless as the Freesync branding?

Nah, Nvidia separate GSync and GSync Compatible.GSync compatible are Freesync monitors that passes NVidia QC. You can use any freesync monitor that doesn't have the GSync stamp of approval, just NVidia cannot guaranteed it will work well.
 
There are so many monitor brands, lines, and specs out there.
Gsync seemed like a control gate for quality, in that manufacturers wouldn't put gsync on less than their best panels, and those would have higher quality control.

Is this not true?
Is Gsync now as untrustworthy and useless as the Freesync branding?
This might answer your questions/unrest:



TLDW;

GSync Compatible is a (simple?) 3 4 point inspection. Getting real GSync branding requires surviving Nvidia QC's gauntlet of tests.
 
Last edited:
Nah, Nvidia separate GSync and GSync Compatible.GSync compatible are Freesync monitors that passes NVidia QC. You can use any freesync monitor that doesn't have the GSync stamp of approval, just NVidia cannot guaranteed it will work well.

So now there will be real "Gsync" and "Gsync compatible" (freesync)?

But nvidia is actually qualifying the freesync "gsync compatible" monitors?
 
So now there will be real "Gsync" and "Gsync compatible" (freesync)?

But nvidia is actually qualifying the freesync "gsync compatible" monitors?

They make it sound complicated don't they?

Gsync is Nvidia's self proclaimed standard or their (proprietary standard) in Variable Refresh rate. VRR has been around for years. So whenever you hear, Gsync "compatible" it only means it passes Nvidia's standard for Variable Refresh rate in accordance to what they deem acceptable.

FreeSync on the other hand, its open source, proprietary free and uses what was already in place in the realm of VRR.
 
Last edited:
They make it sound complicated don't they?

Gsync is Nvidia's self proclaimed standard in Variable Refresh rate. VRR has been around for years. So whenever you hear, Gsync "compatible" it only means it passes Nvidia's standard for Variable Refresh rate in accordance to what they deem acceptable.

I thought nvidia launched the commercial movement towards "VRR" and Gsync is hardware in the monitor synchronizing with your nvidia gpu, with strict standards.

Freesync is AMD's open source and uncontrolled, no standard, no qualification implementation of VRR?
 
I thought nvidia launched the commercial movement towards "VRR" and Gsync is hardware in the monitor synchronizing with your nvidia gpu.

Freesync is AMD's open source and uncontrolled, no standard, no qualification implementation of VRR?

Yes, Nvidia used their own proprietary hardware for Gsync. VRR has been around for a while, even before Nvidia's Gsync chip. But they do like to tout that they create things they didn't. They didn't create sync technology, just vastly improved it.

Thats not taking anything away from Freesync though, I have used both and cannot tell the difference either way.

Im using a 2070 in on a freesync monitor now in G-Sync compatibility mode and it works flawlessly. Im really glad Nvidia decided to start supporting freesync monitors. Good move.

Heres the monitor if you're interested,
https://www.amazon.com/LG-32GK650F-B-Monitor-FreeSync-Technology/dp/B07FLGR2PN
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Yes, Nvidia used their own proprietary hardware for Gsync. VRR has been around for a while, even before Nvidia's Gsync chip. But they do like to tout that they create things they didn't.

Wasn't aware Nvidia ever claimed to invent VRR. They were first to market with the modern version of it but I don't recall them saying they created the entire concept of VRR. It would have been pretty stupid of them to do so as VESA announced Adaptive Sync before G-Sync was out.
 
Wasn't aware Nvidia ever claimed to invent VRR. They were first to market with the modern version of it but I don't recall them saying they created the entire concept of VRR. It would have been pretty stupid of them to do so as VESA announced Adaptive Sync before G-Sync was out.

You know as well as I that they go a little over the top on marketing.
 
Wasn't aware Nvidia ever claimed to invent VRR. They were first to market with the modern version of it but I don't recall them saying they created the entire concept of VRR. It would have been pretty stupid of them to do so as VESA announced Adaptive Sync before G-Sync was out.

Do you have evidence to show that Adaptive Sync was used for syncing game output before Nvidia announced G-Sync?

I remember AMD hacking together a laptop halfassedly immediately after...
 
Yes, Nvidia used their own proprietary hardware for Gsync. VRR has been around for a while, even before Nvidia's Gsync chip. But they do like to tout that they create things they didn't.

I couldn't care less who get's the "win" for accomplishing something, it appears to me gsync is the better implemented end-to-end solution. Not designed, but implemented.

Strange they'd devalue their standard... guess the oems want to sell us less quality cheaper panels with their branding.
 
I couldn't care less who get's the "win" for accomplishing something, it appears to me gsync is the better implemented end-to-end solution. Not designed, but implemented.

Strange they'd devalue their standard... guess the oems want to sell us less quality cheaper panels with their branding.

Personally, I think it has to do with HDMI 2.1 specification, any TV or monitors with HDMI 2.1 will support VRR by default.
 
Do you have evidence to show that Adaptive Sync was used for syncing game output before Nvidia announced G-Sync?

I remember AMD hacking together a laptop halfassedly immediately after...

VRR (for CRT displays of the time) was patented in 1985. So, no, Nvidia did not invent VRR. As I said, they were first to market with the modern version of the tech.
 
Personally, I think it has to do with HDMI 2.1 specification, any TV or monitors with HDMI 2.1 will support VRR by default.

Interesting, do you think cheap consumer space is creating lesser standards, forcing nvidia to accommodate?

I'm only interested in display port myself.
 
Strange they'd devalue their standard... guess the oems want to sell us less quality cheaper panels with their branding.

Might as well get in front of it. Most FreeSync implementations, and I do mean the vast majority, have been sub-par.

But enough have been 'close enough', and whatever Nvidia intended with the outset of G-Sync, they never got the cost down. Depending on the level of monitor being compared it wasn't always that bad, however, there was also a definite 'floor' to G-Sync monitors.
 
Interesting, do you think cheap consumer space is creating lesser standards, forcing nvidia to accommodate?

I'm only interested in display port myself.

A-Sync isn't a lesser standard. Take two versions of the exact same monitor and give on G-Sync and the other A-Sync with the same syn ranges and no other differences and it will be virtually impossible to tell them apart.
 
VRR (for CRT displays of the time) was patented in 1985. So, no, Nvidia did not invent VRR. As I said, they were first to market with the modern version of the tech.

If you're talking about CRTs, then we're not talking about the same thing. The 'idea' of VRR, even as implemented in CRTs, became an entirely new challenge for LCD panels. Apples and oranges, and Nvidia had the entire thing figured out and implemented in hardware and software when they announced.
 
A-Sync isn't a lesser standard. Take two versions of the exact same monitor and give on G-Sync and the other A-Sync with the same syn ranges and no other differences and it will be virtually impossible to tell them apart.

No, but it's also not a standard for everything- just the protocol. And neither was FreeSync. FreeSync 2 did a lot to make up the difference, but I'd be extremely wary of a FreeSync monitor. That's one nice thing about G-Sync. Don't have to worry about the VRR implementation being halfassed.
 
If you're talking about CRTs, then we're not talking about the same thing. The 'idea' of VRR, even as implemented in CRTs, became an entirely new challenge for LCD panels. Apples and oranges, and Nvidia had the entire thing figured out and implemented in hardware and software when they announced.

I'm not entirely sure why you're going after my post when I didn't even say anything negative.
 
Interesting, do you think cheap consumer space is creating lesser standards, forcing nvidia to accommodate?

I'm only interested in display port myself.
No, I don't believe that, if that is the case, I don't see the point NVidia going thru validation process for Freesync monitors or still unveiling GSync monitors from its partners.
 
No, but it's also not a standard for everything- just the protocol. And neither was FreeSync. FreeSync 2 did a lot to make up the difference, but I'd be extremely wary of a FreeSync monitor. That's one nice thing about G-Sync. Don't have to worry about the VRR implementation being halfassed.

I'd only be wary of a Freesync monitor if you don't do your research before hand. And if you aren't doing research before buying something you could use for multiple years (since people tend to keep monitors for a long time) you deserve what you get.
 
I'd only be wary of a Freesync monitor if you don't do your research before hand. And if you aren't doing research before buying something you could use for multiple years (since people tend to keep monitors for a long time) you deserve what you get.

Yeah... no. We might do the research, but anyone can slap a 'FreeSync' sticker on a monitor, because you know, it's 'free'. And there are a lot of halfassed implementations.

And manufacturers do a pretty opaque job of detailing their products. Even today.
 
Because you went off topic?

'VRR was invented for use on CRTs'

I didn't say anything about being invented, patented, anything else.

Because I was trying to figure out what exactly you were after in your first reply because the reply seemed to completely miss what the post was saying.

Yeah... no. We might do the research, but anyone can slap a 'FreeSync' sticker on a monitor, because you know, it's 'free'. And there are a lot of halfassed implementations.

That's why you research the monitors that look interesting. I wouldn't buy a G-Sync monitor without doing some serious research either. In fact, I spent months researching monitors before buying one.
 
I'd only be wary of a Freesync monitor if you don't do your research before hand. And if you aren't doing research before buying something you could use for multiple years (since people tend to keep monitors for a long time) you deserve what you get.

Where can you do research?
 
Reviews, forums, Google searches. The same way you research any product.

Muddy waters, everything is always great, 50/50 at worst.

Do you have a good tech site or are you really just pissing in the wind?
 
Muddy waters, everything is always great, 50/50 at worst.

Do you have a good tech site or are you really just pissing in the wind?

So you act like a dick and expect me to help you afterwards?
 
Back
Top