VESA announces updates to adaptive sync certification process

Lakados

[H]F Junkie
Joined
Feb 3, 2014
Messages
10,276
VESA has found that too many manufacturers are too lax on their implementation of the of the adaptive sync standards (FreeSync) so they are clarifying it and expanding on the certification process.

Unfortunately though VESA is not requiring vendors to provide specifics on Grey to Grey or other supported ranges and their actual test results. They are allowing them to get away with a simple Pass/Fail approach to the certification and it’s going to be up to the public to mid trusted reviews.

https://www.guru3d.com/news-story/vesa-launches-adaptivesync-mediasync-vrr-standards.html

https://www.pcworld.com/article/696...adaptive-sync-monitors-actually-work.html/amp
 
Only question that pops up in my head is if it's still better just to look for a gsync compatible certified monitor since those have at least been tested and certified to provide a good and basic, trouble free, tear free and stutter free gaming experience.

Is this bound to be any better than that, or will it not really change a damn thing and just be blowing hot air. I mean, it can't be going back to the mass of cheap freesync monitors that flooded the market at it's inception and proved to be a minefield of flickering madness; right? :confused:
 
Only question that pops up in my head is if it's still better just to look for a gsync compatible certified monitor since those have at least been tested and certified to provide a good and basic, trouble free, tear free and stutter free gaming experience.

Is this bound to be any better than that, or will it not really change a damn thing and just be blowing hot air. I mean, it can't be going back to the mass of cheap freesync monitors that flooded the market at it's inception and proved to be a minefield of flickering madness; right? :confused:
Yes it is, the GSync certification process is still more rigorous and will generally result in a more enjoyable experience. This appears to be a response to that fact and goes a long ways towards matching up.
 
Wasn’t this already solved by Freesync 2 requiring certain metrics being obtained in order to have certification?
 
Wasn’t this already solved by Freesync 2 requiring certain metrics being obtained in order to have certification?
FreeSync and FreeSync 2 are different certification standards than the VESA ones but they do piggyback on it. From what I understand of it though, AMD is pretty lax in its enforcement of the term FreeSync outside of the FreeSync Premium and Premium Pro certifications. The VESA standards will require the certification tests to be performed in their default configuration which is a rule only shared with Nvidia's GSync.
 
Only question that pops up in my head is if it's still better just to look for a gsync compatible certified monitor since those have at least been tested and certified to provide a good and basic, trouble free, tear free and stutter free gaming experience.

Is this bound to be any better than that, or will it not really change a damn thing and just be blowing hot air. I mean, it can't be going back to the mass of cheap freesync monitors that flooded the market at it's inception and proved to be a minefield of flickering madness; right? :confused:

This is better than having "Gsync compatible" or "Freesync" on the monitor. The Gsync compatible is only marketing blurb that Nvidia introduced to make it appear like Gysnc Compatible is better than Freesync. If you have a good adaptive sync monitor than you will have a good gaming experience whether the monitor is Gsync compatible or not. This new VESA certification goes someway to making sure that monitors with the adaptive sync label are actually good and not just claiming to be good.
 
  • Like
Reactions: kac77
like this
Only question that pops up in my head is if it's still better just to look for a gsync compatible certified monitor since those have at least been tested and certified to provide a good and basic, trouble free, tear free and stutter free gaming experience.

Is this bound to be any better than that, or will it not really change a damn thing and just be blowing hot air. I mean, it can't be going back to the mass of cheap freesync monitors that flooded the market at it's inception and proved to be a minefield of flickering madness; right? :confused:

Well sure, if you have a Nvidia card, G Sync would be a good idea. However, if you have an AMD card, Freesync Premium or Freesync 2 would be the best option. However, when Nvidia usurped the Freesync moniker on a number of monitors and called them G Sync compatible, you no longer can just look at freesync cheap, anymore.
 
Yes it is, the GSync certification process is still more rigorous and will generally result in a more enjoyable experience. This appears to be a response to that fact and goes a long ways towards matching up.

Not when it comes to Freesync Premium or Freesync 2, they are most likely equal. However, you are going to have to pay to play, in these cases.
 
Not when it comes to Freesync Premium or Freesync 2, they are most likely equal. However, you are going to have to pay to play, in these cases.
Not quite equal but comparable, the manufacturers are allowed to color tune and overclock the pannels to pass the FreeSync certifications where GSync tests are required to be run at stock out-of-the-box configurations.
But yes they are much much closer and if you have the know-how to tune your display can be equal.
GSync also runs more tests to ensure a better minimum framerate experience.
 
Not quite equal but comparable, the manufacturers are allowed to color tune and overclock the pannels to pass the FreeSync certifications where GSync tests are required to be run at stock out-of-the-box configurations.
But yes they are much much closer and if you have the know-how to tune your display can be equal.
GSync also runs more tests to ensure a better minimum framerate experience.

I am hoping the VESA standards can account for all of this in time. I don't want to have to do a lot of research to see which monitor supports both Nvidia and AMD properly. FreeSync has been a mess, although it seems to be clearing up and some monitors support G Sync well enough to. But it is still hard to figure out which those monitors are.

If VESA can bring the certification process up to par with G Sync (not sure if entirely possible?) that would be great.
 
I am hoping the VESA standards can account for all of this in time. I don't want to have to do a lot of research to see which monitor supports both Nvidia and AMD properly. FreeSync has been a mess, although it seems to be clearing up and some monitors support G Sync well enough to. But it is still hard to figure out which those monitors are.

If VESA can bring the certification process up to par with G Sync (not sure if entirely possible?) that would be great.
Nvidia keeps a running list, the ones listed as Compatible, are supposedly verified to work with both Nvidia GSync and AMD Freesync compatible cards.
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/
 
I am hoping the VESA standards can account for all of this in time. I don't want to have to do a lot of research to see which monitor supports both Nvidia and AMD properly. FreeSync has been a mess, although it seems to be clearing up and some monitors support G Sync well enough to. But it is still hard to figure out which those monitors are.

If VESA can bring the certification process up to par with G Sync (not sure if entirely possible?) that would be great.
It's actually very easy to see what monitors have g-sync certification as Nvidia even keeps a list of all g-sync ultimate, g-sync and g-sync compatible monitors on their own website, g-sync monitors.

Edit: Lakados beat me to it.
 
Not quite equal but comparable, the manufacturers are allowed to color tune and overclock the pannels to pass the FreeSync certifications where GSync tests are required to be run at stock out-of-the-box configurations.

What??
 
I am hoping the VESA standards can account for all of this in time. I don't want to have to do a lot of research to see which monitor supports both Nvidia and AMD properly. FreeSync has been a mess, although it seems to be clearing up and some monitors support G Sync well enough to. But it is still hard to figure out which those monitors are.

If VESA can bring the certification process up to par with G Sync (not sure if entirely possible?) that would be great.

If you want to find a monitor that works with AMD and Nvidia, just go here.

https://www.amd.com/en/products/freesync-monitors

Pick the Panel type you want and then under the range tab, make sure the monitor you are thinking of buying supports the full range, ie that the high value isn't 60, 75 or 90hz. Any monitor that has the full range should work equally well with Nvidia or AMD.

However, you will still need to do research before buying a monitor. Despite what people will tell you, doesn't matter whether the monitor says Gsync compatible/freesync 2 or not, the monitor could still have issues with flickering, colours, backlight bleed, stuck pixels or any of the numerous problems that monitors have.

When you say on par with Gysnc, do you mean monitors with the Gsync module?
 
  • Like
Reactions: kac77
like this
If you want to find a monitor that works with AMD and Nvidia, just go here.

https://www.amd.com/en/products/freesync-monitors

Pick the Panel type you want and then under the range tab, make sure the monitor you are thinking of buying supports the full range, ie that the high value isn't 60, 75 or 90hz. Any monitor that has the full range should work equally well with Nvidia or AMD.

However, you will still need to do research before buying a monitor. Despite what people will tell you, doesn't matter whether the monitor says Gsync compatible/freesync 2 or not, the monitor could still have issues with flickering, colours, backlight bleed, stuck pixels or any of the numerous problems that monitors have.

When you say on par with Gysnc, do you mean monitors with the Gsync module?
I would think they mean equivalent in G2G, HDR, response times, colour range, accuracy, …, all the other parts of the GSync and FreeSync certifications. Adaptive frame rates while important aren’t the only parts of their branding certificates.

But the two standards will never be equal on the lower end of their response ranges. GSync can go as low as 1fps, where depending on the panel FreeSync will only function above 48 or 60fps.
 
I would think they mean equivalent in G2G, HDR, response times, colour range, accuracy, …, all the other parts of the GSync and FreeSync certifications. Adaptive frame rates while important aren’t the only parts of their branding certificates.

But the two standards will never be equal on the lower end of their response ranges. GSync can go as low as 1fps, where depending on the panel FreeSync will only function above 48 or 60fps.

Yes, that's Gsync Certification, not Gsync compatible certification.
 
Yes, that's Gsync Certification, not Gsync compatible certification.
And that's where shit gets weird, which is why VESA upping their testing requirements and by extension, the FreeSync requirements are a good thing, GSync sets a high bar, and it tends to work extremely well out of the box. There isn't a lot you have to research there, it's very plug-and-play, and it takes a lot of the guesswork and forum hunting out of the process which is nice but it's then basically a service you are paying for through the GSync monitors' obvious markup. It gets a lot clearer with the FreeSync Premium and Premium Pro certified monitors and their baseline stats are close if not identical to the GSync ones but then you are inching a lot closer to the GSync monitor's price, but it does afford you the ability to seamlessly transition between AMD and Nvidia GPU's and have the technology work.
GSync compatible is just "Yeah we tested this in house and we found it worked to our satisfaction"
Between the lot, I am more inclined towards the FreeSync Premium Pro models, they are easier to find and offer an out-of-the-box performance that is on par with GSync devices with better flexibility, very tempted by the Dell G3223D.
 
And that's where shit gets weird, which is why VESA upping their testing requirements and by extension, the FreeSync requirements are a good thing, GSync sets a high bar, and it tends to work extremely well out of the box. There isn't a lot you have to research there, it's very plug-and-play, and it takes a lot of the guesswork and forum hunting out of the process which is nice but it's then basically a service you are paying for through the GSync monitors' obvious markup.
Precisely. You can shit on nVidia for a lot of things with Gsync (and other issues) but one thing you can't is making it easy for users to get it to work well. If you bought a Gsync monitor (with the Gsync hardware) and an nVidia card, it just works. From very low to very high FPS, it works well, has nicely tuned response times, etc. You just use it and enjoy it.

This is what we should want for everything. There shouldn't be a "freesync range" on screens, we shouldn't have to deal with smearing in VRR because the response times aren't optimized for anything but max frame rate. We should be able to plug in a display in VRR mode and just use it without worrying about issues. Only way that is going to happen is either:

1) Letting nVidia take over everything and having one, proprietary, standard where all GPUs and theirs and all monitors have their chips

or

2) Having better standards from standards bodies that require manufacturers to make their shit work well with all devices.
 
I would think they mean equivalent in G2G, HDR, response times, colour range, accuracy, …, all the other parts of the GSync and FreeSync certifications. Adaptive frame rates while important aren’t the only parts of their branding certificates.

But the two standards will never be equal on the lower end of their response ranges. GSync can go as low as 1fps, where depending on the panel FreeSync will only function above 48 or 60fps.

Freesync is not tied solely to the FPS. It can operate at below 48 (typical low end refresh rate of freesync monitors) if the monitor supports LFC and most do. It's real low end is 19.2 to be exact.

Gsync does operate at 1fps theoretically, but that's a slide show. That's like being the master of shit. It's really just marketing because no one is playing at those rates nor would they.
 
Last edited:
Freesync is not tied solely to the FPS. It can operate at below 48 (typical low end refresh rate of freesync monitors) if the monitor supports LFC and most do. It's real low end is 19.2 to be exact.

Gsync does operate at 1fps, but that's a slide show. That's like being the master of shit. It's really just marketing because no one is playing at those rates nor would they.
LFC doesn’t look much better when it’s just holding the frames longer in the buffer so that your 1 frame is shown 2 or 3 times. But yes it is there and is a decent work around for that minimum fps issue. But that is one more feature you have to research to check if it is supported or not on your desired monitor. It should be on all of them by default. FreeSync and FreeSync2 as it currently stands has too many optional parts of the specification you shouldn’t need spreadsheets to determine if the feature is actually going to work as desired on your machine.

As per AMD on this
“Low Framerate Compensation only works on FreeSync monitors in which the maximum refresh rate is at least 2.5 times greater than its minimum refresh rate.”

It’s bullshit that you need to bust out a calculator to do the math on weather the feature is going to work or not.
 
LFC doesn’t look much better when it’s just holding the frames longer in the buffer so that your 1 frame is shown 2 or 3 times. But yes it is there and is a decent work around for that minimum fps issue. But that is one more feature you have to research to check if it is supported or not on your desired monitor. It should be on all of them by default. FreeSync and FreeSync2 as it currently stands has too many optional parts of the specification you shouldn’t need spreadsheets to determine if the feature is actually going to work as desired on your machine.

As per AMD on this
“Low Framerate Compensation only works on FreeSync monitors in which the maximum refresh rate is at least 2.5 times greater than its minimum refresh rate.”

It’s bullshit that you need to bust out a calculator to do the math on weather the feature is going to work or not.
Literally none of that refutes what I said. Where do you think I got the math from? Duh.

"It’s bullshit that you need to bust out a calculator to do the math". I don't know what else to say to that. Welcome to I.T. where you have to be somewhat accurate? Either way It's not bullshit. There's also a list on AMDs site so you don't have to wonder if it supports it (LFC). Yay Internet!

If you want to push Nvidia that's fine, be my guest. Could care less. But the 48fps number wasn't correct. So I corrected you. Not a big deal.
 
Last edited:
Literally none of that refutes what I said. Where do you think I got the math from? Duh.

Listen to yourself, "It’s bullshit that you need to bust out a calculator to do the math". I don't know what else to say to that. Welcome to I.T. where you have to be somewhat accurate? Either way It's not bullshit. There's also a list on AMDs site so you don't have to wonder if it supports it. Yay Internet!

BTW how in the world do you buy your I.T products? You don't check the specifications? Surely you don't just listen to some teenager on YouTube.

Consumer electronics != IT.

If your customers are doing long division in the aisles of the local Best Buy, your marketing team has failed. Engineering team might not be looking so hot either.
 
Consumer electronics != IT.

If your customers are doing long division in the aisles of the local Best Buy, your marketing team has failed. Engineering team might not be looking so hot either.
That's why people end up here because the they relied on the teenager at Best Buy. If you want to put premium gas in your Ford Focus be my guest.
 
Literally none of that refutes what I said. Where do you think I got the math from? Duh.

"It’s bullshit that you need to bust out a calculator to do the math". I don't know what else to say to that. Welcome to I.T. where you have to be somewhat accurate? Either way It's not bullshit. There's also a list on AMDs site so you don't have to wonder if it supports it (LFC). Yay Internet!

If you want to push Nvidia that's fine, be my guest. Could care less. But the 48fps number wasn't correct. So I corrected you. Not a big deal.
The 48/60 number is correct and AMD has to create a software solution to work around it so if you only had 19.5 FPS it will hold those frames for 3 refreshes inflating the frame rate to 58.5 and possibly still tear.

Consumers aren’t IT professionals and should not be expected to be. You should be able to look at a box, see the branding or logos you are wanting and expect it to work.

I’m not pushing anything I am pointing out a large flaw in the FreeSync certification process that was big enough it forced VESA to create two new certification processes that supplant it. Certifications which are needed to provide clarity and enforcement that AMD was not willing or capable of bringing.

Healthy criticism of something is not advocation for its alternatives.
 
Last edited:
The 48/60 number is correct and AMD has to create a software solution to work around it so if you only had 19.5 FPS it will hold those frames for 3 refreshes inflating the frame rate to 58.5 and possibly still tear.

Consumers aren’t IT professionals and should not be expected to be. You should be able to look at a box, see the branding or logos you are wanting and expect it to work.

I’m not pushing anything I am pointing out a large flaw in the FreeSync certification process that was big enough it forced VESA to create two new certification processes that supplant it. Certifications which are needed to provide clarity and enforcement that AMD was not willing or capable of bringing.

Healthy criticism of something is not advocation for its alternatives.
Nope. This is right up there with your B.S accusation that AMD had no GPU patents or the other one where you said It didn't help develop HBM.
 
Nope. This is right up there with your B.S accusation that AMD had no GPU patents or the other one where you said It didn't help develop HBM.
AMD had nothing to do with the development of HBM chips, they developed an interface layer between their packaging and the HBM chips. That is vastly different.

1651766072939.png

SK Hynex developed the DRAM Dye, TSV's and HBM Controller, AMD was responsible for the Silicon Interposer, the Silicon Interposer is listed as a packaging technology, which AMD co-developed with TSMC.
Here is TSMC's IEEE paper for said silicon interposer. Which was completed and published 3 years before HK Hynix would complete the HBM specifications, which was created on AMD's bequest.
https://ieeexplore.ieee.org/document/6651893

But saying AMD created HBM memory is like saying that the person who commissioned a painting created the painting.
 
Last edited:
AMD had nothing to do with the development of HBM chips, they developed an interface layer between their packaging and the HBM chips. That is vastly different.

View attachment 470660
SK Hynex developed the DRAM Dye, TSV's and HBM Controller, AMD was responsible for the Silicon Interposer, the Silicon Interposer is listed as a packaging technology, which AMD co-developed with TSMC.
Here is TSMC's IEEE paper for said silicon interposer. Which was completed and published 3 years before HK Hynix would complete the HBM specifications, which was created on AMD's bequest.
https://ieeexplore.ieee.org/document/6651893
You can't develop an interface without being apart of the development of the chips. You also can't get patents on it unless you helped develop it . Try harder.

Literally it was a partnership. You aren't dropping truth bombs here.
 
You can't develop an interface without being apart of the development of the chips. You also can't get patents on it unless you helped develop it . Try harder.

Literally it was a partnership. You aren't dropping truth bombs here.
AMD states their patents towards HBM are for the 2.5D packaging technology, and that is covered in the JESD235 standard. But sure they probably had to play some hand in specifications and input even if their engineers were not directly involved they probably had some outside role that wouldn't be made clear in the patents or whitepapers.

But that is still very different from AMD's lack of ability to clarify or enforce their FreeSync branding of the VESA standards, forcing VESA to create new branding to supplant the FreeSync ones because they were deemed unreliable by the consumers at large. I was wanting AMD to do better here, and the hoops required to be jumped through to actually enable FreeSync on most displays are too complicated for the majority of consumers. This leads to overall consumer dissatisfaction because just plugging the monitor in and telling it to enable FreeSync in the menu is usually not enough, there are often a number of windows settings in either display preferences or the drivers that then have to be attended to and that should be automatic.
If AMD wants the open standards to gain a significant foothold they need to work to make them better than proprietary ones in the eyes of the consumers. That said I am still a fan of their Premium Pro certified displays, the fact they now also work with VRR on the PS5 is good, I can use that. But AMD hasn't been doing itself any favors here by being lax in their verification of who puts the FreeSync logo on the side of their boxes. Because there are a lot more models out there with FreeSync proudly displayed on them than there are listed in that spreadsheet AMD maintains on their site and that is not a good thing for AMD or the consumer.
 
Last edited:
AMD states their patents towards HBM are for the 2.5D packaging technology, and that is covered in the JESD235 standard. But sure they probably had to play some hand in specifications and input even if their engineers were not directly involved they probably had some outside role that wouldn't be made clear in the patents or whitepapers.

But that is still very different from AMD's lack of ability to clarify or enforce their FreeSync branding of the VESA standards, forcing VESA to create new branding to supplant the FreeSync ones because they were deemed unreliable by the consumers at large. I was wanting AMD to do better here, and the hoops required to be jumped through to actually enable FreeSync on most displays are too complicated for the majority of consumers. This leads to overall consumer dissatisfaction because just plugging the monitor in and telling it to enable FreeSync in the menu is usually not enough, there are often a number of windows settings in either display preferences or the drivers that then have to be attended to and that should be automatic.
If AMD wants the open standards to gain a significant foothold they need to work to make them better than proprietary ones in the eyes of the consumers. That said I am still a fan of their Premium Pro certified displays, the fact they now also work with VRR on the PS5 is good, I can use that. But AMD hasn't been doing itself any favors here by being lax in their verification of who puts the FreeSync logo on the side of their boxes. Because there are a lot more models out there with FreeSync proudly displayed on them than there are listed in that spreadsheet AMD maintains on their site and that is not a good thing for AMD or the consumer.
If your argument is you want them to be better with standards. Again, no problem with it. This doesn't change the fact that Freesync works below 48 fps.
 
If your argument is you want them to be better with standards. Again, no problem with it. This doesn't change the fact that Freesync works below 48 fps.
FreeSync can work below 48, given the right set of conditions, hardware, and software. It will not always work and AMD has done a very poor job at documenting what those conditions are, leaving it more than not to forums like this one to figure that out for them. I mean some of their newer 30-144, and 30-75 pannels (I didn't know those were a thing until about 30s ago) could work as low as 12 FPS, if the GPU has enough buffer room to facilitate it. But let's be real here forget 12 FPS, if your machine is struggling to even reach 20 FPS, chances are the occasional screen tear is the least of your worries.
 
It's not software related. It will always work if your monitor is within a specific refresh range. As long as the math checks out on your monitor it will be able to double refresh rate to prevent tearing from happening.

That's also true. The human eye starts to see the illusion of animation break down somewhere below 24fps. If you're at 20 I would fix that.
 
It's not software related. It will always work if your monitor is within a specific refresh range. As long as the math checks out on your monitor it will be able to double refresh rate to prevent tearing from happening.

That's also true. The human eye starts to see the illusion of animation break down somewhere below 24fps. If you're at 20 I would fix that.
https://digitalmasta.com/amd-low-framerate-compensation-amd-lfc-explained/

According to AMD it is based in the monitoring of repeating frames to evaluate the rate of response above the monitor’s minimum value possible.
Borrowing concepts from the frames recycling processes, the GPU has to “guess” when the next frame will be ready to determine the minimum suitable refresh rate, as well as repeating the same frame if it has to. From this, it gets the adaptative algorithm to handle low FPS situations.

Quite literally a driver function operating at a software level.

NVidia’s solution to this problem uses their hardware inside the monitor to essentially do the same thing but in the process removes the burden from the PC in the process.

Not saying having it at a software level is bad, it’s not an intensive process by any means. But AMD does not have access to the monitors internals so they had to duplicate the tech at a software level. But that gives AMD far more flexibility and compatibility than Nvidia could ever achieve with their approach.
Sadly though one of the trade offs is Nvidia’s approach does better at compensating for motion blur in the minimum/maximum FPS cases than AMD’s can. But again at such low FPS numbers that is a minor inconvenience.
 
Last edited:
https://digitalmasta.com/amd-low-framerate-compensation-amd-lfc-explained/

According to AMD it is based in the monitoring of repeating frames to evaluate the rate of response above the monitor’s minimum value possible.
Borrowing concepts from the frames recycling processes, the GPU has to “guess” when the next frame will be ready to determine the minimum suitable refresh rate, as well as repeating the same frame if it has to. From this, it gets the adaptative algorithm to handle low FPS situations.

Quite literally a driver function operating at a software level.

NVidia’s solution to this problem uses their hardware inside the monitor to essentially do the same thing but in the process removes the burden from the PC in the process.

Not saying having it at a software level is bad, it’s not an intensive process by any means. But AMD does not have access to the monitors internals so they had to duplicate the tech at a software level. But that gives AMD far more flexibility and compatibility than Nvidia could ever achieve with their approach.
Sadly though one of the trade offs is Nvidia’s approach does better at compensating for motion blur in the minimum/maximum FPS cases than AMD’s can. But again at such low FPS numbers that is a minor inconvenience.
The GPU does this not the CPU. If you're saying that the monitor doesn't have the FPGA, well OK. But that's not exactly new. Nor does it make it strictly software (meaning it's the CPU doing the work, it's not in this case). The "guess" is also bizarre. Tons of things "guess" in gaming. Like anti-aliasing, bilinear filtering, etc. Do you turn it off because it's a driver driven aka software? I sure hope not. It's useful.

I haven't argued that Gsync isn't superior. But I have argued this bizarre tendency to pretend AMD has zero patents and Freesync stops at 48 FPS. It's just nutzo and makes no sense.
 
Anandtech has a good article breaking down the tests of the new VESA certifications and what they do.

https://www.anandtech.com/show/1736...display-perf-standards-adaptivesync-mediasync
Nice this alone should bring certified adaptive sync monitors close to or on par with G-Sync monitors:

For the G2G overshoot/undershoot tests, are you testing at one refresh rate or multiple refresh rates?​

When running in Adaptive-Sync mode, the refresh rate (i.e., the speed at which the data is transferred, frame by frame, to the display) and the speed at which the display scan out is occurring is always at maximum refresh rate. When frames are being updated at less than the maximum refresh rate of the panel, this is not because the panel is running any slower, but because the vertical blanking interval (VBlank) timing between frame to frame has increased. Therefore, there is no reason to test G2G/Overshoot/Undershoot at anything other than maximum refresh rate as that’s the only rate the panel will be operating at when in Adaptive-Sync mode. If you were to exit from Adaptive-Sync mode and change the display timing to a fixed rate timing, then and only then does the display clock rate and scan-out time change, at which point different G2G performance may occur. However, this is outside of the Adaptive-Sync mode and not included within the VESA Adaptive-Sync Display test.
 
Back
Top