Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
General VRR question. My Samsung TV and xBox one X support it, but my Marantz AVR is my hub. Is this a pass through type of thing or does the Xbox need to go directly to tv?
Thanks.
HDMI 2.1 supports VRR but
1) does that mean panel makers must support it?
2) are there any standards? I appreciate that GSYNC mandated that GSYNC monitors were capable of 144Hz and LFC. Sure Freesync seemed a lot cheaper but when you are comparing a 60hz monitor to a GSYNC 144hz monitor, why wouldn't you expect the Gsync one to be more expensive?
I think AMD noticed this too and attached standards to the Freesync 2 certification.
My understanding is that being on the list is a formality. That just means that Nvidia has tested it, and bless that it will work properly. Sortof like a Microsoft WHQL driver.
The way I read it you should be able to go in to the driver settings and manually enable it for any VRR or Freesync display as long as you have a 10xx or 20xx GPU.
Huh?
Nvidia have been annoying lately in too many ways.
But this 'looks' to be a great move even if they were forced in this direction.
They dont need to provide an option to allow any VRR display to try the tech but they are.
Their Gsync HDR series displays are missing a trick though.
They should push at least 2000nits to be the best of.
Like I stated before, garenteed Nvidia is going pull some shinanigans with this. Most likely with branding: if you want to put the GSync logo on your box, you can't have the Freesync logo (or even mention Freesync support at all), or Nvidia may even go so far as to force "qualified monitors" to hack out support for Freesync all together, making it difficult or impossible to enable on AMD cards.
Call me pessimistic, but I simply don't see Nvidia as the 'generous' type.
AMD Radeon FreeSync 2 support
The CHG90 supports AMD’s new Radeon FreeSync 2 technology
Reading the interwebs, there are some folks who are especially upset, namely those who recently bought an G-Sync monitor to replace a FreeSync one.Hard to hurt someone that has already taken the shaft.
Too bad my BenQ XL2540 isn't in the list but if it can be turned on anyway unofficially shouldn't matter, quite suprised by this move. For me personally won't matter that much as I simply prefer running 144Hz + BenQ Blur Reduction (strobing) on mine (prefer it over 240Hz non-strobed for better contrast/image quality and smoother motion)
nVidia bought PhysX, and were hoping to capture the market with Physics offloading tech. But most of the industry didn't care.
Nvidia rolled out G-Sync to try and capture the dynamics sync refresh market, but it's been expensive, and most gamers don't really buy into it.
You have no idea what you are talking about with PhysX.
Most gamers don't really buy into higher tier cards either but Nvidia sells a bunch of them don't they. Gsync wasn't developed for all gamers. It was developed for higher tier gamers running fast cards. Thus the custom scaler and higher end panels.
Appears it only works on 10 and 20 series GPUs, even though nVidia has supported it since the 600 series. Ok, they want to sell cards...
Does it require a "qualified" monitor, or can you actually enable it on any Freesync/VRR with YMMV?
My understanding is that while VRR and Freesync work very similarly, G-Sync works differently, and uses some sort of hardware acceleration.
Presumably with Freesync and VRR you wind up with more either GPU or CPU load, but that is unclear to me.
Either way, without an Nvidia blessing, I doubt it would have been as easy as editing an INF file.
^ Damn that has to hurt. That's why I went freesync, price difference is so big can basically buy two quality FS monitors for a single G-Sync monitor. I went with one 32" 4k $350 (but 60hz and VA) and 144hz 27" 1440p IPS $400, really nice to be able to switch from 4k to 144hz on what game I was playing. I had surround PG278Q which worked pretty well with SLI'd 980ti's, but the single DP connection really hindered work.
I think the existence of G-Sync on notebooks proves you wrong. It uses Adaptive-Sync instead of the G-Sync module, and has worked before Geforce 1000 series.It only works on 10 series and newer GPU's because the older cards don't have the hardware needed to connect to an Adaptive sync monitor.
I think the existence of G-Sync on notebooks proves you wrong. It uses Adaptive-Sync instead of the G-Sync module, and has worked before Geforce 1000 series.
DisplayPort is a superset of embedded DisplayPort since 1.2.It's a power saving feature that's part of the Embedded Display port specification.
But the silicon is identical between desktop and mobile.This hardware is not on desktop GPUs as they don't need that kind of power savings. The desktop display port specification didn't have any such requirements until adaptive sync was made an optional part of the standard in 2014.
It sounds like he does know what he's talking about in regards to physx.
PhysX technology is used by game engines such as Unreal Engine (version 3 onwards), Unity, Gamebryo, Vision (version 6 onwards), Instinct Engine,[27] Panda3D, Diesel, Torque, HeroEngine and BigWorld.[19]
As one of the handful of major physics engines, it is used in many games, such as The Witcher 3: Wild Hunt, Warframe, Killing Floor 2, Fallout 4, Batman: Arkham Knight, Borderlands 2, etc. Most of these games use the CPU to process the physics simulations.
Nvidia rolled out G-Sync to try and capture the dynamics sync refresh market, but it's been expensive, and most gamers don't really buy into it.
So what does this mean for consumers? Are G-Sync monitors going to slowly go away in favor of Freesync 2? I don't think I'd buy another one now.
Interesting video I saw in the Display forum that shows what can happen when enabling G-Sync on a monitor that's not on the validated list...
It's funny how tech journalists fall for Nvidia marketing, I have the 3rd monitor (if is the LG 144Hz as seems to be) the blinking never happened on my previous 480 or in my current Vega 64, it's happening on the Nvidia card because it is locked under the freesync range, which starts at 50Hz in this monitor. Edit: To clarify, what I meant is that Nvidia driver is forcing the monitor to go below its minimum refresh rate instead of turning off freesync below 50Hz.
AMD is aware of the supported Freesync range of the screens and turns off when outside of the limits. Nvidia on the other hand would have you believe its a Monitor issue because they are driving the screen outside of its supported HZ range this will cause issues with any screen. custom controls over the Adptive sync range would be needed for the user to setup the correct range OR nvidia can do some research and use the right ranges for the screens
Not sure if I understood correctly, but that latter monitor that was blinking, did it do the same with Radeon cards? Since english is not my native language it is sometimes hard to make full sense of sentences when someone speaks quickly. Anyway, if that is the case then the screen is just plain faulty, or worse, badly made and Freesync got glued on top for marketing purposes. And the other one that was blurring badly, I think that was pretty much the hallmark for the poor first wave Freesync monitors IIRC.
*edit* Watched that part again with auto subtitles and yeah, the blinking monitor is just shitty. So I guess we can say it is official, Nvidia will now support Freesync much like AMD. But out of the sea of crappy Freesync screens it is up to the customer to find a good one.
Are you sure, with statements like "nVidia bought PhysX, and were hoping to capture the market with Physics offloading tech. But most of the industry didn't care."
Let's dissect that statement first. Actually, it's easier to just flip it. NVidia bought PhysX because they don't want to do anything to try and succeed as a business? It's like "Doh !" Of course NVidia is working to increase market share. But he continues by saying that the Industry didn't care .....
That sounds like nobody adopted PhysX, but that's far from true;
That list of games leaves out a lot of other titles. What's more, now that NVidia has Open Sourced PhysX, it's even more common.
So the industry did care, PhysX has been widely adopted, it even crossed to the console makers. Of course NVidia did it for the purpose of gaining market share.
View attachment 133715
Now let's look at his statement about G-Sync;
The "dynamics sync refresh market" ?
WTF is this? Let me explain my problem with this comment this way? What discrete graphics card manufacturer represents the non-dynamic sync refresh market? It's not a market, that is just a non-existent thing you just made up. There is no such thing. There is a discrete graphics card market, AMD and NVidia both have offerings that support Adaptive Sync, and each leverages what it can to try and secure as much of that market as they are able.
Did NVidia come out with G-Sync in order to gain market share, of course it did. There was no other adaptive sync option available on the market in 2013/2014 when NVidia release G-Sync. Freesync wasn't released until March 2015 so for over a year G-Sync was the only horse in the race for adaptive sync. Freesync has not always been the cheaper option by default because not everyone is in a blank slate position when deciding to buy into the technology. Freesync monitors are cheaper than G-Sync monitors, but the total solution has at times been more expensive, particularly if a customer already owned a good NVidia card. Everyone is in their own unique situation.
Overall, I'm not seeing any time prior to late 2015 where NVidia was struggling for market share in the dedicated graphics card market except that small little dip in 2010. Back when PhysX came out in 2008, NVidia had the lead by a recognizable margin, but for the next seven years since PhysX was purchased, NVidia extended that market share lead into a dominant position. That has changed some but it remains a 70/30 split in NVidia's favor.
Now I am not claiming that NVidia, adding PhysX and G-Sync to their arsenal, ere the keys to their success. I'm simply pointing out that NVidia already had a superior market share position and since PhysX and G-Sync they have strengthened that position substantially. I'm thinking any change in market share following the changes in late 2015 are at least some part due to crypto mining, but by this time, PhysX is not really part of the picture as it is now Open Source and G-Sync is now supporting Adaptive Sync across the new HDMI standard while maintaining it's G-Sync capability thru the Display Port standard.
And here is a very good article on why manufacturers have a more expensive time with G-Sync monitors, it's not just the costs of the G-Sync module and licensing;
https://www.pcworld.com/article/312...ia-g-sync-on-monitor-selection-and-price.html
EDIT: And I must appologize if this rambles and at any time seems disjointed, I was distracted several times while writing it.
Not sure if I understood correctly, but that latter monitor that was blinking, did it do the same with Radeon cards? Since english is not my native language it is sometimes hard to make full sense of sentences when someone speaks quickly. Anyway, if that is the case then the screen is just plain faulty, or worse, badly made and Freesync got glued on top for marketing purposes. And the other one that was blurring badly, I think that was pretty much the hallmark for the poor first wave Freesync monitors IIRC.
*edit* Watched that part again with auto subtitles and yeah, the blinking monitor is just shitty. So I guess we can say it is official, Nvidia will now support Freesync much like AMD. But out of the sea of crappy Freesync screens it is up to the customer to find a good one.
*edit2* A little quotes from Youtube comments.
So let me get this straight, Nvidia really can't help themselves but to do SOMETHING to make an otherwise good thing appear worse. Nvidia being Nvidia again.
i bet there was a loop hole in the driver where someone knew about this and basically nvidia came clean
only 12 monitors i though 400 would work
something fishy
Lol my reading must have not been good, because I totally missed the part where he stated nobody cares about physx.
I saw a couple comments saying they had flickering with that monitor on AMD cards. AMD does some driver level tweaks on Freesync monitors to make some of them work better I believe. Some of those flickering issues can be dealt with by changing monitor settings or doing other tweaks. Nvidia is likely running the monitors mostly stock in their testing, relying on monitor manufacturers to be telling the truth about the specs and capabilities of the panels. If Nvidia wanted to do driver level monitor-by-monitor tweaks they probably could ensure that all Freesync monitors worked as well on their cards as they do on AMD’s.
DisplayPort is a superset of embedded DisplayPort since 1.2.
But the silicon is identical between desktop and mobile.
Forgive me if I am wrong, but wasn't the reason behind being able to enable G-Sync on laptop displays primarily because eDP already had VRR tech built into them as a power saving measure, so G-Sync was piggy backed on that tech, but no such thing existed for desktop computers (power saving simply isn't a problem here), so nVidia was essentially forced to make their own modules in order to do what they wanted?
I'm surprised this thread only has 3 pages as this is a huge step in the right direction for Nvidia. The G-SYNC "fee" or "tax" or whatever you would like to call it was very unappealing and also being limited to either FreeSync if you had AMD or G-SYNC if you had Nvidia was very anti-consumer. Supporting adaptive sync monitors outside of G-SYNC certified ones gives consumers more purchasing power and an option to avoid premiums associated with G-SYNC certified monitors.
However, I am slightly upset to see that out of 400 monitors they tested only 12 passed. Makes me wonder if Nvidia is purposely making the validation process extremely strict (more than it needs to be) on these monitors.
I'm surprised this thread only has 3 pages as this is a huge step in the right direction for Nvidia.
That sounds all logical, until you realize that (1) desktop and mobile GPUs are the same silicon, and (2) DP and eDP are packet based protocols, and all that information is encoded in the packets. If you can output VRR via eDP, then you can also output VRR via DP, it's that simple.See my last post and try to understand it this time.
Laptop/APU that uses eDP has supported VRR since 2009 for power saving. There is hardware required to make VRR work and as part of the eDP specification, you have to install that hardware if you put a GPU into a laptop. This is why Nvidia Gysnc works on laptops without a module.
Desktop GPUs that use display port, use the DP specification. DP did not support VRR until 2014 and even then it was only made an optional part of the standard. The timing controller/frame buffer etc that you need to make VRR isn't required on a desktop GPU. And before the 260x launched in September 2013 no desktop GPU had this hardware installed. Pascal is the First desktop GPU from Nvidia with the needed hardware.
That sounds all logical, until you realize that (1) desktop and mobile GPUs are the same silicon, and (2) DP and eDP are packet based protocols, and all that information is encoded in the packets. If you can output VRR via eDP, then you can also output VRR via DP, it's that simple.
You can also connect a VRR capable eDP display to a Ryzen APU (not sure about dGPUs) and bam! you get FreeSync. (if the display EDID does not advertise Adaptive-Sync you may need to fake it through CRU or similar.)
And since version 1.2, DisplayPort is a superset of eDP. And this is not just theory, people actually tried running iPad eDP screens off their graphics cards' DP outputs. And guess what? With DP 1.1 that didn't work, but with DP 1.2 that worked with minimal circuitry.Some of the Specifications of the eDP standard
All I'm saying is that if your hardware already supports VRR via eDP, then your hardware has everything it needs for supporting VRR via DP.some of those specifications certain hardware is needed,
It just proves that FreeSync/Adaptive-Sync on DP and FreeSync/Adaptive-Sync on eDP are functionally the same.Your last statement just proves my point. Of course you get can use Freesync with a Ryzen APU, All APU's use the eDP standard, that's why they work with Adaptive sync.
You are wrong here too. Kaveri is second generation GCN. There are no APUs of first generation GCN (Bobcat is VLIW5, Trinity/Richland are VLIW4, and Kaveri/Kabini are GCN2),That's why when freesync was been launched people wondered why the 7970 and 7950 didn't support it but cheap Kaveri APUs did. They are both first generation GCN