No, GeForce Cards Are Not Suddenly "Playing Nice" with FreeSync Monitors

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The Tech Report posted a story yesterday suggesting NVIDIA might have grown a conscience and was quietly updating their drivers with FreeSync support. With that being an exciting prospect, the report quickly took off, but the editor later realized it was fake news: Windows 10 was merely adding V-Sync to the games being tested, which were running in borderless windowed mode.

After further research and the collection of more high-speed camera footage from our G-Sync displays, I believe the tear-free gameplay we're experiencing on our FreeSync displays in combination with GeForces is a consequence of Windows 10's Desktop Window Manager adding some form of Vsync to the proceedings when games are in borderless windowed mode, rather than any form of VESA Adapative-Sync being engaged with our GeForce cards.
 
So the weird finagling with using an APU or a low end AMD GPU along side your high end nVidia GPU isn't legit??
 
So the weird finagling with using an APU or a low end AMD GPU along side your high end nVidia GPU isn't legit??

If only there was an article to read. oh wait, I guess clicking a couple of times is hard.
 
Dont worry NV will fix that annoying variable sync oddity right away.

Not gonna let any inferior tech ruin their image.

-edit-

Makes me wonder if this is related to Bliz choosing to not allow Full Screen Exclusive with the new WOW expac? If it is a windows related feature they could be choosing to force it due to other benefits as well. Hmmm........
 
If only there was an article to read. oh wait, I guess clicking a couple of times is hard.


If only some people understood that other people may not have time to read an article and had just a few minutes to read snippets at the time and we're looking for a quick bit of info.

28164-fuck-fuck-fuck-fuck-off.jpg
 

Attachments

  • 28164-fuck-fuck-fuck-fuck-off.jpg
    28164-fuck-fuck-fuck-fuck-off.jpg
    16 KB · Views: 0
I don't get it. Border less windowed is always vsynced, how is that news to a tech guy ?!
Yesterday, in a sterling bit of investigative journalism, The Tech Report discovered that Windows applies V-Sync to games running in Windowed mode. Come back next week when they will reveal that the V-Sync in question is... triple buffered!

In other news, Nvidia still dicks, still want you to pay $200 extra for their G-Sync monitors.
 
Yesterday, in a sterling bit of investigative journalism, The Tech Report discovered that Windows applies V-Sync to games running in Windowed mode. Come back next week when they will reveal that the V-Sync in question is... triple buffered!

In other news, Nvidia still dicks, still want you to pay $200 extra for their G-Sync monitors.

how can they lock you in to an ecosystem if you get the features for free?
 
IMO if they allowed it I don't think it would hurt Nvidia sale. If anything I think people would buy more Nvidia cards if you ask me.
 
I bet you think FreeSync isn't proprietary.

It's not proprietary. It is AMD's brand-name for an industry standard called VESA Adaptive Sync.

http://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-140620.pdf
Intel will also be supporting it. (https://www.extremetech.com/gaming/...-to-support-vesa-adaptive-sync-in-future-gpus)

Anyone can implement it. In fact, nvidious uses it for their laptop G-Sync displays since there are no laptop displays with the proprietary module. They artificially disable support for it on desktops even though the hardware supports it because they want to gouge you for their overpriced proprietary g-sync modules.
 
It's not proprietary. It is AMD's brand-name for an industry standard called VESA Adaptive Sync.

http://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-140620.pdf
Intel will also be supporting it. (https://www.extremetech.com/gaming/...-to-support-vesa-adaptive-sync-in-future-gpus)

Anyone can implement it. In fact, nvidious uses it for their laptop G-Sync displays since there are no laptop displays with the proprietary module. They artificially disable support for it on desktops even though the hardware supports it because they want to gouge you for their overpriced proprietary g-sync modules.

Freesync is an optional part of the Displayport specification; it's not supported across the board, and has issues at very low/high refresh rates. As a solution, Gsync is better.

Moot anyways; HDMI 2.1 is making VRR a part of the base specification.
 
I don't get it. Border less windowed is always vsynced, how is that news to a tech guy ?!
"tech guy"

I bet you think FreeSync isn't proprietary.
It's not proprietary. It was DEVELOPED by AMD, but they explicitly made it an open standard. Nvidia can adopt it for no fee anytime they want.

Freesync is an optional part of the Displayport specification; it's not supported across the board, and has issues at very low/high refresh rates. As a solution, Gsync is better.
Probably, but is it an extra $200+ for a new monitor better? Considering how Nvidia won't adopt it, I'm guessing the answer is "no."
 
It's not proprietary. It is AMD's brand-name for an industry standard called VESA Adaptive Sync.

http://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-140620.pdf
Intel will also be supporting it. (https://www.extremetech.com/gaming/...-to-support-vesa-adaptive-sync-in-future-gpus)

Anyone can implement it. In fact, nvidious uses it for their laptop G-Sync displays since there are no laptop displays with the proprietary module. They artificially disable support for it on desktops even though the hardware supports it because they want to gouge you for their overpriced proprietary g-sync modules.
NVIDIA mobile GPUs utilize Adaptive Sync, not Freesync.
 
NVIDIA mobile GPUs utilize Adaptive Sync, not Freesync.

Yes it does, just like Freesync utilises Adaptive sync as well. The problem for Nvidia is that Freesync and Adaptive sync have become one and the same for many people. If they ever decide to use Adaptive sync in the desktop space it will look like they are using AMD technology.
 
Last edited:
They artificially disable support for it on desktops even though the hardware supports it because they want to gouge you for their overpriced proprietary g-sync modules.

Agree with the rest of your post but, as far as I am aware no Nvidia desktop GPU has the hardware needed to connect to an Adaptive sync monitor. It's not something you can enable with software. I don't know about the Turing GPUs, but, Pascal cards didn't.

The reason it works on Laptops is there is no need for a proprietary module, It's a mandatory part of the embedded display port specification. It's only an optional part of the desktop Display port specification.
 
Last edited:
... In fact, nvidious uses it for their laptop G-Sync displays since there are no laptop displays with the proprietary module. They artificially disable support for it on desktops even though the hardware supports it because they want to gouge you for their overpriced proprietary g-sync modules.

It isn't artifically disabled, there are extra processing components required for a display to support G-sync.

The price charged for that is another story, seems overpriced since it adds ~ $200 to a displays' cost to the end user.

The only reason nVidia doesn't also support the freesync option, is it would put a dent in that market for the added components/licensing of G-sync. That's the suck part.

***edit with more thoughts***

What nVidia should realize, is that adding support for both types (in the video cards) makes their expensive cards more attractive.. And there still could be a market for the better sync technology costing more in the display market... but I think that cost could be cut in half.

One question for anyone who might know more about these technologies.. why can't a display manufacturer who is paying to put G-Sync support in an LCD, also support freesync tech? Or can they?
 
Last edited:
It isn't artifically disabled, there are extra processing components required for a display to support G-sync.

don't think you understand what is been said. He is talking about Nvidia artificially disabling support for Adaptive sync not Gsync.
 
Ahh, ok that makes sense.

Question still remains, could a G-Sync monitor also support freesync? They probably control the tech though...
 
Yesterday, in a sterling bit of investigative journalism, The Tech Report discovered that Windows applies V-Sync to games running in Windowed mode. Come back next week when they will reveal that the V-Sync in question is... triple buffered!

In other news, Nvidia still dicks, still want you to pay $200 extra for their G-Sync monitors.


Except that ...... NVidia doesn't sell monitors (y)
 
I bet you think FreeSync isn't proprietary.


FreeSync is open source. Hell, it's so open source that any other company making gaming video cards can use it ....... if there were another company making gaming cards :sneaky:
 
FreeSync is open source. Hell, it's so open source that any other company making gaming video cards can use it ....... if there were another company making gaming cards :sneaky:

It's so open source that it is a crapshoot... This is a fairly good article explaining the pro's and cons: https://www.pcworld.com/article/297...rate-displays-make-pc-games-super-smooth.html

tldr:
freesync pros: cheap. Some support over hdmi.
freesync cons: only works in certain fps (refresh rate) ranges, which can vary wildly from monitor to monitor.
G-sync pros: best experience available: nVidia works with display manufacturer in all aspects including panel selection, optimizing refresh rates, flicker properties, display quality, and more.
G-sync cons: more expensive.

Each is proprietary to one of the GPU manufacturers' cards. The freesync tech is open source, so nVidia GPU's could theoretically support monitors with either sync type, but both sync technologies requires hardware support in the silicon, how much of which or at what cost I do not know. The AMD gpus likely cannot be made to use the G-sync tech in displays as nVidia isn't likely to license that hardware support in the silicon (in the GPU)...
They could get their tech more mainstream if they allowed that, but it would sell competitors products at the expense of their own sales, so an unlikely business decision.
 
They sell the G-sync chip that goes inside them and get a licensing fee. They profit from any g-sync monitor being sold.

And monitor manufacturers must choose, will they license the tech and support NVidia's adaptive sync solution or not. Take Samsung for instance and their new Quantum Dot displays. Samsung made a decision to only support Freensync in their initial version of these new displays. I think people need to put the right spin on this and not get the cart before the horse.

Acer, ASUS, Dell, AOC, LG, Viewsonic, and HP make G-Sync monitors, Samsung does not, as well as a few others.

If no display makers supported G-Sync, G-Sync would probably die out. It wouldn't be the first time a technology died off from lake of vendor support. It would certainly lower the cost of these monitors so why do they continue to support NVidia's tech?

There is only one right answer, it's profitable. They do it because it is making them money. That means there is enough volume in these displays to warrant the licensing fees to NVidia and enough sales to make them a viable business decision. Of course NVidia makes their money off the deal, but so do the display manufacturers or it would be an option at all.

If you think blaime should fall somewhere then it's the people who buy them that deserve your attention and no others.
 
And monitor manufacturers must choose, will they license the tech and support NVidia's adaptive sync solution or not. Take Samsung for instance and their new Quantum Dot displays. Samsung made a decision to only support Freensync in their initial version of these new displays. I think people need to put the right spin on this and not get the cart before the horse.

Acer, ASUS, Dell, AOC, LG, Viewsonic, and HP make G-Sync monitors, Samsung does not, as well as a few others.

If no display makers supported G-Sync, G-Sync would probably die out. It wouldn't be the first time a technology died off from lake of vendor support. It would certainly lower the cost of these monitors so why do they continue to support NVidia's tech?

There is only one right answer, it's profitable. They do it because it is making them money. That means there is enough volume in these displays to warrant the licensing fees to NVidia and enough sales to make them a viable business decision. Of course NVidia makes their money off the deal, but so do the display manufacturers or it would be an option at all.

If you think blaime should fall somewhere then it's the people who buy them that deserve your attention and no others.
You were saying they don't sell monitors, as though Nvidia doesn't have a horse in the race. As for it's profitability, my point was I suspect it's only profitable if they ensure Nvidia cards don't support freesync. If they did, I think the Gsync market would start evaporating, since while it's arguably better, it's not worth the price premium to most buyers and would likely stop turning a profit.

As for blame, look at it however you want. Consumers are easily manipulated. Always have been, always will be. Individual ones can be smart, as a whole, they're sheep. Many companies take advantage of that. That's about all there is to it.
 
It's so open source that it is a crapshoot... This is a fairly good article explaining the pro's and cons: https://www.pcworld.com/article/297...rate-displays-make-pc-games-super-smooth.html

tldr:
freesync pros: cheap. Some support over hdmi.
freesync cons: only works in certain fps (refresh rate) ranges, which can vary wildly from monitor to monitor.
G-sync pros: best experience available: nVidia works with display manufacturer in all aspects including panel selection, optimizing refresh rates, flicker properties, display quality, and more.
G-sync cons: more expensive.

Each is proprietary to one of the GPU manufacturers' cards. The freesync tech is open source, so nVidia GPU's could theoretically support monitors with either sync type, but both sync technologies requires hardware support in the silicon, how much of which or at what cost I do not know. The AMD gpus likely cannot be made to use the G-sync tech in displays as nVidia isn't likely to license that hardware support in the silicon (in the GPU)...
They could get their tech more mainstream if they allowed that, but it would sell competitors products at the expense of their own sales, so an unlikely business decision.

G-Sync only works in certain FPS ranges as well.

You are probably confusing refresh rate with low framerate compensation. There are freesync monitors that have this.

See here :
https://www.amd.com/Documents/freesync-lfc.pdf
 
Last edited:
  • Like
Reactions: N4CR
like this
It's so open source that it is a crapshoot... This is a fairly good article explaining the pro's and cons: https://www.pcworld.com/article/297...rate-displays-make-pc-games-super-smooth.html

tldr:
freesync pros: cheap. Some support over hdmi.
freesync cons: only works in certain fps (refresh rate) ranges, which can vary wildly from monitor to monitor.
G-sync pros: best experience available: nVidia works with display manufacturer in all aspects including panel selection, optimizing refresh rates, flicker properties, display quality, and more.
G-sync cons: more expensive.

Each is proprietary to one of the GPU manufacturers' cards. The freesync tech is open source, so nVidia GPU's could theoretically support monitors with either sync type, but both sync technologies requires hardware support in the silicon, how much of which or at what cost I do not know. The AMD gpus likely cannot be made to use the G-sync tech in displays as nVidia isn't likely to license that hardware support in the silicon (in the GPU)...
They could get their tech more mainstream if they allowed that, but it would sell competitors products at the expense of their own sales, so an unlikely business decision.

Most of the time the freesync usable range is restricted by monitor manufacturer so they can praise the 'benefits' of a $200 module/price hike that does practically the same thing for Ngreedia. Because many users unlock them for a greater sync range.
 
It's so open source that it is a crapshoot... This is a fairly good article explaining the pro's and cons: https://www.pcworld.com/article/297...rate-displays-make-pc-games-super-smooth.html

tldr:
freesync pros: cheap. Some support over hdmi.
freesync cons: only works in certain fps (refresh rate) ranges, which can vary wildly from monitor to monitor.
G-sync pros: best experience available: nVidia works with display manufacturer in all aspects including panel selection, optimizing refresh rates, flicker properties, display quality, and more.
G-sync cons: more expensive.

Each is proprietary to one of the GPU manufacturers' cards. The freesync tech is open source, so nVidia GPU's could theoretically support monitors with either sync type, but both sync technologies requires hardware support in the silicon, how much of which or at what cost I do not know. The AMD gpus likely cannot be made to use the G-sync tech in displays as nVidia isn't likely to license that hardware support in the silicon (in the GPU)...
They could get their tech more mainstream if they allowed that, but it would sell competitors products at the expense of their own sales, so an unlikely business decision.

please stop spreading fake news
 
Probably, but is it an extra $200+ for a new monitor better? Considering how Nvidia won't adopt it, I'm guessing the answer is "no."

They have no choice to; it's part of the mainline specification.

The way I see it, once HDMI 2.1 comes out, Displayport will likely turn into a dead standard (see: Firewire for reference). We don't need two competing digital video output formats, and Displayport is MIA in televisions.
 
Freesync is an optional part of the Displayport specification; it's not supported across the board, and has issues at very low/high refresh rates. As a solution, Gsync is better.

Moot anyways; HDMI 2.1 is making VRR a part of the base specification.
FreeSync does not have issues with very high or low FPS. Just as an example, some freesync monitors may be worse/better at anti-ghosting with dynamic refresh rates than competing GSync displays but that has nothing to do with the Freesync technology. It has to do with the quality of the monitor.

For the record, I own a 144Hz freeeync display and observed no issues at high or low FPS.
 
If triple buffering is applied by windows 10 is there really much point in having the ability to use GSync/FreeSync in non-exclusive full screen? Do the drivers circumvent Windows when GSync/FreeSync is working correctly?

I guess I will have to check my display's FPS counter next time I'm playing a game in windowed mode...
 
Back
Top