Is it time for Nvidia to support Freesync?

Don't need to use geforce experience if you don't want to, they never tied it down, they said they would but I think the kick back from that they stopped it.

I was under the impression that you needed GFE to auto-update the drivers?

Either way, the GFE shit is annoying. Perhaps I'll do a fresh install and get rid of that crap on my HTPC if I can still get automatic updates.
 
Except Shadowplay one must use Geforce Experience and all the crap spam, tracking etc. that goes along with it. With RTG Relive is transparently supported in the drivers. Game profiles with Nvidia drivers take forever to update every time opened and then limited as in with RTG I can adjust clock speeds, voltages etc. as well. Interface wise Nvidia is clunky and somewhat ugly, amateurish from my view on the interface. I am beginning to hate Geforce Experience as bad as Raptr (which is being discontinued and sight being closed down) in the past. I want one function, Shadowplay and with Nvidia it tied to a whole bunch of crap.

Yes, for automatic update you need GFE crap loaded, spamming, advertising etc.
 
I was under the impression that you needed GFE to auto-update the drivers?

Either way, the GFE shit is annoying. Perhaps I'll do a fresh install and get rid of that crap on my HTPC if I can still get automatic updates.


You won't get automtaic updates with out it but you can set up your nvidia account to email you about driver updates.
 
Count me as one that doesn't mind GFE. I'm just not that picky about individual game settings in most games, and GFE generally picks good settings and makes it easy to bump them in a different direction if needed.
 
Got your time lines wrong man.

2013 think it was Oct, Nov, nV announced Gsync and demoed it on desktop monitors, 2014 January AMD announced and demoed it on laptop. It wasn't till around Jan 2015, did AMD finally show Free Sync desktop monitors off at CES, it was more than a year after, it was 1.25 years.

Coincidentally, it took nV about 1 year to get G sync to a working product as well, if we are to believe Tom Patterson. Tom even stated to get it to work on laptops they could have done it fairly easily, much sooner. AMD even mentioned this later on too.

So nV started with Gsync in early 2012 end of 2011 and released first working demos of desktop monitors in end of 2013 and starting selling DIY kits in early 2014, which is the same time frame it took AMD to get from demo on laptop, which was soon after Gsync was shown off on desktop, to showing and releasing desktop monitors with Freesync.

Yeah for AMD it was a reactionary project, they knew they could do it on desktops, but they needed to get the scalar tech for desktop monitors in there, this is where the extra cost comes in, its not extra cost to the end user by AMD, its extra cost to the display manufactures, which in turn increases price of the monitors, but not by much, talking like 50 bucks for the higher end ones with better a feature set. AMD is not making money on this, and that is why its "free", Display manufacturers on the other hand, get a little bit extra.

Now back to time lines, in 2015 (early to mid), early the first G sync included monitors were released, it wasn't till 2016 early 1Q did the first AMD Freesync monitor get released.

So all time lines for Free sync was one year out of G Sync for desktop parts, that shows the development time line right there, 1 year or so for Free sync. So it all lines up.


Sorry man, my timeline is perfect. Here it is.

September 2013 - AMD announces new cards - R9 290, R9 290x, R7 260x and R7260
Early October 2013 - R7 260x Released.
Mid October 2013 - Nvidia First Demo Gync. R9 290 cards released.
November 2013 - AMD Sends Final Proposal to VESA.
January 2014 - CES - AMD does Freesync demo on Laptop, Nvidia shows off Gync monitors, DIY Gsync Kit goes on Sale.
June 2014 - Computex - AMD demo's Freesync on Monitors.
August 2014 - First Gsync monitor available for sale. Asus Rog Swift.
January 2015 - CES - 7 monitors with Freesync at show.
March 2015 - Ben Q releases first Freesync monitor. It was released early. Launch was not supposed to be until April with the other Freesync monitors.
Late May 2015 - Ghosting issues fixed. (To me is the actual proper release day of a working product.)
November 2015 - AMD release LFC driver.

Freesync for AMD wasn't a reactionary product. They were definitely working on it before the Gysnc demo. They were probably caught well off guard by Nvidia's Gsync demo. Maybe they didn't think Nvidia was working on a VRR tech, or maybe they thought they had more time.

Do you really think that AMD somehow managed to put together a complete VRR solution between October 2013 and November 2013? Not only come up with a solution, but also to come up with a proposal to put to VESA with all the technical ins and outs. And, remember, with VESA there are couple of stages to go through before the final proposal. Is that possible for a company that moves as slow as AMD?

But, let's say that they managed that. How did they manage to go back in time and make the cards already released compatible with the display port 1.2a adaptive sync standard (an optional standard not yet available and wouldn't be until May 2014)

I know why it's called Free, no royalties have to be paid to AMD or VESA for Monitor manufacturers to use it. Address this part of your post to those people(idiots) who keep rabbiting on about Freesync not really been free as you still have to buy the monitor.


I posted earlier that I found a specific post from 2008 on VRR which in almost everyway mirrored how it works today, not specifics to VBlank but monitor refresh rate change. I am not sure if we can be certain who began the tech only who showed up first. But being first to market doesn't necessarily mean first to have the idea.

Your take was quite interesting as well.


Well, actually, I am sort of cheating throughout this discussion as I know I am right on this :) AMD did a question and answer session on the Overclockers UK forums shortly after the release of Freesync. It was one of the questions asked. The guy from AMD said that they were working on Freesync throughout the development of their Hawaii cards. Of course he could be lying, but not sure why he would. Nvidia still beat them to the market. Who came up first with the idea who cares really? It's quite possible that both companies begin working on a VRR solution independently. That's what the point that I have been trying to get across here. There are just too many signs pointing to AMD working on a VRR solution before the Gsync demo.

At the end of the day first to market is usually the most important thing.
 
Sorry man, my timeline is perfect. Here it is.

September 2013 - AMD announces new cards - R9 290, R9 290x, R7 260x and R7260
Early October 2013 - R7 260x Released.
Mid October 2013 - Nvidia First Demo Gync. R9 290 cards released.
November 2013 - AMD Sends Final Proposal to VESA.
January 2014 - CES - AMD does Freesync demo on Laptop, Nvidia shows off Gync monitors, DIY Gsync Kit goes on Sale.
June 2014 - Computex - AMD demo's Freesync on Monitors.
August 2014 - First Gsync monitor available for sale. Asus Rog Swift.
January 2015 - CES - 7 monitors with Freesync at show.
March 2015 - Ben Q releases first Freesync monitor. It was released early. Launch was not supposed to be until April with the other Freesync monitors.
Late May 2015 - Ghosting issues fixed. (To me is the actual proper release day of a working product.)
November 2015 - AMD release LFC driver.

Freesync for AMD wasn't a reactionary product. They were definitely working on it before the Gysnc demo. They were probably caught well off guard by Nvidia's Gsync demo. Maybe they didn't think Nvidia was working on a VRR tech, or maybe they thought they had more time.

And why was nV able to get it out a full year before AMD if AMD was working on it before? AMD not capable as nV to get things done? You think they were waiting on specs to ratified? They didn't need to worry about that, cause that's why the free sync monitors are all over the place. They could have started talking to the Monitor manufacturer well before the ratification process even began. That is the way I would do it if I was running AMD, I have done it numerous projects, just done first, and then push the committees to bend.

Do you really think that AMD somehow managed to put together a complete VRR solution between October 2013 and November 2013? Not only come up with a solution, but also to come up with a proposal to put to VESA with all the technical ins and outs. And, remember, with VESA there are couple of stages to go through before the final proposal. Is that possible for a company that moves as slow as AMD?

on a laptop yeah, how long do you think drivers take to get up and going, this isn't that complex like a GPU, a GPU it takes about 8 months to a year for final release drivers. Not more than that.

But, let's say that they managed that. How did they manage to go back in time and make the cards already released compatible with the display port 1.2a adaptive sync standard (an optional standard not yet available and wouldn't be until May 2014)

You think the GPU needs modification for this tech to work? err no it doesn't its just the display port spec they had used. Its all about the display port tech and version as long as it has 1.2a it can do it. That is why the 280x and few others can't do it. So yeah they were working on it right around the same time nVidia released their GPU. 1.2a was ratified when sometime in 2012? 1.3 was in 2014, so it had to be done well before then, 1.2 was 2009.

I know why it's called Free, no royalties have to be paid to AMD or VESA for Monitor manufacturers to use it. Address this part of your post to those people(idiots) who keep rabbiting on about Freesync not really been free as you still have to buy the monitor.

Its not free, its got extra cost. Not much but its still there. And that is what everyone else has been saying.

Well, actually, I am sort of cheating throughout this discussion as I know I am right on this :) AMD did a question and answer session on the Overclockers UK forums shortly after the release of Freesync. It was one of the questions asked. The guy from AMD said that they were working on Freesync throughout the development of their Hawaii cards. Of course he could be lying, but not sure why he would. Nvidia still beat them to the market. Who came up first with the idea who cares really? It's quite possible that both companies begin working on a VRR solution independently. That's what the point that I have been trying to get across here. There are just too many signs pointing to AMD working on a VRR solution before the Gsync demo.

They were working on the specifications. the GPU itself has nothing to do with it..... just needs DP 1.2a that's it.

doesn't matter at the end as you said, nV came out with it first and an over all better solution with better specifications. AMD needs to do that with Free Sync 2. If they can do that that will help them gain market share, if they can't their cards aren't going to do anything by themselves right now.
 
Garenteed, no, I'll bet money on it. 50 bucks says that if a monitor passes Nvidia's test to have the "GSync compatible" branding, it won't be allowed to have the AMD Freesync branding. Essentially, Nvidia are going to try to get monitor manufacturers to drop any Freesync branding because it was actually mildly successful for AMD.
 
I honestly think the only validated some monitors becuase those brands seem closely aligned with Gsync. Anyone else notice that? Those seem to be the primary Gsync brands. May be Nvidia just pulled some kind of internal deal to push more of those monitors to make the top gsync panel makers happy. On top they just left the force option open. I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.
 
I wouldn't say BenQ is a major G-Sync brand though.

I actually noticed the number of TN monitors more than the manufacturers.
 
It's good news for gamers with those screens and an Nvidia card of whom I'm sure there are quite a few.
 
I honestly think the only validated some monitors becuase those brands seem closely aligned with Gsync. Anyone else notice that? Those seem to be the primary Gsync brands. May be Nvidia just pulled some kind of internal deal to push more of those monitors to make the top gsync panel makers happy. On top they just left the force option open. I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.

You can manually activate it on all VRR monitors now, not just certified ones.
 
You can manually activate it on all VRR monitors now, not just certified ones.

I know that. I was just talking about their tested monitors in specific. It seems like they left the samsung 32 inch hdr monitor out. Likely because they don't make any gsync monitors.
 
I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.

Why's that hard to believe? We know most Freesync monitors are straight trash already. They mostly don't support variable overdrive, for example. Most of the supported monitors are TN, it's entirely possible that without variable overdrive, Nvidia considers the response time out of spec for many panels at certain refresh rates. How many don't support low framerate compensation, or only support it for a poor range?

There's lots of legitimate reasons to reject Freesync monitors as not good enough. Because most aren't.
 
I honestly think the only validated some monitors becuase those brands seem closely aligned with Gsync. Anyone else notice that? Those seem to be the primary Gsync brands. May be Nvidia just pulled some kind of internal deal to push more of those monitors to make the top gsync panel makers happy. On top they just left the force option open. I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.
Most Adaptive Sync screens are trash. I'm not surprised if only those few offer comparable experience.
 
I know that. I was just talking about their tested monitors in specific. It seems like they left the samsung 32 inch hdr monitor out. Likely because they don't make any gsync monitors.
You mean we will be getting more Nvidia propaganda in which brands are cool and others which suck. Next thing you know people paying extra because Nvidia cool monitors are so cool ;)
Why's that hard to believe? We know most Freesync monitors are straight trash already. They mostly don't support variable overdrive, for example. Most of the supported monitors are TN, it's entirely possible that without variable overdrive, Nvidia considers the response time out of spec for many panels at certain refresh rates. How many don't support low framerate compensation, or only support it for a poor range?

There's lots of legitimate reasons to reject Freesync monitors as not good enough. Because most aren't.
There is choice in that market nothing that will burn a hole in your wallet, better stuff costs more. This is not a Freesync only deal this used to be the case before Freesync ever came to market.

Freesync is not something that defines how much money you can spend unlike G sync where you need to spend money in order to get anywhere and guess what there are terrible G sync monitors (blew your mind right there didn't I?).

But at least Nvidia's branding is subjective maybe even good enough for them to raise prices on approved Nvidia monitors that is the thing we have all been waiting for....
 
I honestly think the only validated some monitors becuase those brands seem closely aligned with Gsync. Anyone else notice that? Those seem to be the primary Gsync brands. May be Nvidia just pulled some kind of internal deal to push more of those monitors to make the top gsync panel makers happy. On top they just left the force option open. I really don't believe this thing about they actually tested 400 monitors lol and only 12 passed. Seems another marketing stunt to push certain monitors.

Saying most freesync monitors are trash or substandard isn't exactly a hot take. Most of them have serious limitations in one way or another, especially in terms of their supported range. What the validated monitors have in common isn't the brands, its the quality. They are all high-quality, highly rated, popular freesync monitors with feature sets that match up to G-sync counterparts. Most likely, the only monitors that pass are going to be the best of the best freesync panels that can stack up with G-sync panels.

I doubt they spent hours and hours on each of 400 monitors, but I wouldn't be surprised if they looked at them all in some way. A lot could probably be tossed out early or right away just based on specs while many more would fail super quick do to simply being garbage despite decent sounding specs.
 
I'm going to speculate the 400 tested monitors number is inflated and out of context. I'm assuming a minimum validation requirement is likely going to be enough of a range to properly support frame doubling (or LFC in AMD terms). A huge chunk of existing VRR displays would not qualify for that (eg. 48-60/75, 90-120/144, etc.)

If you use AMD's Freesync display page as a source that immediately filters out I think 350+ displays. All those were likely "tested" in the bare minimum sense and failed.

After that I would guess the testing becomes more involved and likely more time/resource consuming depending on what they are looking for. As specific edge case visual artifacts would be more tricky to find beyond just range verification.

Another aspect of this to consider is going to be branding battle for VRR and how that will play out. Technically speaking Nvidia is not supporting Freesync as the thread title alludes to, only AMD supports Freesync and is known to going forward.

Nvidia will likely want to push the association of G-Sync with VRR or at last push the association to neutral terms like Adaptive Sync and Variable Refresh Rate. Whereas the AMD side will want to continue the existing association of Freesync. This will have implication of how displays market themselves and whether they get branding.
 
Last edited:
  • Like
Reactions: Tup3x
like this
I think we all know why this happened. Freesync/Adaptive Sync is now part of the HDMI 2.1 standard and more and more televisions are adding VRR support. Nvidia does not really have a choice but to support it at some point if they want to keep on adding the latest HDMI ports in their cards. NVidia is just trying to spin things in their favor to make it look like they came up with the idea. Thank you console and TV manufacturers for this positive developement.
 
Saying most freesync monitors are trash or substandard isn't exactly a hot take. Most of them have serious limitations in one way or another, especially in terms of their supported range. What the validated monitors have in common isn't the brands, its the quality. They are all high-quality, highly rated, popular freesync monitors with feature sets that match up to G-sync counterparts. Most likely, the only monitors that pass are going to be the best of the best freesync panels that can stack up with G-sync panels.

I doubt they spent hours and hours on each of 400 monitors, but I wouldn't be surprised if they looked at them all in some way. A lot could probably be tossed out early or right away just based on specs while many more would fail super quick do to simply being garbage despite decent sounding specs.

Im going with "you need to freesync from zero to full speed", and most "non gaming" monitors only freesync from 40hz to 60hz. I honestly cant think of any freesync that is less than the cost of a gsync, that can run at 20-30hz so you dont tear in the dip.
 
Has anyone pulled together a speclist for the dozen passing monitors to attempt to determine what level of performance NVidia is requiring to pass its validation testing?
 
Right now it looks like they are literally going down a list of monitors alphabetically, and just barely started.

Acer acer acer acer acer acer acer benq

LOL

No way they tested "400" recent monitors, because good ones like the Nixeus are shoe-ins for sure.
 
Has anyone pulled together a speclist for the dozen passing monitors to attempt to determine what level of performance NVidia is requiring to pass its validation testing?

The Acer XFA240 FreeSync range is 48-144Hz/FPS (Frames Per Second) over DisplayPort and 48-120Hz over HDMI.

LFC (Low Framerate Compensation) is also supported which means that even when your FPS rate drops below 48 and FreeSync stops working, the display’s refresh rate will double or triple the frame rate for a smoother performance.


That is the key to be nvidia certified. How it handles LFC. Most have no LFC. To be nvidia certified for gsync the monitor must "do something" from 1fps to whatever the max frames are. Most freesync monitors do not have the chips for coverting very low fps back to the min native fps that the panel can handle.

https://www.amd.com/en/products/freesync-monitors
 
Last edited:
The Acer XFA240 FreeSync range is 48-144Hz/FPS (Frames Per Second) over DisplayPort and 48-120Hz over HDMI.

LFC (Low Framerate Compensation) is also supported which means that even when your FPS rate drops below 48 and FreeSync stops working, the display’s refresh rate will double or triple the frame rate for a smoother performance.


That is the key to be nvidia certified. How it handles LFC. Most have no LFC. To be nvidia certified for gsync the monitor must "do something" from 1fps to whatever the max frames are. Most freesync monitors do not have the chips for coverting very low fps back to the min native fps that the panel can handle.

https://www.amd.com/en/products/freesync-monitors

That also goes a large way to show why NVidia had to do so. The XFA 240 is only $200 on Amazon, and offers a reasonably 24" TN 1080p/144hz entry point to high refresh/variable frame rate gaming. The cheapest similar GSsync monitor I could find, the Acer Predator XB241H is nearly than twice that. (I'm not 100% sure it is the cheapest because Amazon and Google's search results were heavily polluted with non-gsync monitors.)
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Has anyone pulled together a speclist for the dozen passing monitors to attempt to determine what level of performance NVidia is requiring to pass its validation testing?

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

You can see them listed at the bottom. A minimum spec requirement is likely going to >2.5x max refresh to min ratio to be able to support frame multiplying. This already eliminates the majority of current non G-Sync VRR displays as they have ranges like 48-60hz and etc.

I assume that elimination compromises the current bulk of the 400+ tested. It's doubtful they've actually done extensive testing for that many monitors by now.

Their key note also mentioned some visual artifacts with respect VRR which would require more extensive testing (eg. flickering edge cases and etc.).
 
https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

You can see them listed at the bottom. A minimum spec requirement is likely going to >2.5x max refresh to min ratio to be able to support frame multiplying. This already eliminates the majority of current non G-Sync VRR displays as they have ranges like 48-60hz and etc.

I assume that elimination compromises the current bulk of the 400+ tested. It's doubtful they've actually done extensive testing for that many monitors by now.

Their key note also mentioned some visual artifacts with respect VRR which would require more extensive testing (eg. flickering edge cases and etc.).
So is Nvidia more limited by what is on their cards compared to AMD? Have a 144hz FreeSync 2 Samsung HDR monitor and the Vega's run it perfect - smoothest game play I've ever experience. So my 1080Ti's may have issues because the hardware is not as flexible or capable as AMD?
 
So is Nvidia more limited by what is on their cards compared to AMD? Have a 144hz FreeSync 2 Samsung HDR monitor and the Vega's run it perfect - smoothest game play I've ever experience. So my 1080Ti's may have issues because the hardware is not as flexible or capable as AMD?


It's the same exact hardware. Nvidia calls DP Adaptive Sync it uses on Laptops GSYNC.

All because they use a local framebuffer to guarantee no hitching (something Freesync ditches).

It's the exact same hardware. It will be subject to the same driver bugs you have with *SYNC displays,.
 
It's the same exact hardware. Nvidia calls DP Adaptive Sync it uses on Laptops GSYNC.

All because they use a local framebuffer to guarantee no hitching (something Freesync ditches).

It's the exact same hardware. It will be subject to the same driver bugs you have with *SYNC displays,.
So should be rather comparable in the end - that would be great news! Look very much forward to this.
 
So is Nvidia more limited by what is on their cards compared to AMD? Have a 144hz FreeSync 2 Samsung HDR monitor and the Vega's run it perfect - smoothest game play I've ever experience. So my 1080Ti's may have issues because the hardware is not as flexible or capable as AMD?


No. AMD sucks at protecting "freesync" which is a certification/marketing for AMD that includes VRR. "freesync" has a list of qualifications just like gsync, but AMD let everyone use "freesync" on the label reguardless of performance.

Your samsung has all the real requirements, min lag rating, low frame rate compensation, etc.

Looking back all the low end "freesync" monitors should have just been labeled VRR or Async compatible, not "freesync certified".

There is a huge difference between a 48-75hz(no LFC chip) 100ms lag "freesync" tv/monitor, vs your $$$ that passed all of amds requirements.

Im pretty sure "most" name brand freesync monitors that can do 144hz, will work just find with nvidias new drivers.


The next fun part is getting the monitor to work with the nvidia. I had a fun time getting my viewsonic to recognize my amd apu as freesync enabled, even though it was enabled on the pc and the monitor menu.
 
There are no Freesync monitor. There are Adaptive Sync monitors that support AMD's Freesync. It's just people keep calling them Freesync monitors.

There is no LFC chip either. LFC is done on the GPU not the monitor. The requirements for LFC are that the Maximum refresh rate is greater than or equal to 2 times the min refresh rate.

I honestly cant think of any freesync that is less than the cost of a gsync, that can run at 20-30hz so you dont tear in the dip.

What are you saying? There are monitors that have Freesync that are full range and are much cheaper than any Gsync monitor.
 
There are no Freesync monitor. There are Adaptive Sync monitors that support AMD's Freesync. It's just people keep calling them Freesync monitors.

There is no LFC chip either. LFC is done on the GPU not the monitor. The requirements for LFC are that the Maximum refresh rate is greater than or equal to 2 times the min refresh rate.



What are you saying? There are monitors that have Freesync that are full range and are much cheaper than any Gsync monitor.


You are correct:
https://www.amd.com/Documents/freesync-lfc.pdf

So its enabled on the gpu side once the gpu knows the monitor is capable of 2.5x min refresh.

That makes me wonder if the same is true for nvidia. Does the gsync fpga handle the LFC, or does it just report back to the gpu?

On that note, what will happen on a freesync monitor when the fps goes below 48hz, using a nvidia gpu? Will the new drivers use the gpu to LFC like amd?

As for my second comment, There is a lot of overlap on prices for gsync/freesync with 120-144hz panels. Yes you can get god like ips gysync monitors, but they are of a much higher quality. Like wise you can get 240hz gysync monitors, which you can not get freesync equivalent.

But for $400-500 you can get yourself a nice 27-32" freesync/gsync 1080/1440 120/144hz unit. Im not comparing a 60hz freesync $150 to a 120hz $400 gsync.
 
Last edited:
You are correct:
https://www.amd.com/Documents/freesync-lfc.pdf

So its enabled on the gpu side once the gpu knows the monitor is capable of 2.5x min refresh.

That makes me wonder if the same is true for nvidia. Does the gsync fpga handle the LFC, or does it just report back to the gpu?

On that note, what will happen on a freesync monitor when the fps goes below 48hz, using a nvidia gpu? Will the new drivers use the gpu to LFC like amd?

As for my second comment, There is a lot of overlap on prices for gsync/freesync with 120-144hz panels. Yes you can get god like ips gysync monitors, but they are of a much higher quality. Like wise you can get 240hz gysync monitors, which you can not get freesync equivalent.

But for $400-500 you can get yourself a nice 27-32" freesync/gsync 1080/1440 120/144hz unit. Im not comparing a 60hz freesync $150 to a 120hz $400 gsync.


In a Gsync monitor the FPGA handles everything, It's the Frame buffer, timing control, scaler all in one.

With Nvidia now supporting Adaptive Sync, they will have something similar to LFC on the GPU side to handle the low framerates.

As for your last few statements, they don't match your statement in that post I quoted. You were trying to say that no cheap adaptive sync monitor could handle low frame rates like Gsync could, that you would need an adaptive sync monitor as least as expensive as a Gysnc monitor to do it. But that's not the case.
 
In a Gsync monitor the FPGA handles everything, It's the Frame buffer, timing control, scaler all in one.

With Nvidia now supporting Adaptive Sync, they will have something similar to LFC on the GPU side to handle the low framerates.

As for your last few statements, they don't match your statement in that post I quoted. You were trying to say that no cheap adaptive sync monitor could handle low frame rates like Gsync could, that you would need an adaptive sync monitor as least as expensive as a Gysnc monitor to do it. But that's not the case.


Am i newegging wrong?

27" 1080/1440p
144hz

looks like all the name brand stuff is around $500 for gsync/free sync that can LFC. I can not find a name brand 27"-32" freesync for 300-400 unless i really go offbrand or go for a 60hz panel. Can you really find an asus/acer/benq? At best i can only find the really new viewsonic units.
 
Wow looks like my monitor will get some Freesync/Gsync Love. It doesn't have LFC and its range is very limited (40-60) but my system never goes below 40 at the settings I use anyway so it will be a nice free upgrade.
I had Gsync before and wasn't a huge fan or hater of it. Just didn't like the price point I had to pay for the feature at the resolution I had gotten at the time. I switched to a Samsung Freesync display because the other specs it had are what I wanted and could afford.
Nvidia finally realizing the smart move is rather surprising.
Maybe losing all the money shook the greed a bit.
 
Am i newegging wrong?

27" 1080/1440p
144hz

looks like all the name brand stuff is around $500 for gsync/free sync that can LFC. I can not find a name brand 27"-32" freesync for 300-400 unless i really go offbrand or go for a 60hz panel. Can you really find an asus/acer/benq? At best i can only find the really new viewsonic units.

You have made several wrong statements in this thread. There are plenty of full range Freesync monitors from Samsung, Acer, Asus, AOC, MSI that run at 144hz in various screen sizes, including 27inch. There are also 240Hz Freesync monitors. A 27 inch 240Hz monitor from Acer costs about the same as the cheapest Gsync monitor.
 
You have made several wrong statements in this thread. There are plenty of full range Freesync monitors from Samsung, Acer, Asus, AOC, MSI that run at 144hz in various screen sizes, including 27inch. There are also 240Hz Freesync monitors. A 27 inch 240Hz monitor from Acer costs about the same as the cheapest Gsync monitor.

Maybe i got missunderstood...

You repeated everything i was agreeing to. I believe my original comment was you are not going to save much money trying to find a free sync that supports LFC and has the same quality panel, vs a gsync. I guess i was trying to make and apples to apples argument and failed.


On a ligher note, im running a viewsonic VX3258 via the igpu trick with a ryzen 2200g and a gtx 1070. It will be nice at the end of the month to not have to manually select each game exe and select performance in win10 in order to run the gtx and have free sync at the same time.
 

This is an excellent summary of what "meets standard" means. Basic Freesync with no LFC and no variable overdrive is largely worthless IMO and should not even be considered a feature.
 
Back
Top