Is it time for Nvidia to support Freesync?

We do care because the same monitor with gsync costs at least ~$200 more than that same monitor with freesync. I gave specific examples fairly recently in another post. It isn't hard to figure out got to Newegg. Spend 5 minutes there comparing monitors and feature sets.

Maybe you don't care if your monitor costs you $200 more in the end. I do. I use three matching monitors. That's $600. And ultimately right now... at this moment. freesync has better premium display options. There is no gsync with quantum dots panel released yet, nor is there any gsync 38" panel.

I bought my three freesync Omen 32" 1440p monitors for $300 each, new direct from hp.com

I'd gladly move back to Nvidia because I'm not pleased with buggy Vega drivers right now and I have a pair of.1080ti mining. But freesync with a single Vega at 75hz presents a smoother gaming experience than 60hz vsync on my omens, and I've been looking for some equitable gsync monitors but there are none to be had for anywhere near the same pricepont that I picked up the HP Omens.


I was calling out idiotincharge for making specific statements without specifics.

If freesync ist n'free, where's proof it costs?

If g-sync is better where are specific tests objective and subjective showing that to be the case. I've looked for them. I can't find them.

I'd gladly use My Nvidia cards with my freesync monitors if Nvidia supported freesync. Since that is unlikely I'm either going to wait till the next iteration of gsync monitors releases or vega's drivers improve. At this precise moment I'm pretty stuck. I don't see anything in gsync i think is a worthwhile improvement over my omens and Vega drivers are a buggy underwhelming hot mess. PUBG performance is dismal, and I've had 5 lock ups with Vega in about 6 hours of gametime. I know PUBG isn't very AMD friendly, but that's what I'm playing right now, and it'd be great if it worked or as reliable. I get massive slowdowns and crashes. 90FPS then 8FPS. Stuttery jumping vehicles, red lockup screens 30-40Fps sections. Just nonsense in the current drivers.

But the other argument can be had is that, is there proof that FreeSync is indeed free to implement? I am willing to wager that even if it is indeed "Free" to consumers, it's almost certainly not free to the manufacturer because they need to make controllers that conform to that standard. Hell, even if it is just a simple firmware update, you still need divert people to write that firmware. It may not be visible to us consumers because manufacturers will have more freedom to cut corners elsewhere.

If it doesn't cost 'us' (as in we), it still costs someone something.

As for the FreeSync vs G-Sync, I highly suspect that it's because it's rare that there will be many people with both, people usually go with one or the other.
 
You do realize that if AMD were in the dominant position (not that that's happened as long as they've had a graphics division) that they'd do the exact same thing, right?
No, I do not know or pretend to know what AMD would do if they had more discrete market share. There actual GPU share x86 devices is not much different in numbers from Nvidia.

Now if Nvidia has such a larger proportion of gaming graphics cards then why are there way more FreeSync monitors? Especially if they cost more to make then not?

What I see or view is Nvidia dos not support adaptive sync, open standard which they could easily support because it would kill off their version which no one else can use and which they make money from it outright and from future graphic card sells. It also shows that GSync is not significantly better to maintain a market is probable reason why Nvidia does not support adaptive sync. Driving force is the bottom line or money and not you or me getting the most bang for buck. Over a year ago I bought a FreeSync 4K 10bit IPS monitor on sell for $399 - I don't even think I've seen a 10 bit GSync monitor little alone anything remotely at that cost. It would be interesting if Nvidia did start to support adaptive sync, the purchasers of Gsync monitors might wonder if they had been taken in the long run too. :ROFLMAO:
 
I think the main reason why Nvidia currently won't consider freesync is because they make money off of G-Sync. Unfortunately for AMD, They cannot say the same.
 
Considering how many people were waiting for Vega because of FreeSync monitors I would hazard a guess that AMD/RTG is pretty happy that nVidia does not support FreeSync. :whistle:

I think most waited for a hype and to place their money loyally at a company.

If you are in doubt, look at the sales numbers for Vega and why AIBs wont invest in it. Vega will sell how many cards in its lifespan, 25000?
 
I think the main reason why Nvidia currently won't consider freesync is because they make money off of G-Sync. Unfortunately for AMD, They cannot say the same.

No its because of:

1. Freesync isn't as good.
2. No quality standard for Freesync. Often Freesync implementations are piss. (AMD had/have plans to fix this with Freesync 2, dramatically increasing prices).
 
Well, back in 2015, Intel did say they would support the VESA Adaptive Refresh spec, possibly beginning with the IGP as soon as Kaby Lake. Which, in my limited understanding, means it's not "AMD Branded Freesync", but does mean that it should work with Freesync-compatible displays?

Thing is, Kaby has come, and it's successor was just announced today. If I'm not mistaken, Coffee Lake has the same IGP package as Kaby did though.

I haven't heard anything from Intel since 2015 regarding support, and I would imagine it would have been yet another important marketing bulletpoint that would have been touted.

So I guess we'll see what Cannon Lake brings to the table.

I'm not holding my breath about it, and even if they do, I don't particularly care about it, as I don't game on Intel IGP. Intel is the leading graphics provider, because they include it inside of nearly every CPU they sell. That doesn't necessarily mean it's the most commonly used graphics provider, particularly in gaming. Steam Hardware Surveys will show you how that works out in the gaming world.

In my experience businesses and low end PCs are leaning on Intel IGP, because you don't need anything that sophisticated to drive Windows and most office/web content - but those aren't really going to see any particular benefit of adaptive sync either. Maybe a bit better battery life in a laptop, maybe?

Intel adoption of VESA standards would be a marketing bulletpoint, both for AMD and Intel, but won't do anything to actually move the needle one way or another on adaptive sync adoption.

As far as GSync dying - I don't know. PhysX is still around, although not exactly prominent like it once was now that's it's rolled in with GameWorks. I agree that nVidia should support the VESA standard on all GPUs, and bill GSync as a premium experience with selected and tightly integrated monitors.

nVidia is already billing GSync as that premium experience, but they are missing out on the benefits of Adaptive Sync for their customers that will have monitors at price points that match their lower tier products.

Will GSync die like 3D or other fads? I don't know - I don't think adaptive sync is going away. I think GSync will continue to evolve, and hopefully to some standard that doesn't require special hardware. But if anything could kill GSync, it will be nVidia themselves - a combination of commanding a higher price premium on a proprietary technology - people will only be willing to continue to do that if it presents a clearly superior experience. If Freesync continues to catch on (and a lot of that depends on FS2), with it being a VESA standard and coming at no extra cost on more and more monitors, that could push nVidia to evolve GSync - either lower the licensing price, and/or adopt the industry standard.

For Intel the desktop is dead. And the implementations on the mobile front is often about other factors. So the chance Intel will ever support Freesync is 0.00%.
Nvidia booms like crazy in the mobile gaming. And you see more and more Gsync implementations there.

In short, the train have left, and Freesync still stands at the station wondering what happened.
 
I think most waited for a hype and to place their money loyally at a company.

If you are in doubt, look at the sales numbers for Vega and why AIBs wont invest in it. Vega will sell how many cards in its lifespan, 25000?

No its because of:

1. Freesync isn't as good.
2. No quality standard for Freesync. Often Freesync implementations are piss. (AMD had/have plans to fix this with Freesync 2, dramatically increasing prices).

Keep telling yourself that.

For Intel the desktop is dead. And the implementations on the mobile front is often about other factors. So the chance Intel will ever support Freesync is 0.00%.
Nvidia booms like crazy in the mobile gaming. And you see more and more Gsync implementations there.

In short, the train have left, and Freesync still stands at the station wondering what happened.
Your turn... Link some proof there buddy. I see you whining but no links to back up your FUD.
 
I've got both Nvidia and AMD running in different machines. I've got a Freesync monitor paired with an AMD card. I like it. Since Nvidia will not support Freesync, for THAT particular computer, I will stick with AMD.

No, I have no plans to pay extra for a Gsync monitor.
 
Xbox One using FreeSync doesn't actually do ANYTHING for AMD beyond name recognition.

AMD's presentation merely states that FreeSync can be used over HDMI, and XBox One's implementation is going to be over HDMI, but basically it does squat for AMD's GPU market share.

TV's using that standard is what's going to help AMD to at least gain some traction.
No, I do not know or pretend to know what AMD would do if they had more discrete market share. There actual GPU share x86 devices is not much different in numbers from Nvidia.

Now if Nvidia has such a larger proportion of gaming graphics cards then why are there way more FreeSync monitors? Especially if they cost more to make then not?

What I see or view is Nvidia dos not support adaptive sync, open standard which they could easily support because it would kill off their version which no one else can use and which they make money from it outright and from future graphic card sells. It also shows that GSync is not significantly better to maintain a market is probable reason why Nvidia does not support adaptive sync. Driving force is the bottom line or money and not you or me getting the most bang for buck. Over a year ago I bought a FreeSync 4K 10bit IPS monitor on sell for $399 - I don't even think I've seen a 10 bit GSync monitor little alone anything remotely at that cost. It would be interesting if Nvidia did start to support adaptive sync, the purchasers of Gsync monitors might wonder if they had been taken in the long run too. :ROFLMAO:

The way I see it, it's because nVidia has such a big say in the G-Sync tech itself that it takes time to negotate with them just to get the approval to make them, not to mention the fact that they may have to pay royalties too.

FreeSync is part of the DP1.2a standard, and AMD does not have licensing fee on a FreeSync monitor, and, to put it bluntly, because there is a greater degree of Freedom with the hardware, so they can use a lot of already existing technologies, either that, or they can cut corners better than they can with G-Sync modules.

I own a G-sync monitor, but I bought it when FreeSync weren't even on the radar, and it would ironically costed me more to go with FreeSync monitor even if they did come out because we (as in Taiwan) apparently didn't get the memo from AMD that they slashed their 290 and 290x's, so any cost saving I would have got from the FreeSync monitor went directly into the GPU, penny for penny, so I still came out on top.

Right now the only thing not converting me to FreeSync wholesale is because AMD's Vega is simply too little too late, and nVidia haven't supported FreeSync. Whichever happens first, I am biting.
 
Your turn... Link some proof there buddy. I see you whining but no links to back up your FUD.


The rumor that Intel will support freesync was what 2 years ago, yeah, a rumor that never materialized. You still think that is going to happen?

Free sync does have quality issues, not all freesync monitors are built the same, you know this, no need to ask for proof when its common knowledge.

Vega is not selling well, right now there is a huge supply limitation, and I highly doubt its due to yields, or HBM supply, doesn't make sense, when nV has P100 and V100 for sales does it? Even if manufacturers are are having issues with multiple vendors for the interposer, that still doesn't cover the fact that will not create supply constraints for months.

Keep this in mind Vega was in AMD's hand for quite some time! More than 6 months! To say its due to any factor other than their own hand is folly.
 
The rumor that Intel will support freesync was what 2 years ago, yeah, a rumor that never materialized. You still think that is going to happen?

I wasn't expecting Intel to add support until they roll out a new GPU. Which so far they haven't; so I'm withholding judgement. If it's a no-show when Icelake is out in a year or two I'll probably consider it dropped.
 
I wasn't expecting Intel to add support until they roll out a new GPU. Which so far they haven't; so I'm withholding judgement. If it's a no-show when Icelake is out in a year or two I'll probably consider it dropped.


Freesync has nothing to do with the hardware, if it supports dp 1.2a it can do Freesync period. So they had 3 generations of chip where they could have added it by now, and they could have differentiated the last too gens by separating it in drivers.
 
The rumor that Intel will support freesync was what 2 years ago, yeah, a rumor that never materialized. You still think that is going to happen?

Free sync does have quality issues, not all freesync monitors are built the same, you know this, no need to ask for proof when its common knowledge.

Vega is not selling well, right now there is a huge supply limitation, and I highly doubt its due to yields, or HBM supply, doesn't make sense, when nV has P100 and V100 for sales does it? Even if manufacturers are are having issues with multiple vendors for the interposer, that still doesn't cover the fact that will not create supply constraints for months.

Keep this in mind Vega was in AMD's hand for quite some time! More than 6 months! To say its due to any factor other than their own hand is folly.
Don't get me wrong, I wasn't making any assertions with that post, I will make them in a minute. But rather was pointing out the he requested links/proof when I stated and referenced my point with how to find it, time constraint, so in turn he should do the same especially with claims to cost and other general claims to superiority.

Now my take on this is:

Intel did say they would support Adaptive sync 2 years ago and added that it would be added on the next iGPU they used. But being they still use the one they had when they made the announcement, it is hard to tell if it is that they don't have the NEXT one yet or aren't inclined to do so yet. And I agree with you that they could do it at any point even with the current one, although I am not certain of the 1.2a compliance as wasn't the adaptive sync portion optional?

Now as far as superiority, Neither has an upperhand per se. However in general G-sync will be better than some Freesync monitors,or a better way to put it is G-sync's worst will be better than Freesyncs worst and by a good margin. However the best of both will offer nearly identical performance and results. It is disingenuous to make claims that all G-sync monitors are better than freesync/adaptive sync monitors or that freesync "isn't as good" ( that is in reference to Shintai shit posting as usual not you). I said early on G-sync has an advantage and most of it was in initial quality on early models. And sure there are some crap freesync monitors out there whereas G-sync because of the validation process will not likely have as bad showing and meaning no where near as bad, to be expected.

Now Vega, well I have made no claims to its success or even pretend to have any idea of how it is selling or how many. I have already stated it is a mediocre showing in comparison to Nvidia but is a decent performer especially for those like me coming from a R9-290.

And my Samsung C34F791 is absolutely amazing. Works great. Freesync works without flaw, and the picture quality is rather nice. Looked at a review the other day and it did very well in their tests.

3440 x 1440 UWQHD Resolution
  • 1500R Curvature
  • 100 Hz Native Refresh Rate, 4ms Response Time
  • AMD Free Sync
  • Quantum Dot Backlight
  • sRGB 125% Motion Blur Technology
  • HDMI x 2 /DisplayPort/ USB 3.0

This is the feature list for it and as far as the above reference with G-sync quality, I doubt any would rival this by any measureable amount, measureable meaning great distance not an actual measurement like grayscale or RGB range or such.
 
Don't get me wrong, I wasn't making any assertions with that post, I will make them in a minute. But rather was pointing out the he requested links/proof when I stated and referenced my point with how to find it, time constraint, so in turn he should do the same especially with claims to cost and other general claims to superiority.

G Sync is superior for now. Free Sync 2 should even that out but just have to wait and see. For the price is it better? for now it is, because the cards that can drive G Sync with better visual fidelity, well only one option. People buy monitors for a number of years usual 5-7 years, unlike graphics cards which could be as less as 1 gen 1.5 years. So right now G Sync is a better buy too even though its more expensive.

Now my take on this is:

Intel did say they would support Adaptive sync 2 years ago and added that it would be added on the next iGPU they used. But being they still use the one they had when they made the announcement, it is hard to tell if it is that they don't have the NEXT one yet or aren't inclined to do so yet. And I agree with you that they could do it at any point even with the current one, although I am not certain of the 1.2a compliance as wasn't the adaptive sync portion optional?

To be 1.2a compliant it has to be capable of adaptive sync. That is the reason for standards ;)

Now as far as superiority, Neither has an upperhand per se. However in general G-sync will be better than some Freesync monitors,or a better way to put it is G-sync's worst will be better than Freesyncs worst and by a good margin. However the best of both will offer nearly identical performance and results. It is disingenuous to make claims that all G-sync monitors are better than freesync/adaptive sync monitors or that freesync "isn't as good" ( that is in reference to Shintai shit posting as usual not you). I said early on G-sync has an advantage and most of it was in initial quality on early models. And sure there are some crap freesync monitors out there whereas G-sync because of the validation process will not likely have as bad showing and meaning no where near as bad, to be expected.

The top ends are close, but if you want the most out of the tech, G-Sync is the way to go. Along with the cards that can push top resolutions and frame rates too ;). Vega is not fully capable of 4k res @ 60 fps in games that really push graphics hard. Granted 1080ti has difficulty in some too, but much less.
Now Vega, well I have made no claims to its success or even pretend to have any idea of how it is selling or how many. I have already stated it is a mediocre showing in comparison to Nvidia but is a decent performer especially for those like me coming from a R9-290.

You asked for proof, why ask for proof when there are numerous leaks that AIB's aren't getting enough chips for custom cards and limited supply? Is everyone that gets these leaks off? Would these leaks look good for AMD? Why hasn't AMD responded to this? Cause its real. We can see it, so you are arguing for the sake of the argument not for anything else.

And my Samsung C34F791 is absolutely amazing. Works great. Freesync works without flaw, and the picture quality is rather nice. Looked at a review the other day and it did very well in their tests.

Good for you, have you ever tried a G-sync monitor, or any nV products in the past 2 years since G-sync was launched?

This is the feature list for it and as far as the above reference with G-sync quality, I doubt any would rival this by any measureable amount, measureable meaning great distance not an actual measurement like grayscale or RGB range or such.

Well without the cards to push top resolutions @ 60 FPS with all features active, is a big problem, so how does that push Freesync over Gsync because of price even though features are close? What happens with Volta's launch will Free Sync be a good buy right now then? We know AMD won't have any cards till end of next year so it kinda leaves Free Sync buyers looking at not spending that 200 bucks on a G Sync monitor give them no upgrade path or not being able to get the best tech possible and not being able to get the most out of their visuals.

To even look at Free Sync monitors right now, makes no sense cause if your an enthusiast it locks ya into AMD cards, and AMD doesn't look to be having a great performance card against Volta let alone an enthusiast card.

So spending t200 bucks on a G Sync monitor that will last for 2 or 3 gens or more is a better option then spending 1000 bucks on a Free Sync monitor and hoping AMD will come out with cards that will drive it to its fullest potential.

This all comes down to Halo products, and why halo products are so important to the entire ecosystem. With out them all these high end components that drive sales just don't make sense.

If you really want to support AMD, tell them you are disappointed, no need to argue with others on any forum when AMD is the one at fault for this mess. What did I say at launch of Vega, the only reason to get Vega is if ya already have a Free Sync monitor, otherwise its not worth it. Its not worth buying even with the package discounts, a Free Sync Monitor at this point.
 
Last edited:
G Sync is superior for now. Free Sync 2 should even that out but just have to wait and see. For the price is it better for now it is, because the cards that can drive G Sync with better visual fidelity, well only one option. People buy monitors for number of years, unlike graphics cards which could be a less as 1 gen 1.5 years. So right now G Sync is a better buy too even though its more expensive.



To be 1.2a compliant it has to be capable of adaptive sync. That is the reason for standards ;)



The top ends are close, but if you want the most out of the tech, G-Sync is the way to go. Along with the cards that can push top resolutions and frame rates too ;). Vega is not fully capable of 4k res @ 60 fps in games that really push graphics hard. Granted 1080ti has difficulty in some too, but much less.


You asked for proof, why ask for proof when there are numerous leaks that AIB's aren't getting enough chips for custom cards and limited supply? Is everyone that gets these leaks off? Would these leaks look good for AMD? Why hasn't AMD responded to this? Cause its real. We can see it, so you are arguing for the sake of the argument not for anything else.



Good for you, have you ever tried a G-sync monitor, or any nV products in the past 2 years since G-sync was launched?



Well without the cards to push top resolutions @ 60 FPS with all features active, is a big problem, so how does that push Freesync over Gsync because of price even though features are close? What happens with Volta's launch will Free Sync be a good buy right now then? We know AMD won't have any cards till end of next year so it kinda leaves Free Sync buyers looking at not spending that 200 bucks on a G Sync monitor give them no upgrade path or not being able to get the best tech possible and not being able to get the most out of their visuals.

To even look at Free Sync monitors right now, makes no sense cause if your an enthusiast it locks ya into AMD cards, and AMD doesn't look to be having a great performance card against Volta let alone an enthusiast card.

So spending t200 bucks on a G Sync monitor that will last for 2 or 3 gens or more is a better option then spending 1000 bucks on a Free Sync monitor and hoping AMD will come out with cards that will drive it to its fullest potential.
I am very much open to getting a Gsync monitor when I buy another high end monitor but that will depend on what is available in monitors and graphics cards. JustReason monitor pretty much beats out any top end GSync monitor in my books. My 4K freesync monitor is working well, has modified FreeSync range from 35 to 61.

As for 4K gaming and Vega 64 - alive and strong here. Just remember folks have different tolerances and ideas what looks best for them. FreeSync 40fps to me looks better than an average 55 fps non sync monitor (by far at that as well). Also I usually never use motion blur, rarely use Depth of Field but some games like Doom does it right and I do use it. There is also a lot of leadway in settings that degrades performance for little to no significant IQ advantage, some while hitting performance hard also degrades IQ in my books. In short 4K gaming on the Vega 64 is working for the most part for me. In my opinion at 4K I am getting a better gaming experience with the Vega 64 then I would with my 1080 Ti because of the combination of Vega 64 (just enough) and FreeSync and of course what I like to use for settings. On the wide screen non sync monitor the 1080Ti torches the Vega 64, so each to me are optimized as best as possible for my needs or wants for what I have. One cup does not fit all is the the jest here.

I also believe that eventually Nvidia maybe forced to support adaptive sync due to as time goes on, the limited and higher price GSync monitors may hurt sells for Nvidia. Luckily for them RTG is not competing effectively with them.
 
G Sync is superior for now. Free Sync 2 should even that out but just have to wait and see. For the price is it better? for now it is, because the cards that can drive G Sync with better visual fidelity, well only one option. People buy monitors for a number of years usual 5-7 years, unlike graphics cards which could be as less as 1 gen 1.5 years. So right now G Sync is a better buy too even though its more expensive.



To be 1.2a compliant it has to be capable of adaptive sync. That is the reason for standards ;)



The top ends are close, but if you want the most out of the tech, G-Sync is the way to go. Along with the cards that can push top resolutions and frame rates too ;). Vega is not fully capable of 4k res @ 60 fps in games that really push graphics hard. Granted 1080ti has difficulty in some too, but much less.


You asked for proof, why ask for proof when there are numerous leaks that AIB's aren't getting enough chips for custom cards and limited supply? Is everyone that gets these leaks off? Would these leaks look good for AMD? Why hasn't AMD responded to this? Cause its real. We can see it, so you are arguing for the sake of the argument not for anything else.



Good for you, have you ever tried a G-sync monitor, or any nV products in the past 2 years since G-sync was launched?



Well without the cards to push top resolutions @ 60 FPS with all features active, is a big problem, so how does that push Freesync over Gsync because of price even though features are close? What happens with Volta's launch will Free Sync be a good buy right now then? We know AMD won't have any cards till end of next year so it kinda leaves Free Sync buyers looking at not spending that 200 bucks on a G Sync monitor give them no upgrade path or not being able to get the best tech possible and not being able to get the most out of their visuals.

To even look at Free Sync monitors right now, makes no sense cause if your an enthusiast it locks ya into AMD cards, and AMD doesn't look to be having a great performance card against Volta let alone an enthusiast card.

So spending t200 bucks on a G Sync monitor that will last for 2 or 3 gens or more is a better option then spending 1000 bucks on a Free Sync monitor and hoping AMD will come out with cards that will drive it to its fullest potential.

This all comes down to Halo products, and why halo products are so important to the entire ecosystem. With out them all these high end components that drive sales just don't make sense.

If you really want to support AMD, tell them you are disappointed, no need to argue with others on any forum when AMD is the one at fault for this mess. What did I say at launch of Vega, the only reason to get Vega is if ya already have a Free Sync monitor, otherwise its not worth it. Its not worth buying even with the package discounts, a Free Sync Monitor at this point.
Not sure what it is you think I am arguing. I tried to be nice about it and give an explanation as to why without stepping on any toes but...

First enthusiast does not mean buying ONLY top tier products, it generally means enjoying a product or set of, to the point of spending either money and/or time over a common user. Generally enthusiasts build their own PCs and that doesn't usually include buying only top tier products but rather the best that individual feels they require within a budget.

As far as my monitor, My WC Vega 64 is in fact pushing frame rates within the Freesync range and most of my games are pegging it ie 100fps. I am not stating it can push 100fps on every game with ALL options but that it does with all the options I do run maxed and others I never run off (I never use radial blur or DoF, never) I don't always play the latest and greatest games, I play a plethora ranging from remakes of DOS games to current ones that fit my tastes. But also note I haven't said anything that resembles that another SHOULD purchase a VEGA nor do I delude myself or others about its performance. If one doesn't have a brand preference I have stated that the 1080 or greater would better serve them. However as I stated with the FX series CPUs, Vega can provide great performance if one is so inclined to purchase one. It isn't for me or you or any other to demean or dictate another's purchase, only to inform of the alternatives both positive and negative.

Now as far as Volta, just like you guys complaining about the "wait for" (insert whatever here) on AMD driver or GPU releases, same can be said here. Sure Volta will be better but it aint here yet, nor can one make a deciding factor now in lieu of any hard numbers or facts. Besides the biggest plus to G-sync or Freesync is the diminished need to upgrade frequently. As I and others have stated in the 1080Ti - Vega monitor blind test thread, this is a whole new game.

And to add to the original point: I posted he needed to link proof for some of his claims especially the all inclusive ones.

My monitor scored very well in tests. Has LFC support. A wide Freesync range (48-100) easily to stay within. Quantum dot for more vibrant color. Not sure how G-sync is going to add to this at all, again in a measureable way that would make it a clear advantage.
 
Not sure what it is you think I am arguing. I tried to be nice about it and give an explanation as to why without stepping on any toes but...

Prove it isn't an argumentative point?

First enthusiast does not mean buying ONLY top tier products, it generally means enjoying a product or set of, to the point of spending either money and/or time over a common user. Generally enthusiasts build their own PCs and that doesn't usually include buying only top tier products but rather the best that individual feels they require within a budget.

any enthusiast will buy the best product for their money at this point, AMD has no products that lead any of the product stack when all pertinent features are looked at outside of price.

As far as my monitor, My WC Vega 64 is in fact pushing frame rates within the Freesync range and most of my games are pegging it ie 100fps. I am not stating it can push 100fps on every game with ALL options but that it does with all the options I do run maxed and others I never run off (I never use radial blur or DoF, never) I don't always play the latest and greatest games, I play a plethora ranging from remakes of DOS games to current ones that fit my tastes. But also note I haven't said anything that resembles that another SHOULD purchase a VEGA nor do I delude myself or others about its performance. If one doesn't have a brand preference I have stated that the 1080 or greater would better serve them. However as I stated with the FX series CPUs, Vega can provide great performance if one is so inclined to purchase one. It isn't for me or you or any other to demean or dictate another's purchase, only to inform of the alternatives both positive and negative.

So if you had a G Sync monitor and a 1080ti instead of Free Sync and your WC Vega, you would have a better gaming experience no? You don't need to turn down your features then. Yeah you spend 300 bucks more total, but its worth it isn't it? Better yet you can be damn sure Volta will give at least the same boost over Pascal that Pascal gave of Maxwell, looking at the 2 factors, Volta's compute performance and nV's scaling on units vs performance ever generation (Volta looks higher than a 1 to 1 increase of units too, more like 1.5 to 1, how the clocks play into this is the only thing I don't know). We can also add in a 3rd factor in there as nV has always scaled well for current games when they release next gen products. AMD\ATi on the other hand had lopsided architectures which were bottlenecked either by ratio of unit differences or by sheer engineering incompetence where bottlenecks weren't removed from previous generations.

Now as far as Volta, just like you guys complaining about the "wait for" (insert whatever here) on AMD driver or GPU releases, same can be said here. Sure Volta will be better but it aint here yet, nor can one make a deciding factor now in lieu of any hard numbers or facts. Besides the biggest plus to G-sync or Freesync is the diminished need to upgrade frequently. As I and others have stated in the 1080Ti - Vega monitor blind test thread, this is a whole new game.

As far as Volta is concerned I can guarantee you its going to give at least 60% improvement over Pascal, tier to tier equivalent at this point, based on what I stated above.

And to add to the original point: I posted he needed to link proof for some of his claims especially the all inclusive ones.

If its well know then there is no need. I could just put an emoji like this one :depressed: for all of AMD's graphics product line and its real. Anyone that is an enthusiast in graphics knows it.
My monitor scored very well in tests. Has LFC support. A wide Freesync range (48-100) easily to stay within. Quantum dot for more vibrant color. Not sure how G-sync is going to add to this at all, again in a measureable way that would make it a clear advantage.


If your card is fast enough on the mins, ya don't need to worry about that as much so...... And that is a software feature anyways, as is Free Sync, you think nV can't add things to their software? Saying I don't know on that one lol yeah, right, you know they can, its software. Measurable btw.
 
Last edited:
I am very much open to getting a Gsync monitor when I buy another high end monitor but that will depend on what is available in monitors and graphics cards. JustReason monitor pretty much beats out any top end GSync monitor in my books. My 4K freesync monitor is working well, has modified FreeSync range from 35 to 61.

As for 4K gaming and Vega 64 - alive and strong here. Just remember folks have different tolerances and ideas what looks best for them. FreeSync 40fps to me looks better than an average 55 fps non sync monitor (by far at that as well). Also I usually never use motion blur, rarely use Depth of Field but some games like Doom does it right and I do use it. There is also a lot of leadway in settings that degrades performance for little to no significant IQ advantage, some while hitting performance hard also degrades IQ in my books. In short 4K gaming on the Vega 64 is working for the most part for me. In my opinion at 4K I am getting a better gaming experience with the Vega 64 then I would with my 1080 Ti because of the combination of Vega 64 (just enough) and FreeSync and of course what I like to use for settings. On the wide screen non sync monitor the 1080Ti torches the Vega 64, so each to me are optimized as best as possible for my needs or wants for what I have. One cup does not fit all is the the jest here.

I also believe that eventually Nvidia maybe forced to support adaptive sync due to as time goes on, the limited and higher price GSync monitors may hurt sells for Nvidia. Luckily for them RTG is not competing effectively with them.


How would Justreason's monitor beat out a G sync monitor with 144 hz and that can be driven at those hz? We talked about this before. I can tell the difference in depending on the game up to 100 hz, maybe even 120 hz, others can go up even higher. Saw that in youtube videos where people can actually tell the difference between 144, 120, 100, 75, 60.

Vega can't do that, 1080ti on the other hand can. Big difference. Then add in all the visual glory that you can throw in there with game settings. Those have to be sacrificed on Vega.

I still haven't gotten on the Free Sync G Sync bus yet, just waiting for now, cause I tend to keep my monitors for 5 to 7 years, and would rather not switch them out cause they will be 2k a piece, even though I have the money to do it. But at this point I don't see AMD getting any where near nV next gen, and even the gen after that is not going to be easy. So G-sync looks better to me, I would not rather sacrifice visual fidelity in anything if I don't have to.
 
Last edited:
How would Justreason's monitor beat out a G sync monitor with 144 hz and that can be driven at those hz? We talked about this before. I can tell the difference in depending on the game up to 100 hz, maybe even 120 hz, others can go up even higher. Saw that in youtube videos where people can actually tell the difference between 144, 120, 100, 75, 60.

Vega can't do that, 1080ti on the other hand can. Big difference. Then add in all the visual glory that you can throw in there with game settings. Those have to be sacrificed on Vega.

I still haven't gotten on the Free Sync G Sync bus yet, just waiting for now, cause I tend to keep my monitors for 5 to 7 years, and would rather not switch them out cause they will be 2k a piece, even though I have the money to do it. But at this point I don't see AMD getting any where near nV next gen, and even the gen after that is not going to be easy. So G-sync looks better to me, I would not rather sacrifice visual fidelity in anything if I don't have to.

HDR?

Well yes if 144mhz is important to you and can be maintained - personally I could care less. Now where is 144 fps is being maintained at 3440x1440p with high settings? I can't even come close to that with my 1080Ti? I guess in some games yes but modern games no way for new AAA titles. This makes 144mhz pointless until maybe years later. Anyways each their own.
 
HDR?

Well yes if 144mhz is important to you and can be maintained - personally I could care less. Now where is 144 fps is being maintained at 3440x1440p with high settings? I can't even come close to that with my 1080Ti? I guess in some games yes but modern games no way for new AAA titles. This makes 144mhz pointless until maybe years later. Anyways each their own.


Well 100 fps is not capable on Vega unless ya drop a lot of things on features or drop resolution can't do that at 4k on Vega lol. So he can't be doing 4k ;)

Well yeah it is a subjective topic, but if you have a card capable of driving 144 hz at the same res and with higher settings. Which solution is a no brainier. There is nothing to think about the chances of the one that can deliver that is going to be better, if without a subject test.
 
Well 100 fps is not capable on Vega unless ya drop a lot of things on features or drop resolution can't do that at 4k on Vega lol. So he can't be doing 4k ;)

Well yeah it is a subjective topic, but if you have a card capable of driving 144 hz at the same res and with higher settings. Which solution is a no brainier. There is nothing to think about the chances of the one that can deliver that is going to be better, if without a subject test.
Well like you said monitors tend to be kept longer, so GSync 100mhz + may become more useful (will grow) as faster cards come about. This can also be the same with FreeSync monitors except Nvidia has been on a very steady advancement since Maxwell. In my case I would be looking at future monitors with HDR, 2019 time frame. Volta is looking to be a very strong part, at least V100 is very much so - how that will translate into the gaming cards is to be seen. May end up with a GSync 1180 Ti combination in 2019 if all goes well - If RTG can pull a rabbit out of the hat then that is also a possible option. Since we are talking a significant amount of time here, Nvidia may be supporting adaptive sync as well which would save me some $ and get me to buy a Nvidia card faster.
 
Well like you said monitors tend to be kept longer, so GSync 100mhz + may become more useful (will grow) as faster cards come about. This can also be the same with FreeSync monitors except Nvidia has been on a very steady advancement since Maxwell. In my case I would be looking at future monitors with HDR, 2019 time frame. Volta is looking to be a very strong part, at least V100 is very much so - how that will translate into the gaming cards is to be seen. May end up with a GSync 1180 Ti combination in 2019 if all goes well - If RTG can pull a rabbit out of the hat then that is also a possible option. Since we are talking a significant amount of time here, Nvidia may be supporting adaptive sync as well which would save me some $ and get me to buy a Nvidia card faster.


I'm looking at HDR monitors too for my upgrade as well. But tell ya truth that too will be better on nV cards for now cause of the performance disparity.

We can always extrapolate performance from P100 to Pascal to see how things end up. Outside of HPC, DL, and Pro cards P100 and GP102 line up well. If anything Volta with similar cuda cores will have an advantage based on clocks or IPC, can't tell which one right now because don't have that data, but per core right now it has more performance looking at V100 compute performance over the P100, means its either clocks or IPC. Added to the that Volta will have more cuda cores, TMU's, ROPs, all the other stuff that nV usually scales up with gen to gen, it should have no problems.

Nvidia already supports adaptive sync lol, that is why I was surprised when JustReason mentioned it in the other thread. its in the drivers, pull down menu for V sync.

nvidia.png
 
I'm looking at HDR monitors too for my upgrade as well. But tell ya truth that too will be better on nV cards for now cause of the performance disparity.

We can always extrapolate performance from P100 to Pascal to see how things end up. Outside of HPC, DL, and Pro cards P100 and GP102 line up well. If anything Volta with similar cuda cores will have an advantage based on clocks or IPC, can't tell which one right now because don't have that data, but per core right now it has more performance looking at V100 compute performance over the P100, means its either clocks or IPC. Added to the that Volta will have more cuda cores, TMU's, ROPs, all the other stuff that nV usually scales up with gen to gen, it should have no problems.

Nvidia already supports adaptive sync lol, that is why I was surprised when JustReason mentioned it in the other thread. its in the drivers, pull down menu for V sync.

View attachment 37743
lol, adaptive V-Sync Nvidia supports not Adaptive Sync.
 
lol, adaptive V-Sync Nvidia supports not Adaptive Sync.


Sorry you are correct, but up to a certain point, if the graphics card is always outputting over 60 fps or 120 fps or what ever your monitor is set at, it shouldn't matter anymore right? Min frames on cards hit this, for the 1070, it should not hit less than 60min frames on a 1080p or wide screens for a gtx 1080 qhd or wide screen and for a 1080ti 4k, that is why adaptive v-sync is more than enough for nV cards. Adaptive V-sync is actually somewhere in between Freesync and Adaptive sync. As long as the card can deliver the min frames.

Adaptive sync doesn't change the monitor refresh rate as well well not fine enough to make it a viable solution.


then with adaptive v sync you also get the frame fillers JustReason was just talking about (LFC)

PS anyone that thinks there isn't a difference between G Sync and Free sync,

should look at the input lag

G Sync provides less input lag over free sync. I did notice this on my friends systems

and was measurable in reviews too



Linus should have tested G sync with Fast sync and most of those problems he saw would disappear.
 
Last edited:
Prove it isn't an argumentative point?



any enthusiast will buy the best product for their money at this point, AMD has no products that lead any of the product stack when all pertinent features are looked at outside of price.



So if you had a G Sync monitor and a 1080ti instead of Free Sync and your WC Vega, you would have a better gaming experience no? You don't need to turn down your features then. Yeah you spend 300 bucks more total, but its worth it isn't it? Better yet you can be damn sure Volta will give at least the same boost over Pascal that Pascal gave of Maxwell, looking at the 2 factors, Volta's compute performance and nV's scaling on units vs performance ever generation (Volta looks higher than a 1 to 1 increase of units too, more like 1.5 to 1, how the clocks play into this is the only thing I don't know). We can also add in a 3rd factor in there as nV has always scaled well for current games when they release next gen products. AMD\ATi on the other hand had lopsided architectures which were bottlenecked either by ratio of unit differences or by sheer engineering incompetence where bottlenecks weren't removed from previous generations.



As far as Volta is concerned I can guarantee you its going to give at least 60% improvement over Pascal, tier to tier equivalent at this point, based on what I stated above.



If its well know then there is no need. I could just put an emoji like this one :depressed: for all of AMD's graphics product line and its real. Anyone that is an enthusiast in graphics knows it.



If your card is fast enough on the mins, ya don't need to worry about that as much so...... And that is a software feature anyways, as is Free Sync, you think nV can't add things to their software? Saying I don't know on that one lol yeah, right, you know they can, its software. Measurable btw.
If you weren't trying so hard to be combative you would spend a lot less time even responding.

Not even interested in debating points I already made as it seems you don't want to be reasonable at all, just superior. However you are misinterpreting my point with G-sync and what it would add. I meant in terms of visual/experience not software capability or what not. Remember that blind test? It is a lot like that, personal preference and choice. You act as if I turn off settings to get frame rates. No I turn off settings I will NEVER use no matter if they impact performance or not because I don't like them. First time I play a game they get turned off even before I see them in the game. Wouldn't matter which camp I was using I would do the same.

Problem here is even when any of us comment that G-sync is a better regulated solution therefore the average quality of their monitors is higher, most of you will argue with the fact that SOME Freesync Monitors are equal to G-syncs best. More than enough reviews (not youtubers I refuse to watch a single one) show parity among the top performers of both camps and the difference in cost between them. For those, G-sync is a terrible deal as far as the extra cost. I may argue that if you are a Nvidia-only user it is worth the cost, as even you stated monitors are generally 5+ year puchases and Freesync no matter how much cheaper means jack since you cant use it anyway. And then I would argue still that given a certain price range and the equalizing effect of either sync tech the cheaper route will provide as much performance from a virtual smoothness stand point. Now that may in fact push greater in favor of which ever has the better price point at their given performance level so say a 1080 or even the 1070 may still win out over the vega and Freesync simply because of price discrepancies, but again only to the brand agnostic.

A good point I like to use is if I ran with the Vega and you ran with a 1080Ti, how would your 1080Ti affect my game play at home? It doesn't. Only in reviews or for those that have both does it matter. My gameplay is not diminished in the slightest. Nor does yours get better simply because I am using a Vega. Yours could get worse if I live in the same town because I would likely kill the grid with my hefty power consumption. ( last part is a joke for those lacking a sense of humor)
 
If you weren't trying so hard to be combative you would spend a lot less time even responding.

I'm not being combative, you should use some "reasoning" in your posts instead of telling others to prove themselves, with what they are saying are well know. If you want to disprove something that a person says, show logic to disprove them.

Not even interested in debating points I already made as it seems you don't want to be reasonable at all, just superior. However you are misinterpreting my point with G-sync and what it would add. I meant in terms of visual/experience not software capability or what not. Remember that blind test? It is a lot like that, personal preference and choice. You act as if I turn off settings to get frame rates. No I turn off settings I will NEVER use no matter if they impact performance or not because I don't like them. First time I play a game they get turned off even before I see them in the game. Wouldn't matter which camp I was using I would do the same.

You are just not interested in anything that remotely would increase your understanding of the topics at hand.

Problem here is even when any of us comment that G-sync is a better regulated solution therefore the average quality of their monitors is higher, most of you will argue with the fact that SOME Freesync Monitors are equal to G-syncs best. More than enough reviews (not youtubers I refuse to watch a single one) show parity among the top performers of both camps and the difference in cost between them. For those, G-sync is a terrible deal as far as the extra cost. I may argue that if you are a Nvidia-only user it is worth the cost, as even you stated monitors are generally 5+ year puchases and Freesync no matter how much cheaper means jack since you cant use it anyway. And then I would argue still that given a certain price range and the equalizing effect of either sync tech the cheaper route will provide as much performance from a virtual smoothness stand point. Now that may in fact push greater in favor of which ever has the better price point at their given performance level so say a 1080 or even the 1070 may still win out over the vega and Freesync simply because of price discrepancies, but again only to the brand agnostic.

But it doesn't and I've just showed you, input lag increases from Free Sync, Gsync when using Fast sync no longer has the issues Linus had with Vsync. It is worth the extra cost for the best possible for almost everyone, not even subjective at this point. You want the best you pay for the best, you don't have the money to get the best or a fan of AMD you get second best GPU's and monitors, that simple. That is why AMD's products are cheaper than best nV products, because that is where they stand, that is their place in the market, not at the top. That is why Vega 64 liquid is priced @ a lower price than a a 1080ti, cause it only competes with a gtx 1080 but still priced higher which is just ridiculous buy right now IMO same with Vega 64 air cooled, the only Vega card that should even be remotely looked into is the Vega 56 which is going to get surpassed easily by the gtx 1070 ti, and the Free Sync monitor can't do everything the Gsync monitor can. Simple economics at play..... Same thing with Ryzen, it has its faults more so than Intel counterparts. When a 8 core chip can't beat a 4 core chip in gaming that's bad. When a 16 core TR goes up against a 16core Skylake X, it gets creamed in all tests, hence why TR was priced cheaper, it has to be priced cheaper.

Its like a person that wants a lexus vs a toyota avalon, both can do the same thing, get from point A to point B, but the avalon can't get all the features of the lexus, sorry that is just the way it is. And they are priced accordingly. Another person and I discussed this before, if you want a mazda miata it can do what a lotus elise does but the elise does everything better than the miata, and of course the elise demands the price its at because of it.

You can't sit here and tell me price is a driving factor for AMD sales (not only that we have never seen it work in the market place if its not the only factor involved). In any market price is never a driving factor for sales by itself, products are always placed based on what in the market already and then priced accordingly based on demand and supply. There is no way you can say otherwise to any of these things.

A good point I like to use is if I ran with the Vega and you ran with a 1080Ti, how would your 1080Ti affect my game play at home? It doesn't. Only in reviews or for those that have both does it matter. My gameplay is not diminished in the slightest. Nor does yours get better simply because I am using a Vega. Yours could get worse if I live in the same town because I would likely kill the grid with my hefty power consumption. ( last part is a joke for those lacking a sense of humor)

So lets see I'm playing Titan Fall 2 right now, I'm playing it at 4k with everything turned up to the highest they can go, getting over 60 fps even the mins. Can Vega do that? you tell me. I can tell you it can't! it can't get to 60 avg with everything turned up highest possible let alone min frames. I don't even need a Gsync or freesync monitor to have all the features of Gysnc or Free Sync in that game. All I need is fast sync.

Dude you can come over to my house I have 1200 amps coming into this house now for my rigs, so no problem there ;)
 
Last edited:
But it doesn't and I've just showed you, input lag increases from Free Sync, Gsync when using Fast sync no longer has the issues Linus had with Vsync. It is worth the extra cost for the best possible for almost everyone, not even subjective at this point. You want the best you pay for the best, you don't have the money to get the best or a fan of AMD you get second best GPU's and monitors, that simple. That is why AMD's products are cheaper than best nV products, because that is where they stand, that is their place in the market, not at the top. That is why Vega 64 liquid is priced @ a lower price than a a 1080ti, cause it only competes with a gtx 1080 but still priced higher which is just ridiculous buy right now IMO same with Vega 64 air cooled, the only Vega card that should even be remotely looked into is the Vega 56 which is going to get surpassed easily by the gtx 1070 ti, and the Free Sync monitor can't do everything the Gsync monitor can. Simple economics at play..... Same thing with Ryzen, it has its faults more so than Intel counterparts. When a 8 core chip can't beat a 4 core chip in gaming that's bad. When a 16 core TR goes up against a 16core Skylake X, it gets creamed in all tests, hence why TR was priced cheaper, it has to be priced cheaper.

Its like a person that wants a lexus vs a toyota avalon, both can do the same thing, get from point A to point B, but the avalon can't get all the features of the lexus, sorry that is just the way it is. And they are priced accordingly. Another person and I discussed this before, if you want a mazda miata it can do what a lotus elise does but the elise does everything better than the miata, and of course the elise demands the price its at because of it.

You can't sit here and tell me price is a driving factor for AMD sales (not only that we have never seen it work in the market place if its not the only factor involved). In any market price is never a driving factor for sales by itself, products are always placed based on what in the market already and then priced accordingly based on demand and supply. There is no way you can say otherwise to any of these things.

This is spot on. If you want the best, you pay the premium for the best, really straightforward stuff. if it was in nVidia's best interest (making money) to abandon G-Sync and/or also embrace Free, do people not think that they would? I've got a feeling their internal sale/marketing data says otherwise.

My other question to those posting in this thread are; how many bought a Freesync product prior to Vega's release with the intention of purchasing, and now seeing the "lackluster" performance/power/heat numbers we are now aware of, and aren't happy with their display purchase? Is it buyers remorse? How many wish they would have just paid the tax and been able to sit back and enjoy? I do own a 1080 + Gsync display (see sig) and have been enjoying the combo since last May worry free without waiting on the unknown to maybe justify my monitor purchase. It fits my needs, it may not meet yours. All good.

Honest question, not trying to ruffle feathers.

I'm not understanding the spite in this thread towards nVidia who has a better overall product (by this I mean GPU + monitor combo) demanding a premium price tag. I'm sure there are exceptions on the monitor side of things though, never taken the time to browse many freesync offerings.
 
Last edited:
I would love for freesync to work on my 1080 + 75hz FS monitor... Not holding my breath though
 
I'd love to buy a GSYNC monitor, just can't afford the extra Benjamin's for the proprietary technology.
 
Freesync isn't free- it's just cheaper, because AMD halfassed the solution and then developed software to get it most of the way to G-Sync.

Freesync 2, which raises the standards, will be even less 'free'.

And I can think of one big reason for Nvidia to not support AMD's Freesync: they developed the technology, and their version is still superior in implementation. Maybe they'll even get the cost down to parity with Freesync.


[and HDMI VRR, while modeled after Freesync, is something else; I wonder if Nvidia will support this to get around supporting an AMD imitation of their technology directly]


So much wrong with this post.

AMD didn't half ass the solution. Adaptive Sync is a VESA standard put forward by AMD, which Nvidia approved, or else it would never have become part of the Standard. Nvidia are using the exact same solution in their Gsync laptops.

Monitor manufacturers can use adaptive sync or not. While there are some monitors without full range, there are loads of monitor with full range.

Nvidia didn't develop the technology. AMD and Nvidia both got their idea from the power saving features of the embedded display port spec. The reason adaptive sync took longer to come to market is that it had to go through the whole approval process for the display port standard. AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync.


The person who you quoted didn't say Freesync was free, he said the standard was free, which it is. He was wondering would the Freesync 2 standard cost money.

It must not cost that much money, because AOC are releasing two 27 inch 1440p gaming monitors. One with Gsync and one with Freesync 2. The Freesync 2 monitor has HDR the Gsync does not. The Freesync 2 monitor is $120 cheaper.
 
If g-sync is better where are specific tests objective and subjective showing that to be the case. I've looked for them. I can't find them.

Gsync isn't any better than Freesync. The "problem" with Freesync monitors is that some of the ones that don't have full range don't support LFC either. That could be a complaint against Adaptive Sync, but, I look at it the other way, because of how cheap adaptive sync is to implement, monitor manufacturers are putting it into more and more monitors. Even a 20hz sync range is better than none.
 
So much wrong with this post.

AMD didn't half ass the solution. Adaptive Sync is a VESA standard put forward by AMD, which Nvidia approved, or else it would never have become part of the Standard. Nvidia are using the exact same solution in their Gsync laptops.

Monitor manufacturers can use adaptive sync or not. While there are some monitors without full range, there are loads of monitor with full range.

Nvidia didn't develop the technology. AMD and Nvidia both got their idea from the power saving features of the embedded display port spec. The reason adaptive sync took longer to come to market is that it had to go through the whole approval process for the display port standard. AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync.


The person who you quoted didn't say Freesync was free, he said the standard was free, which it is. He was wondering would the Freesync 2 standard cost money.

It must not cost that much money, because AOC are releasing two 27 inch 1440p gaming monitors. One with Gsync and one with Freesync 2. The Freesync 2 monitor has HDR the Gsync does not. The Freesync 2 monitor is $120 cheaper.

As far as I can see it Adaptive Sync and Freesync are the same thing or am I wrong in this regard? What makes Freesync different then Adaptive Sync? And no I am not talking about NVIDIA's adaptive v-sync which is different. So really both MFG should support adaptive sync being that it is a VESA standard.
 
As far as I can see it Adaptive Sync and Freesync are the same thing or am I wrong in this regard? What makes Freesync different then Adaptive Sync? And no I am not talking about NVIDIA's adaptive v-sync which is different. So really both MFG should support adaptive sync being that it is a VESA standard.


No they aren't the same, Adaptive sync doesn't have the granularity of the monitor refresh rate that free sync has. This is why nV didn't support adaptive sync, it doesn't work well enough in games, nor was it really taken up by end users.

Gsync has only one area of superiority over Freesync input lag. And we are talking about 20 mill second differences here, so it can affect some people, but I think most people won't see this.
 
No they aren't the same, Adaptive sync doesn't have the granularity of the monitor refresh rate that free sync has. This is why nV didn't support adaptive sync, it doesn't work well enough in games, nor was it really taken up by end users.

Gsync has only one area of superiority over Freesync input lag. And we are talking about 20 mill second differences here, so it can affect some people, but I think most people won't see this.

20ms is pretty large actually that would affect most people running at or above only 50FPS. I also thought Gsync had a much larger range of the variable refresh rate.
 
Back
Top