Is it time for Nvidia to support Freesync?

yes G Sync does have a larger range of variable refresh rate, well as frame rates go higher the input latency drops 20ms difference only happens below 50fps, so really depends. Its really hard to tell too, cause the time when you hit the key and when your eyes to send info to your brain that it sees the muzzle flash, is in that range too. Some people can see it, others can't.
 
yes G Sync does have a larger range of variable refresh rate, well as frame rates go higher the input latency drops 20ms difference only happens below 50fps, so really depends. Its really hard to tell too, cause the time when you hit the key and when your eyes to send info to your brain that it sees the muzzle flash, is in that range too. Some people can see it, others can't.

The problem I see is that every device you interact with adds latency. So while 20ms may not be much on its own, the keyboard, drivers, monitor, network all can add additional latency and this starts adding up.
 
pretty sure that's all of the devices together ;), cause I don't know of any other way to measure the latency. The only way right now is to use a slow motion camera and record the game play and looking at that.

Edit: This is also why its not recommended to use V sync with nV cards at all with G Sync (fast sync only), V sync really increases latency when above 60 fps.
 
I'm sticking with Gsync. If they supported adaptive sync I would be pretty surprised, they like their locked ecosystem. However, the fatal flaw of AMD's Freesync to me is that is doesn't work worth a shit on anything that isn't top of the line brand new AAA title (I've submitted bug reports on this for several years now and nothing has ever come of it so AMD clearly doesn't care).

I've encountered exactly one game I can't use Gsync in, which is Skyrim. I have over a dozen games that I play regularly that absolutely freak out with Freesync and flicker like crazy to the point where I can't stand to look at it. As a matter of fact, the only thing I've played in the last 2 years that works with Freesync is Dragon Age, everything else I have to turn it off or its spaz city.
 
As far as I can see it Adaptive Sync and Freesync are the same thing or am I wrong in this regard? What makes Freesync different then Adaptive Sync? And no I am not talking about NVIDIA's adaptive v-sync which is different. So really both MFG should support adaptive sync being that it is a VESA standard.

Adaptive Sync is the VESA display port standard. Because the Monitor Manufacturers don't have to pay any Royalties to AMD or VESA to use adaptive sync, it was called "Free" sync as a kind of joke by one of the tech Websites and the name sort of stuck. So yes they are the same thing.

Adaptive sync is the proper name for it.
 
Personally yes, but it would likely deliver a serious blow to AMD for reasons already supplied above and I do like competition as well.
 
From AMD's own FAQ on the subject:

Radeon FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.

So, no, they're not identical otherwise the word unique would be a false assertion.
 
I see a lot of crocodile tears. Nvidia can lower the price of G-sync at any time. That would absolutely kill FreeSync. You're free to buy an inferior FreeSync product. Please stop crying.
 
I see a lot of crocodile tears. Nvidia can lower the price of G-sync at any time. That would absolutely kill FreeSync. You're free to buy an inferior FreeSync product. Please stop crying.

The thing is this issue has been here since CRTs left way back when. We should have had a variable refresh standard long ago, but PC monitors are not pulling in enough revenue for these companies. This is why TV technology is far out pacing monitors, because of the demand.
 
From AMD's own FAQ on the subject:

Radeon FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.

So, no, they're not identical otherwise the word unique would be a false assertion.

Radeon FreeSync technology applies to the hardware/software the graphics card uses to connect to an adaptive sync monitor.

But, that's not what's been discussed here. It's about monitors. And when people say Freesync monitors, they really mean adaptive sync monitors. In response to what Raidflex was asking, adaptive sync and freesync are the same thing.
 
I see a lot of crocodile tears. Nvidia can lower the price of G-sync at any time. That would absolutely kill FreeSync. You're free to buy an inferior FreeSync product. Please stop crying.

Nothing inferior about Freesync monitors. Some of the best monitors on the market are Freesync monitors.

Can Nvidia lower the price of Gsync? They have to buy the FPGA chip and program it for each monitor. How much can they reduce the price and still make money?

Not sure why I replied to your post as it is obviously just trolling.
 
Radeon FreeSync technology applies to the hardware/software the graphics card uses to connect to an adaptive sync monitor.

But, that's not what's been discussed here. It's about monitors. And when people say Freesync monitors, they really mean adaptive sync monitors. In response to what Raidflex was asking, adaptive sync and freesync are the same thing.


No they are not, all adaptive sync monitors arn't FreeSync monitors, not all adaptive sync monitors have the granularity of frame rates to be used as FreeSync. This is why AMD does have some sort of certification process, just not very extensive hence why different Free Sync monitors have different feature sets.

Nothing inferior about Freesync monitors. Some of the best monitors on the market are Freesync monitors.

Can Nvidia lower the price of Gsync? They have to buy the FPGA chip and program it for each monitor. How much can they reduce the price and still make money?

Not sure why I replied to your post as it is obviously just trolling.

Even the best Free Sync monitors don't have the same features of G Sync. G Sync has less input lag for any model low end to high end over even the high end Free Sync, and has strict guide lines of higher range of refresh rates too.

https://www.techspot.com/article/1454-gsync-vs-freesync/

Ironically, JustReason must not have know LFC is part of ALL G Sync monitors, so when buying a G Sync monitor no need to worry about it not being there, its part of the spec, where with Free Sync its not part of the specification, so buyers beware, with high end Free Sync monitors pretty sure that feature will be there though

The main takeaway from looking at a range of G-Sync and FreeSync displays is that G-Sync is a known quantity, whereas FreeSync monitors vary significantly in quality. Basically every G-Sync monitor is a high-end unit with gaming-suitable features, a large refresh window, support for LFC and ULMB – in other words, when purchasing a G-Sync monitor you can be sure you’re getting the best variable refresh experience and a great monitor in general.

With FreeSync, some monitors are gaming-focused with high-end features and support for LFC, but many aren’t and are more geared towards everyday office usage than gaming. Potential buyers will need to research FreeSync monitors more than with G-Sync equivalents to ensure they’re getting a good monitor with all the features necessary for the best variable refresh experience.

Free Sync 2 should fix or close the gap with input lag too
FreeSync 2 is a much larger update, that not only includes support for HDR monitors, but also introduces a monitor validation program that will see only the best monitors receive a FreeSync 2 badge. FreeSync 2 monitors will have at least twice the maximum brightness and color volume over standard sRGB displays, and monitors will be validated to meet input lag standards (in the “few milliseconds” range). All FreeSync 2 monitors will support LFC.

Pretty much what nV did was G-Sync is a premium product, every one else gets less, that's it. What AMD did was everyone can get Free Sync if they want to, with minimal cost over existing monitors, but doesn't give all the features of G Sync because they didn't have time or resources or what not, what ever it is, to do a full blown G Sync competitor, most likely time because of how fast they rolled out with FreeSync.

Look at it this way too, G-Sync is expensive not only for the consumer but adds cost to the monitor, that means the manufacturer has to add cost on their end too, hence why there are so few G-Sync monitors out there, manufacturers don't want to increase their own costs let alone charge higher for their products right? Does AMD even have the market share to leverage addition costs for end users for an ancillary product? Can they convince manufacturers to increase their costs as well? So maybe another reason they went with FreeSync out of the box instead of FreeSync 2.
 
Last edited:
Radeon FreeSync technology applies to the hardware/software the graphics card uses to connect to an adaptive sync monitor.

But, that's not what's been discussed here. It's about monitors. And when people say Freesync monitors, they really mean adaptive sync monitors. In response to what Raidflex was asking, adaptive sync and freesync are the same thing.

I'll take AMD's description of their own trademarked technology over an anonymous internet forum person. AMD doesn't agree with your description of FreeSync technology but you are free to tell them that you know more than they do.
 
that comparison video linus created comparing Gsync and Freesync did nothing more than confuse me.
 
that comparison video linus created comparing Gsync and Freesync did nothing more than confuse me.


That was why you shouldn't use V Sync with G Sync lol doesn't work well ;) but even with V Sync when its most important, when frame rates drops below 60 fps it still helps. Fast sync fixes all the problems Linus was having.

V Sync automatically increases input latency by itself. Fast sync does too but the difference is very large, looking at over 20ms differences between V Sync and Fast Sync.

As frame rates increase that should drop input latency, with V Sync that doesn't happen.
 
So much wrong with this post.

There's so much wrong with your rebuttal!

AMD didn't half ass the solution. Adaptive Sync is a VESA standard put forward by AMD, which Nvidia approved, or else it would never have become part of the Standard. Nvidia are using the exact same solution in their Gsync laptops.

Sure, because the basis of the technology, the signalling itself, is as you say below. However, G-Sync is a separate technology: Nvidia put hardware into monitors for the purpose of eliminating lag. They have been focused on smooth, lag-free gameplay for a very long time, during much of that time AMD had trouble just shipping hardware and occasionally shipping stable drivers to go with it. AMD was rightly called out on this, and as you can see further in the thread, there's a real, measurable lag penalty built in to FreeSync.

FreeSync 2, which will be even more expensive, seeks to emulate the lag reduction approach that Nvidia designed into G-Sync hardware by increasing the the amount of hardware necessary to support FreeSync 2, as well as introducing a certification program. This is why I said AMD halfassed FreeSync, because they did, when Nvidia had everything implemented upon announcement. All AMD could show was a laptop that was modified to use the DisplayPort power-saving protocol along with a modified monitor.

Nvidia didn't develop the technology. AMD and Nvidia both got their idea from the power saving features of the embedded display port spec. The reason adaptive sync took longer to come to market is that it had to go through the whole approval process for the display port standard. AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync.

So yes, Nvidia did develop the technology, above and beyond what was possible by just using the existing power-saving tech with modifications.

And now on to your assertion: prove that "AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync". This will be fun.
 
There's so much wrong with your rebuttal!



Sure, because the basis of the technology, the signalling itself, is as you say below. However, G-Sync is a separate technology: Nvidia put hardware into monitors for the purpose of eliminating lag. They have been focused on smooth, lag-free gameplay for a very long time, during much of that time AMD had trouble just shipping hardware and occasionally shipping stable drivers to go with it. AMD was rightly called out on this, and as you can see further in the thread, there's a real, measurable lag penalty built in to FreeSync.

FreeSync 2, which will be even more expensive, seeks to emulate the lag reduction approach that Nvidia designed into G-Sync hardware by increasing the the amount of hardware necessary to support FreeSync 2, as well as introducing a certification program. This is why I said AMD halfassed FreeSync, because they did, when Nvidia had everything implemented upon announcement. All AMD could show was a laptop that was modified to use the DisplayPort power-saving protocol along with a modified monitor.



So yes, Nvidia did develop the technology, above and beyond what was possible by just using the existing power-saving tech with modifications.

And now on to your assertion: prove that "AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync". This will be fun.
Doesn't have to prove AMD did only others have. I found as far back as the 90s as in 1990s. But a greater indepth look was posted in 2008 that was specifically forecasting what Gsync and freesync do today. So NO Nvidia didn't come up with this on their own, the theories and tech capability have been there for a while just as to be expected the monitor manufacturers were not even interested in bringing this forward. All you can attribute to Nvidia is bring a form of it to market for the gaming field first, not being the creators.
 
Doesn't have to prove AMD did only others have. I found as far back as the 90s as in 1990s. But a greater indepth look was posted in 2008 that was specifically forecasting what Gsync and freesync do today. So NO Nvidia didn't come up with this on their own, the theories and tech capability have been there for a while just as to be expected the monitor manufacturers were not even interested in bringing this forward. All you can attribute to Nvidia is bring a form of it to market for the gaming field first, not being the creators.

Don't misquote me- I'm not saying that Nvidia created the technology for adaptive v-sync, I'm saying that they created the technology to go above and beyond what was available in the protocol for power saving to solve the whole problem.

They had it solved and marketable before AMD had anything to show.

And please understand- I'm not rooting for Nvidia or dissing AMD, I'm calling it like it is. Nvidia was on the ball here.
 
Don't misquote me- I'm not saying that Nvidia created the technology for adaptive v-sync, I'm saying that they created the technology to go above and beyond what was available in the protocol for power saving to solve the whole problem.

They had it solved and marketable before AMD had anything to show.

And please understand- I'm not rooting for Nvidia or dissing AMD, I'm calling it like it is. Nvidia was on the ball here.
Fair point and taken.
 
Nvidia are using the exact same solution in their Gsync laptops.

Monitor manufacturers can use adaptive sync or not. While there are some monitors without full range, there are loads of monitor with full range.

Nvidia didn't develop the technology. AMD and Nvidia both got their idea from the power saving features of the embedded display port spec. The reason adaptive sync took longer to come to market is that it had to go through the whole approval process for the display port standard. AMD were thinking of sync tech long before Gsync was introduced. They put the hardware needed to work with adaptive sync monitors onto desktop cards before the release of Gsync.


The person who you quoted didn't say Freesync was free, he said the standard was free, which it is. He was wondering would the Freesync 2 standard cost money.

It must not cost that much money, because AOC are releasing two 27 inch 1440p gaming monitors. One with Gsync and one with Freesync 2. The Freesync 2 monitor has HDR the Gsync does not. The Freesync 2 monitor is $120 cheaper.

While the eDP standard for variable v-blank was already existing for the purpose of power savings in laptops, in 2013, Nvidia was the first to innovate the fully-working application of variable v-blank in order to solve the problem of screen tearing in games without the inherent disadvantages of traditional v-sync, initially available as a G-Sync DIY kit for the Asus VG248QE gaming monitor. AMD was caught off-guard by this, that's why it was obvious that they rushed to offer their own competing VRR solution with Freesync kind of like how Nvidia reacted when ATi launched Eyefinity. I still remember AMD's Freesync windmill tech demo on a laptop that followed months later and how the early press releases implied that it might be possible for end users to have Freesync enabled for free via a monitor firmware update making it the more compelling option over g-sync. We all know how that didn't materialize when AMD realized that getting variable V-blank from the eDP standard for laptop displays to work on a desktop monitor wasn't as simple as they first thought that's why they had to go upstream in the monitor supply chain and forged partnerships with the major desktop monitor scalar manufacturers, MStar, Novatek and Realtek to be able to implement Freesync on the desktop as these monitors require scalar boards between the panel and the GPU to work. It has to be clarified that the embedded display port in laptops allow the display panel to be directly interfaced with the GPU and don't require a scalar board to work, that is why AMD was able to whip up their initial freesync tech demo on a laptop in such a short amount of time and also that is why G-sync on laptops is done differently compared to the desktop which requires the scalar board replacement so they cannot be considered as the same thing.

Eventually, it took AMD about two years after the launch of the first G-sync DIY board before the first Freesync monitors started selling in the market so AMD was obviously playing catch up in the VRR game. AMD eventually adjusted their PR to say that Freesync is actually "royalty free" for monitor OEMs to adopt thus the savings can be passed on to the consumers but still, getting Freesync required the purchase of monitors and gpus that supported it so in the end, it wasn't actually free.

With much looser certification requirements and the full intent of getting the technology in as many monitors as possible, Freesync was part of a strategy to help AMD sell more of their GPUs rather than directly profiting from the sales of Freesync monitors. Unfortunately, this has led to the side effect of fragmentation and inconsistent VRR experience across different Freesync monitors with varying VRR ranges, overdrive/ghosting compensation, LFC support, etc. It's really more about quantity over quality.

Freesync 2 improves on this with mandatory LFC support but is still more accessible with lower HDR certification requirements compared to G-Sync HDR. G-sync HDR follows the HDR10 standard more closely like having at least 1000nits of peak brightness to be certified and and that adds to the cost.
 
While the eDP standard for variable v-blank was already existing for the purpose of power savings in laptops, in 2013, Nvidia was the first to innovate the fully-working application of variable v-blank in order to solve the problem of screen tearing in games without the inherent disadvantages of traditional v-sync, initially available as a G-Sync DIY kit for the Asus VG248QE gaming monitor. AMD was caught off-guard by this, that's why it was obvious that they rushed to offer their own competing VRR solution with Freesync kind of like how Nvidia reacted when ATi launched Eyefinity. I still remember AMD's Freesync windmill tech demo on a laptop that followed months later and how the early press releases implied that it might be possible for end users to have Freesync enabled for free via a monitor firmware update making it the more compelling option over g-sync. We all know how that didn't materialize when AMD realized that getting variable V-blank from the eDP standard for laptop displays to work on a desktop monitor wasn't as simple as they first thought that's why they had to go upstream in the monitor supply chain and forged partnerships with the major desktop monitor scalar manufacturers, MStar, Novatek and Realtek to be able to implement Freesync on the desktop as these monitors require scalar boards between the panel and the GPU to work. It has to be clarified that the embedded display port in laptops allow the display panel to be directly interfaced with the GPU and don't require a scalar board to work, that is why AMD was able to whip up their initial freesync tech demo on a laptop in such a short amount of time and also that is why G-sync on laptops is done differently compared to the desktop which requires the scalar board replacement so they cannot be considered as the same thing.

Eventually, it took AMD about two years after the launch of the first G-sync DIY board before the first Freesync monitors started selling in the market so AMD was obviously playing catch up in the VRR game. AMD eventually adjusted their PR to say that Freesync is actually "royalty free" for monitor OEMs to adopt thus the savings can be passed on to the consumers but still, getting Freesync required the purchase of monitors and gpus that supported it so in the end, it wasn't actually free.

With much looser certification requirements and the full intent of getting the technology in as many monitors as possible, Freesync was part of a strategy to help AMD sell more of their GPUs rather than directly profiting from the sales of Freesync monitors. Unfortunately, this has led to the side effect of fragmentation and inconsistent VRR experience across different Freesync monitors with varying VRR ranges, overdrive/ghosting compensation, LFC support, etc. It's really more about quantity over quality.

Freesync 2 improves on this with mandatory LFC support but is still more accessible with lower HDR certification requirements compared to G-Sync HDR. G-sync HDR follows the HDR10 standard more closely like having at least 1000nits of peak brightness to be certified and and that adds to the cost.
Nice post!

Subjectively, 1000 nits of brightness is plain old dumb unless you are working on a park bench outside in full sun. I can't imagine what 1000 nits of brightness would be like in a game at night with the lights off in your Den when it immediately transitions from a dark scene to a fully lit scene. Think Doom, or that kind of game. It'd make you see spots. HDR range issues like this have plagued the earlier adopters on the home theater forums with new technology HDR projector users. There's dynamic...and then there is painful.
 
Nice post!

Subjectively, 1000 nits of brightness is plain old dumb unless you are working on a park bench outside in full sun. I can't imagine what 1000 nits of brightness would be like in a game at night with the lights off in your Den when it immediately transitions from a dark scene to a fully lit scene. Think Doom, or that kind of game. It'd make you see spots. HDR range issues like this have plagued the earlier adopters on the home theater forums with new technology HDR projector users. There's dynamic...and then there is painful.
hmmm, might have to wear sun glasses . . . Maybe rapid tinting transition glasses . . . ;)
 
Nice post!

Subjectively, 1000 nits of brightness is plain old dumb unless you are working on a park bench outside in full sun. I can't imagine what 1000 nits of brightness would be like in a game at night with the lights off in your Den when it immediately transitions from a dark scene to a fully lit scene. Think Doom, or that kind of game. It'd make you see spots. HDR range issues like this have plagued the earlier adopters on the home theater forums with new technology HDR projector users. There's dynamic...and then there is painful.

For HDR content that presents problems for viewers/users, I'd posit that the content creators are essentially doing it wrong.

The basic point of HDR (in and of itself, in any application) is to extend the range between the lowest signal and the highest signal that can be discretely captured or represented. For monitors/TVs, this means that you can have very dark stuff and very bright stuff on screen at the same time.

It does not mean that viewers/users should be blasted by 1000nits across the screen. It means that, for example, if the sun is in the frame, it will actually be really bright and not just a dim white, while everything else will be normally lit, while stuff in shadows will still be visible instead of black.

Dolby Cinema at AMC is a great example of HDR properly applied, for the movies I've seen presented in those theaters.
 
Transitioning glasses usually do not react to bright light, they react to UV light in sunlight.

You can test this by being in a car, glass in the windscreen usually do a good job of absorbing UV light so your transition lenses won't turn as dark as you would standing outside.

Source: I own a pair of transition glasses.
hmmm, might have to wear sun glasses . . . Maybe rapid tinting transition glasses . . . ;)
 
For HDR content that presents problems for viewers/users, I'd posit that the content creators are essentially doing it wrong.

The basic point of HDR (in and of itself, in any application) is to extend the range between the lowest signal and the highest signal that can be discretely captured or represented. For monitors/TVs, this means that you can have very dark stuff and very bright stuff on screen at the same time.

It does not mean that viewers/users should be blasted by 1000nits across the screen. It means that, for example, if the sun is in the frame, it will actually be really bright and not just a dim white, while everything else will be normally lit, while stuff in shadows will still be visible instead of black.

Dolby Cinema at AMC is a great example of HDR properly applied, for the movies I've seen presented in those theaters.


Yeah, actually any bright light source in a scene can produce more than 1000nits, easily when looking straight at it, the FOV should be able to regulate the brightness of the light as well based on distance and angle.
 
While the eDP standard for variable v-blank was already existing for the purpose of power savings in laptops, in 2013, Nvidia was the first to innovate the fully-working application of variable v-blank in order to solve the problem of screen tearing in games without the inherent disadvantages of traditional v-sync, initially available as a G-Sync DIY kit for the Asus VG248QE gaming monitor. AMD was caught off-guard by this, that's why it was obvious that they rushed to offer their own competing VRR solution with Freesync kind of like how Nvidia reacted when ATi launched Eyefinity. I still remember AMD's Freesync windmill tech demo on a laptop that followed months later and how the early press releases implied that it might be possible for end users to have Freesync enabled for free via a monitor firmware update making it the more compelling option over g-sync. We all know how that didn't materialize when AMD realized that getting variable V-blank from the eDP standard for laptop displays to work on a desktop monitor wasn't as simple as they first thought that's why they had to go upstream in the monitor supply chain and forged partnerships with the major desktop monitor scalar manufacturers, MStar, Novatek and Realtek to be able to implement Freesync on the desktop as these monitors require scalar boards between the panel and the GPU to work. It has to be clarified that the embedded display port in laptops allow the display panel to be directly interfaced with the GPU and don't require a scalar board to work, that is why AMD was able to whip up their initial freesync tech demo on a laptop in such a short amount of time and also that is why G-sync on laptops is done differently compared to the desktop which requires the scalar board replacement so they cannot be considered as the same thing.

Eventually, it took AMD about two years after the launch of the first G-sync DIY board before the first Freesync monitors started selling in the market so AMD was obviously playing catch up in the VRR game. AMD eventually adjusted their PR to say that Freesync is actually "royalty free" for monitor OEMs to adopt thus the savings can be passed on to the consumers but still, getting Freesync required the purchase of monitors and gpus that supported it so in the end, it wasn't actually free.

With much looser certification requirements and the full intent of getting the technology in as many monitors as possible, Freesync was part of a strategy to help AMD sell more of their GPUs rather than directly profiting from the sales of Freesync monitors. Unfortunately, this has led to the side effect of fragmentation and inconsistent VRR experience across different Freesync monitors with varying VRR ranges, overdrive/ghosting compensation, LFC support, etc. It's really more about quantity over quality.

Freesync 2 improves on this with mandatory LFC support but is still more accessible with lower HDR certification requirements compared to G-Sync HDR. G-sync HDR follows the HDR10 standard more closely like having at least 1000nits of peak brightness to be certified and and that adds to the cost.


In 2011 Embedded Display port 1.3 spec was introduced. This brought with it the Panel Self Refresh feature. It was this feature that has led to the sync tech that we know have in desktop monitor, Gysnc and Freesync. in October 2013 Nvidia had the press release and demo of Gsync. The DIY kits for the VG248QE weren't available until 2014. The first Gsync monitor wasn't available until the end of August 2014.

AMD were caught off guard by how quickly Nvidia got their demo working and out the door, and this resulted in their pretty bad laptop demo at CES in January 2014. I don't think they were at all surprised by Nvidia releasing Gsync though, as I think both companies had been working towards sync tech. Nvidia had the resources and the money to get their solution out first.

Why do I think both companies had been working towards their own solutions. Well, Nvidia and AMD were on the board of Directors of VESA at that time. AMD put the final adaptive sync proposal to VESA in November 2013. This would have been in the work long before that though and the VESA board would have seen previous versions of the proposal. Isn't funny how, in October, Nvidia had their Gsync demo. Remember it just came out of the blue. There are other reasons that I think AMD were thinking of sync tech before Nvidia's demo. First of all they made their new cards, the R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 compatible with adaptive sync. The R9 290/290x and R7 260x are interesting because these cards were announced and available for sale before the Nvidia demo. These GPUs all were compatible with a display port spec that wouldn't exist until the middle of 2014 and monitors that wouldn't be released until 2015. Why put hardware, not needed by discrete desktop GPUs, into these cards if they hadn't been considering sync technology. Surely, it's not beyond belief that a company specialising in APUs and working closely with the embedded display port spec would have come up with a sync tech idea on their own?

You got the details slightly wrong with the firmware as well. There were claims that some monitors could be upgraded by firmware to support adaptive sync, but, only if the monitor had the hardware, (appropriate scaler etc) already. There were a couple of companies that considered doing it but then changed their minds. IIyama and Nixeus(it was a Nixeus monitor used in the Computex 2014 Freesync demo) were two that I can remember. But, the firmware upgrade involved sending your monitor back to the company to get it done and they were talking about a 28 day turn around and in the end both companies decided not to go ahead with it. AMD's stance on this was simple. Some monitors could be upgraded but that was up to the monitor manufacturer to offer that service.

Not sure what you are trying to say with the scalers. All monitors require scalers. In a laptop the GPU is designed to do that job. In Gsync monitors the FPGA chip handles the scaling. For the adaptive sync display port certification you need a scaler that suits. Your trying to tell me that AMD didn't realise they needed a scaler? LOL sorry, they wrote the proposal that VESA accepted, and Nvidia is part of VESA. Nvidia is on the board of directors. Of course they knew they would need appropriate scalers. The long delay was for other reasons and not totally down to AMD, well, maybe it was, because, for whatever reason AMD choose to go the open route. That involved going with open standards and getting certification and all the delays that entails. Remember it was only in the May 2014 that Display port 1.2a was released. It takes time for manufacturers to adopt new standards. It took longer for AMD to get the first Freesync monitor on the market, but it wasn't as long as you are making it out to be. The first Freesync monitor was released in March 2015. But, there was ghosting issues, but by May this issues were resolved, so say 16 months from demo to the first proper working freesync monitor, compared to 10 months for the first proper Gysnc monitor. Not too bad considering the much slower route AMD had to take to get there. And it wasn't 2 years after the Asus DIY kit either, maybe a year, because it was late too.

You are mistaken where the "free" came from. It wasn't because the hardware was free but because monitor manufacturers didn't have to pay anything to use the display port 1.2a specification. Not sure why people keep repeating this rubbish, "Oh, it's not free you still have to buy the monitor" Well of course you do. It was only rampant Nvidia fanboys that thought otherwise. I don't know any sane person who thought the hardware was going to be free.

I agree with you, freesync was part of AMD plan for selling more GPUs. Of course it was, just like Nvidia's Gysnc. They aren't charities. Monitors are pretty inconsistent across all price ranges. Yeah, sure, Gysnc is more consistent as it has the same range across all it's monitors, but at pretty hefty price tag. With a little of research you can find full range freesync monitors for a lot cheaper. Are there freesync monitors out there without LFC and only 25 hz ranges? yes there are. But, not sure what the fuss is about. Low frame rates are low frame rates, if you are getting under 40fps then games still feel crap on gysnc and freesync. Surely it's better for the guy with a 60hz monitor to have a 20fps freesync range than none at all?

And, as you say, AMD are working towards making Freesync 2 more high end. And that's a good thing. I am not well up with either Gysnc HDR or Freesync HDR but, my understanding is that they are both using HDR10. And, feel free to correct me, I thought there two versions, one with higher peak luminance of 1000nit but no so low on the blacks and the other with a peak luminance of 540 nits but much lower blacks. It's simply which one you opt for. Nvidia have chosen the higher one. Will be interesting to see both in action. Hope windows sorts out it's HDR support and games are widely available before the monitors launch next year.
 
In 2011 Embedded Display port 1.3 spec was introduced. This brought with it the Panel Self Refresh feature. It was this feature that has led to the sync tech that we know have in desktop monitor, Gysnc and Freesync. in October 2013 Nvidia had the press release and demo of Gsync. The DIY kits for the VG248QE weren't available until 2014. The first Gsync monitor wasn't available until the end of August 2014.


AMD were caught off guard by how quickly Nvidia got their demo working and out the door, and this resulted in their pretty bad laptop demo at CES in January 2014. I don't think they were at all surprised by Nvidia releasing Gsync though, as I think both companies had been working towards sync tech. Nvidia had the resources and the money to get their solution out first.


Why do I think both companies had been working towards their own solutions. Well, Nvidia and AMD were on the board of Directors of VESA at that time. AMD put the final adaptive sync proposal to VESA in November 2013. This would have been in the work long before that though and the VESA board would have seen previous versions of the proposal. Isn't funny how, in October, Nvidia had their Gsync demo. Remember it just came out of the blue. There are other reasons that I think AMD were thinking of sync tech before Nvidia's demo. First of all they made their new cards, the R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 compatible with adaptive sync. The R9 290/290x and R7 260x are interesting because these cards were announced and available for sale before the Nvidia demo. These GPUs all were compatible with a display port spec that wouldn't exist until the middle of 2014 and monitors that wouldn't be released until 2015. Why put hardware, not needed by discrete desktop GPUs, into these cards if they hadn't been considering sync technology. Surely, it's not beyond belief that a company specialising in APUs and working closely with the embedded display port spec would have come up with a sync tech idea on their own?


You got the details slightly wrong with the firmware as well. There were claims that some monitors could be upgraded by firmware to support adaptive sync, but, only if the monitor had the hardware, (appropriate scaler etc) already. There were a couple of companies that considered doing it but then changed their minds. IIyama and Nixeus(it was a Nixeus monitor used in the Computex 2014 Freesync demo) were two that I can remember. But, the firmware upgrade involved sending your monitor back to the company to get it done and they were talking about a 28 day turn around and in the end both companies decided not to go ahead with it. AMD's stance on this was simple. Some monitors could be upgraded but that was up to the monitor manufacturer to offer that service.


Not sure what you are trying to say with the scalers. All monitors require scalers. In a laptop the GPU is designed to do that job. In Gsync monitors the FPGA chip handles the scaling. For the adaptive sync display port certification you need a scaler that suits. Your trying to tell me that AMD didn't realise they needed a scaler? LOL sorry, they wrote the proposal that VESA accepted, and Nvidia is part of VESA. Nvidia is on the board of directors. Of course they knew they would need appropriate scalers. The long delay was for other reasons and not totally down to AMD, well, maybe it was, because, for whatever reason AMD choose to go the open route. That involved going with open standards and getting certification and all the delays that entails. Remember it was only in the May 2014 that Display port 1.2a was released. It takes time for manufacturers to adopt new standards. It took longer for AMD to get the first Freesync monitor on the market, but it wasn't as long as you are making it out to be. The first Freesync monitor was released in March 2015. But, there was ghosting issues, but by May this issues were resolved, so say 16 months from demo to the first proper working freesync monitor, compared to 10 months for the first proper Gysnc monitor. Not too bad considering the much slower route AMD had to take to get there. And it wasn't 2 years after the Asus DIY kit either, maybe a year, because it was late too.


You are mistaken where the "free" came from. It wasn't because the hardware was free but because monitor manufacturers didn't have to pay anything to use the display port 1.2a specification. Not sure why people keep repeating this rubbish, "Oh, it's not free you still have to buy the monitor" Well of course you do. It was only rampant Nvidia fanboys that thought otherwise. I don't know any sane person who thought the hardware was going to be free.


I agree with you, freesync was part of AMD plan for selling more GPUs. Of course it was, just like Nvidia's Gysnc. They aren't charities. Monitors are pretty inconsistent across all price ranges. Yeah, sure, Gysnc is more consistent as it has the same range across all it's monitors, but at pretty hefty price tag. With a little of research you can find full range freesync monitors for a lot cheaper. Are there freesync monitors out there without LFC and only 25 hz ranges? yes there are. But, not sure what the fuss is about. Low frame rates are low frame rates, if you are getting under 40fps then games still feel crap on gysnc and freesync. Surely it's better for the guy with a 60hz monitor to have a 20fps freesync range than none at all?


And, as you say, AMD are working towards making Freesync 2 more high end. And that's a good thing. I am not well up with either Gysnc HDR or Freesync HDR but, my understanding is that they are both using HDR10. And, feel free to correct me, I thought there two versions, one with higher peak luminance of 1000nit but no so low on the blacks and the other with a peak luminance of 540 nits but much lower blacks. It's simply which one you opt for. Nvidia have chosen the higher one. Will be interesting to see both in action. Hope windows sorts out it's HDR support and games are widely available before the monitors launch next year.


First of all, VRR works by manipulating the vBlank signal interval to the monitor so the fps is synced with the refresh rate where vBlank represents the interval between the time when a monitor finishes drawing the current frame, and the beginning of the next frame. Let me reiterate, it's a different application of variable refresh. That's where the innovation came from.

Anyway, it was obvious AMD didn't even think about VRR since they were more focused on delivering their "titan-killing" hardware, drivers, ISV relations as well as their "never settle" and "gaming evolved" marketing campaign at the time. When G-sync was announced, AMD reacted. It was like how AMD only thought about fixing their frame pacing issues after frame time benchmarking and subjective smoothness testing revealed how Nvidia was actually on top of the issue already with both single GPU and more importantly, SLI. They were already doing internal testing on frame times with FCAT which was eventually shared to the press to use for reviews. In fact, the first indicator of Nvidia developing superior alternatives to V-sync that addresses screen tearing, input lag and stutter was when they introduced "Adaptive V-Sync" in the NV control panel back in 2012. It wasn't a perfect solution so it was natural for Nvidia to not stop there and continue developing their sync technology. The salient point here was that they were paying attention to the problem and doing something about it before anyone else did. Nvidia, like all other tech companies work on top secret innovations that when released and if these features are well received, the competition tends to scramble to come up with their own version. Nothing unusual here.

Its not entirely unlikely that despite AMD claiming that their 200-series GPUs have updated display controllers to support adaptive sync in their marketing materials, it may very well have meant that they have just developed a way to make freesync work on these cards through the drivers and only required display controllers that are DP 1.2A compliant which these cards already had and the previous generations did not have. You failed to mention that Adaptive Sync was an initiative that was solely proposed by AMD to VESA as an additional optional feature that updated the existing DP 1.2a standard as groundwork for their Freesync implementation and this happened after G-sync was announced in 2013. Adaptive sync was not originally part of the DP 1.2a spec when it was ratified by the VESA group.

About the firmware, all I said was that there was an implication that it could be done for free and and now I realize, maybe at a nominal logistical cost. It was in the early days so the initial PR indicated as I said it and the point here was the cost. The details you mentioned came after as the situation developed. About free.....please re-read my post about that and royalties and such. As I said, the definition of "free" by their PR changed as the situation developed for freesync.

You totally misunderstood my point about scalars. What I was saying is that laptops do not require discrete scalar boards through eDP and I know the GPU handles its job instead. It's different for desktop monitors which do require these scalar board. They are not directly comparable as to how VRR is implemented on each of them. You claimed that "Nvidia are using the exact same solution in their Gsync laptops." so no, that is not the case at all.

AMD simply needed the time to develop freesync from the ground-up. The ground work for the adaptive sync vesa dp 1.2a optional feature and getting the scalar manufacturers on board was just part of it. You do remember the teething pains freesync went through early on. They didn't even have LFC until they found out the G-sync module was responsible for frame multiplication below the VRR minimum supported refresh rate of the panel. Early freesync models also had terrible ghosting issues as seen in their own windmill demo whereas from the beginning, the G-sync module already has overdrive and anti-ghosting algorithms running on it that minimized these VRR artifacts. Clearly, freesync was in the early days if it's development. It can't be denied that G-sync was already working from day-one even if wide availability came a bit later. It was brand new tech so availability had to ramp up from there. The fact of the matter is, Nv was the first out of the gate with this technology. Considering the fact that G-sync even uses a hardware module and how polished it worked from the get-go, it's easy to tell that in this case, G-sync was already in development much earlier and longer.

The status quo right now is that freesync and gsync address different markets. The latter seems to have settled with focusing on being the premium product for those who are willing to pay for it. As always, the market will dictate how this will all turn out in the end.

This time, it was AMD who is first out of the gate with a Freesync 2 HDR monitor with the Samsung CHG70 but unfortunately, the HDR implementation and experience is half-baked at best according to the published reviews of the product, particularly for having very few zones for localized dimming and having less dynamic range capability in relation to the HDR10 spec. Of course, we can't entirely blame AMD here because there are many other factors hindering the proper adoption of HDR for the PC. Maybe that's why G-sync HDR has been delayed.
 
First of all, VRR works by manipulating the vBlank signal interval to the monitor so the fps is synced with the refresh rate where vBlank represents the interval between the time when a monitor finishes drawing the current frame, and the beginning of the next frame. Let me reiterate, it's a different application of variable refresh. That's where the innovation came from.

Anyway, it was obvious AMD didn't even think about VRR since they were more focused on delivering their "titan-killing" hardware, drivers, ISV relations as well as their "never settle" and "gaming evolved" marketing campaign at the time. When G-sync was announced, AMD reacted. It was like how AMD only thought about fixing their frame pacing issues after frame time benchmarking and subjective smoothness testing revealed how Nvidia was actually on top of the issue already with both single GPU and more importantly, SLI. They were already doing internal testing on frame times with FCAT which was eventually shared to the press to use for reviews. In fact, the first indicator of Nvidia developing superior alternatives to V-sync that addresses screen tearing, input lag and stutter was when they introduced "Adaptive V-Sync" in the NV control panel back in 2012. It wasn't a perfect solution so it was natural for Nvidia to not stop there and continue developing their sync technology. The salient point here was that they were paying attention to the problem and doing something about it before anyone else did. Nvidia, like all other tech companies work on top secret innovations that when released and if these features are well received, the competition tends to scramble to come up with their own version. Nothing unusual here.

Its not entirely unlikely that despite AMD claiming that their 200-series GPUs have updated display controllers to support adaptive sync in their marketing materials, it may very well have meant that they have just developed a way to make freesync work on these cards through the drivers and only required display controllers that are DP 1.2A compliant which these cards already had and the previous generations did not have. You failed to mention that Adaptive Sync was an initiative that was solely proposed by AMD to VESA as an additional optional feature that updated the existing DP 1.2a standard as groundwork for their Freesync implementation and this happened after G-sync was announced in 2013. Adaptive sync was not originally part of the DP 1.2a spec when it was ratified by the VESA group.

About the firmware, all I said was that there was an implication that it could be done for free and and now I realize, maybe at a nominal logistical cost. It was in the early days so the initial PR indicated as I said it and the point here was the cost. The details you mentioned came after as the situation developed. About free.....please re-read my post about that and royalties and such. As I said, the definition of "free" by their PR changed as the situation developed for freesync.

You totally misunderstood my point about scalars. What I was saying is that laptops do not require discrete scalar boards through eDP and I know the GPU handles its job instead. It's different for desktop monitors which do require these scalar board. They are not directly comparable as to how VRR is implemented on each of them. You claimed that "Nvidia are using the exact same solution in their Gsync laptops." so no, that is not the case at all.

AMD simply needed the time to develop freesync from the ground-up. The ground work for the adaptive sync vesa dp 1.2a optional feature and getting the scalar manufacturers on board was just part of it. You do remember the teething pains freesync went through early on. They didn't even have LFC until they found out the G-sync module was responsible for frame multiplication below the VRR minimum supported refresh rate of the panel. Early freesync models also had terrible ghosting issues as seen in their own windmill demo whereas from the beginning, the G-sync module already has overdrive and anti-ghosting algorithms running on it that minimized these VRR artifacts. Clearly, freesync was in the early days if it's development. It can't be denied that G-sync was already working from day-one even if wide availability came a bit later. It was brand new tech so availability had to ramp up from there. The fact of the matter is, Nv was the first out of the gate with this technology. Considering the fact that G-sync even uses a hardware module and how polished it worked from the get-go, it's easy to tell that in this case, G-sync was already in development much earlier and longer.

The status quo right now is that freesync and gsync address different markets. The latter seems to have settled with focusing on being the premium product for those who are willing to pay for it. As always, the market will dictate how this will all turn out in the end.

This time, it was AMD who is first out of the gate with a Freesync 2 HDR monitor with the Samsung CHG70 but unfortunately, the HDR implementation and experience is half-baked at best according to the published reviews of the product, particularly for having very few zones for localized dimming and having less dynamic range capability in relation to the HDR10 spec. Of course, we can't entirely blame AMD here because there are many other factors hindering the proper adoption of HDR for the PC. Maybe that's why G-sync HDR has been delayed.

Yeah, I know what Vblank is. But, it was the panel self refresh feature of the eDP spec that led to the sync technologies that we now have. It introduced the tcon and framebufers needed to manipulate the Vblank signal.

Wait, you are concluding that AMD couldn't have been thinking about VRR because Nvidia came out with adaptive Vsync first? And the other reason you give is that they were too busy at that time. What? Do you think that VRR solutions are invented in a month? Do you really think that AMD, with it's very limited resources and in the middle of a very busy month for them with just after launching their new cards, were able to come out and come up with their own solution to VRR in less than a month? Not a hope. Not only they did they somehow manage to come up with a VRR solution they also managed to submit a change to VESA and get to the final proposal stages of VESA's certification system in that same month. Which is kind of impossible.

And sorry, your next paragraph make no sense what so ever. We were talking about adaptive sync, that was only introduced to the display port spec in May 2014. And, as you say, only as an optional part of the spec, it's still an optional part. So, that makes it even stranger. Why would AMD make their cards compatible with an optional part of the spec if they weren't considering a VRR solution? If a simple driver update was all that was needed why didn't they make their 7 series cards compatible? Especially as several of the cards compatible with Freesync were just straight up rebrands. It would have required very little extra work for AMD and none at all for the rebrands. It makes no sense that they didn't especially for the kudos it would have earned them.

It's a moot point anyway, because you need extra hardware between the display port and GPU to be compatible with the 1.2a adaptive sync specification. That's why the R7 260 and R7 265 work with VRR monitors and the 7790 does not. It's also why all the GCN APU's work with VRR monitors, because of the eDP specification they already have the needed hardware.

So, yeah, they were working on VRR before Nvidia announced Gysnc. But you carry on believing that somehow AMD, in less than month developed a VRR solution and put forward a proposal to VESA in the middle of the November 2013 that was good enough to pass VESA's certification process. And all this while in the middle of one of their busiest months ever. It's AMD. They don't do anything quickly.

I didn't misunderstand your point about scalers at all. You did your best to make it sound like AMD didn't have a clue when they started and thought all monitors would work with VRR. Which wasn't the case at all. Proper scalers were part of the specification. And I didn't say that Gysnc on desktops is the same as Gysnc on laptops. But Gsync on laptops is the same as adaptive sync. Nvidia even said this was the case. That they didn't need to use the gysnc module because the hardware and standards were already there in laptops. Freesync and Gsync on laptops will be utilising the same eDP standards. Those same eDP standards which led to adaptive sync.

Their definition of Free never changed. NEVER. The "Free" was entirely down to not having to pay VESA or AMD to use adaptive sync in your monitors. That's where it came from. The other crap you are talking about came from a statement made that maybe some monitors could be upgraded for free if they had the right hardware. It was never a case of every monitor would be upgradable or that every GPU would support it. Why do you persist with this? The Freesync name never meant that the monitors and GPU would be free.

Gysnc had teething problems too. There was a flicker issue that took months to solve and then there was the input lag problem which again took a long time to resolve. And despite their huge clout and resources it still took them 10 months to get a monitor to market. And, it wasn't all plain sailing, there were issues to be resolved. And I did mention the ghosting the problems that AMD had with Freesync, I even allowed time for that, by saying it was 16 months from demo to launch. When in fact it was 13 months. But the ghosting was so bad on the first monitors that they weren't great for VRR. LFC came out in November. But, there were several full range monitors out before that time. It's not as if there wasn't choices for people. AMD simply do not have the resources. It takes them time to implement everything and get all the bug fixes out there.

I always said Nvidia were first to market, where did I ever say otherwise?

The Samsung monitor you mention, well, there seems to be some doubts about it. The big one is that Samsung maybe are lying about it been a Freesync 2 monitor. There is no LFC and poor HDR. Never mind the problems with the half pixel height making text blurry. It's probably why there are no other Freesync 2 HDR monitors available until next year. Samsung probably jumped the gun like Ben Q did when releasing the first freesync monitor.
 
Nothing inferior about Freesync monitors. Some of the best monitors on the market are Freesync monitors.

Can Nvidia lower the price of Gsync? They have to buy the FPGA chip and program it for each monitor. How much can they reduce the price and still make money?

Not sure why I replied to your post as it is obviously just trolling.


Look at the resale value of Nvidia and Intel products then AMD. I'm sorry but I'm not going to buy a potato AMD product.
 
Yeah, I know what Vblank is. But, it was the panel self refresh feature of the eDP spec that led to the sync technologies that we now have. It introduced the tcon and framebufers needed to manipulate the Vblank signal.

Wait, you are concluding that AMD couldn't have been thinking about VRR because Nvidia came out with adaptive Vsync first? And the other reason you give is that they were too busy at that time. What? Do you think that VRR solutions are invented in a month? Do you really think that AMD, with it's very limited resources and in the middle of a very busy month for them with just after launching their new cards, were able to come out and come up with their own solution to VRR in less than a month? Not a hope. Not only they did they somehow manage to come up with a VRR solution they also managed to submit a change to VESA and get to the final proposal stages of VESA's certification system in that same month. Which is kind of impossible.

And sorry, your next paragraph make no sense what so ever. We were talking about adaptive sync, that was only introduced to the display port spec in May 2014. And, as you say, only as an optional part of the spec, it's still an optional part. So, that makes it even stranger. Why would AMD make their cards compatible with an optional part of the spec if they weren't considering a VRR solution? If a simple driver update was all that was needed why didn't they make their 7 series cards compatible? Especially as several of the cards compatible with Freesync were just straight up rebrands. It would have required very little extra work for AMD and none at all for the rebrands. It makes no sense that they didn't especially for the kudos it would have earned them.

It's a moot point anyway, because you need extra hardware between the display port and GPU to be compatible with the 1.2a adaptive sync specification. That's why the R7 260 and R7 265 work with VRR monitors and the 7790 does not. It's also why all the GCN APU's work with VRR monitors, because of the eDP specification they already have the needed hardware.

So, yeah, they were working on VRR before Nvidia announced Gysnc. But you carry on believing that somehow AMD, in less than month developed a VRR solution and put forward a proposal to VESA in the middle of the November 2013 that was good enough to pass VESA's certification process. And all this while in the middle of one of their busiest months ever. It's AMD. They don't do anything quickly.

I didn't misunderstand your point about scalers at all. You did your best to make it sound like AMD didn't have a clue when they started and thought all monitors would work with VRR. Which wasn't the case at all. Proper scalers were part of the specification. And I didn't say that Gysnc on desktops is the same as Gysnc on laptops. But Gsync on laptops is the same as adaptive sync. Nvidia even said this was the case. That they didn't need to use the gysnc module because the hardware and standards were already there in laptops. Freesync and Gsync on laptops will be utilising the same eDP standards. Those same eDP standards which led to adaptive sync.

Their definition of Free never changed. NEVER. The "Free" was entirely down to not having to pay VESA or AMD to use adaptive sync in your monitors. That's where it came from. The other crap you are talking about came from a statement made that maybe some monitors could be upgraded for free if they had the right hardware. It was never a case of every monitor would be upgradable or that every GPU would support it. Why do you persist with this? The Freesync name never meant that the monitors and GPU would be free.

Gysnc had teething problems too. There was a flicker issue that took months to solve and then there was the input lag problem which again took a long time to resolve. And despite their huge clout and resources it still took them 10 months to get a monitor to market. And, it wasn't all plain sailing, there were issues to be resolved. And I did mention the ghosting the problems that AMD had with Freesync, I even allowed time for that, by saying it was 16 months from demo to launch. When in fact it was 13 months. But the ghosting was so bad on the first monitors that they weren't great for VRR. LFC came out in November. But, there were several full range monitors out before that time. It's not as if there wasn't choices for people. AMD simply do not have the resources. It takes them time to implement everything and get all the bug fixes out there.

I always said Nvidia were first to market, where did I ever say otherwise?

The Samsung monitor you mention, well, there seems to be some doubts about it. The big one is that Samsung maybe are lying about it been a Freesync 2 monitor. There is no LFC and poor HDR. Never mind the problems with the half pixel height making text blurry. It's probably why there are no other Freesync 2 HDR monitors available until next year. Samsung probably jumped the gun like Ben Q did when releasing the first freesync monitor.
I posted earlier that I found a specific post from 2008 on VRR which in almost everyway mirrored how it works today, not specifics to VBlank but monitor refresh rate change. I am not sure if we can be certain who began the tech only who showed up first. But being first to market doesn't necessarily mean first to have the idea.

Your take was quite interesting as well.
 
Look at the resale value of Nvidia and Intel products then AMD. I'm sorry but I'm not going to buy a potato AMD product.
You mean the resale of AMD products where 2 year old used cards sell for $200 over new MSRP due to mining? (RX470, RX480, RX570, RX580). ;)
 
Yeah, I know what Vblank is. But, it was the panel self refresh feature of the eDP spec that led to the sync technologies that we now have. It introduced the tcon and framebufers needed to manipulate the Vblank signal.

Wait, you are concluding that AMD couldn't have been thinking about VRR because Nvidia came out with adaptive Vsync first? And the other reason you give is that they were too busy at that time. What? Do you think that VRR solutions are invented in a month? Do you really think that AMD, with it's very limited resources and in the middle of a very busy month for them with just after launching their new cards, were able to come out and come up with their own solution to VRR in less than a month? Not a hope. Not only they did they somehow manage to come up with a VRR solution they also managed to submit a change to VESA and get to the final proposal stages of VESA's certification system in that same month. Which is kind of impossible.

And sorry, your next paragraph make no sense what so ever. We were talking about adaptive sync, that was only introduced to the display port spec in May 2014. And, as you say, only as an optional part of the spec, it's still an optional part. So, that makes it even stranger. Why would AMD make their cards compatible with an optional part of the spec if they weren't considering a VRR solution? If a simple driver update was all that was needed why didn't they make their 7 series cards compatible? Especially as several of the cards compatible with Freesync were just straight up rebrands. It would have required very little extra work for AMD and none at all for the rebrands. It makes no sense that they didn't especially for the kudos it would have earned them.

It's a moot point anyway, because you need extra hardware between the display port and GPU to be compatible with the 1.2a adaptive sync specification. That's why the R7 260 and R7 265 work with VRR monitors and the 7790 does not. It's also why all the GCN APU's work with VRR monitors, because of the eDP specification they already have the needed hardware.

So, yeah, they were working on VRR before Nvidia announced Gysnc. But you carry on believing that somehow AMD, in less than month developed a VRR solution and put forward a proposal to VESA in the middle of the November 2013 that was good enough to pass VESA's certification process. And all this while in the middle of one of their busiest months ever. It's AMD. They don't do anything quickly.

I didn't misunderstand your point about scalers at all. You did your best to make it sound like AMD didn't have a clue when they started and thought all monitors would work with VRR. Which wasn't the case at all. Proper scalers were part of the specification. And I didn't say that Gysnc on desktops is the same as Gysnc on laptops. But Gsync on laptops is the same as adaptive sync. Nvidia even said this was the case. That they didn't need to use the gysnc module because the hardware and standards were already there in laptops. Freesync and Gsync on laptops will be utilising the same eDP standards. Those same eDP standards which led to adaptive sync.

Their definition of Free never changed. NEVER. The "Free" was entirely down to not having to pay VESA or AMD to use adaptive sync in your monitors. That's where it came from. The other crap you are talking about came from a statement made that maybe some monitors could be upgraded for free if they had the right hardware. It was never a case of every monitor would be upgradable or that every GPU would support it. Why do you persist with this? The Freesync name never meant that the monitors and GPU would be free.

Gysnc had teething problems too. There was a flicker issue that took months to solve and then there was the input lag problem which again took a long time to resolve. And despite their huge clout and resources it still took them 10 months to get a monitor to market. And, it wasn't all plain sailing, there were issues to be resolved. And I did mention the ghosting the problems that AMD had with Freesync, I even allowed time for that, by saying it was 16 months from demo to launch. When in fact it was 13 months. But the ghosting was so bad on the first monitors that they weren't great for VRR. LFC came out in November. But, there were several full range monitors out before that time. It's not as if there wasn't choices for people. AMD simply do not have the resources. It takes them time to implement everything and get all the bug fixes out there.

I always said Nvidia were first to market, where did I ever say otherwise?

The Samsung monitor you mention, well, there seems to be some doubts about it. The big one is that Samsung maybe are lying about it been a Freesync 2 monitor. There is no LFC and poor HDR. Never mind the problems with the half pixel height making text blurry. It's probably why there are no other Freesync 2 HDR monitors available until next year. Samsung probably jumped the gun like Ben Q did when releasing the first freesync monitor.


Got your time lines wrong man.

2013 think it was Oct, Nov, nV announced Gsync and demoed it on desktop monitors, 2014 January AMD announced and demoed it on laptop. It wasn't till around Jan 2015, did AMD finally show Free Sync desktop monitors off at CES, it was more than a year after, it was 1.25 years.

Coincidentally, it took nV about 1 year to get G sync to a working product as well, if we are to believe Tom Patterson. Tom even stated to get it to work on laptops they could have done it fairly easily, much sooner. AMD even mentioned this later on too.

So nV started with Gsync in early 2012 end of 2011 and released first working demos of desktop monitors in end of 2013 and starting selling DIY kits in early 2014, which is the same time frame it took AMD to get from demo on laptop, which was soon after Gsync was shown off on desktop, to showing and releasing desktop monitors with Freesync.

Yeah for AMD it was a reactionary project, they knew they could do it on desktops, but they needed to get the scalar tech for desktop monitors in there, this is where the extra cost comes in, its not extra cost to the end user by AMD, its extra cost to the display manufactures, which in turn increases price of the monitors, but not by much, talking like 50 bucks for the higher end ones with better a feature set. AMD is not making money on this, and that is why its "free", Display manufacturers on the other hand, get a little bit extra.

Now back to time lines, in 2015 (early to mid), early the first G sync included monitors were released, it wasn't till 2016 early 1Q did the first AMD Freesync monitor get released.

So all time lines for Free sync was one year out of G Sync for desktop parts, that shows the development time line right there, 1 year or so for Free sync. So it all lines up.
 
Last edited:
G-Sync monitors were out earlier than that, my PG278Q is has manufacturing date of October 2014, and I bought it in November 2014. TFTCentral's review is dated July 2014.
 
Transitioning glasses usually do not react to bright light, they react to UV light in sunlight.

You can test this by being in a car, glass in the windscreen usually do a good job of absorbing UV light so your transition lenses won't turn as dark as you would standing outside.

Source: I own a pair of transition glasses.
Don't buy cheap transition lenses - some now also will react to natural light.
 
Don't buy cheap transition lenses - some now also will react to natural light.
My transition lenses were not cheap.

Also my transition lenses are 3 years old, when I bought it the XTRActive lenses were not on sale, so it's not like I had a choice in the matter.
 
Last edited:
I really wish they would. I'm sticking with AMD in one of my machines if not for any other reason than I own a FreeSync display.

Although I'll be honest, nVidias drivers are starting to annoy the shit out of me anyway. Bloated junk that needs an account tied to it just to auto-update drivers. Never thought I'd see the day where I praised AMD for their drivers.
 
  • Like
Reactions: noko
like this
I really wish they would. I'm sticking with AMD in one of my machines if not for any other reason than I own a FreeSync display.

Although I'll be honest, nVidias drivers are starting to annoy the shit out of me anyway. Bloated junk that needs an account tied to it just to auto-update drivers. Never thought I'd see the day where I praised AMD for their drivers.


Don't need to use geforce experience if you don't want to, they never tied it down, they said they would but I think the kick back from that they stopped it.
 
Don't need to use geforce experience if you don't want to, they never tied it down, they said they would but I think the kick back from that they stopped it.

I still install only the drivers, the installer does not require me to install Geforce experience. I also have not had to use any kinda of driver cleaners for a very long time, now to be safe I always choose the clean install option in the setup. It only take 5 minutes to re-apply any settings that I have manually set in the drivers for specific games.
 
Back
Top