LG to offer G-Sync firmware updates for 2019 OLED TVs

That is true for standard Freesync monitor, but not Freesync 2. Anything with Freesync 2 support has to go through certification and pass various tests. As far as I can tell FS2 and GSC have pretty similar requirements, with one big difference being that FS2 doesn't require the monitor to default to "Freesync Enabled" in the OSD.
I believe to be certified for Freesync 2, it must also have HDR400 (used to be 600) capability and Low Framerate Compensation. So I wouldn't say GSC and FS2 have similar requirements unless it is GSU.
 
I believe to be certified for Freesync 2, it must also have HDR400 (used to be 600) capability and Low Framerate Compensation. So I wouldn't say GSC and FS2 have similar requirements unless it is GSU.

I thought AMD later split up HDR from Freesync 2... but I don't remember what the new one with HDR is.

Not that it matters much as we see almost no new Freesync 2 displays, and likely won't so long as AMD is content to max out in the mid-range GPU space.
 
I thought AMD later split up HDR from Freesync 2... but I don't remember what the new one with HDR is.

Not that it matters much as we see almost no new Freesync 2 displays, and likely won't so long as AMD is content to max out in the mid-range GPU space.
I thought it was just a rebranding from FS2 to FS2 HDR due to FS2 was release before HDR standardization from VESA.
 
There are plenty of happy Freesync users and it works just fine FUD boy. So fine, Nvidia copied it, Imagine that.

Uhh, no. Before nVidia created G-sync, VRR did not exist. They saw a problem (pointed out with the first frame-time analysis' did at sites like [H]), and fixed it. It showed up in SLI, cross-fire, but also even in single GPU implementations (iirc AMD's cards at the time had really bad frame time plots).

No wait - buy our obsolete G-Sync chips and pony up more cash. LOL

Not quite. Home TV displays getting a G-Sync certification do not require G-Sync modules... duh, article says its a firmware update.

Keep on hatin' though, it's what you are best at.
 
Uhh, no. Before nVidia created G-sync, VRR did not exist. They saw a problem (pointed out with the first frame-time analysis' did at sites like [H]), and fixed it. It showed up in SLI, cross-fire, but also even in single GPU implementations (iirc AMD's cards at the time had really bad frame time plots).

Stop talking a bout "back in the day". no one gives a crap, except u.

Not quite. Home TV displays getting a G-Sync certification do not require G-Sync modules... duh, article says its a firmware update.

The discussion migrated past TV, Try and keep up.

Keep on hatin' though, it's what you are best at.

Hate is a strong word, it's just a discussion. Lighten up, Francis.

PS - as always, appreciate your informative feedback, keep it up!
 
"G-sync Compatible" is another way of rebranding variable refresh rate technology. A firmware update will not add the G-Sync chips (which is now branded as G-Sync Ultimate) but can unlock adaptive sync via the variable refresh rate part of the Displayport standard. NVIDIA locked away VRR through the Displayport standard for the longest time to make more money selling their G-sync chips but recently started allowing regular Adaptive sync over Displayport to take away the Freesync marketing point from AMD. Overall this is good for the market as both AMD and NVIDIA now support the Displayport VRR standard rather than NVIDIA artifically locking it away. However, NVIDIA wants to take advantage of this by branding Displayport VRR as "G-Sync Compatible" rather than allowing AMD to put "Freesync compatible" stickers on the monitors as less savvy buyers might then think they need an NVIDIA GPU to make it work like it did in the past.


This isn't right. NVidia didn't control the Display Port standard. NVidia controlled whether or not VRR was supported from their cards to non-G-Sync displays. And in turn, whether Non-G-Synce monitors would be able to benefit from NVidia cards regarding adaptive refresh tech.
 
I disagree entirely. As a rule, I also care about back in the day, because it has an undeniable bearing on where we find ourselves today.

As usually taken completely out of context. Good Job.
 
Freesync is an AMD owned trademark, they couldn’t use that name even if they wanted to. Also, they’re both just brand names for Adaptive Sync so the whole Freesync and G-Sync Compatible brandings are nonsense anyway.

This isn't correct. Adaptive Sync is a tech capability that is part of the VESA standard. G-Sync and Freesync are corporate brand names for NVidia's and AMD's implementation of that standard. It is not at all outside the realm of possibility that another company such as Intel, may offer their own native adaptive sync solution calling it whatever they wish. Let me make sure I am clear, although you may feel it semantics, adaptive sync is a capability that may be leveraged in the VESA standard, not a vendor's implementation of that standard. Those solutions are not just a vendor's branding name for an element of the VESA standard.
 
As usually taken completely out of context. Good Job.

As usual, I don't mind going back over it to see if I have been ..... unduly critical.

EDITED: Oh, I see. I stepped into the middle of an epeen contest and my intrusion is not welcome.

Kindly disregard and continue pointlessly insulting each other. I'll bow out gracefully and leave you two men to your own fun.
 
This isn't right. NVidia didn't control the Display Port standard. NVidia controlled whether or not VRR was supported from their cards to non-G-Sync displays. And in turn, whether Non-G-Synce monitors would be able to benefit from NVidia cards regarding adaptive refresh tech.

Err, I don't think you are comprehending what was written. The whole point of the post was that NVIDIA was artificially locking away the VRR displayport standard from working with their GPUs for the longest time. Then they stopped locking it way so a software update could enable VRR compatibility on their GPUs. How you concluded that the post was saying NVIDIA should get credit for controlling or developing the Displayport standard makes no sense to me.

To repeat in different words, Displayport adaptive sync existed prior to both Freesync and G-sync branding and was first added to the specification in 2009. AMD initially decided to use Displayport adaptive sync and they rebranded it Freesync for marketing purposes. NVIDIA for the longest time used a proprietary solution with the previously mentioned chips and blocked VRR compatibility. I said its a good thing both parties are now supporting VESA Displayport VRR. Currently, both companies are trying to get a marketing advantage by rebranding the same thing as either "AMD Freesync" or as "Nvidia G-Sync compatible." Not sure what's wrong or controversial about any of these statements.
 
Freesync is an AMD owned trademark, they couldn’t use that name even if they wanted to. Also, they’re both just brand names for Adaptive Sync so the whole Freesync and G-Sync Compatible brandings are nonsense anyway.

Pretty sure even though AMD "owns" the name it's open for anyone to use royalty free.

NVIDIA is just using this to get the brand name "G-Sync" more exposure.
 
Pretty sure even though AMD "owns" the name it's open for anyone to use royalty free.

And, so long as they don't call it "Freesync 2" etc., they can implement the very bare minimum and use the Freesync trademark.

NVIDIA is just using this to get the brand name "G-Sync" more exposure.

What's important is that their certification process does actually provide value -- it's something that any company marketing a gaming monitor would want since it sets a very high standard that gamers associate with quality. This means that your average 'Freesync' display is improving in VRR implementation and that finding a Freesync display with a great implementation is now much easier than when AMD released Freesync.
 
This isn't correct. Adaptive Sync is a tech capability that is part of the VESA standard. G-Sync and Freesync are corporate brand names for NVidia's and AMD's implementation of that standard. It is not at all outside the realm of possibility that another company such as Intel, may offer their own native adaptive sync solution calling it whatever they wish.

You know, Nvidia really should have gone with another name besides G-Sync. They could have called it

upload_2019-11-5_12-48-54.png
 
Err, I don't think you are comprehending what was written. The whole point of the post was that NVIDIA was artificially locking away the VRR displayport standard from working with their GPUs for the longest time. Then they stopped locking it way so a software update could enable VRR compatibility on their GPUs. How you concluded that the post was saying NVIDIA should get credit for controlling or developing the Displayport standard makes no sense to me.

To repeat in different words, Displayport adaptive sync existed prior to both Freesync and G-sync branding and was first added to the specification in 2009. AMD initially decided to use Displayport adaptive sync and they rebranded it Freesync for marketing purposes. NVIDIA for the longest time used a proprietary solution with the previously mentioned chips and blocked VRR compatibility. I said its a good thing both parties are now supporting VESA Displayport VRR. Currently, both companies are trying to get a marketing advantage by rebranding the same thing as either "AMD Freesync" or as "Nvidia G-Sync compatible." Not sure what's wrong or controversial about any of these statements.

So let's address this in order. It's you who is misunderstanding me. Unless you have knowledge of the code, "locking it away" is not accurate. NVidia developed G-Sync, put G-Sync modules in their cards, and set out to license the corresponding G-Sync support modules to monitor manufacturers so that the first VRR solution could work. Later, AMD found another solution that they called Freesync, and for it to work, all they needed was a little additional code and modification to the VESA standard. AMD didn't give their end of the Freesync solution to anybody that I am aware of, although the little bit of code they supplied for the monitor end was touted as being "Free", as in Freesync. You can correct me if I am wrong, but has anyone else seen other video card vendors selling cards with Freesync support? Trademarked from AMD? Maybe a third party reseller of AMD made cards, but not like an Intel produced card that I know of.

I just shortened this considerably and I'll leave you with two posts which you are free to dispute if you wish, but I believe them to be accurate.

October 18, 2013
https://www.nvidia.com/en-us/geforc...volutionary-ultra-smooth-stutter-free-gaming/
May 12, 2014
https://www.anandtech.com/show/8008...andard-variable-refresh-monitors-move-forward

Now the dates above are the dates of the article's posting, not the true dates of the tech releases, but there is an approximate seven month difference and although the VESA DisplayPort standard was released prior, it did not support VRR until after NVidia had released G-Sync.

Have not not claimed that NVidia is responsible for anything having to do with the DisplayPort standard other than their own support, or lack there-of, of elements of that standard.
 
Last edited:
There’s some serious revisionist history taking place in this thread.

Yes, VRR is part of the DisplayPort standard but nobody was doing VRR before GSync came along.

You can argue that Nvidia should’ve promoted adoption of the VESA standard but that would require monitor manufacturers to get on board with the required hardware support. It was easier for Nvidia to just swap in their own scaler and control the entire ecosystem. Who knows what would’ve happened if they left it up to the monitor guys.

What has happened since then thanks to GSync marketing is that the market has recognized the demand for VRR and monitor manufacturers are slowly adding decent support for the VESA standard. However the quality of those implementations is all over the place so having certification programs is still very important.

So thank you Nvidia for waking people up to the awesomeness of VRR. I for one don’t care that you made a few dollars in the process.
 
May 2014: Vesa Displayport Adaptive Sync announcement: https://vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Article says adaptive sync was 'part of the eDP spec' (whatever that means), but as far as I can determine, no one had adaptive sync of any kind until G-Sync came out, so we all have nVidia to thank for this as I understand it.

To bring in some history for further clarification: 'adaptive sync' in eDP, that is the Display Port interface designed to be used for built-in displays on laptops, was created to allow for laptops to stop refreshing their panels when there was nothing new to update as a power saving measure.

It had nothing to do with VRR as we know it for gaming.

However, once Nvidia pushed G-Sync hardware into the public, AMD scrambled and hacked eDP on a laptop to perform VRR to a very limited degree.

Now, their hack was pretty cheap for laptop hardware, and that hardware was already prolific, so getting 'Freesync' on desktop monitors meant monitor makers swapping out the LCD controllers they had been using for ones designed for laptop panels. There was most certainly a cost here over standard monitors, but the economy of scale already in place made that cost less than Nvidia's high-powered FPGA solution. The existing or slightly modified controllers were also pretty crappy when it came to actual VRR at least until a few more generations had been completed and a number of issues had been ironed out, such as sync ranges and input lag.
 
Back
Top