LG to offer G-Sync firmware updates for 2019 OLED TVs

Burticus

Supreme [H]ardness
Joined
Nov 7, 2005
Messages
5,099
All you rich hosers with fancy new OLED TV's will soon have something else to be happy about other than your piles of cash.

I admit that the idea of a 4k 77" OLED Gsync display does sound.... nice? OTOH if you can afford the $5500 C9 77", I think you can probably scrape up another grand for a 2080ti...
https://www.amazon.com/LG-OLED77C9PUB-Alexa-Built-Ultra/dp/B07PQ98L9D


Borrowed from THW

https://www.tomshardware.com/news/lg-oled-tvs-g-sync-compatibility

"The G-Sync update will be pushed to the TVs through a firmware update on the E9 (55-inch and 65-inch), C9 (55-inch, 65-inch and 77-inch) and B9 (55-inch and 65-inch) range."
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I thought G-Sync required a proprietary Nvidia chip installed for it to work.
I didn't know that those chips could magically appear through a firmware update?
 
I thought G-Sync required a proprietary Nvidia chip installed for it to work.
I didn't know that those chips could magically appear through a firmware update?

"G-sync Compatible" is another way of rebranding variable refresh rate technology. A firmware update will not add the G-Sync chips (which is now branded as G-Sync Ultimate) but can unlock adaptive sync via the variable refresh rate part of the Displayport standard. NVIDIA locked away VRR through the Displayport standard for the longest time to make more money selling their G-sync chips but recently started allowing regular Adaptive sync over Displayport to take away the Freesync marketing point from AMD. Overall this is good for the market as both AMD and NVIDIA now support the Displayport VRR standard rather than NVIDIA artifically locking it away. However, NVIDIA wants to take advantage of this by branding Displayport VRR as "G-Sync Compatible" rather than allowing AMD to put "Freesync compatible" stickers on the monitors as less savvy buyers might then think they need an NVIDIA GPU to make it work like it did in the past.
 
I thought G-Sync required a proprietary Nvidia chip installed for it to work.
I didn't know that those chips could magically appear through a firmware update?

It's basically FreeSync, but Nvidia doesn't want to give AMD any credit so they just call it G-Sync compatible. The G-Sync offerings with an actual module still exist, those are branded G-Sync Ultimate now.
 
I thought G-Sync required a proprietary Nvidia chip installed for it to work.
I didn't know that those chips could magically appear through a firmware update?

Nah, nVidia finally bowed to the inevitable and supports adaptive sync aka Freesync. There still are Gsync monitors with the modules, called Gsync Ultimate I think, but they will also certify a monitor as Gsync compatible if it passes their tests and just used regular adaptive sync.
 
Its the same thing as Gsync, but don't let Nvidia or their fanboys know.
 
It's basically FreeSync, but Nvidia doesn't want to give AMD any credit so they just call it G-Sync compatible. The G-Sync offerings with an actual module still exist, those are branded G-Sync Ultimate now.

Freesync is an AMD owned trademark, they couldn’t use that name even if they wanted to. Also, they’re both just brand names for Adaptive Sync so the whole Freesync and G-Sync Compatible brandings are nonsense anyway.
 
Except it's not, they each have different specs and input ranges. Meaning the chipped version has a lower bottom end and can tolerate lower framerates from less powerful GPUs.

Incorrect, mostly. The hardware version technically supports from 0-max hz but G-Sync Compatible requirements state that a monitor needs to support LFC so those monitors effectively support their entire range as well.
 
Fundamental misunderstanding of the facts here. It's not basically freesync,
nor did AMD invent VRR. They were actually late to the party if anything.

Never said they invented it, but without AMD it's unlikely an open standard would have caught on at all.
 
Like most things, it started as a proprietary technology and after it was proven it becomes a standard.

I personally like how AMD in general tries to support open standards, but both Nvidia and AMD played a part in furthering VRR.

In a few years time there will likely not be GSync or FreeSync, only the VRR standard (but maybe the companies will continue to brand it).
 
Like most things, it started as a proprietary technology and after it was proven it becomes a standard.

I personally like how AMD in general tries to support open standards, but both Nvidia and AMD played a part in furthering VRR.

In a few years time there will likely not be GSync or FreeSync, only the VRR standard (but maybe the companies will continue to brand it).
Definitely both companies had a hand in helping to further it. Nvidia got the ball rolling in 2013 with G-Sync, AMD co-opted VESA's Adaptive-Sync and released it as their proprietary FreeSync in 2015, and both the VESA and HDMI generic A-Sync standards are further along for these efforts.

The technical differences between software based A-Sync and the technically superior G-Sync Ultra remain though - with the latter able to accept input down to 1Hz, at true HDR, and without an extra 2-4ms input latency. Nvidia just hasn't done a good job in communicating and marketing the benefits of the chipped version, or making the differences attractive enough pricewise given the added cost of the module.

There are really only three standards going forward: A-Sync over DIsplayPort, A-Sync over HDMI, G-Sync Ultra (module).

"Freesync compatible" and "G-Sync compatible" at this point are just marketing smoke-and-mirrors to mostly capitalize on VRR confusion among consumers.
 
Last edited:
The technical differences between software based A-Sync and the technically superior G-Sync Ultra remain though - with the latter able to accept input down to 1Hz, at true HDR, and without an extra 2-4ms input latency. Nvidia just hasn't done a good job in communicating and marketing the benefits of the chipped version, or making the differences attractive enough pricewise given the added cost of the module.

Also they locked it to nVidia GPUs and to DisplayPort, which limits the market. While nVidia is the biggest dGPU company for computers, there's a substantial market outside of that, the iGPUs of course but the consoles as well, all of which cannot work with the Gsync module, and many of which don't do DP. That on top of the cost made it just not attractive to many companies. Sure fine for a high end computer monitor you are trying to market to nVidia owners (I have just such a monitor) but not very interesting for TVs. Had it been more compatible from the get-go, might have been something more popular with better marketing.

As it stands, I expect the modules to slowly die out.
 
As it stands, I expect the modules to slowly die out.
Yeah I don't know what Nvidia's endgame is with rebranding the module-based version as G-Sync Ultra. That would imply they believe it has some future. Maybe it'll stick around on certain high end, high price gaming monitors. When VRR over HDMI is going good enough and widespread enough for most gamers, then even if Nvidia gave the modules away free to display manufacturers it wouldn't matter at this point. Maybe they should've done that from the beginning
 
nVidia is usurping the FreeSync and AdaptiveSync branding by pushing monitor manufactures to to label it GSync Compatible instead. Monitors on both newegg and amazon that were listed as "FreeSync" have had the badging and labeling removed in the advertisement and instead show nVidia GSync branding instead.
 
nVidia is usurping the FreeSync and AdaptiveSync branding by pushing monitor manufactures to to label it GSync Compatible instead. Monitors on both newegg and amazon that were listed as "FreeSync" have had the badging and labeling removed in the advertisement and instead show nVidia GSync branding instead.

Do you have examples of this?

I just searched Amazon and found a ton of freesync monitors many of which had freesync in the title.
 
nVidia is usurping the FreeSync and AdaptiveSync branding by pushing monitor manufactures to to label it GSync Compatible instead.

Not really.

Nvidia is bothering to actually test and certify said monitors where AMD couldn't give two shits about the quality of whatever is labeled as 'Freesync'.

And that's really a good thing, as the market for Displayport VRR before Nvidia started setting real standards was a bit of a shitshow.
 
Huh. So only Nvidia knows how to certify and AMD doesnt know shit. Same unfounded BS song and dance spreading FUD from u.

No FUD -- the vast majority of 'robustly certified' Freesync monitors have the absolute bare-minimum implementation. VRR works with a narrow a nearly useless range, there's input lag, and so on; better than having no VRR at all but only just.

Whereas G-Sync was essentially perfect right out of the gate, and the monitors that Nvidia certifies represent the high-end of what's possible with DP Adaptive Sync and HDMI VRR.
 
Great sales pitch. Yes Freesync standard is looser, you just need to pick a higher spec freesync monitor if you want higher performance.
 
  • Like
Reactions: N4CR
like this
Yes Freesync standard is looser, you just need to pick a higher spec freesync monitor if you want higher performance.

Or, you know, useful VRR at all.

But at least now there's a standard for decent Freesync implementations ;)
 
Or, you know, useful VRR at all.

But at least now there's a standard for decent Freesync implementations ;)

There are plenty of happy Freesync users and it works just fine FUD boy. So fine, Nvidia copied it, Imagine that. No wait - buy our obsolete G-Sync chips and pony up more cash. LOL
 
  • Like
Reactions: N4CR
like this
There are plenty of happy Freesync users

Plenty of users happy without VRR too.

and it works just fine FUD boy

I didn't claim that it didn't, and you can keep the personal attacks to yourself.

So fine, Nvidia copied it, Imagine that.

Nvidia released G-Sync hardware before AMD even had a solution, and AMDs response was to hack together a laptop (lol) to show that they could halfass something together at some point.

No wait - buy our obsolete G-Sync chips and pony up more cash. LOL

More cash? Yes, buying the premiere solution does cost a few dollars more. Are you trying to insinuate that better products don't generally cost more?

Or will you go on to insinuate that Freesync is actually free?
 
Yes G-sync is more expensive than freesync for a reason. "premiere solution" LOL Nvidia G-sync TAX. Nvida copying freesync proves the chip is basically a worthless expense.
 
Yes G-sync is more expensive than freesync for a reason.

It's better, and uses hardware to achieve that performance.

"premiere solution" LOL Nvidia G-sync TAX.

So you don't like better performance then?

Nvida copying freesync proves the chip is basically a worthless expense.

AMD copied Nvidia, and continues to. Someday they might be less than two years behind...
 
It's better, and uses hardware to achieve that performance.

So you don't like better performance then?

AMD copied Nvidia, and continues to. Someday they might be less than two years behind...

Actually the software freesync performs just fine. Nvida copied AMDs freesync software solution, Go ahead and pay for the chip and nvidia tax for nothing. The performance IS IN THE MONITOR SPECS. Again, just pick a higher spec monitor with freesync and u r set. The chip does not provide better performance. As two years later, NV finally learned the right way from AMD.
 
Actually the software freesync performs just fine.

Uh, what's that?

Nvida copied AMDs freesync software solution

No, again?

The performance IS IN THE MONITOR SPECS.

Plenty of digging to get there, and then you'll need to wait for third-party reviews to make sure that the monitor actually performs to its 'specs'.

Or are you suggesting that the marketing from monitor manufacturers should be taken at face value? There aren't enough emoticons...

As two years later, NV finally learned the right way from AMD.

Which is why there are still G-Sync monitors being released, and they're still the best in their class, right?

...right?
 
Uh, what's that?

No, again?

Plenty of digging to get there, and then you'll need to wait for third-party reviews to make sure that the monitor actually performs to its 'specs'.

Or are you suggesting that the marketing from monitor manufacturers should be taken at face value? There aren't enough emoticons...

Which is why there are still G-Sync monitors being released, and they're still the best in their class, right?

...right?

So your skepticism is about everything but nvidia and g-sync. got it.

are you implying DisplayPort 1.2a is not a standard that manufacturers follow? because that's all Freesync needs to trigger.
 
Chip inside.. Better? naw.

They all have chips inside.

Betamax Baby!

...was the better technology.

And unlike Betamax, these two can coexist quite easily. After all, both technologies are extensions of existing standards and work with all outputs that use said technologies, excepting VRR functionality itself.
 
Not really.

Nvidia is bothering to actually test and certify said monitors where AMD couldn't give two shits about the quality of whatever is labeled as 'Freesync'.

And that's really a good thing, as the market for Displayport VRR before Nvidia started setting real standards was a bit of a shitshow.

That is true for standard Freesync monitor, but not Freesync 2. Anything with Freesync 2 support has to go through certification and pass various tests. As far as I can tell FS2 and GSC have pretty similar requirements, with one big difference being that FS2 doesn't require the monitor to default to "Freesync Enabled" in the OSD.
 
That is true for standard Freesync monitor, but not Freesync 2. Anything with Freesync 2 support has to go through certification and pass various tests. As far as I can tell FS2 and GSC have pretty similar requirements, with one big difference being that FS2 doesn't require the monitor to default to "Freesync Enabled" in the OSD.

That's my understanding too- though I see quite few Freesync 2 monitors out there, while just about every new monitor with DP Adaptive Sync comes with "G-Sync Compatible" these days.

Honestly if AMD had started with the requirements set forth with Freesync 2, there'd be little to criticize- G-Sync is still superior, but not by enough to matter for most.

But AMD didn't, and now their PR points have been deflated, and one can't really blame anyone but AMD themselves.
 
Capt'n - She's breakin' up, She can't take no more..

Hold it together ,Scotty!
 
Yeah I don't know what Nvidia's endgame is with rebranding the module-based version as G-Sync Ultra. That would imply they believe it has some future. Maybe it'll stick around on certain high end, high price gaming monitors. When VRR over HDMI is going good enough and widespread enough for most gamers, then even if Nvidia gave the modules away free to display manufacturers it wouldn't matter at this point. Maybe they should've done that from the beginning

Well they already have the modules. I mean the real expensive part is the design, testing, tape out, and mask. After that making them is a fairly small incremental cost. So might as well still try and sell them. Some people will buy them because "L33t r0xx0r gamer" and all that jazz. Also they are superior, not a ton, but they do offer a bit lower latency as well as really rock solid low FPS performance. My guess is that they'll keep trying to sell this 2nd generation of modules as long as they can, but won't design a 3rd generation.

Or who knows? Maybe they'll figure out how to make a module that works with general purpose adaptive sync, but still offers some advantages and be able to sell that. I doubt it, but hey.
 
Or who knows? Maybe they'll figure out how to make a module that works with general purpose adaptive sync, but still offers some advantages and be able to sell that. I doubt it, but hey.

I'm surprised that they made a second the same way -- an FPGA, really? -- and given that it mostly just needs an update for the next round of DP and HDMI interfaces, we'll probably at least see another revision.

Further, I don't see them making a version that allows for G-Sync to work with other graphics vendors. Letting Freesync etc. work with Geforce GPUs is one thing, but really, the 'lock in' on the monitor side is what matters if it matters at all.

Though I could certainly be wrong about that too. I didn't think they'd let Freesync monitors work with Geforce GPUs, as they didn't have to; but I also didn't see their certification program coming either, and that is quite the coup.
 
Back
Top