OLED HDMI 2.1 VRR LG 2019 models!!!!

I have to figure out how I'm going to add this to my desk. Are you guys getting mounts or just going to use the stand it comes with. I think my desk is big enough for that, but maybe one of those super adjustable fireplace mounts would be nice.
 
Saying it again:

HDMI VRR is not Freesync.

What's the point of this statement without context?


AMD has already stated HMDI 2.1 VRR will be part of FreeSync:

"Radeon Software will add support for HDMI 2.1 Variable Refresh Rate (VRR) technology on Radeon RX products in an upcoming driver release. This support will come as an addition to the Radeon FreeSync technology umbrella, as displays with HDMI 2.1 VRR support reach market."


HDMI VRR itself, is just the HDMI standard 'catching up' to DisplayPort Adaptive Sync:

"Unlike FS-DP, which was just AMD’s implementation of DisplayPort adaptive sync on their GPUs and software stack, FS-HDMI is not an open standard, at least not at this time. HDMI does not have a variable refresh rate technology standard, and while RTG is pushing to have one included in a future version of HDMI, the HDMI consortium moves too slowly for RTG’s tastes."

And the Adaptive Sync standard was a proposal by AMD, accepted by VESA into DP 1.2a:

"after a proposal from AMD, VESA later adapted the Panel-Self-Refresh feature for use in standalone displays and added it as an optional feature of the main DisplayPort standard under the name "Adaptive-Sync" in version 1.2a."


So make it clear, one is derived from the other.
 
Yeah there is no way I can justify spending that much with this OLED on the way for half the price and much better picture.
 
BFGDs are pretty much dead in the water at this point or at the very best, will have a year or less on the market, making them quite short lived. Monitor and TV convergence will simply continue with specialty gaming modes being supported on all TV's eventually. There simply won't be a need for a special built gaming "monitor" product for large format displays. LG and Samsung will pretty much see to that this year with HDMI 2.1 adoption sporting 4K @ 120Hz and with faster panels. Everyone else in the game will be jumping aboard and doing the same before the year's out.
 
Last edited:
At that price it better be the best damn display ever made. I'm sure it won't be.
It's a 8bit+FRC VA panel.
Versus a true 10 bit OLED panel with instant response times and BFI.

And you can buy 2 or more if smaller for the price of a BFGD.

Nvidia made a booboo.
 
https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/

The question is will they support it through HDMI? As long as they do even if they don't support LG OLEDs specifically it seems like it would be easy enough to hack around it.

A big difference between GSYNC and Freesync is variable overdrive, but that isn't needed in OLED. So as long as nvidia supports VRR through HDMI a gsync OLED wouldn't even be necessary for the best possible VRR OLED experience.
 
That is amazing news!! Given the circumstances I am pretty sure they will end up supporting HDMI freesync.
 
As long as they do even if they don't support LG OLEDs specifically it seems like it would be easy enough to hack around it.

No hacking necessary, Nvidia has already said you can force it on for any display that supports VRR. But that is a good question about HDMI. I'm sure it's coming if not already there on the 15th.
 
  • Like
Reactions: elvn
like this
Vega How stoked are you after Nvidia's announcement on "gsync compatible" displays?

Ya baby! This was a huge move for NVIDIA. I kinda feel sorry for AMD. :(

The biggest take-away though is that Jensen said anyone can flip the switch in the drivers on a VRR display, not just thew NVIDIA "certified" non G-Sync ones. This move just basically locked all high end hardware gamer's into NVIDIA (unless of course AMD can actually make a fast GPU). NVIDIA saw the writing on the wall with HDMI 2.1 coming out. This means BFGD's aren't going to sell very well.
 
Ya baby! This was a huge move for NVIDIA. I kinda feel sorry for AMD. :(

The biggest take-away though is that Jensen said anyone can flip the switch in the drivers on a VRR display, not just thew NVIDIA "certified" non G-Sync ones. This move just basically locked all high end hardware gamer's into NVIDIA (unless of course AMD can actually make a fast GPU). NVIDIA saw the writing on the wall with HDMI 2.1 coming out. This means BFGD's aren't going to sell very well.

I'm interested in seeing what will happen to G-Sync displays from now on. Since AMD cards won't support them, will manufacturers go for compatibility/cheaper implementation and stick with Freesync only or will Nvidia give them incentives to use the G-Sync module in upcoming displays as well. Ideally G-Sync will be phased to just a brand name and all good displays will support variable refresh rate going forward.
 
Ideally G-Sync will be phased to just a brand name and all good displays will support variable refresh rate going forward.

Entirely possible this will happen to low and midrange monitors, but the G-sync HDR module in the X27/PG27UQ (and presumably any actually decent upcoming LCD HDR monitors) controls the backlight, in addition to variable overdrive, so you can't just do without it. You'd end up with an extremely slow backlight like the ones in LCD TVs, which aren't very good.

This may push manufacturers to build better Freesync monitors that actually meet Nvidia's standards also, which is good, since most Freesync monitors are garbage that don't even support variable overdrive.
 
In response to 10 bit color, response times... tv fald speed.

FALD speed
--------------------
I'd be very interested to see measurements of a PG27UQ hdr fald monitor's fald speed in practice on content compared to a samsung Q9FN using their respective FALD algorithms. I can't imagine that the samsung would be demonstrably slower than the gsync fald considering that the samsung can do 480Hz flickering.

VA response time, overdrive, tv enhancements
---------------------------------------------------------------
Instant response times are great but OLEDs still have bad sample and hold blur like lcds so BFI and/or flicker could still be desirable vs blur if possible without bad artifacting.
OLED also can have stutter problems at lower hz sources complicated by their response times ironically.

Modern VA are tight enough response times with overdrive to around 120FpsHz or so, so higher response time is more crucial at higher than 120hz. In fact, some people cap their fps at 118 - 120 to stay in the sweet spot on VA gaming monitors.
In addition, modern high end VA TV's also can do up to 480Hz flicker, interpolation if desired, and BFI optionally any of which can affect the end result a lot.

8bit+FRC
-------------
PG27UQ tftcentral: "The Asus ROG Swift PG27UQ features an AU Optronics M270QAN02.2 AHVA (IPS-type) technology panel which is capable of producing 1.07 billion colours. This is achieved through a 8-bit+FRC colour depth as detailed in the manufacturers specification. Some people may complain that the panel is not a native full 10-bit panel, but in reality you are going to be very hard pressed to see any real difference in practice between a good 8-bit+FRC panel and a true 10-bit panel. "

Also, see below that the color drops off..


Color volume, color reproduction
-----------------------------------------------
both the 1000nit nvidia HDR monitors and the samsung Q9FNs 1800+ nit can do much higher color in HDR than the OLEDS which have a lower true color peak than their peak white subpixel brightness. The LG OLEDs also roll off their peak brightness as a safety measure vs burn in. " Large bright scenes are very dim due to the Automatic Brightness Limiter(ABL).". Both the white subpixel effect and the brightness roll downs affect the OLED color production so much that they can't even be color calibrated in HDR.. If that matters to you but either way why BFGD over high end tvs.

color gamut
----------------
LG C8: (rtings)
"The HDR EOTF in the 'Technicolor Expert' picture mode follows our target PQ curve very well until it rolls off at the TV's peak brightness. " 10bit but only until low peak color brigthness.
"When displaying HDR content in 'PC Mode' colors appear washed, the C8PUA does not detect the wide color gamut and the setting cannot be changed.." ... (currently ?)

Samsung Q9FN:
"Samsung Q9F has an excellent color gamut, covering almost all of the SDR color space. HDR coverage is very good, the Q9FN displays a wide color gamut, but it is unable to reproduce some of the new green tones in the Rec.2020 color space, very few TVs can. "
"HDTVTest has shown that for lower brightness HDR infoframes (such as 1000 nits) the TV produces scenes which are brighter than intended (see his video here)"


.............

.. so what BFGD has and is charging thousands of more dollars for is G-SYNC that nvidia is ("was" now???) shackling their high end nvidia gpus to, with weaker overall features and specs than high end tvs

BFGD
-prob not hdmi 2.1 bandwidth if like the other hdr nvidia monitors so no 4:4:4 at 120Hz 4k
-potentially no QFT (quick frame transport for low input lag gaming) or QMS (quick media switching for no black/blank screens at start of media) though some hdmi 2.1 features can be patched into hdmi 2.0b if they wanted to.
-100 less FALD zones than the samsung's 480
-not per pixel emissive to avoid dim or bloom area offsets of FALD dynamics like OLED
-no incredible black depths like LG OLEDs emissive/off
-800 to 1000 nit less HDR colors than the Q9FN (hdr color volume ~1000 vs 1800 - 2000)
-no BFI black frame insertion afaik, no interpolation option for consoles and lower frame rates, no 480Hz flicker afaik.
-no VRR for consoles and amd gpus (though displays can be patched to support these on hdmi 2.0b if they wanted to).
-No comparable price range





 
Last edited:
Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it, the GPU will need to have it also. But yes still EXTREMELY exciting news.


Well right now nothing has HDMI 2.1. lol. Something has to have it first, and TVs will be good to have it because this means TVs and Receivers will get it this year, and next year we will get consoles that will then have it. So people with newer TVs will then be able to play consoles with variable refresh. Sure GPUs will get it this year or next too, but the main thing right now is TVs and Receivers for the next gen consoles to make use of.
 
Some HDMI 2.1 features have already made it into current AVRs and TVs without needing HDMI 2.1.
ie fast sync, VRR, eARC
Denon for example have been pretty decent updating firmware on older models.

Sadly TVs dont seem to have eARC yet so you cant feed multichannel PCM to your TV to pass on to your AVR.
I would have liked this.
 
Sadly TVs dont seem to have eARC yet so you cant feed multichannel PCM to your TV to pass on to your AVR.
I would have liked this.

True, but most devices that are outputting high end video (like 4K ultra blu-ray players) already have dedicated HDMI audio output ports that feed your A/V receiver. It's ugly, but that's the state of things currently.

Ultimately, I can see all mainstream TVs adopting HDMI 2.1 and supporting a dedicated HDMI-eARC output to feed to a receiver for sound.
 
Is there a reason people are feeding their TV with video and then going out to a receiver? That is opposite of how things are supposed to be working. The receiver takes in all the HDMI and sends audio out to the speaker system, then uses its HDMI output to just pass video to your primary and sometimes secondary screens. If you do it the other way then you are wasting all the inputs on the receiver and having to switch inputs on the TV. Doesnt make sense.
 
Is there a reason people are feeding their TV with video and then going out to a receiver? That is opposite of how things are supposed to be working. The receiver takes in all the HDMI and sends audio out to the speaker system, then uses its HDMI output to just pass video to your primary and sometimes secondary screens. If you do it the other way then you are wasting all the inputs on the receiver and having to switch inputs on the TV. Doesnt make sense.
Keep display lag as low as possible when gaming without feeding 2 HDMI outs.
 
Is there a reason people are feeding their TV with video and then going out to a receiver? That is opposite of how things are supposed to be working. The receiver takes in all the HDMI and sends audio out to the speaker system, then uses its HDMI output to just pass video to your primary and sometimes secondary screens. If you do it the other way then you are wasting all the inputs on the receiver and having to switch inputs on the TV. Doesnt make sense.

If you watch things through the TV's apps you have to use ARC to get the sound to the receiver.

Then if you have something connected through the receiver you have to mess with the settings to turn off arc and switch inputs and it can be a pita.

Also I like to use 1080p@120hz which the tv supports but the receiver does not.
 
Is there a reason people are feeding their TV with video and then going out to a receiver? That is opposite of how things are supposed to be working. The receiver takes in all the HDMI and sends audio out to the speaker system, then uses its HDMI output to just pass video to your primary and sometimes secondary screens. If you do it the other way then you are wasting all the inputs on the receiver and having to switch inputs on the TV. Doesnt make sense.

Depending on the receiver it can add a lot of input lag. I was surprised to find this out, my Denon AVR-1610 adds a lot of input lag if I go through it vs plugging to TV and sending sound out from there.
 
Is there a reason people are feeding their TV with video and then going out to a receiver? That is opposite of how things are supposed to be working. The receiver takes in all the HDMI and sends audio out to the speaker system, then uses its HDMI output to just pass video to your primary and sometimes secondary screens. If you do it the other way then you are wasting all the inputs on the receiver and having to switch inputs on the TV. Doesnt make sense.

This was true perhaps 10 years ago. Times have changed. Try passing a 4K HDR video signal @ 60hz 4:4:4 chroma through your receiver's HDMI. Things typically won't go all that well for you.

And as mentioned by others, many TV's now have their own built-in streaming apps. These TV's need to feed a receiver sound.
 
Last edited:
This was true perhaps 10 years ago. Times have changed. Try passing a 4K HDR video signal @ 60hz 4:4:4 chroma through your receiver's HDMI. Things typically won't go all that well for you.

And as mentioned by others, many TV's now have their own built-in streaming apps. These TV's need to feed a receiver sound.
Oddly my Denon X4400H has extremely low lag such that I can feed my PC through it and tbh, I cant tell the difference.
The only downside is it doesnt provide 1440p 120Hz mode but it does 1440p 60Hz.
My TV can do 1440p 120Hz but doesnt have a 1440p 60Hz mode, yet it will do 1440p 60Hz when fed through the AVR.

However, the other day I tried creating a custom 1440p 120Hz mode for connecting through the receiver and it works!
(I took the custom timings etc from the same mode connected directly to the TV)
So my CAC 1080 DP to HDMI adapter I used to connect direct to the TV has become redundant, hmph!
 
Yeah my Denon 6400 and 4400 have no issues being the traditional receivers at 4k.
 
This was true perhaps 10 years ago. Times have changed. Try passing a 4K HDR video signal @ 60hz 4:4:4 chroma through your receiver's HDMI. Things typically won't go all that well for you.

Ya I bought mine specifically with Dolby Vision in 4k HDR 4:4:4. Most dont support 4:4:4 mode unless you get high end, though on the other side of that same coin most TVs also dont support 4:4:4 until you also buy high end.

The TV apps thing makes sense I guess. I suppose many consumers do want to use those. I refuse to connect my TV to the internet for both security and privacy reasons, so I forgot TVs even have those apps on them that people would use.
 
Ya I bought mine specifically with Dolby Vision in 4k HDR 4:4:4. Most dont support 4:4:4 mode unless you get high end, though on the other side of that same coin most TVs also dont support 4:4:4 until you also buy high end.

The TV apps thing makes sense I guess. I suppose many consumers do want to use those. I refuse to connect my TV to the internet for both security and privacy reasons, so I forgot TVs even have those apps on them that people would use.

Having your TV connected to the internet helps in keeping it's firmware and apps updated though. You must be watching some pretty "special" things to want to avoid sharing anything with your corporate "friends" :D

Yep, unless you've invested in a high end A/V receiver recently (within the past couple of years), odds are you are out of luck as to it being able to handle passing 4K HDR 4:4:4 and properly negotiating all the Hollywood HDCP crap being forced on everything digital these days.

My main A/V receiver is still a Yamaha A-3090, so I don't rely on it to do anything video related at all. :D
 
Last edited:
I can understand not connecting a tv with their snoop history but then again i keep an amazon echo connected - it's just too damn convenient. I do agree about at least connecting a tv once in awhile for firmware updates. That's a good poont.

Besides the tv snoop angle, the smart tv apps are usually slower and clunkier. You can add a nvidia shield-tv or a htpc instead. Shield and consoles also have Netflix, prime video, plex, youtube etc. Shield and consoles also have twitch which roku doesn't, and shield can run kodi if you are into that as well. Youtube especially seems resource hungry/un-optimized so performs poorer and crashes more often on smart tvs and weaker rokus but is fast on shield. The whole shield interface is snappy and has the whole google play (android) store and the nvidia game one available.
 
Last edited:
I can understand not connecting a tv with their snoop history but then again i keep an amazon echo connected - it's just too damn convenient. I do agree about at least connecting a tv once in awhile for firmware updates. That's a good poont.

Besides the tv snoop angle, the smart tv apps are usually slower and clunkier. You can add a nvidia shield-tv or a htpc instead. Shield and consoles also have Netflix, prime video, plex, youtube etc. Shield and consoles also have twitch which roku doesn't, and shield can run kodi if you are into that as well. Youtube especially seems resource hungry/un-optimized so performs poorer and crashes more often on smart tvs and weaker rokus but is fast on shield. The whole shield interface is snappy and has the whole google play (android) store and the nvidia game one available.

LG's OLED interface is pretty fast. A big advantage of the TV's built in apps is it can control the refresh rate. So for example if you're watching a 24 fps or 30 fps video it can adjust and you don't get the nasty pulldown stuttering.

Also with LG's internal Apps you get 4k HDR and surround sound.
 
and your nvidia shield isnt snooping on you?nvidia now snoops via drivers (which you can sort of disable)

the netflix app on 2017s works really well.
 
Guess no video card maker announcements at CES regarding HDMI 2.1 bandwidth support? (FWIW...didn't see any....)
 
This was true perhaps 10 years ago. Times have changed. Try passing a 4K HDR video signal @ 60hz 4:4:4 chroma through your receiver's HDMI. Things typically won't go all that well for you.

And as mentioned by others, many TV's now have their own built-in streaming apps. These TV's need to feed a receiver sound.

Ya I bought mine specifically with Dolby Vision in 4k HDR 4:4:4. Most dont support 4:4:4 mode unless you get high end, though on the other side of that same coin most TVs also dont support 4:4:4 until you also buy high end.

I bought my Yamaha RX-V583 in the fall of 2017, I think for around $499, and the main reason I got it was because unlike a lot of other AVRs (at the time, mind you) it has four HDMI 2.0 in ports *all* with HDCP 2.2. Can pass 4K 4:4:4 60p just fine, and Yamaha put out a software update a few months after I bought it that made Dolby Vision passthrough possible. For what I paid for it and the features it gives, I can’t complain. That’s basically at the upper-low / lower-mid end of AVRs, tier wise.

Also AFAIK, all broadcast content and even 4K UHD media are 4:2:0, so unless you’re connected to a PC it won’t make a difference if it’s 4:4:4. But at least your TV interface (or my Apple TV 4K menus) will look pretty.

I can understand not connecting a tv with their snoop history but then again i keep an amazon echo connected - it's just too damn convenient. I do agree about at least connecting a tv once in awhile for firmware updates. That's a good point.

Besides the tv snoop angle, the smart tv apps are usually slower and clunkier. You can add a nvidia shield-tv or a htpc instead. Shield and consoles also have Netflix, prime video, plex, youtube etc. Shield and consoles also have twitch which roku doesn't, and shield can run kodi if you are into that as well. Youtube especially seems resource hungry/un-optimized so performs poorer and crashes more often on smart tvs and weaker rokus but is fast on shield. The whole shield interface is snappy and has the whole google play (android) store and the nvidia game one available.

Agreed. I never use the TV apps. I hear they can be quite good these days if you don’t have a clunker of a TV. WebOS is quite snappy, I just enjoy my ATV 4K too much to use anything else. Likewise for people who use other such devices.
 
I'm interested in seeing what will happen to G-Sync displays from now on. Since AMD cards won't support them, will manufacturers go for compatibility/cheaper implementation and stick with Freesync only or will Nvidia give them incentives to use the G-Sync module in upcoming displays as well. Ideally G-Sync will be phased to just a brand name and all good displays will support variable refresh rate going forward.

G-Sync modules will slowly die out.
 
Back
Top