OLED HDMI 2.1 VRR LG 2019 models!!!!

sharknice

2[H]4U
Joined
Nov 12, 2012
Messages
3,747


BFGD DEAD

True HDMI 2.1 with every 2.1 feature. 48 gbps variable refresh rate, low latency gaming mode, etc. etc.

Time to get an AMD card with freesync and do power saving trick with a 2080ti
 
HDMI 2.1 - worth waiting for
VRR something every vendor should support, because it will move a wealth of products not just TV's.
NOW why on earth shouldn't monitors also have HDMI 2.1 in 2019?
 


BFGD DEAD

True HDMI 2.1 with every 2.1 feature. 48 gbps variable refresh rate, low latency gaming mode, etc. etc.

Time to get an AMD card with freesync and do power saving trick with a 2080ti


Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it, the GPU will need to have it also. But yes still EXTREMELY exciting news.
 
Can one of you pros explain why this is so amazing compared to the TVs currently out now ?
 
Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it, the GPU will need to have it also. But yes still EXTREMELY exciting news.

Navi almost certainly will have HDMI 2.1, the chipset has been out for the better part of a year.

2019 I might go all AMD with Zen 2 and dump the 2080 Ti. Tired of no gsync on big 4K displays. I'll get a 4K TV with VRR and game on AMD Navi. Screw Nvidia and their proprietary crap. Who wants to spend $5k on a display just for 4K gsync?
 
I would say it is worth buying freesync monitor more than gsync now. If Intel and tvs start all using freesync, nvidia one day will break and do it too
 
Navi almost certainly will have HDMI 2.1, the chipset has been out for the better part of a year.

2019 I might go all AMD with Zen 2 and dump the 2080 Ti. Tired of no gsync on big 4K displays. I'll get a 4K TV with VRR and game on AMD Navi. Screw Nvidia and their proprietary crap. Who wants to spend $5k on a display just for 4K gsync?

Navi is rumored to only be around the performance of an RTX 2070, not really enough for some 4k games. But can't we still do the trick of using a low end AMD gpu to use Freesync while keeping the 2080 Ti as the rendering gpu as long as the AMD card has HDMI 2.1? Surely even the low end Navi would have it.
 
Navi is rumored to only be around the performance of an RTX 2070, not really enough for some 4k games. But can't we still do the trick of using a low end AMD gpu to use Freesync while keeping the 2080 Ti as the rendering gpu as long as the AMD card has HDMI 2.1? Surely even the low end Navi would have it.

I don't play many shooters. When mainly playing RTS or RPG games, I tested my 1440p gsync monitor's lower range, and I found I couldn't tell the difference between 42 fps from 60fps--I had to run 200% resolution scaling with AA jacked up on my 2080 Ti to force the fps that low on a lot of the games I tested (like Witcher 3).

Also I personally find simply raising texture details to max and reducing the lighting to medium with AA off let's even a GTX 1080 class card to hit 60fps in most games at 4K. If Navi is in that performance range it might be good enough for me, certainly would allow me a lot more flexibility in displays than going Nvidia, and it shouldn't be too hard to target 40+ fps with lower settings.
 
Navi is rumored to only be around the performance of an RTX 2070, not really enough for some 4k games. But can't we still do the trick of using a low end AMD gpu to use Freesync while keeping the 2080 Ti as the rendering gpu as long as the AMD card has HDMI 2.1? Surely even the low end Navi would have it.

Good enough for emulation, which is all that really matters at this point anyway.
 
Waiting paid off..
HDMI 2.1 - worth waiting for
VRR something every vendor should support, because it will move a wealth of products not just TV's.
NOW why on earth shouldn't monitors also have HDMI 2.1 in 2019?
Because the chipset to do it ( just hdmi, none of the other functions) will probably be million dollar order minimum and for niche gaming screens it isn't feasible.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
whatever. Next consoles will be 4k30 anyway. Maybe some games 60fps.
and on pc, the best hardware now gives you 4k60... but I wouldnt want 55" on my desk
 
I am not convinced this is the real deal 48 GBps chip-set. You can do all sorts of stupid stuff to 4K/120 Hz like reduce chroma to 4:2:0 8-bit, no HDR, do up-sampling/down-sampling, compression to fit it in 18 GBps HDMI 2.0 chips.

I smell something that stinks. I hope I am wrong.
 
Last edited:
Navi is rumored to only be around the performance of an RTX 2070, not really enough for some 4k games. But can't we still do the trick of using a low end AMD gpu to use Freesync while keeping the 2080 Ti as the rendering gpu as long as the AMD card has HDMI 2.1? Surely even the low end Navi would have it.
HDMI VRR isn't Freesync. The whole point of HDMI VRR is to eliminate the need for a software layer to make VRR work. Meaning so long as a video card has an HDMI 2.1 output no matter who manufactured it you will get VRR when it is connected to a display with an HDMI 2.1 input.
I am not convinced this is the real deal 48 GBps chip-set. You can do all sorts of stupid stuff to 4K/120 Hz like reduce chroma to 4:2:0 8-bit, no HDR, do up-sampling/down-sampling, compression to fit it in 18 GBps HDMI 2.0 chips.

I smell something that stinks. I hope I am wrong.
I tend to agree. The Xbox One X is HDMI 2.1 in that it supports VRR and ALLM, but it does not support the full 48 Gbps bandwidth.
 
Last edited:
I'll be keeping my eye on these but I still plan to keep my OLED C6 for a bit longer. I'd love it if there was a breakthrough in blur reduction on these. The 2018 models had black frame insertion but I heard it could have been implemented better.
 
We really need to get to the point where we can just seamlessly use HDMI with a receiver with a computer monitor and not have to have phantom monitors and other horse shit. The whole situation we're in right now is a disgrace.
 
I can push 200Gbps over a $20 passive QSFP DAC cable up to 3m or so, but we had to wait how long to get the shitty crippled hdmi above 18Gbps? (Which many retail store will try to trick you into a $100 branded cable for)
And even then a lot of TVs do dumb chroma shit, the default is compressed black levels etc etc.

Endless bottom dollar mediocrity and hollywood ball sucking crypto is annoying.
 
True HDMI 2.1 with every 2.1 feature. 48 gbps variable refresh rate, low latency gaming mode, etc. etc.

I would temper our excitement until we get the actual details next week. It wouldn't surprise me to see limited implementation of HDMI 2.1 with the first round of products from this year's CES.

It doesn't help that the HDMI Org has allowed HDMI 2.1 features to be implemented piecemeal, with some able to be implemented on HDMI 2.0. We know that these 2019 models will come with VRR, ALLM, and eARC, but AFAIK all of those can be implemented on 18Gbps chipsets.

In otherwords, we want the "full fat" 48Gbps chipsets.

The gold standard we're aiming for (especially for PC use) continues to be 4K @ 120Hz @ 4:4:4 10-bit HDR. That kind of bandwidth clocks in at 45Gbps. Moreover, these 2019 models are meaningless if we don't also have graphics cards that also use HDMI 2.1 and push all these features too.
 
This is good news.

I will upgrade my C7 as soon as I get some devices that can actually use 2.1 output.
 
I would temper our excitement until we get the actual details next week. It wouldn't surprise me to see limited implementation of HDMI 2.1 with the first round of products from this year's CES.

It doesn't help that the HDMI Org has allowed HDMI 2.1 features to be implemented piecemeal, with some able to be implemented on HDMI 2.0. We know that these 2019 models will come with VRR, ALLM, and eARC, but AFAIK all of those can be implemented on 18Gbps chipsets.

In otherwords, we want the "full fat" 48Gbps chipsets.

The gold standard we're aiming for (especially for PC use) continues to be 4K @ 120Hz @ 4:4:4 10-bit HDR. That kind of bandwidth clocks in at 45Gbps. Moreover, these 2019 models are meaningless if we don't also have graphics cards that also use HDMI 2.1 and push all these features too.

Seems like your math's off.
30bpp * 4k * 120Hz is 29.8gbps.
These chipsets are theoretically capable of ~4k@190(ish)hz@10bit or 5k(5120x2160) ultrawide@144(ish)[email protected] of those are ~48gbps.

Unless HDR is an additional and not incorporated within the 4k 10bpc i dont see where the extra overhead fits?


Unfortunately for us, LG probably doesnt care enough to support custom display modes.
 
I would temper our excitement until we get the actual details next week. It wouldn't surprise me to see limited implementation of HDMI 2.1 with the first round of products from this year's CES.

It doesn't help that the HDMI Org has allowed HDMI 2.1 features to be implemented piecemeal, with some able to be implemented on HDMI 2.0. We know that these 2019 models will come with VRR, ALLM, and eARC, but AFAIK all of those can be implemented on 18Gbps chipsets.

In otherwords, we want the "full fat" 48Gbps chipsets.

The gold standard we're aiming for (especially for PC use) continues to be 4K @ 120Hz @ 4:4:4 10-bit HDR. That kind of bandwidth clocks in at 45Gbps. Moreover, these 2019 models are meaningless if we don't also have graphics cards that also use HDMI 2.1 and push all these features too.

They're specifically saying it's supports everything in the press release. Unlike other tvs and the Xbox's partial support.

But yes, now we need a graphics card that does too.
 
They're specifically saying it's supports everything in the press release. Unlike other tvs and the Xbox's partial support.

But yes, now we need a graphics card that does too.

All I'm saying is I didn't see 48Gbps mentioned. Seems other sites are inferring that, but makes me raise my eyebrow until we otherwise know for sure.
 
I'm excited to see this tech finally hit the streets. In another couple years, it might actually be affordable :D

I'm still going to enjoy my C7 for another 5 years, while you early adopters have fun!
 
Seems like your math's off.
30bpp * 4k * 120Hz is 29.8gbps.
These chipsets are theoretically capable of ~4k@190(ish)hz@10bit or 5k(5120x2160) ultrawide@144(ish)[email protected] of those are ~48gbps.

Unless HDR is an additional and not incorporated within the 4k 10bpc i dont see where the extra overhead fits?


Unfortunately for us, LG probably doesnt care enough to support custom display modes.

Yes, HDR adds significantly more bandwidth to the requirement. The math is correct.

If you want to play around with the numbers, just go here: https://www.extron.com/product/videotools.aspx
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I'm excited to see this tech finally hit the streets. In another couple years, it might actually be affordable :D

I'm still going to enjoy my C7 for another 5 years, while you early adopters have fun!

I think they'll keep the price the same as last year's models despite the upgrades. I'm glad I waited and didn't buy a 2018 model on black Friday.
 
Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it

I don't know if this is true. You could use Custom Resolution Utility to overclock your HDMI connection and set custom Freesync ranges. So even if 120hz isn't possible, 90hz or 75hz might be. Still very cool.
 
I think they'll keep the price the same as last year's models despite the upgrades. I'm glad I waited and didn't buy a 2018 model on black Friday.
So that means 2500€ for 55" until the 2018 have been sold out, gonna be a long wait.
 
Dude no AMD card has HDMI 2.1....so it doesn't matter if the TV will have it, the GPU will need to have it also. But yes still EXTREMELY exciting news.
Yes but also like AMD has an anouncement in 5 days @ CES. The hope for a match made in heaven Q2 2019 still lives.
 
You can use ARC with an appropriate receiver to play smart tv sources or F connector sources noises from your receiver's speakers instead of the TV's. Say your appropriate receiver does not have enough hdmi ports for all of your equipment you can plug them into your TV with eARC and not pay any penalty like it did with ARC with only 2 channel toslink instead of 37-40mbps, and ethernet in addition to the rest of the HDMI 2.1 specs.
 
Don't use an appropriate receiver?

Depends on the setup. For the living room, less is more and if I can use the TV for source switching, it's better than having to use a receiver to do it. Like Wade88 said, if you're using the TV's app as a source, having eARC will get you full audio support as well.
 
I prefer to have the avr do the source switching so there can be minimal wires in the wall to the TV's. I have big cats that like to leap around so they get mounted to studs in the wall and have polycarbonate anti-cat armor, 1 power and 1 hdmi go into the wall in a conduit that is next to the conduit for the satellite speakers that aren't L/C/R or the rca to the sub and out behind the TV so they can't unplug anything. If you switch all your sources with the avr it's just the one ARC cable for doing TV sources and traditional AVR duty for your other sources. If you mount on a regular piece of furniture this might not be a consideration. it's also nice to keep my nieces from destroying them when they're not as closely attended.
 
Yes, HDR adds significantly more bandwidth to the requirement.

It doesn't. HDR10 metadata can be encoded in the video stream's SEI (that already exists) and adds negligible overhead to the file (since SEIs are only present once every GOP), which doesn't translate to any additional bandwidth needed since you've already established the display parameters when changing display modes.
 
Back
Top