OLED HDMI 2.1 VRR LG 2019 models!!!!

The main selling point - the 4k120 video mode.

Downscaling to 1080p for full chroma content or 1440p for reduced chroma content is a severe limitation alleviated by HDMI 2.1.

If you wanted to buy something _RIGHT_NOW_, sure, but otherwise it's LG/Alienware All the way. Hopefully the 2019 Samsung Flagship comes in with 4k120 + BFI aswell.
I think you have been misinformed.
It does full chroma at 4K and 1440p, there is no need to downscale to 1080p.
1440p mode needs to use RGB Full not ycbcr, that is the only limitation and tbh doesnt matter because full RGB is better.
Colours on this TV are simply fantastic no matter what you use it for.

4K/120Hz cant be pushed with current gfx cards.
By the time gfx cards can do it, they will be HDMI 2.1 and the new series of TVs will be out.
If you want a blazingly good TV/monitor now, the Q9 is a great choice.
 
I think you have been misinformed.
It does full chroma at 4K and 1440p, there is no need to downscale to 1080p.
1440p mode needs to use RGB Full not ycbcr, that is the only limitation and tbh doesnt matter because full RGB is better.
Colours on this TV are simply fantastic no matter what you use it for.

4K/120Hz cant be pushed with current gfx cards.
By the time gfx cards can do it, they will be HDMI 2.1 and the new series of TVs will be out.
If you want a blazingly good TV/monitor now, the Q9 is a great choice.

4K/120Hz can absolutely be pushed with even a midrange card, at least for my usage.
I don't mean to argue here, but not everybody intends to play the latest Shooter from EA/Activision/Dice on these things, there are plenty of games that would happily run off a 1070 or even an RX580, if the port was present, and you need to look no further than the steam Most Played list to see why. The flexibity of resolutions is also important.

As for the Q9FN - It only does full chroma at 4k60@8bpc and 1440p60@8bpc, at 1440p120 it reduces the chroma to 4:2:0. Both video modes only support full chroma at 8bit color depths, even though the panel is 10bit. So, any way you look at it, it's just not up to the task the 2019 models will be, only in terms of limited bandwidth.
In terms of being a good display it is absolutely brilliant at it.

And then there's minor inconveniences like of having to change your desktop resolution if you want to play something at 120hz in borderless mode.
 
As for the Q9FN - It only does full chroma at 4k60@8bpc and 1440p60@8bpc, at 1440p120 it reduces the chroma to 4:2:0. Both video modes only support full chroma at 8bit color depths, even though the panel is 10bit. So, any way you look at it, it's just not up to the task the 2019 models will be, only in terms of limited bandwidth.
In terms of being a good display it is absolutely brilliant at it.
1440p is only exposed at 120Hz (a 60Hz mode isnt provided without making a custom mode) and is best with RGB full, the Rtings report covers this.
It does not convert to ycbcr 4:2:0, it remains RGB full.
I use 1440p 120Hz RGB full 8bit for gaming in SDR and HDR, it looks great!
What makes you think it converts to 4:2:0 at 1440p?

ycbcr isnt necessary for HDR, this TV accepts it with RGB full 8bit and still covers the full gamut.
I dont notice banding so it is perfectly acceptable.
My gfx card output is set to RGB full 8 bit permanently, I have no need to change it.
 
1440p is only exposed at 120Hz (a 60Hz mode isnt provided without making a custom mode) and is best with RGB full, the Rtings report covers this.
It does not convert to ycbcr 4:2:0, it remains RGB full.
I use 1440p 120Hz RGB full 8bit for gaming in SDR and HDR, it looks great!
What makes you think it converts to 4:2:0 at 1440p?

ycbcr isnt necessary for HDR, this TV accepts it with RGB full 8bit.
I dont notice banding so it is perfectly acceptable.
My gfx card output is set to RGB full 8 bit permanently, I have no need to change it.

Hey, i was under the impression that it was limited to 4:2:0, but that seems to only be the case for the Xbox One X, so my bad on that one.
However, after checking, rtings specifically mentions issues with colors on the 1440p120@ Full Chroma setting.
I suppose that could've been addressed in firmware, or it could've been a problem with their setup

upload_2019-1-20_1-31-10.png


Nothing is mentioned about what the issues are exactly though, have you experienced anything of the sort?
 
Hey, i was under the impression that it was limited to 4:2:0, but that seems to only be the case for the Xbox One X, so my bad on that one.
However, after checking, rtings specifically mentions issues with colors on the 1440p120@ Full Chroma setting.
I suppose that could've been addressed in firmware, or it could've been a problem with their setup

View attachment 136151

Nothing is mentioned about what the issues are exactly though, have you experienced anything of the sort?
I explained a few times, 1440p 120Hz is best used with RGB full not ycbcr and indeed looks great.
The full chroma issue is redundant with RGB full, there is no chroma setting, it is automatically better than ycbcr 4:4:4.
 
Last edited:
I explained a few times, 1440p 120Hz is best used with RGB full not ycbcr and indeed looks great.
The full chroma issue is redundant with RGB full, there is no chroma setting, it is automatically better than ycbcr 4:4:4.

This is ambiguous cause the AMD drivers call Full RGB as "Full Color RGB 4:4:4", in addition to the"YCbCr 4:4:4" Mode, unlike NVidia, which just calls it RGB, which is what i'm assuming you're going by.

This seems like wrong naming on AMD's part cause it implies RGB can be used with modes that aren't 4:4:4 which simply isn't the case.

Rtings explicitly specifies that an RX580 was used to test as well, so we have no idea what circumstances caused issues for Rtings exactly. Either way, this could've had a lot of different causes.
 
This is ambiguous cause the AMD drivers call Full RGB as "Full Color RGB 4:4:4", in addition to the"YCbCr 4:4:4" Mode, unlike NVidia, which just calls it RGB, which is what i'm assuming you're going by.

This seems like wrong naming on AMD's part cause it implies RGB can be used with modes that aren't 4:4:4 which simply isn't the case.

Rtings explicitly specifies that an RX580 was used to test as well, so we have no idea what circumstances caused issues for Rtings exactly. Either way, this could've had a lot of different causes.
I agree they are not that clear.
4:4:4 in its primitive sense means uncompressed colour, but they may be using a different context such as with ycbcr as that is the only way it can be explicitly configured.
That was my assumption after trying it and having no colour issues with RGB full.
Who knows :)
 
This is ambiguous cause the AMD drivers call Full RGB as "Full Color RGB 4:4:4", in addition to the"YCbCr 4:4:4" Mode, unlike NVidia, which just calls it RGB, which is what i'm assuming you're going by.

This seems like wrong naming on AMD's part cause it implies RGB can be used with modes that aren't 4:4:4 which simply isn't the case.

Rtings explicitly specifies that an RX580 was used to test as well, so we have no idea what circumstances caused issues for Rtings exactly. Either way, this could've had a lot of different causes.
I just realised, they state the problem only occurs in PC mode.

I dont use PC mode because lag is so low it isnt needed, and it also means I can use and configure HDR+ for gaming.
(ie I tell the TV my PC is a Blu Ray player)
PC mode allows a bright HDR mode (Game mode) but its menu is in a different place and some configuration options are missing.

HDR+ is truly awesome for none HDR material. Its like having a CRTs max brightness/contrast again. I turn the backlight right down to normal SDR level which lets it boost the image highlights and it works great.
I leave it on permanently except when using true HDR material (which has the backlight maxed for best effect).
Everyone who has seen my TV thinks HDR+ looks amazing.
 
Last edited:
To be clear if you followed my reply history, while I mention the features of the Q9FN - when I talk about "a 2019 Q9FN with HDMI 2.1" I'm talking about whatever samsung model iteration in that top tier slot with hdmi 2.1 is going to be.. not the 60hz 4k hdmi 2.0b Q9FN itself (which is a great display, just limited by hdmi 2.0b and nvidia-vrr).

----------------------
https://hardforum.com/threads/oled-hdmi-2-1-vrr-lg-2019-models.1974798/page-5#post-1044041750
Any 2019 LG OLED tvs or high end Samsung Q series with HDMI 2.1 and 4k 120hz 4:4:4 will lack variable refresh rate off nvidia gpus because nvidia doesn't support VRR over HDMI (and I suppose you could say due to the fact that the TVs lack a displayport).

the forecast to my sensibilities seems to be looking at the dell hdmi 2.1 OLED gaming display or a Samsung Q9fn 2019 hdmi 2.1 series tv... perhaps in a few years a Dual Layer 3000nit hdr color + .0003 black depth LCD assuming they roll out from manufacturers .. which could potentially hold ground for 2 - 3 more years until microLED becomes available at high prices rather than astromical ones.
 
Last edited:
To be clear if you followed my reply history, while I mention the features of the Q9FN - when I talk about "a 2019 Q9FN with HDMI 2.1" I'm talking about whatever samsung model iteration in that top tier slot with hdmi 2.1 is going to be.. not the 60hz 4k hdmi 2.0b Q9FN itself (which is a great display, just limited by hdmi 2.0b and nvidia-vrr).

----------------------
https://hardforum.com/threads/oled-hdmi-2-1-vrr-lg-2019-models.1974798/page-5#post-1044041750

Right, reading comprehension escaped me i suppose. Either way, some good news for the 2019 "Q9FN" (Q9FO?, Q950R?, Q9R?) is that according to the HDTV Test guy, at CES they showed off a prototype that was VA and had OLED-like viewing angles, which samsung has said will be implemented in the 2019 flagship, though Dimming Zones will likely remain the same.
 
  • Like
Reactions: elvn
like this
What's with all this crappy Q9F talk in a OLED/HDMI 2.1 thread lol.
 
Cause people are afraid of burn in boogie man and believe the Q9 serise are better TVs.

It's amusing to have been using an OLED daily for the past year while I watch people on the internet go on about how unsuitable it is as a monitor and that burn-in is all but a certainty.

/looks at set

Nope. Still none.

Trust me, I put off buying an OLED for a while because I was somewhat concerned about it too. But the proof is in the pudding, and the many owners saying that it's a non-issue can't all be lying.
 
Its a comparison since there's a lot of championing of feature sets and dismissal of others in this thread, and in doing such a lot of good information and links have been posted by people which can help make more informed decisions and time tables.

People are talking about response times and what hz and frame rates limits there are on other displays, what are the color volume limits of upcoming displays , sample and hold blur, Dual Layer (lcd monotone bakclight layer) LCD tech development, which displays have hdmi 2.1 of whatever comparable techs, which lack hdmi 2.1, which have displayport for current nvidia vrr support, and yes burn in - both the lack of recourse were you unlucky enough to have it happen to you, and the fact that it is mostly avoided by strict controls keeping the color brightnesss low, utilizing a white subpixel on every pixel to read somewhat higher brightness measurements in a polluted color space , and ABL brightness safety roll down reflexes out of fear of burn in. High end color professional lcds wete brought up, Even crts, fw900s were brought up as comparatives.

The 55" Dell gaming oled seems like one of the best bets in the near term as long as you don't care about high HDR color volume for the time being. In addition to there being no mention of pricing - they haven't said if they are going to have a gsync option but in addition to having hdmi 2.1 which nvidia doesn't support VRR on as of now, they will have displayport so at least should be able to have freesync support by nvidia (without full 4:4:4 4k 120z chroma due to the bandwidth limitation on the dp) . They would have some future proofing for whenever hdmi 2.1 shows up on gpus down the line.

The LG OLED tvs and the samsung Q series both with hdmi 2.1 in 2019 will be limited by having no displayport so only getting hdmi2. 0b bandwidth and Hz off of nvidia cards with no vrr support on nvidia's hdmi outputs. They should be able to do 1440p 120hz.
The Q series are about five times higher hdr color than oleds true (capable of being calibrated) color limit and three times higher hdr color than lg oleds 600nit white pixel polluted roll down brightness. We don't have data on the dell gaming one yet.
The OLEDs dominate for side by side "SDR+ ranged" per pixel contrast with no surrounding dim of bloom offset areas of FALD.. and have ultra black levels.
the forecast to my sensibilities seems to be looking at the dell hdmi 2.1 OLED gaming display or a Samsung Q9fn 2019 hdmi 2.1 series tv... perhaps in a few years a Dual Layer 3000nit hdr color + .0003 black depth LCD assuming they roll out from manufacturers .. which could potentially hold ground for 2 - 3 more years until microLED becomes available at high prices rather than astromical ones.

So for all the discussions I've leaned one way then another and somewhat back again to be honest. It's still some major tradeoffs either way. The amount of HDR gaming content available and nvidia gpus lacking both hdmi 2.1 and lack of support of hdmi2.0b VRR are also big considerations if buying a very expensive gaming display in 2019.
 
  • Like
Reactions: Nenu
like this
Let's just be happy that some real progress is finally being made towards 4k120Hz gaming with HDMI 2.1 displays coming out this year and Nvidia at least supporting Freesync over DP connection. No matter what flavor of display you go for this year is going to be pretty exciting for just about everyone who has been waiting for some solid 4k120 options.
 
Yes, it's been a long time coming. I enjoy high refresh rates, but I wasn't willing to trade the size or resolution for it.

The Dell looks awesome. I just wonder what they think they can get away with charging for it.
 
Cause people are afraid of burn in boogie man and believe the Q9 serise are better TVs.
Am I Crazy for considering 8K QLED with the theoretical LCD Wide Viewing Angle *almost* as compelling as the OLEDs?
Like the samsung TVs have more software features and are more convenient in a number of ways. They're better products but worse TVs if that makes sense.

Like hang with me for a second, the Q9FN is already a bloody good display, not OLED good, obviously, but still.

If the 2019 model matches or is even very slightly behind the LG in input lag, but is 8K, that gives you ~140 PPI On Demand, and you can still run your games in 4k120, obviously with BFI, and also get potentially sharper everything via upscaling too.
It's too early to call but if samsung outdoes themselves with the processing it might edge it out for me, as I won't be using this exclusively for gaming.

And it will probably do the samsung trick that can interpolate game content with minimal input lag penalty, have the OneConnect box for uniform thick-ish profile (seriously, my sphincter wont' be able to handle the amount of clench that 5mm OLED screen will bring) and have a longer warranty (dunno why samsungs are 3 and LGs are 2 in my area, but still)


Either way, there's a way to go until computex where we hopefully get GPUs that will be able to drive these properly via HDMI 2.1.
 
Either way, there's a way to go until computex where we hopefully get GPUs that will be able to drive these properly via HDMI 2.1.

Which is why the Dell/Alienware 55" OLED is compelling, because it will have DP1.4, which means you can buy it and use it at 4K 120hz 8-bit RGB right away, and in the future when a card supports HDMI 2.1 you can upgrade to that and use that. Especially if Nvidia decides not to do any kind of refresh or re-release this year supporting HDMI 2.1, because Navi isn't going to be high performance.
 
Yes, it's been a long time coming. I enjoy high refresh rates, but I wasn't willing to trade the size or resolution for it.

The Dell looks awesome. I just wonder what they think they can get away with charging for it.

giphy.gif

giphy.gif
 


BFGD DEAD

True HDMI 2.1 with every 2.1 feature. 48 gbps variable refresh rate, low latency gaming mode, etc. etc.

Time to get an AMD card with freesync and do power saving trick with a 2080ti

Wait, how do you trick windows into thinking an amd card is a "low power gpu." You can do this if you have a ryzen apu, since the onboard gpu is considered a low powered gpu. Unfortunately amd graphic cards are treated as a "discrete gpu" So you can't tell windows to specifically select your nvidia card as the as the renderer, and nvidia control panel wont allow you to set that feature if you are plugged into the amd card, which is the only way to get freesync. So that means you are limited to the very few games that actually allow you to select which gpu to render in their menu.
 
Wait, how do you trick windows into thinking an amd card is a "low power gpu." You can do this if you have a ryzen apu, since the onboard gpu is considered a low powered gpu. Unfortunately amd graphic cards are treated as a "discrete gpu" So you can't tell windows to specifically select your nvidia card as the as the renderer, and nvidia control panel wont allow you to set that feature if you are plugged into the amd card, which is the only way to get freesync. So that means you are limited to the very few games that actually allow you to select which gpu to render in their menu.

There's some sort of windows settings trick to do it. Linus Tech Tips has a YouTube video where they do it. They use gsync with an amd card and freesync with an Nvidia card, and even used a mining card with no video connections to render and output through another card.
 
There's some sort of windows settings trick to do it. Linus Tech Tips has a YouTube video where they do it. They use gsync with an amd card and freesync with an Nvidia card, and even used a mining card with no video connections to render and output through another card.
Can you link it? I actually have a samsung tv that supports freesync at 48-120 hz with lfc at 1440 p (Samsung q6fn) im using a 1070 though. I would like to test this out with a lower end amd card that supports freesync.
 
There's some sort of windows settings trick to do it. Linus Tech Tips has a YouTube video where they do it. They use gsync with an amd card and freesync with an Nvidia card, and even used a mining card with no video connections to render and output through another card.
Do you remember if this will allow freesync over HDMI on an NVidia card?
I have a 1080ti and freesync TV without displayport.
It would be nice to try freesync on this TV.
 
Do you remember if this will allow freesync over HDMI on an NVidia card?
I have a 1080ti and freesync TV without displayport.
It would be nice to try freesync on this TV.
Nvidia doesn't support freesync. They support "VRR" over display port. They might end up supporting VRR over HDMI 2.1, but they certainly won't support freesync over HDMI 2.0 which amd implemented and isn't standard with HDMI 2.0. I'm assuming you also have a samsung tv.

I've researched this a ton and cant find a way to trick windows into thinking your amd card is low powered. Hopefully Shark has it, but im doubtful. There was a project called setgpu which was developed to allow you to select your rendering gpu, but it was like one guy, only worked on a few games, and has been discontinued.
 
Last edited:
Can you link it? I actually have a samsung tv that supports freesync at 48-120 hz with lfc at 1440 p (Samsung q6fn) im using a 1070 though. I would like to test this out with a lower end amd card that supports freesync.

I will see if I can find the video when I get home tonight.
 
Here is the video


He has an AMD APU and a discrete NVIDIA GPU. Then just switches the game to use power saving mode or performance mode and it will use gsync or freesync

But I looked it up and people in the forums are saying you can't do it with 2 discrete GPUs.

Someone needs to figure out how to hack that.
 
Here is the video


He has an AMD APU and a discrete NVIDIA GPU. Then just switches the game to use power saving mode or performance mode and it will use gsync or freesync

But I looked it up and people in the forums are saying you can't do it with 2 discrete GPUs.

Someone needs to figure out how to hack that.

Yeah, I saw that video. It was a bit disingenuous of Linus to suggest there are two valid ways of achieving this. As far as hacking it, I don't think there is large enough of a market. You have people that own samsung tv's from 2018 and 2019 and then maybe the LG oled people 2019 and up. I get the feeling that nvidia will eventually support VRR over HDMI 2.1 and up like how they support VRR for displayport since it's a vesa standard. So that leaves people like me screwed who bought a samsung with hdmi 2.0 sets and freesync (not vesa VRR).

I'm just going to go for a smaller build anyway for my living room with amd hardware, so it wont be too big of a deal. I'm really hoping next gen consoles support freesync though and not just VRR over hdmi 2.1. Sony has never bothered to support freesync even though they are using amd hardware, while Microsoft actually has with xbox one.
 
The monsters at LG have dropped the 3.5ms MPRT along with 120hz BFI Support :mad:

Other than that though, if 4k120 does 7ms lag like 1440p120, then this is next level.
 
Back
Top