OLED HDMI 2.1 VRR LG 2019 models!!!!

No one has produced even a prototype of a display that can compete with Gsync HDR yet so this seems pretty unlikely.

Honestly, not many people even give a shit about HDR. Eliminating stuttering and running at 90hz+ are objective improvements, and HDR almost seems more like subjective taste. A lot of people don't even like the way it looks.
 


Updated burn in test from RTings. Only the sets that were running CNN for 20 hours a day showed burn in. The sets used for gaming only were fine even after thousands of hours of usage. Sorry but this whole burn in scare, while being a very real possibility, can also be very unlikely for many.
 
Ya 6,000 hours of playing the SAME game with the HUD up and literally zero burn in. OLED burn in totally over-rated. I think anyone with 1/4 of a brain wouldn't play only CNN for 20 hours per day for thousands of hours.
 
Updated burn in test from RTings. Only the sets that were running CNN for 20 hours a day showed burn in. The sets used for gaming only were fine even after thousands of hours of usage. Sorry but this whole burn in scare, while being a very real possibility, can also be very unlikely for many.

Go figure.

But it won't stop the Burn In Brigade around here from hating on OLED despite their having never owned one. They see white paper specs and torture tests and are paranoid they'll have a burned in screen after 5 minutes of use.
 
Go figure.

But it won't stop the Burn In Brigade around here from hating on OLED despite their having never owned one. They see white paper specs and torture tests and are paranoid they'll have a burned in screen after 5 minutes of use.

It's a psychological defense mechanism to stick with crappy LCD. Their simple minds think gaming for 30 minutes and playing CNN for 20 hours per day for thousands of hours are the same thing.
 
Very interesting video. Thanks. It would be nice if they'd back it up with a warranty period of some sort. Samsung's LED FALD LCDs have a 10 year burn in guarantee. With an oled, if it happens (with many claims of it in product reviews on major sites) .. you are screwed.
6:15 "This also indicates that there may be a 'luck component' that in some cases may be more significant than the theoretical lifespan"

0KHvQ3V.png



It is encouraging especially for OLED enthusiasts considering HDMI 2.1 sets are coming out.
Regardless,
OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display.
They do 400nit color or so and with added white subpixel can get brighter white but not brighter HDR color volumes. So they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in.


If true HDR color gamut past SDR-ish/quasi HDR volumes so you can show the color gamut into hdr color of 1000 - 2000nit ranges (and further toward hdr 4000 and 10000 where hdr is mastered for.. in future tvs) doesn't matter to you and you are willing to take the burn in risk - then the black depths and per pixel side by side contrast on oled is amazing.
If you are going to want real HDR color volumes of 1000 - 2000nit color out of HDR 1000 and HDR 4000/10,000 content.. and not be retarded by safety mechanisms and caps, and not play the burn in luck factor.. OLED might not be for you.

Of course the current state of FALD and small dynamic backlight arrays have their own tradeoffs in screen dim or bloom favoring algorithms in screen spaces since they aren't per pixel so they aren't a perfect solution either but for me I am more interested in achieving real HDR color volume going forward personally.
 
Last edited:
Very interesting video. Thanks. It would be nice if they'd back it up with a warranty period of some sort. Samsung's LED FALD LCDs have a 10 year burn in guarantee. With an oled, if it happens (with many claims of it in product reviews on major sites) .. you are screwed.
6:15 "This also indicates that there may be a 'luck component' that in some cases may be more significant than the theoretical lifespan"





It is encouraging especially for OLED enthusiasts considering HDMI 2.1 sets are coming out.
Regardless,
OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display.
They do 400nit color or so and with added white subpixel can get brighter white but not brighter HDR color volumes. So they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in.


If true HDR color gamut past SDR-ish/quasi HDR volumes so you can show the color gamut into hdr color of 1000 - 2000nit ranges (and further toward hdr 4000 and 10000 where hdr is mastered for.. in future tvs) doesn't matter to you and you are willing to take the burn in risk - then the black depths and per pixel side by side contrast on oled is amazing.
If you are going to want real HDR color volumes of 1000 - 2000nit color out of HDR 1000 and HDR 4000/10,000 content.. and not be retarded by safety mechanisms and caps, and not play the burn in luck factor.. OLED might not be for you.

Of course the current state of FALD and small dynamic backlight arrays have their own tradeoffs in screen dim or bloom favoring algorithms in screen spaces since they aren't per pixel so they aren't a perfect solution either but for me I am more interested in achieving real HDR color volume going forward personally.

Yes true. But let's look at Micro LED. In just 1 year Samsung has managed to shrink the wall from 146 inches down to 75 inches. In 2-3 more years they may be able to shrink it down to 55 inches and HDR content will be more prevelant. If you absolutely need that HDR performance then MicroLED is the way to go, all without burn in risk. If you are more interested in achieving real HDR performance then that's what you want, not these crappy FALD LCDs.
 
I'll be in on those for sure. For now , as for what seems like forever... trade-offs.
 
Burn-in doesn't have to be permanent. I've had some older LCDs that suffered from temporary burn-in and I can tell you even that was hugely annoying in desktop use. It was usually things like the Windows taskbar that were burned in as that rarely changes on your screen. It showed faintly on dark images usually. So even if you won't get permanent burn-in on OLEDs, it would be possible to get it temporarily. I can't say if that occurs with modern OLED TVs.
 
https://www.engadget.com/2019/01/08/alienware-4k-oled-55-inch-gaming-monitor/

It seems like this may be an alternative for those interested in jumping in early. It has DP 1.4 support, so it will avoid the need for HDMI 2.1 support on video cards to get 4K 120hz.

Yes but if it doesn't have HDMI 2.1 then it's somewhat going to be obsolete. Having both connections ensures it's future proof. If it only has DP 1.4 then once HDMI 2.1 gpus are out we won't be able to make use of it on this display.
 
Yes true. But let's look at Micro LED. In just 1 year Samsung has managed to shrink the wall from 146 inches down to 75 inches. In 2-3 more years they may be able to shrink it down to 55 inches and HDR content will be more prevelant. If you absolutely need that HDR performance then MicroLED is the way to go, all without burn in risk. If you are more interested in achieving real HDR performance then that's what you want, not these crappy FALD LCDs.


In 2014 a company made a prototype smartwatch out of MicroLED with something insane like 1700PPI, and that wasnt even the first microLED display there was. So the reason Samsung made "The Wall" so big was not because they were working on shrinking it, but rather to just make an impressive display that would grab attention and headlines. They showed themselves up this year by increasing the size of The Wall to 219 inches at CES.

Sony also had a 55" MicroLED 1080p TV out in 2012.

The issue has never been shrinking it down, but rather making it larger. Small is easy. The problem is MicroLED is made on silicon wafers like CPUs. The wafer is only so big, so you cant make a TV sized wafer very easily. This is why the tech hasnt been around yet. Samsung figured out a way around this, make 4" wafers (which is still huge for a single piece of silicon product) and then make interconnects at the edges of them all and stitch them together as a single cohesive unit to make a large display.

Remember that Samsung's MicroLEDs are 10μm wide. If my math is right (and it could very easily be wrong at these tiny sizes), that means in a 75" display Samsung could theoretically have a resolution of 50,933 x 28,800 with a PPI of 2,340. (that's the math results I get using 3 subpixels each per pixel at a size of 10μm wide per subpixel on a 75" screen). So like I said, going tiny is easy, you can already make the pixels insanely small. Going big is the really hard part.
 
Last edited:
Yes true. But let's look at Micro LED. In just 1 year Samsung has managed to shrink the wall from 146 inches down to 75 inches. In 2-3 more years they may be able to shrink it down to 55 inches and HDR content will be more prevelant. If you absolutely need that HDR performance then MicroLED is the way to go, all without burn in risk. If you are more interested in achieving real HDR performance then that's what you want, not these crappy FALD LCDs.
Microled has been at about that size for about a year now.

You should use the commercial displays to compare. .5mm is the current smallest pixel pitch, .7mm being the older smallest one. the samsung one is .435 for comparison for 4k at 75 inches.

It's not really that much of a shrink and they are under absolutely no pressure to go under 65 inches on account of how expensive these will be at that start.
 
Hmm.

Looking at all the announced 2019 models, they are all 65+ inches, which means they are useless as monitors.

Hopefully they will come in smaller sizes later.

My ideal TV would be:
- 40"-43"
- 4K Resolution at 4:4:4 chroma
- No smart features at all. Just a TV.
- 120+hz max available refresh.
- Low response time
- VRR
- HDMI 2.1

Heck, I don't even need it to have speakers.

I'm a little leery of OLED though, because of the burn-in issues. I'm so tired of being paranoid with my Panasonic Plasma i use in my home theater.


At some point I'll also be in th emarket for an upgraded larger screen for the home theater, but first I want a small one with VRR to use as a monitor.
 
Hmm.

Looking at all the announced 2019 models, they are all 65+ inches, which means they are useless as monitors.

Nop. C9, B9 and E9 are available in 55, as usual. We already knew there wouldn't be anything 40-50 inch until next year at the earliest, if ever. It's not a popular size anymore and LG panel production is focused on getting as much TV market share as possible, it doesn't make sense for them to expend capacity on a specialized monitor size for lots of reasons. The market for PC monitors that cost more than $500 is very small.

tired of being paranoid with my Panasonic Plasma i use in my home theater

If it was one of the last gen ones, I doubt there's any reason to. I've had a Samsung PN60F8500 since 2014 and I've forgotten about it with the Windows desktop up for 12+ hours and it didn't even have image retention. I don't think the last gen plasmas even had burn-in at all.

You have to pour thousands of hours of identical static content into an OLED to burn anything in.
 
Nop. C9, B9 and E9 are available in 55, as usual. We already knew there wouldn't be anything 40-50 inch until next year at the earliest, if ever. It's not a popular size anymore and LG panel production is focused on getting as much TV market share as possible, it doesn't make sense for them to expend capacity on a specialized monitor size for lots of reasons. The market for PC monitors that cost more than $500 is very small.



If it was one of the last gen ones, I doubt there's any reason to. I've had a Samsung PN60F8500 since 2014 and I've forgotten about it with the Windows desktop up for 12+ hours and it didn't even have image retention. I don't think the last gen plasmas even had burn-in at all.

You have to pour thousands of hours of identical static content into an OLED to burn anything in.


The market may be what it is, but 43" is my max size. Above that size, no sale.

It's a shame that this size has been reduced to the realm of budget TV's.

I paid $2k for my 2015 48" Samsung JS9000, and would do it again for the right TV. I want it so little smaller and with VRR this time though.

It has to be top end picture quality though. I don't want some cost reduced model just because it is smaller.

Throw the best you can muster picture qualitty wise into a nice little 43" package, and omit any and all smart functions, and we are in business.
 
The market may be what it is, but 43" is my max size. Above that size, no sale.

It's a shame that this size has been reduced to the realm of budget TV's.

I paid $2k for my 2015 48" Samsung JS9000, and would do it again for the right TV. I want it so little smaller and with VRR this time though.

It has to be top end picture quality though. I don't want some cost reduced model just because it is smaller.

Throw the best you can muster picture qualitty wise into a nice little 43" package, and omit any and all smart functions, and we are in business.

Then your best bet is the Asus XG438. 43 inch VA panel with 120Hz VRR.
 
Then your best bet is the Asus XG438. 43 inch VA panel with 120Hz VRR.

Thank you for pointing out the XG438Q.

In years past I would have shunned VA, but my experience with the SVA panel in my Samsung JS9000 has been good. Not perfect, but very good, so I'd give this a chance.

Short of a MicroLED or OLED panel with the same specs, this appears like it could be the perfect monitor for me.

Once I confirm Nvidia support for the VRR I'll probably buy one provided it isn't insanely priced.
 
Ugg, too bad us computer gamer guys have to wait until HDMI 2.1 GPU's come out. NVIDIA 7nm may not be until 2020.

So the decision:

1. Buy a C9 this spring and wait until 2020 for a NVIDIA HDMI 2.1 GPU and then realize 4K/120 VRR

or

2. Buy a Alienware 4K/120 55" OLED monitor this fall and use with my current Displayport 1.4 RTX Titan video card and realize 4K/120 VRR sooner?
 
It's crazy that nvidia didn't put in the 2.1 hdmi in the rtx cards. 2080ti is already $1200, lol fn nvidia.
Even is 2.1 hdmi is 6 months out its not like nvidia has any competition in the high end market.
 
There wasn't enough time between HDMI 2.1 spec release and designijng Turing architecture to include HDMI 2.1.
 
I can understand not connecting a tv with their snoop history but then again i keep an amazon echo connected - it's just too damn convenient. I do agree about at least connecting a tv once in awhile for firmware updates. That's a good poont.

Besides the tv snoop angle, the smart tv apps are usually slower and clunkier. You can add a nvidia shield-tv or a htpc instead. Shield and consoles also have Netflix, prime video, plex, youtube etc. Shield and consoles also have twitch which roku doesn't, and shield can run kodi if you are into that as well. Youtube especially seems resource hungry/un-optimized so performs poorer and crashes more often on smart tvs and weaker rokus but is fast on shield. The whole shield interface is snappy and has the whole google play (android) store and the nvidia game one available.

I'd prefer my TV's completely without any "smart" capability at all. Getting very difficult to but one though. You could choose to never plug one into Ethernet or give it wifi passwords though.

IMHO anyone who has an Amazon Echo or Google Home on their Network at all is a huge sucker.
 
I really don't even get why companies like Samsung are at the forefront of stuff like MicroLED. You'd think that AMD and Intel would want to get in on that action since making stuff small is basically their area of expertise.
 
I really don't even get why companies like Samsung are at the forefront of stuff like MicroLED. You'd think that AMD and Intel would want to get in on that action since making stuff small is basically their area of expertise.
Yeah no, it's big stuff.

Plus there's a ton of patents and know-how involved.
 
It's crazy that nvidia didn't put in the 2.1 hdmi in the rtx cards. 2080ti is already $1200, lol fn nvidia.
Even is 2.1 hdmi is 6 months out its not like nvidia has any competition in the high end market.

There wasn't enough time between HDMI 2.1 spec release and designijng Turing architecture to include HDMI 2.1.

I wonder if NVIDIA will find itself in a similar situation to 2018, where RTX inventory could stagnate because gamers will hold off for HDMI 2.1. As a result, we could see big sales as we head into spring and summer.

I have no horse in the race when it comes to HDMI 2.1 on the GPU end, though. I picked up a 1080 Ti when there were fire sales going on this fall, so I won't be back in the market until 2021 at the earliest. At least by that point, HDMI 2.1 will be firmly established.
 
I wonder if NVIDIA will find itself in a similar situation to 2018, where RTX inventory could stagnate because gamers will hold off for HDMI 2.1. As a result, we could see big sales as we head into spring and summer.

I have no horse in the race when it comes to HDMI 2.1 on the GPU end, though. I picked up a 1080 Ti when there were fire sales going on this fall, so I won't be back in the market until 2021 at the earliest. At least by that point, HDMI 2.1 will be firmly established.
Perhaps this is why they chose to release a low inventory high price card when they did.
They knew there would be excess inventory leading up to HDMI 2.1.
The high price also helped shift excess 10xx series stock.
This way they make a ton more profit on the new cards and have less issues moving forward.
 
I really don't even get why companies like Samsung are at the forefront of stuff like MicroLED. You'd think that AMD and Intel would want to get in on that action since making stuff small is basically their area of expertise.

Samsung is actually at the forefront. AMD doesnt even have fabs anymore, they just design products with software that is made to work on these other company's fabs. Thats not to say their engineers doing the designs arent brilliant, but they arent actually the designers of such small scale fabrication.

Samsung on the other hand does the R&D and has their own fabs. Some of the most advanced in the world. Global Foundries (formerly AMD fabs) licensed Samsung foundry tech in the past and was going to do so again on an upcoming node that was cancelled. Samsung will also be the first with a 7nm EUV process and it will be significantly better than TSMC's 7nm process. They have also been shipping the first 10nm and 7nm processors already for smartphones. On top of that, Samsung actually designs their own Exynos processors too, and designs their own DRAM and NAND as well as controller chips for those products.

Then we have MicroLED actually being a display technology, which Samsung excels at with their TVs. So really Samsung is far better suited to be the ones pushing MicroLED than AMD or Intel.





Perhaps this is why they chose to release a low inventory high price card when they did.
They knew there would be excess inventory leading up to HDMI 2.1.
The high price also helped shift excess 10xx series stock.
This way they make a ton more profit on the new cards and have less issues moving forward.
Nvidia chose to release Turing when they did because they already spent the R&D money on ray tracing cores and and arch for them for self-driving cars and then that deal fell through and they had to make money off their spent R&D.
 
This is all so confusing.

So this TV is coming out in 2019 and will do 120hz at 4k? What about at 1080P (the Samsung Q6N from last year had 120hz 1080P with a input lag of 9.4).

Any idea when these will hit or what they will cost? What about input lag? Sub 10ms?

Lastly, price point? I am I the market for a multipurpose 65 inch for the media area and this looks promising.
 
It's far less complicated than what Intel already does.

It might actually be hard for AMD and Intel to find a fab that can make something so big as a MicroLED. lol. Transistors are in the low nanometer size range, MicroLED is 1000x larger than the transistors AMD, Intel, Samsung, Qualcomm, and Apple use to make CPUs and SoCs. The fab used to make these has to support such a large sized device.
 
Both TFTcentral and digitaltrends articles say the Dell Alienware 55" OLED gaming monitor has both DP 1.4 and HDMI 2.1.


The samsung Q9 series ~1800 nit LED FALD HDR VA LCD tvs (phew that's a lot of acronyms) .. have 1440p 120hz at 10ms and VRR on hdmi 2.0b. The LG C8 OLED is still at around 21ms I think currently but can't do 1440p 120hz so that could be the difference there.
HDMI 2.1 spec has QFT (quick frame transport) for low input lag gaming which might help all tvs achieve that kind of low input lag or better at 4k 120hz 4:4:4 on hdmi 2.1 hopefully.

The Dell Alienware being marketed as a monitor and having g-sync tech could even be less input lag and would have oled response times as well though that response time doesn't come into play as much at 120hz compared to lcd unlike higher Hz ranges would. You also still need to feed high fps ranges to high hz (120hz in this case) to combat sample and hold blur even on oled.

100FpsHz on 120hz monitor = 40% blur reduction vs 60hz smearing, 5:3 motion definition increase (10 unique world action frames shown to every 6 at 60FpsHz).

120FpsHz on 120hz monitor = 50% blur recution (~ soften blur filter look) vs 60hz, 2:1 (double) the motion states shown (12 frames to every 6 shown at 60FpsHz)

I'm pretty happy if I can dial in 100fpsHz average to get around 70 - 100 - 130+ fps range in demanding games.

The higher end tvs have other technologies like 480Hz flicker, optional BlackFrameInsertion(BFI), and optional interpolation. It would be nice if the modern monitors and TVs would share a lot of those capabilities across the board so you could use them for consoles, emu and indie games, low hz limited games, etc.
 
Last edited:
Oh and I just realized... If you run everything through a stereo receiver that will have to have HDMI 2.1 as well.

More waiting on updates I guess.
 
well you can use arc depending. Then you'd run direct to your tv for video and your tv would output the audio back to the receiver sort of as if it were an external speaker.

But yes if you want to use hdmi switching and all that it can be more complicated and newer recievers will come out that support it directly. Also for higher bandwidth audio if that hits.
 
Nvidia chose to release Turing when they did because they already spent the R&D money on ray tracing cores and and arch for them for self-driving cars and then that deal fell through and they had to make money off their spent R&D.

Source? How would ray-tracing be useful for self-driving cars?
 
Both TFTcentral and digitaltrends articles say the Dell Alienware 55" OLED gaming monitor has both DP 1.4 and HDMI 2.1.


The samsung Q9 series ~1800 nit LED FALD HDR VA LCD tvs (phew that's a lot of acronyms) .. have 1440p 120hz at 10ms and VRR on hdmi 2.0b. The LG C8 OLED is still at around 21ms I think currently but can't do 1440p 120hz so that could be the difference there.
HDMI 2.1 spec has QFT (quick frame transport) for low input lag gaming which might help all tvs achieve that kind of low input lag or better at 4k 120hz 4:4:4 on hdmi 2.1 hopefully.

The Dell Alienware being marketed as a monitor and having g-sync tech could even be less input lag and would have oled response times as well though that response time doesn't come into play as much at 120hz compared to lcd unlike higher Hz ranges would. You also still need to feed high fps ranges to high hz (120hz in this case) to combat sample and hold blur even on oled.

100FpsHz on 120hz monitor = 40% blur reduction vs 60hz smearing, 5:3 motion definition increase (10 unique world action frames shown to every 6 at 60FpsHz).

120FpsHz on 120hz monitor = 50% blur recution (~ soften blur filter look) vs 60hz, 2:1 (double) the motion states shown (12 frames to every 6 shown at 60FpsHz)

I'm pretty happy if I can dial in 100fpsHz average to get around 70 - 100 - 130+ fps range in demanding games.

The higher end tvs have other technologies like 480Hz flicker, optional BlackFrameInsertion(BFI), and optional interpolation. It would be nice if the modern monitors and TVs would share a lot of those capabilities across the board so you could use them for consoles, emu and indie games, low hz limited games, etc.


This is why I'm thrilled tvs are finally getting connectors and features that are more gaming focused. When monitors that were more gaming focused with lower latency and higher variable refresh rates were walled off from having those features on tvs with better visuals, display makers got to feed all the pc users shit displays for top dollar. Now all the latest display tech will have the gaming display attributes more baked in and we are not locked down to leftovers.
 
Back
Top