Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

The RTings tests are somewhat torture tests. Playing the same content for 20 hours a day in 5 hour durations. The average user surely wouldn't be using their OLED display 20 hours a day, and would be playing lots of different content on it I'd imagine. Any OLED I purchase today I would expect it to EASILY be burn free for many years in my use case scenario.
 
Still not entirely realistic, what consumer is going to play FIFA for 800 hours straight with zero other content? People play other games, use TV and console menus, watch TV programs and movies even some PC desktop use. And he stated it's not visible on normal content.

Yes, I will be the first to say that OLED shouldn't be used for airport displays. But normal consumer use there is NO issue.

The RTings tests are somewhat torture tests. Playing the same content for 20 hours a day in 5 hour durations. The average user surely wouldn't be using their OLED display 20 hours a day, and would be playing lots of different content on it I'd imagine. Any OLED I purchase today I would expect it to EASILY be burn free for many years in my use case scenario.
Burn-in results from cumulative wear, i.e., it should not matter whether different content is mixed in or not, or over what period of time that 800 hours of wear occurs. It can be 800 hours of the higher-risk content interspersed with 2000 hours of random content, for ~4 hours a day over two years, and you should in theory get the same degree of burn-in as 800 hours straight, 24/7, of just the high-risk content (minus the overall aging of the 2000 hours of random content).

From Rtings:
Note that we expect burn-in to depend on a few factors:

  • The total duration of static content. LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day.
  • The brightness of the static content. Our maximum brightness CNN TV has more severe burn-in than our 200 nits brightness CNN TV.
  • The colors of the static areas. We found that in our 20/7 Burn-in Test the red sub-pixel is the fastest to degrade, followed by blue and then green.
 
Last edited:
The RTings tests are somewhat torture tests. Playing the same content for 20 hours a day in 5 hour durations. The average user surely wouldn't be using their OLED display 20 hours a day, and would be playing lots of different content on it I'd imagine. Any OLED I purchase today I would expect it to EASILY be burn free for many years in my use case scenario.

Maybe not while gaming, but I do put in multiple non-gaming sessions at least that long every week with fixed taskbar and semi-fixed titlebar elements on screen continuously.
 
Burn-in results from cumulative wear, i.e., it should not matter whether different content is mixed in or not, or over what period of time that 800 hours of wear occurs. It can be 800 hours of the higher-risk content interspersed with 2000 hours of random content, for ~4 hours a day over two years, and you should in theory get the same degree of burn-in as 800 hours straight, 24/7, of just the high-risk content (minus the overall aging of the 2000 hours of random content).

From Rtings:

Incorrect. Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time – with brightness typically at peak levels.

https://www.lg.com/us/experience-tvs/oled-tv/reliability

RTINGS burn-in tests are NOT realistic for consumer use. That is why their varying content displays at 6,000+ hours show zero burn-in, and zero loss of brightness or color volume.

If RTINGS didn't do their CNN severe abuse scenario, we wouldn't even be talking about "burn-in".
 
"OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display."

"... they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in."

Very interesting video. Thanks. It would be nice if they'd back it up with a warranty period of some sort. Samsung's LED FALD LCDs have a 10 year burn in guarantee. With an oled, if it happens (with many claims of it in product reviews on major sites) .. you are screwed.
6:15 "This also indicates that there may be a 'luck component' that in some cases may be more significant than the theoretical lifespan"




It is encouraging especially for OLED enthusiasts considering HDMI 2.1 sets are coming out.
Regardless,
OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display.
They do 400nit color or so and with added white subpixel can get brighter white but not brighter HDR color volumes. So they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in.


If true HDR color gamut past SDR-ish/quasi HDR volumes so you can show the color gamut into hdr color of 1000 - 2000nit ranges (and further toward hdr 4000 and 10000 where hdr is mastered for.. in future tvs) doesn't matter to you and you are willing to take the burn in risk - then the black depths and per pixel side by side contrast on oled is amazing.
If you are going to want real HDR color volumes of 1000 - 2000nit color out of HDR 1000 and HDR 4000/10,000 content.. and not be retarded by safety mechanisms and caps, and not play the burn in luck factor.. OLED might not be for you.

Of course the current state of FALD and small dynamic backlight arrays have their own tradeoffs in screen dim or bloom favoring algorithms in screen spaces since they aren't per pixel so they aren't a perfect solution either but for me I am more interested in achieving real HDR color volume going forward personally.
...

That said, the dell alienware display is pretty interesting. It doesn't look like I'll be able to use a samsung Q9FN in 2019 with hdmi 2.1 off of nvidia card with VRR on HDMI 2.0b, at least not for now. It might be 2020 by the time nvidia gets hdmi 2.1 on their gpus too. Just like when they had HDMI 1.4 on their 700series gpus which could only do 30hz 4k --- then a yr later they released the 900 series with hdmi 2.0 which could do 60hz 4k.
 
Last edited:
According to LG's roadmap (published in Nov 2017), they actually do have plans to introduce 40-49" OLEDs in 2020:

"The company also revealed that starting in late 2019 or early 2020 it will introduce a smaller size between 40 to 49 inches."

View attachment 134022

They seem to be roughly on target, just having announced the 88" 8K 88Z9 at CES. This means the same panel substrate could theoretically be used to cut ~44" 4K panels, from larger sheets that fail QC.

Things are looking good, we just need patience guys. ;)
Good to hear. My wallet will be ready!
"OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display."

"... they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in."


...

That said, the dell alienware display is pretty interesting. It doesn't look like I'll be able to use a samsung Q9FN in 2019 with hdmi 2.1 off of nvidia card with VRR on HDMI 2.0b, at least not for now. It might be 2020 by the time nvidia gets hdmi 2.1 on their gpus too. Just like when they had HDMI 1.4 on their 700series gpus which could only do 30hz 4k --- then a yr later they released the 900 series with hdmi 2.0 which could do 60hz 4k.
You also have LG doing ABL now, which as far as I know can't be disabled. So while a 2% window will hit about 900 nits it will quickly dim to around 600 nits. I still don't think color can be beat with an OLED, though, even with the compressed dynamic range. But I tell you what: the 2000 nit peak brightness on the Q9FN and P Series Quantum is a sight to behold with properly mastered HDR content.
 
  • Like
Reactions: elvn
like this
Yes there is no denying that high brightness has it's advanrages to image quality. However, these advantages only seem to pertain to HDR, something that I feel is still in it's early days for now. Unless I will be viewing something in HDR for more than half the time I am using my display, I would rather pick the OLED for it's superior SDR performance.
 
So does it really matter if the GTX 1000 or RTX series doesn't have HDMI 2.1? Couldn't they theoretically be overdriven with something like CRU to get higher refresh rates and/or VRR?
 
So does it really matter if the GTX 1000 or RTX series doesn't have HDMI 2.1? Couldn't they theoretically be overdriven with something like CRU to get higher refresh rates and/or VRR?

Doesn't matter with this Alienware as it comes with DP1.4 for 4k120Hz anyways. If you want to use an LG C9 then that's where the lack of hdmi 2.1 becomes an issue. I've heard of people overclocking the HDMI port before so....maybe?
 
I was not being bothered with the latest and greatest hardware for almost 10 years until last January and to my surprise I see the 2560X1440 resolution as being something "in demand" when I had a 2560X1600 30" display 10 years ago.
Ridiculous! And I don't care that they are 144Hz now - we should be @ 4K already. And after experiencing how crisp 4K looks I hate to say it but 1440p looks low to me 144Hz or not.
 
I was not being bothered with the latest and greatest hardware for almost 10 years until last January and to my surprise I see the 2560X1440 resolution as being something "in demand" when I had a 2560X1600 30" display 10 years ago.
Ridiculous! And I don't care that they are 144Hz now - we should be @ 4K already. And after experiencing how crisp 4K looks I hate to say it but 1440p looks low to me 144Hz or not.

Call AMD and Nvidia. With current GPU performance levels, you generally get to choose one or the other.

[*really just call Nvidia, AMD will be three or four years behind]
 
Incorrect. Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time – with brightness typically at peak levels.

https://www.lg.com/us/experience-tvs/oled-tv/reliability

RTINGS burn-in tests are NOT realistic for consumer use. That is why their varying content displays at 6,000+ hours show zero burn-in, and zero loss of brightness or color volume.

If RTINGS didn't do their CNN severe abuse scenario, we wouldn't even be talking about "burn-in".
That passage on LG's site doesn't refute the claim that it's cumulative hours that causes burn-in.

"Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time"


Note "most cases." Not all cases. Read between the lines. This is probably-legally-defensible weasel wording intended to allay the fears of a consumer audience - NOT a definitive statement on the cause of burn-in.

This has been discussed extensively over on AVS Forum (e.g. this thread). LG themselves have apparently told Rtings it is cumulative time that counts. I'll quote it again:

"LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day."

If you understand how OLED works, it only really makes sense for it to be cumulative hours that counts. Burn-in is caused by differences in the rates of degradation between different subpixels; the brighter any single subpixel is run over time, the more it will degrade in that time. If a subpixel is run at a high brightness for 100 hours (displaying e.g. a highly saturated logo or HUD), it will have 100 hours of high brightness wear. Spreading those hours out over a longer period of time and running the subpixel at a lower brightness in addition to those 100 hours won't somehow undo the wear caused by those 100 hours. That's just not how the technology works.
 
Yes there is no denying that high brightness has it's advanrages to image quality. However, these advantages only seem to pertain to HDR, something that I feel is still in it's early days for now. Unless I will be viewing something in HDR for more than half the time I am using my display, I would rather pick the OLED for it's superior SDR performance.
It's only advantageous in very specific cases for lcd. When the entire frame is bright.

If you have a very bright area and dark area next to it OLED is far superior.


In my experience the second is much more common and the difference is much more impactful.
 
According to LG's roadmap (published in Nov 2017), they actually do have plans to introduce 40-49" OLEDs in 2020:

"The company also revealed that starting in late 2019 or early 2020 it will introduce a smaller size between 40 to 49 inches."

View attachment 134022

They seem to be roughly on target, just having announced the 88" 8K 88Z9 at CES. This means the same panel substrate could theoretically be used to cut ~44" 4K panels, from larger sheets that fail QC.

Things are looking good, we just need patience guys. ;)

CC2BAB9A-88A1-49D1-83C6-A1B9F1B4209B.gif


This can't happen soon enough! Then I can relegate my 55" OLED to TV duties. It's a little larger than my ideal size, but I just can't go back to LCD for PC gaming and movies now. These sizes NEED to happen.
 
This is definitely it. Dell doesn't make display panels, so like almost everyone else in the business of selling monitors/TV's they're limited to what the panel makers produce. On the monitor side only Samsung and LG are vertically integrated to both make raw panels and sell finished monitors. AUO and Innolux also are monitor panel makers, but AFAIK don't see anything directly to consumers. I don't follow the TV market closely, so I"m not sure if there're any additional panel makers of note there or not.


TFT Central's most recent high refresh roadmap doesn't include any OLED gaming panels, and I don't recall having seen any talk of 30/40" class OLED displays coming from the TV world so we probably won't see one anytime soon.

http://www.tftcentral.co.uk/articles/high_refresh_rate.htm
Auo does sell direct too but not so common.
 
I just cannot imagine using an OLED for day to day computer use, at least not for a few more years. The costs of these beasts right now is crazy, and knowing that the moment it's turned on the pixels are literally dying, ouch, that's a considerable expense. And also considering there's a lot of content on most computer displays for most people that's static in nature (the toolbars, desktop icons if people have 'em, Rainmeter/system monitor type utilities, etc) the chance for actual burn-in of those organic LEDs, wow.

I mean, yes, OLED tech is absolutely gorgeous to look at, with great colors (sometimes a bit over-saturated but that can be accounted for in settings and profiles), and pure blacks like they should be, but even that pricing is certainly for the [H]ardest of the [H]ardcore people.
 
I just cannot imagine using an OLED for day to day computer use, at least not for a few more years. The costs of these beasts right now is crazy, and knowing that the moment it's turned on the pixels are literally dying, ouch, that's a considerable expense. And also considering there's a lot of content on most computer displays for most people that's static in nature (the toolbars, desktop icons if people have 'em, Rainmeter/system monitor type utilities, etc) the chance for actual burn-in of those organic LEDs, wow.

I mean, yes, OLED tech is absolutely gorgeous to look at, with great colors (sometimes a bit over-saturated but that can be accounted for in settings and profiles), and pure blacks like they should be, but even that pricing is certainly for the [H]ardest of the [H]ardcore people.

I actually agree with you. I don't even expect any of my OLED sets to last something like 10 years, but I think most people here also view OLED as disposable and just need theirs to last a few years until something better like MicroLED becomes available.
 
I agree too. It's just so damn good it's worth it to me. But I think for most people the cost is way too much.
 
I actually agree with you. I don't even expect any of my OLED sets to last something like 10 years

You know what, though? I remember people expressing these same lifespan concerns over plasma TVs, MLC and TLC SSDs, etc. Yet, most of those devices lasted WAY longer than the fearmongers predicted and it proved to be a non-issue for most people who didn't abuse them. There are plenty of those devices still in use today.

I expect the same for OLED...heck, it's playing out as we speak, but every time there's an OLED thread on here the same concerns get brought up just as they did for those other new technologies. And that's OK. I've been using my OLED for over a year as a monitor with absolutely zero signs of burn-in or degradation, but I'd much rather have this level of visual splendor for a few years less than yet another LCD/LED with evolutionary rather than revolutionary improvements in image quality because I tend to upgrade before wearing anything out anyway. Guarantee I'll move on to a 4K/120Hz OLED or a MicroLED well before this set shows any signs of wear, but I realize everyone doesn't have the budget or desire to do that.
 
That passage on LG's site doesn't refute the claim that it's cumulative hours that causes burn-in.

"Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time"


Note "most cases." Not all cases. Read between the lines. This is probably-legally-defensible weasel wording intended to allay the fears of a consumer audience - NOT a definitive statement on the cause of burn-in.

This has been discussed extensively over on AVS Forum (e.g. this thread). LG themselves have apparently told Rtings it is cumulative time that counts. I'll quote it again:

"LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day."

If you understand how OLED works, it only really makes sense for it to be cumulative hours that counts. Burn-in is caused by differences in the rates of degradation between different subpixels; the brighter any single subpixel is run over time, the more it will degrade in that time. If a subpixel is run at a high brightness for 100 hours (displaying e.g. a highly saturated logo or HUD), it will have 100 hours of high brightness wear. Spreading those hours out over a longer period of time and running the subpixel at a lower brightness in addition to those 100 hours won't somehow undo the wear caused by those 100 hours. That's just not how the technology works.

Nope. The 6,000 hours of the FPS game with the HUD up with zero burn-in test is completely contrary. There was ZERO degradation in the OLED displays brightness or color volume over 6,000+ hours in those tests. Burn-in occurs due to the pixels being driven extremely hard in a single state un-interupted for extremely long periods of time. If you give the pixels a rest, they recover. That is the whole point behind LG's build in "pixel refresher".

If that CNN test was 1/4'rd; 3/4 of the time (15 of the 20 hours per day) varied content over 18,000 hours and CNN for 1/4 (5 of the 20 hours per day broken up into one hour blocks), or the same 6,000 hours cumulative the TV wouldn't have NEARLY as much burn in. Probably none at all. The lower brightness CNN test under the same test I just described wouldn't have any burn-in at all, guaranteed. And on top of it all, these are the older OLED's with smaller pixels that need to be driven harder. OLED panels have been getting larger sub-pixels and less dead space between pixels in 2018 and now with 2019, lessening the issue even more.

RTINGS won't do a test like I describe above, because they know they'd be wasting their time.
 
Last edited:
The Alienware unit is most likely an LG 2019 series OLED Television with an additional DisplayPort 1.4

The 2019 LG OLED TV series will have:
- full HDMI 2.1 (incl. full bandwidth, VRR, ALL, the works)
- 120Hz OLED refresh rate and input (via HDMI 2.1)
- 13 ms lag

The reason why this is not in smaller sizes, is because LG doesn't currently produce such OLED panels.

Alienware could at the very least made it a real G-Sync Ultimate monitor.

Now we don't even know if it's G-Sync Compatible, because G-Sync Compatibility is NOT currently supported via the HDMI 2.1 interface AND we do not know if the Alienware 55-monitor supports VESA AdaptiveSync via DisplayPort input (if it doesn't, bye-bye to G-Sync compatibility).

Besides all that, it will probably be way overpriced, and as such it'll make more sense to buy the corresponding LG 2019 OLED TV with 120Hz panel, full HDMI 2.1, Variable Refresh Rate and the whole shebang for just a cheaper price.
 
Nope. The 6,000 hours of the FPS game with the HUD up with zero burn-in test is completely contrary. There was ZERO degradation in the OLED displays brightness or color volume over 6,000+ hours in those tests.
Judging by some Call of Duty: WWII multiplayer video, it's a low-degradation rate game: the HUD is mostly semi-transparent and not very saturated. It's not that surprising it didn't show any signs of burn-in.

Burn-in occurs due to the pixels being driven extremely hard in a single state un-interupted for extremely long periods of time.
Can you substantiate the claim that it is specifically uninterrupted display that causes burn-in? This is contrary to LG's own statement regarding aging being dependent on cumulative display.

If you give the pixels a rest, they recover. That is the whole point behind LG's build in "pixel refresher".
Can you explain the mechanism behind the pixel refresher function? "Giving the pixels a rest" doesn't make much sense as a means of reversing OLED subpixel aging. I believe this post is a more likely explanation of what it does, in which case it could only reduce temporary image retention, while having no effect on already degraded subpixels.

If that CNN test was 1/4'rd; 3/4 of the time (15 of the 20 hours per day) varied content over 18,000 hours and CNN for 1/4 (5 of the 20 hours per day broken up into one hour blocks), or the same 6,000 hours cumulative the TV wouldn't have NEARLY as much burn in. Probably none at all. The lower brightness CNN test under the same test I just described wouldn't have any burn-in at all, guaranteed.
Can you substantiate this claim? Otherwise it's just more assertions without evidence.

Look, I'm not anti-OLED. It's clear that it provides the best image quality money can buy in a consumer display, and I'll probably pick one up myself when I'm comfortable with the size/features/price. But let's be honest and realistic about its shortcomings - which it does have. Burn-in is a real issue, that will affect some people based on their usage patterns. And not just marathoning CNN, either, but it's entirely possible that over a couple years of regular (but not excessive) displaying of content with high brightness/saturated static elements, you will get some permanent burn-in. Games like League of Legends or Overwatch have health bars that are always up, are moderately bright saturated green/blue, making them more likely to cause burn-in over months/years. Most of my gaming is limited to games I'll play for a few dozen hours and be done with, so I'm personally not that worried about burn-in, but there are definitely lots of people that do play one game for thousands of hours.
 
Last edited:
It's worth pointing out that Rtings conducting these tests using the C7, which features a different pixel structure to the C8 and presumably the C9. LG increased the size of red pixels last year, the end result being that they don't have to be driven as hard to produce the same amount of light.

I'd like it if Rtings did this test again using C9, with more realistic daily usage and displayed content.
 
I just cannot imagine using an OLED for day to day computer use, at least not for a few more years. The costs of these beasts right now is crazy, and knowing that the moment it's turned on the pixels are literally dying, ouch, that's a considerable expense. And also considering there's a lot of content on most computer displays for most people that's static in nature (the toolbars, desktop icons if people have 'em, Rainmeter/system monitor type utilities, etc) the chance for actual burn-in of those organic LEDs, wow.

I mean, yes, OLED tech is absolutely gorgeous to look at, with great colors (sometimes a bit over-saturated but that can be accounted for in settings and profiles), and pure blacks like they should be, but even that pricing is certainly for the [H]ardest of the [H]ardcore people.

I've used an LG 2016 C series 55" OLED as my primary PC monitor in my home office for all sorts of use from business work to gaming (Xbox, PC, PS4 etc.) and have zero burn in. Lots of static content but realistically even normal PC use you are constantly shifting pixels no matter how long you are actively using it.

I'm over the 2 year mark in use and still going strong.. but this Alienware as well as the new 2019 LG's are tempting to upgrade to.
 
Great to see what is coming, good thing I just bought a 120Hz Gsync monitor :(. Someone will get a deal on it as my wife is very happy with the 8 year old monitor she has on her PC. I will definitely be buying one of these if it is supports some sort of VRR tech I can use with my 2080ti.
 
I can believe that they are avoiding most of the burn in risk by using shackled color brightness limitations, full screen brightness rolldowns, and their 400 - 500nit color ceiling before white subpixel brightness takes over followed by ABL screen dimming. You could still be unlucky and get some burn in though with no mfg coverage period whatsoever.

For me a huge issue is that OLED will never be capable of HDR color volumes. At best it is "SDR+" with strong color brightness limitations and "cooldown" reflexes in order to avoid real burn in damage and shorter lifespans.

That colorbrightness... not just "brightness" ceiling crushing to white like SDR.. will grow. There are already near 2000nit HDR color tvs. Movies are mastered in HDR 10,000 color volume. UHD HDR discs are an array of HDR 4000 color and 1000 color titles, with at least one with HDR 10,000 color (bladerunner 2049). The handful of HDR PC games ranges too, with several that support HDR 10,000 and 4000 already.

--------------------------------

A big limitation of OLED tv so far was lack of 120hz + VRR though. Personally I have zero interest in pc gaming on 60hz displays and without variable refresh rates. If they can provide both, even on SDR ranged color, with strong burn in safety features it will definitely be interesting.

I wouldn't buy one for HDR color though.. even the HDR 2000 color tvs are fractional HDR.. around 1/2 of 4000 content's color and 1/5 of full HDR 10,000 content's color volume. 400nit color is 1/10th of HDR 4000 color and 1/25th of HDR 10,000 color volume, which is hardly "HDR" color. Black depths on the other hand, and side by side pixel contrast.. are enormous gains but pick your poison.
Unfortunately we might not have as many choices as I'd like since it's in nvidia's hands whether they will support VRR over HDMI. Their current GPU gen also lacks HDMI 2.1 obviously for full 4k 120hz 4:4:4 chroma bandwidth.
 
Last edited:
For those of you who use 55+ screens for gaming/office use - how far away do you sit from the screen?

I am looking to replace my 3415 Dell with something with a higher refresh rate and slightly larger real estate, but I feel like 55 is just too large. Something in the 32-40 range with 4k/120Hz/OLED sounds just about perfect.

Also I am guessing I don't have enough body parts to sell to get one of these things regardless.
 
For those of you who use 55+ screens for gaming/office use - how far away do you sit from the screen?

I am looking to replace my 3415 Dell with something with a higher refresh rate and slightly larger real estate, but I feel like 55 is just too large. Something in the 32-40 range with 4k/120Hz/OLED sounds just about perfect.

Also I am guessing I don't have enough body parts to sell to get one of these things regardless.
I sit about 4 feet away from the 49" TV I use for console gaming. It is curved, though, but very slight like 3800R.
 
I sit about 8' away from my 70" Vizio FALD VA (non-hdr gen) with my laptop ported to it with no problem for media and games.. so a 55" would probably be fine at around 4' I'm guessing, in line with what Armenius said.

I do have to look up to the corners for text at that distance on my 70" though so it has more of a multiple monitor/window usage scenario when occasionally using it for desktop/apps, web browsing, etc. rather than full focal view of the whole screen so it depends. If you have the room to face your desk away from the wall or face the desk out from the corner with the monitor mid room, or set the desk mid-room facing the monitor on a wall.. those usually allow for a better kind of setup for larger screens rather than trying to maintain the common sitting at a bookshelf type arrangement.
 
Last edited:
For those of you who use 55+ screens for gaming/office use - how far away do you sit from the screen?
I have a 55" Samsung on a standard computer desk. Sitting about 2 feet away if I'm typing (I sit more like 3 or 4 feet away when gaming or watching movies).

It's a tad too big for this setup. I was on 40" before, and that was a nicer size for daily use. I still like it for media usage, though I have another computer with a normal monitor I use for work.
 
Meh. Can't go back to a flat display.

It makes sense at typical TV distance, but that would be rather underwhelming.
 
Meh. Can't go back to a flat display.

It makes sense at typical TV distance, but that would be rather underwhelming.

LG is releasing a rollable tv. The screens can probably be bent to whatever curve you want if you're willing to mod it. I wouldn't be surprised if there are some hard members crazy enough to curve mod their oleds once they release.
 
It's worth pointing out that Rtings conducting these tests using the C7, which features a different pixel structure to the C8 and presumably the C9. LG increased the size of red pixels last year, the end result being that they don't have to be driven as hard to produce the same amount of light.

I'd like it if Rtings did this test again using C9, with more realistic daily usage and displayed content.
Indeed, last year was a relatively small increase in subpixel size, but this year's models look like they might have a much more substantial improvement in that area. Someone posted this on AVS Forum, though note the 2019 numbers are only a preliminary estimate right now:

pMpXIhF.png



Next year should see substantial gains yet again, assuming LG gets top emission production up and running.


I've used an LG 2016 C series 55" OLED as my primary PC monitor in my home office for all sorts of use from business work to gaming (Xbox, PC, PS4 etc.) and have zero burn in. Lots of static content but realistically even normal PC use you are constantly shifting pixels no matter how long you are actively using it.

I'm over the 2 year mark in use and still going strong.. but this Alienware as well as the new 2019 LG's are tempting to upgrade to.
I actually wouldn't be that concerned about desktop use as long as you take a few precautions. The static parts of a desktop are usually white, shades of grey, or muted colors. Compared to saturated colors, white/grey is much easier on an LG WOLED, since you can use all four subpixels (R,G,B,W) to display it. On top of that, the white subpixel doesn't lose light to a color filter.

By comparison, a highly saturated color like red or green needs to be generated entirely by one subpixel (white can be used to increase brightness, but it loses saturation), AND you lose much of the light to the color filter. Both of these factors means it needs to work extra hard to generate a bright, saturated color. (Side note: this is where QD-OLED could improve efficiency dramatically, since it converts much more of the OLED light to the right color than a color filter can.)

So the main thing you'd want to avoid with desktop use are any static icons that use highly saturated colors. Hide/move/replace them, set a black or desaturated image as the desktop background, and you should be good for quite a while.
 
Nope. The 6,000 hours of the FPS game with the HUD up with zero burn-in test is completely contrary. There was ZERO degradation in the OLED displays brightness or color volume over 6,000+ hours in those tests. Burn-in occurs due to the pixels being driven extremely hard in a single state un-interupted for extremely long periods of time. If you give the pixels a rest, they recover. That is the whole point behind LG's build in "pixel refresher".

If that CNN test was 1/4'rd; 3/4 of the time (15 of the 20 hours per day) varied content over 18,000 hours and CNN for 1/4 (5 of the 20 hours per day broken up into one hour blocks), or the same 6,000 hours cumulative the TV wouldn't have NEARLY as much burn in. Probably none at all. The lower brightness CNN test under the same test I just described wouldn't have any burn-in at all, guaranteed. And on top of it all, these are the older OLED's with smaller pixels that need to be driven harder. OLED panels have been getting larger sub-pixels and less dead space between pixels in 2018 and now with 2019, lessening the issue even more.

RTINGS won't do a test like I describe above, because they know they'd be wasting their time.

All things considered, burn in is a legitimate concern on something expensive as is a tv, because
A) it is not exactly known what causes more degradation
B) a tv is expected to last thousands of hours
 
I guess it depends how long you think you will keep it. I usually upgrade monitors every few years, it which case burn-in may not be too big an issue (though could affect resell value).
 
If I had one I'd probably keep it in a multi monitor setup and only use the big screen for games and vids. Shut it off when not doing thise activities and perhaps also have a screen timeout/black screen visualization you could activate manually for in between times when you dont want the montior "off" but are jumping out of the game/vid for a short time to focus on something else. Don't need OLED burn time for desktop and apps really in a muti monitor setup imo. More like game theater and video fullscreen usage scenarios.

A display like oled should have its own internal sleep timer option separate from monitor keep alive so that if something goes stupid with windows or an app or a crash, random reboot or a video playing or mouse moving, a weight on a keyboard or controller stick or it's upside down on its own button, etc.. it won't stay on all night or all the time you are away from home or left the room, afk. Human error and random occurences shouldn't burn up your oled lifespan.
 
A display like oled should have its own internal sleep timer option separate from monitor keep alive so that if something goes stupid with windows or an app or a crash, random reboot or a video playing or mouse moving, a weight on a keyboard or controller stick or it's upside down on its own button, etc.. it won't stay on all night or all the time you are away from home or left the room, afk. Human error and random occurences shouldn't burn up your oled lifespan.

Yeah, like recent Battlefield games, after you've played them, the screen saver won't work again until reboot. Situations like that could definitely be harmful to an OLED.
 
I can believe that they are avoiding most of the burn in risk by using shackled color brightness limitations, full screen brightness rolldowns, and their 400 - 500nit color ceiling before white subpixel brightness takes over followed by ABL screen dimming. You could still be unlucky and get some burn in though with no mfg coverage period whatsoever.

For me a huge issue is that OLED will never be capable of HDR color volumes. At best it is "SDR+" with strong color brightness limitations and "cooldown" reflexes in order to avoid real burn in damage and shorter lifespans.

That colorbrightness... not just "brightness" ceiling crushing to white like SDR.. will grow. There are already near 2000nit HDR color tvs. Movies are mastered in HDR 10,000 color volume. UHD HDR discs are an array of HDR 4000 color and 1000 color titles, with at least one with HDR 10,000 color (bladerunner 2049). The handful of HDR PC games ranges too, with several that support HDR 10,000 and 4000 already.

--------------------------------

A big limitation of OLED tv so far was lack of 120hz + VRR though. Personally I have zero interest in pc gaming on 60hz displays and without variable refresh rates. If they can provide both, even on SDR ranged color, with strong burn in safety features it will definitely be interesting.

I wouldn't buy one for HDR color though.. even the HDR 2000 color tvs are fractional HDR.. around 1/2 of 4000 content's color and 1/5 of full HDR 10,000 content's color volume. 400nit color is 1/10th of HDR 4000 color and 1/25th of HDR 10,000 color volume, which is hardly "HDR" color. Black depths on the other hand, and side by side pixel contrast.. are enormous gains but pick your poison.
Unfortunately we might not have as many choices as I'd like since it's in nvidia's hands whether they will support VRR over HDMI. Their current GPU gen also lacks HDMI 2.1 obviously for full 4k 120hz 4:4:4 chroma bandwidth.


The thing is, most pc displays top out at 400 nits anyway. I'm using a sony 43" 4k tv that tops out around 400 nits, and that is basically white light. It would be a massive improvement if the COLORS topped out at 500 nits, on top of deep blacks instead of this VA panel with zero fald for the smaller display.

I see any of these oleds as a massive upgrade and landing zone for wonderful looking content and games for years to come, while the rest of the industry pushes ahead to higher peak brightness levels and color volume. It's going to be a LONG time before we get displays that can hit 10k nits of brightness, or even 4k. And how long before we get that AND emissive displays?

When it comes to displays, there are these sorts of... base camps, going beyond that is a lot of incremental improvement until you get to another local base camp.



cheap 1440p korean displays were a basecamp. 43" 4k was to me another basecamp. 4k 120Hz oled is another basecamp. And after that, that is already enough vrr for me, higher color volumes would be another basecamp, but not just a little bit higher, with a little bit more content. I'd want a MUCH bigger leap to jump again, and I can wait for those improvements to build up. But once these oled vrr 120Hz 4k displays hit, that is a higher enough local peak to jump towards and live with for awhile.
 
I actually wouldn't be that concerned about desktop use as long as you take a few precautions. So the main thing you'd want to avoid with desktop use are any static icons that use highly saturated colors. Hide/move/replace them, set a black or desaturated image as the desktop background, and you should be good for quite a while.

This basically reads like:
"I actually wouldn't be that concerned about desktop use as long as you're concerned about it."

No. It's not acceptable. Thoughts and considerations like these shouldn't enter the mind of people dropping $2,000+ on a monitor.
 
Back
Top