Sweet Baby Jeebus Alienware 55" 4k120 OLED Displayport

Discussion in 'Displays' started by l88bastard, Jan 8, 2019.

  1. MistaSparkul

    MistaSparkul Gawd

    Messages:
    904
    Joined:
    Jul 5, 2012
    The RTings tests are somewhat torture tests. Playing the same content for 20 hours a day in 5 hour durations. The average user surely wouldn't be using their OLED display 20 hours a day, and would be playing lots of different content on it I'd imagine. Any OLED I purchase today I would expect it to EASILY be burn free for many years in my use case scenario.
     
  2. Necere

    Necere 2[H]4U

    Messages:
    2,654
    Joined:
    Jan 3, 2003
    Burn-in results from cumulative wear, i.e., it should not matter whether different content is mixed in or not, or over what period of time that 800 hours of wear occurs. It can be 800 hours of the higher-risk content interspersed with 2000 hours of random content, for ~4 hours a day over two years, and you should in theory get the same degree of burn-in as 800 hours straight, 24/7, of just the high-risk content (minus the overall aging of the 2000 hours of random content).

    From Rtings:
     
    Last edited: Jan 10, 2019
  3. DanNeely

    DanNeely 2[H]4U

    Messages:
    3,437
    Joined:
    Aug 26, 2005
    Maybe not while gaming, but I do put in multiple non-gaming sessions at least that long every week with fixed taskbar and semi-fixed titlebar elements on screen continuously.
     
  4. Vega

    Vega [H]ardness Supreme

    Messages:
    5,878
    Joined:
    Oct 12, 2004
    Incorrect. Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time – with brightness typically at peak levels.

    https://www.lg.com/us/experience-tvs/oled-tv/reliability

    RTINGS burn-in tests are NOT realistic for consumer use. That is why their varying content displays at 6,000+ hours show zero burn-in, and zero loss of brightness or color volume.

    If RTINGS didn't do their CNN severe abuse scenario, we wouldn't even be talking about "burn-in".
     
  5. elvn

    elvn 2[H]4U

    Messages:
    3,004
    Joined:
    May 5, 2006
    "OLED still can't do high HDR color, and part of the reason they can't and why that gap will grow even worse is for the exact same reason they fared well on those tests - that reason is avoiding the burn in risks. So it is sort of like reducing your color gamut out of fear that flying too high into it will wreck your display."

    "... they are much less color shown past 400 nit or so even if their brightness says 800nit+ for brief periods. They also do brightness roll downs dimming the whole screen on higher overall *color* brightness scenes as a safety trigger so again, their colors are retarded in both of these dynamic usage scenarios to save the screen from burning in."

    ...

    That said, the dell alienware display is pretty interesting. It doesn't look like I'll be able to use a samsung Q9FN in 2019 with hdmi 2.1 off of nvidia card with VRR on HDMI 2.0b, at least not for now. It might be 2020 by the time nvidia gets hdmi 2.1 on their gpus too. Just like when they had HDMI 1.4 on their 700series gpus which could only do 30hz 4k --- then a yr later they released the 900 series with hdmi 2.0 which could do 60hz 4k.
     
    Last edited: Jan 10, 2019
  6. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    15,469
    Joined:
    Jan 28, 2014
    Good to hear. My wallet will be ready!
    You also have LG doing ABL now, which as far as I know can't be disabled. So while a 2% window will hit about 900 nits it will quickly dim to around 600 nits. I still don't think color can be beat with an OLED, though, even with the compressed dynamic range. But I tell you what: the 2000 nit peak brightness on the Q9FN and P Series Quantum is a sight to behold with properly mastered HDR content.
     
    elvn likes this.
  7. MistaSparkul

    MistaSparkul Gawd

    Messages:
    904
    Joined:
    Jul 5, 2012
    Yes there is no denying that high brightness has it's advanrages to image quality. However, these advantages only seem to pertain to HDR, something that I feel is still in it's early days for now. Unless I will be viewing something in HDR for more than half the time I am using my display, I would rather pick the OLED for it's superior SDR performance.
     
  8. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    842
    Joined:
    Mar 23, 2013
    So does it really matter if the GTX 1000 or RTX series doesn't have HDMI 2.1? Couldn't they theoretically be overdriven with something like CRU to get higher refresh rates and/or VRR?
     
  9. MistaSparkul

    MistaSparkul Gawd

    Messages:
    904
    Joined:
    Jul 5, 2012
    Doesn't matter with this Alienware as it comes with DP1.4 for 4k120Hz anyways. If you want to use an LG C9 then that's where the lack of hdmi 2.1 becomes an issue. I've heard of people overclocking the HDMI port before so....maybe?
     
    Armenius likes this.
  10. JargonGR

    JargonGR Limp Gawd

    Messages:
    461
    Joined:
    Dec 16, 2006
    I was not being bothered with the latest and greatest hardware for almost 10 years until last January and to my surprise I see the 2560X1440 resolution as being something "in demand" when I had a 2560X1600 30" display 10 years ago.
    Ridiculous! And I don't care that they are 144Hz now - we should be @ 4K already. And after experiencing how crisp 4K looks I hate to say it but 1440p looks low to me 144Hz or not.
     
    DanNeely likes this.
  11. IdiotInCharge

    IdiotInCharge Not the Idiot YOU are Looking for

    Messages:
    7,944
    Joined:
    Jun 13, 2003
    Call AMD and Nvidia. With current GPU performance levels, you generally get to choose one or the other.

    [*really just call Nvidia, AMD will be three or four years behind]
     
  12. Necere

    Necere 2[H]4U

    Messages:
    2,654
    Joined:
    Jan 3, 2003
    That passage on LG's site doesn't refute the claim that it's cumulative hours that causes burn-in.

    "Most cases of burn-in in televisions is a result of static images or on-screen elements displaying on the screen uninterrupted for many hours or days at a time"


    Note "most cases." Not all cases. Read between the lines. This is probably-legally-defensible weasel wording intended to allay the fears of a consumer audience - NOT a definitive statement on the cause of burn-in.

    This has been discussed extensively over on AVS Forum (e.g. this thread). LG themselves have apparently told Rtings it is cumulative time that counts. I'll quote it again:

    "LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day."

    If you understand how OLED works, it only really makes sense for it to be cumulative hours that counts. Burn-in is caused by differences in the rates of degradation between different subpixels; the brighter any single subpixel is run over time, the more it will degrade in that time. If a subpixel is run at a high brightness for 100 hours (displaying e.g. a highly saturated logo or HUD), it will have 100 hours of high brightness wear. Spreading those hours out over a longer period of time and running the subpixel at a lower brightness in addition to those 100 hours won't somehow undo the wear caused by those 100 hours. That's just not how the technology works.
     
  13. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,349
    Joined:
    Nov 12, 2012
    It's only advantageous in very specific cases for lcd. When the entire frame is bright.

    If you have a very bright area and dark area next to it OLED is far superior.


    In my experience the second is much more common and the difference is much more impactful.
     
    Armenius and IdiotInCharge like this.
  14. Lateralus

    Lateralus More [H]uman than Human

    Messages:
    13,021
    Joined:
    Aug 7, 2004
    CC2BAB9A-88A1-49D1-83C6-A1B9F1B4209B.gif

    This can't happen soon enough! Then I can relegate my 55" OLED to TV duties. It's a little larger than my ideal size, but I just can't go back to LCD for PC gaming and movies now. These sizes NEED to happen.
     
    Gatecrasher3000 and Armenius like this.
  15. N4CR

    N4CR 2[H]4U

    Messages:
    3,249
    Joined:
    Oct 17, 2011
    Auo does sell direct too but not so common.
     
  16. Tiberian

    Tiberian DILLIGAFuck

    Messages:
    5,764
    Joined:
    Feb 12, 2012
    I just cannot imagine using an OLED for day to day computer use, at least not for a few more years. The costs of these beasts right now is crazy, and knowing that the moment it's turned on the pixels are literally dying, ouch, that's a considerable expense. And also considering there's a lot of content on most computer displays for most people that's static in nature (the toolbars, desktop icons if people have 'em, Rainmeter/system monitor type utilities, etc) the chance for actual burn-in of those organic LEDs, wow.

    I mean, yes, OLED tech is absolutely gorgeous to look at, with great colors (sometimes a bit over-saturated but that can be accounted for in settings and profiles), and pure blacks like they should be, but even that pricing is certainly for the [H]ardest of the [H]ardcore people.
     
  17. MistaSparkul

    MistaSparkul Gawd

    Messages:
    904
    Joined:
    Jul 5, 2012
    I actually agree with you. I don't even expect any of my OLED sets to last something like 10 years, but I think most people here also view OLED as disposable and just need theirs to last a few years until something better like MicroLED becomes available.
     
  18. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,349
    Joined:
    Nov 12, 2012
    I agree too. It's just so damn good it's worth it to me. But I think for most people the cost is way too much.
     
  19. Lateralus

    Lateralus More [H]uman than Human

    Messages:
    13,021
    Joined:
    Aug 7, 2004
    You know what, though? I remember people expressing these same lifespan concerns over plasma TVs, MLC and TLC SSDs, etc. Yet, most of those devices lasted WAY longer than the fearmongers predicted and it proved to be a non-issue for most people who didn't abuse them. There are plenty of those devices still in use today.

    I expect the same for OLED...heck, it's playing out as we speak, but every time there's an OLED thread on here the same concerns get brought up just as they did for those other new technologies. And that's OK. I've been using my OLED for over a year as a monitor with absolutely zero signs of burn-in or degradation, but I'd much rather have this level of visual splendor for a few years less than yet another LCD/LED with evolutionary rather than revolutionary improvements in image quality because I tend to upgrade before wearing anything out anyway. Guarantee I'll move on to a 4K/120Hz OLED or a MicroLED well before this set shows any signs of wear, but I realize everyone doesn't have the budget or desire to do that.
     
    Armenius and gan7114 like this.
  20. Vega

    Vega [H]ardness Supreme

    Messages:
    5,878
    Joined:
    Oct 12, 2004
    Nope. The 6,000 hours of the FPS game with the HUD up with zero burn-in test is completely contrary. There was ZERO degradation in the OLED displays brightness or color volume over 6,000+ hours in those tests. Burn-in occurs due to the pixels being driven extremely hard in a single state un-interupted for extremely long periods of time. If you give the pixels a rest, they recover. That is the whole point behind LG's build in "pixel refresher".

    If that CNN test was 1/4'rd; 3/4 of the time (15 of the 20 hours per day) varied content over 18,000 hours and CNN for 1/4 (5 of the 20 hours per day broken up into one hour blocks), or the same 6,000 hours cumulative the TV wouldn't have NEARLY as much burn in. Probably none at all. The lower brightness CNN test under the same test I just described wouldn't have any burn-in at all, guaranteed. And on top of it all, these are the older OLED's with smaller pixels that need to be driven harder. OLED panels have been getting larger sub-pixels and less dead space between pixels in 2018 and now with 2019, lessening the issue even more.

    RTINGS won't do a test like I describe above, because they know they'd be wasting their time.
     
    Last edited: Jan 11, 2019
    Armenius likes this.
  21. Halc

    Halc n00bie

    Messages:
    8
    Joined:
    Feb 28, 2016
    The Alienware unit is most likely an LG 2019 series OLED Television with an additional DisplayPort 1.4

    The 2019 LG OLED TV series will have:
    - full HDMI 2.1 (incl. full bandwidth, VRR, ALL, the works)
    - 120Hz OLED refresh rate and input (via HDMI 2.1)
    - 13 ms lag

    The reason why this is not in smaller sizes, is because LG doesn't currently produce such OLED panels.

    Alienware could at the very least made it a real G-Sync Ultimate monitor.

    Now we don't even know if it's G-Sync Compatible, because G-Sync Compatibility is NOT currently supported via the HDMI 2.1 interface AND we do not know if the Alienware 55-monitor supports VESA AdaptiveSync via DisplayPort input (if it doesn't, bye-bye to G-Sync compatibility).

    Besides all that, it will probably be way overpriced, and as such it'll make more sense to buy the corresponding LG 2019 OLED TV with 120Hz panel, full HDMI 2.1, Variable Refresh Rate and the whole shebang for just a cheaper price.
     
  22. Necere

    Necere 2[H]4U

    Messages:
    2,654
    Joined:
    Jan 3, 2003
    Judging by some Call of Duty: WWII multiplayer video, it's a low-degradation rate game: the HUD is mostly semi-transparent and not very saturated. It's not that surprising it didn't show any signs of burn-in.

    Can you substantiate the claim that it is specifically uninterrupted display that causes burn-in? This is contrary to LG's own statement regarding aging being dependent on cumulative display.

    Can you explain the mechanism behind the pixel refresher function? "Giving the pixels a rest" doesn't make much sense as a means of reversing OLED subpixel aging. I believe this post is a more likely explanation of what it does, in which case it could only reduce temporary image retention, while having no effect on already degraded subpixels.

    Can you substantiate this claim? Otherwise it's just more assertions without evidence.

    Look, I'm not anti-OLED. It's clear that it provides the best image quality money can buy in a consumer display, and I'll probably pick one up myself when I'm comfortable with the size/features/price. But let's be honest and realistic about its shortcomings - which it does have. Burn-in is a real issue, that will affect some people based on their usage patterns. And not just marathoning CNN, either, but it's entirely possible that over a couple years of regular (but not excessive) displaying of content with high brightness/saturated static elements, you will get some permanent burn-in. Games like League of Legends or Overwatch have health bars that are always up, are moderately bright saturated green/blue, making them more likely to cause burn-in over months/years. Most of my gaming is limited to games I'll play for a few dozen hours and be done with, so I'm personally not that worried about burn-in, but there are definitely lots of people that do play one game for thousands of hours.
     
    Last edited: Jan 11, 2019
  23. gan7114

    gan7114 Limp Gawd

    Messages:
    163
    Joined:
    Dec 14, 2012
    It's worth pointing out that Rtings conducting these tests using the C7, which features a different pixel structure to the C8 and presumably the C9. LG increased the size of red pixels last year, the end result being that they don't have to be driven as hard to produce the same amount of light.

    I'd like it if Rtings did this test again using C9, with more realistic daily usage and displayed content.
     
    Armenius likes this.
  24. TheGameguru

    TheGameguru [H]ard|Gawd

    Messages:
    1,825
    Joined:
    May 31, 2002
    I've used an LG 2016 C series 55" OLED as my primary PC monitor in my home office for all sorts of use from business work to gaming (Xbox, PC, PS4 etc.) and have zero burn in. Lots of static content but realistically even normal PC use you are constantly shifting pixels no matter how long you are actively using it.

    I'm over the 2 year mark in use and still going strong.. but this Alienware as well as the new 2019 LG's are tempting to upgrade to.
     
  25. Sorcerer_Tim

    Sorcerer_Tim [H]Lite

    Messages:
    111
    Joined:
    Oct 8, 2009
    Great to see what is coming, good thing I just bought a 120Hz Gsync monitor :(. Someone will get a deal on it as my wife is very happy with the 8 year old monitor she has on her PC. I will definitely be buying one of these if it is supports some sort of VRR tech I can use with my 2080ti.
     
  26. elvn

    elvn 2[H]4U

    Messages:
    3,004
    Joined:
    May 5, 2006
    I can believe that they are avoiding most of the burn in risk by using shackled color brightness limitations, full screen brightness rolldowns, and their 400 - 500nit color ceiling before white subpixel brightness takes over followed by ABL screen dimming. You could still be unlucky and get some burn in though with no mfg coverage period whatsoever.

    For me a huge issue is that OLED will never be capable of HDR color volumes. At best it is "SDR+" with strong color brightness limitations and "cooldown" reflexes in order to avoid real burn in damage and shorter lifespans.

    That colorbrightness... not just "brightness" ceiling crushing to white like SDR.. will grow. There are already near 2000nit HDR color tvs. Movies are mastered in HDR 10,000 color volume. UHD HDR discs are an array of HDR 4000 color and 1000 color titles, with at least one with HDR 10,000 color (bladerunner 2049). The handful of HDR PC games ranges too, with several that support HDR 10,000 and 4000 already.

    --------------------------------

    A big limitation of OLED tv so far was lack of 120hz + VRR though. Personally I have zero interest in pc gaming on 60hz displays and without variable refresh rates. If they can provide both, even on SDR ranged color, with strong burn in safety features it will definitely be interesting.

    I wouldn't buy one for HDR color though.. even the HDR 2000 color tvs are fractional HDR.. around 1/2 of 4000 content's color and 1/5 of full HDR 10,000 content's color volume. 400nit color is 1/10th of HDR 4000 color and 1/25th of HDR 10,000 color volume, which is hardly "HDR" color. Black depths on the other hand, and side by side pixel contrast.. are enormous gains but pick your poison.
    Unfortunately we might not have as many choices as I'd like since it's in nvidia's hands whether they will support VRR over HDMI. Their current GPU gen also lacks HDMI 2.1 obviously for full 4k 120hz 4:4:4 chroma bandwidth.
     
    Last edited: Jan 11, 2019
  27. mbakalski

    mbakalski n00bie

    Messages:
    62
    Joined:
    Oct 9, 2013
    For those of you who use 55+ screens for gaming/office use - how far away do you sit from the screen?

    I am looking to replace my 3415 Dell with something with a higher refresh rate and slightly larger real estate, but I feel like 55 is just too large. Something in the 32-40 range with 4k/120Hz/OLED sounds just about perfect.

    Also I am guessing I don't have enough body parts to sell to get one of these things regardless.
     
  28. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    15,469
    Joined:
    Jan 28, 2014
    I sit about 4 feet away from the 49" TV I use for console gaming. It is curved, though, but very slight like 3800R.
     
  29. elvn

    elvn 2[H]4U

    Messages:
    3,004
    Joined:
    May 5, 2006
    I sit about 8' away from my 70" Vizio FALD VA (non-hdr gen) with my laptop ported to it with no problem for media and games.. so a 55" would probably be fine at around 4' I'm guessing, in line with what Armenius said.

    I do have to look up to the corners for text at that distance on my 70" though so it has more of a multiple monitor/window usage scenario when occasionally using it for desktop/apps, web browsing, etc. rather than full focal view of the whole screen so it depends. If you have the room to face your desk away from the wall or face the desk out from the corner with the monitor mid room, or set the desk mid-room facing the monitor on a wall.. those usually allow for a better kind of setup for larger screens rather than trying to maintain the common sitting at a bookshelf type arrangement.
     
    Last edited: Jan 11, 2019
    Armenius likes this.
  30. cybereality

    cybereality 2[H]4U

    Messages:
    3,670
    Joined:
    Mar 22, 2008
    I have a 55" Samsung on a standard computer desk. Sitting about 2 feet away if I'm typing (I sit more like 3 or 4 feet away when gaming or watching movies).

    It's a tad too big for this setup. I was on 40" before, and that was a nicer size for daily use. I still like it for media usage, though I have another computer with a normal monitor I use for work.
     
    Armenius likes this.
  31. Desert Fish

    Desert Fish n00bie

    Messages:
    31
    Joined:
    May 30, 2016
    Meh. Can't go back to a flat display.

    It makes sense at typical TV distance, but that would be rather underwhelming.
     
  32. linuxdude9

    linuxdude9 Gawd

    Messages:
    513
    Joined:
    Dec 25, 2004
  33. sharknice

    sharknice [H]ard|Gawd

    Messages:
    1,349
    Joined:
    Nov 12, 2012
    LG is releasing a rollable tv. The screens can probably be bent to whatever curve you want if you're willing to mod it. I wouldn't be surprised if there are some hard members crazy enough to curve mod their oleds once they release.
     
    Armenius likes this.
  34. Necere

    Necere 2[H]4U

    Messages:
    2,654
    Joined:
    Jan 3, 2003
    Indeed, last year was a relatively small increase in subpixel size, but this year's models look like they might have a much more substantial improvement in that area. Someone posted this on AVS Forum, though note the 2019 numbers are only a preliminary estimate right now:

    pMpXIhF.png


    Next year should see substantial gains yet again, assuming LG gets top emission production up and running.


    I actually wouldn't be that concerned about desktop use as long as you take a few precautions. The static parts of a desktop are usually white, shades of grey, or muted colors. Compared to saturated colors, white/grey is much easier on an LG WOLED, since you can use all four subpixels (R,G,B,W) to display it. On top of that, the white subpixel doesn't lose light to a color filter.

    By comparison, a highly saturated color like red or green needs to be generated entirely by one subpixel (white can be used to increase brightness, but it loses saturation), AND you lose much of the light to the color filter. Both of these factors means it needs to work extra hard to generate a bright, saturated color. (Side note: this is where QD-OLED could improve efficiency dramatically, since it converts much more of the OLED light to the right color than a color filter can.)

    So the main thing you'd want to avoid with desktop use are any static icons that use highly saturated colors. Hide/move/replace them, set a black or desaturated image as the desktop background, and you should be good for quite a while.
     
    sharknice and Lateralus like this.
  35. prava

    prava [H]ard|Gawd

    Messages:
    1,342
    Joined:
    Nov 14, 2010
    All things considered, burn in is a legitimate concern on something expensive as is a tv, because
    A) it is not exactly known what causes more degradation
    B) a tv is expected to last thousands of hours
     
    bigbluefe likes this.
  36. cybereality

    cybereality 2[H]4U

    Messages:
    3,670
    Joined:
    Mar 22, 2008
    I guess it depends how long you think you will keep it. I usually upgrade monitors every few years, it which case burn-in may not be too big an issue (though could affect resell value).
     
  37. elvn

    elvn 2[H]4U

    Messages:
    3,004
    Joined:
    May 5, 2006
    If I had one I'd probably keep it in a multi monitor setup and only use the big screen for games and vids. Shut it off when not doing thise activities and perhaps also have a screen timeout/black screen visualization you could activate manually for in between times when you dont want the montior "off" but are jumping out of the game/vid for a short time to focus on something else. Don't need OLED burn time for desktop and apps really in a muti monitor setup imo. More like game theater and video fullscreen usage scenarios.

    A display like oled should have its own internal sleep timer option separate from monitor keep alive so that if something goes stupid with windows or an app or a crash, random reboot or a video playing or mouse moving, a weight on a keyboard or controller stick or it's upside down on its own button, etc.. it won't stay on all night or all the time you are away from home or left the room, afk. Human error and random occurences shouldn't burn up your oled lifespan.
     
  38. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    842
    Joined:
    Mar 23, 2013
    Yeah, like recent Battlefield games, after you've played them, the screen saver won't work again until reboot. Situations like that could definitely be harmful to an OLED.
     
  39. tybert7

    tybert7 2[H]4U

    Messages:
    2,628
    Joined:
    Aug 23, 2007

    The thing is, most pc displays top out at 400 nits anyway. I'm using a sony 43" 4k tv that tops out around 400 nits, and that is basically white light. It would be a massive improvement if the COLORS topped out at 500 nits, on top of deep blacks instead of this VA panel with zero fald for the smaller display.

    I see any of these oleds as a massive upgrade and landing zone for wonderful looking content and games for years to come, while the rest of the industry pushes ahead to higher peak brightness levels and color volume. It's going to be a LONG time before we get displays that can hit 10k nits of brightness, or even 4k. And how long before we get that AND emissive displays?

    When it comes to displays, there are these sorts of... base camps, going beyond that is a lot of incremental improvement until you get to another local base camp.



    cheap 1440p korean displays were a basecamp. 43" 4k was to me another basecamp. 4k 120Hz oled is another basecamp. And after that, that is already enough vrr for me, higher color volumes would be another basecamp, but not just a little bit higher, with a little bit more content. I'd want a MUCH bigger leap to jump again, and I can wait for those improvements to build up. But once these oled vrr 120Hz 4k displays hit, that is a higher enough local peak to jump towards and live with for awhile.
     
  40. bigbluefe

    bigbluefe Gawd

    Messages:
    512
    Joined:
    Aug 23, 2014
    This basically reads like:
    "I actually wouldn't be that concerned about desktop use as long as you're concerned about it."

    No. It's not acceptable. Thoughts and considerations like these shouldn't enter the mind of people dropping $2,000+ on a monitor.
     
    DanNeely likes this.