Even OLED is a flawed technology

The 1st is a 2017 model and had more issues then the 2018/19. The rting testing is not normal daily use case. If you plan on watching/gaming 20/7 then yes OLEDs are not for you. So basically the test simulated over 5 years of tv use before substantial burn-in occurred. Which still doesn't represent normal use. 5 year is pretty reasonable life for TVs nowadays and more then likely you will still get a lot more out a OLED. I don't baby my c9 and haven't changed my viewing habits and will enjoy the beauty of this TV. If I have issue after a year I will eat my words.

The Rtings tests were done at 175 nit and 200nit. The second one they set on CNN as an "extreme" test was 380nits so none of them were showing any HDR brightness whatsoever. I'm sure the 55" alienware by dell being a solid SDR 400nit color brightness peak is no coincidence. I believe LG oleds spike into the high 700 nits to 800nit in small % and overall scene is kicked back to 600nit max via ABL auto brightness limiter dimming the whole scene well below the now 600nit overall. Of course you usually aren't watching HDR content as static images but HDR should eventually take over as a higher range color brightness 3d color gamut so will be in photos , wallpapers and apps someday, streaming services, more games, and editing of HDR images, graphics and videos. It would be interesting to see the results if they had run HDR material on a loop on a few OLED tvs as well.

=================================================================

RTINGS https://www.rtings.com/tv/learn/permanent-image-retention-burn-in-lcd-oled
TEST SETUP

"The TVs are placed side-by-side in one of our testing rooms as shown to the right. The TVs will stay on for 20 hours per day, 7 days per week, running our test pattern in a loop. They will be turned off for 4 hours each day using USB infrared transmitters connected to each TV and controlled by a PC to better represent normal (but still very heavy) usage. Calibration settings have been applied, with the backlight or OLED light set to produce 175 nits on our checkerboard pattern. On the B6, the 'Pixel Shift' option is enabled. A single Android TV Box is used as a source, with a HDMI splitter used to provide the same material to each display."

"A 5.5 hour video loop is used as the test pattern. It has been designed to mix static content with moving images to represent some typical content. The base material is a recording of over the air antenna TV with RTINGS overlay logos of different opacities and durations, and letterbox black bars added. These additional elements are:

  • Top and bottom: Letterbox bars present for 2 hours, then absent for 3.5 hours (movie example)
  • Top left: 100% solid logo, present for the whole clip (torture test)
  • Top right: 50% opacity logo, present for the whole clip (network logo torture test)
  • Bottom left: 100% solid logo, present for 2 hours then absent for 3.5 hours (video games example)
  • Bottom right: 50% opacity logo, present for 10 minutes then absent for 2 minutes (sports or TV shows example) "
----------------------------------

  • The total duration of static content. LG has told us that they expect it to be cumulative, so static content which is present for 30 minutes twice a day is equivalent to one hour of static content once per day.
  • The brightness of the static content. Our maximum brightness CNN TV has more severe burn-in than our 200 nits brightness CNN TV.
  • The colors of the static areas. We found that in our 20/7 Burn-in Test the red sub-pixel is the fastest to degrade, followed by blue and then green.

CNN "MAXIMUM" Brightness test they did is only 380nits:
"As above, live CNN is played on the TV through a cable feed. However, for this TV, the 'OLED Light' is set to maximum, which corresponds to a brightness of 380 nits on our checkerboard pattern. This is to show the relationship between burn-in rate and 'OLED Light' with the exact same content and over the same time period.
=================================================================

If you use a big OLED as another monitor in an array set up as a dedicated "cinema/game screen" and never put icons or taskbars or anything on it regularly it would probably be fine with it's ABL limits, at least for 3 - 5 years anyway I'd guess unless you got unlucky... but for regular PC desktop/app with a screen designated as monitor use I'm sure DELL was being smart about it using SDR 400nit color peak.


Most of the OLED laptops are also low nit screens well below the HDR1000nit standard so are mostly SDR screens at ~400nit color and less, with 600nit being more like "SDR+" (400nit SDR and through a few hundred nit nore to peak colors ceiling). Since the 483nit hp and the 626nit xps 15 are still within the limits the tvs use they might be relatively safe and laptops tend to power save and blank screen more by default unless you change it. compared to using a pc at a desk. (The xps 15 screen has complaints of grey banding and black crush though so might not be that great overall anyway reportedly).
 
Also, to add another piece of 2 cents into the fray...

No technology is perfect. Every technology has its drawbacks. Every. Single. One. CRT had drawbacks, LCD has drawbacks, and OLED also has drawbacks. For everything that OLED does though, I think that it gets more right than wrong. The motion clarity issue with OLEDs should be solved by now (rolling scan, please!), and once that's implemented in consumer sets, the only real drawback that it will have to CRT is not being able to do any resolution natively.

EDIT: And regarding burn-in. Just don't do anything stupid. There are lots of techs out there that are susceptible to uneven wear due to prolonged use. Did you know that LCD projectors shouldn't be run too long in a single setting either? Reason is that the polarizers are susceptible to being burned (particularly blue) by the UV light coming off the lamp. True story.
 
EDIT: And regarding burn-in. Just don't do anything stupid. There are lots of techs out there that are susceptible to uneven wear due to prolonged use. Did you know that LCD projectors shouldn't be run too long in a single setting either? Reason is that the polarizers are susceptible to being burned (particularly blue) by the UV light coming off the lamp. True story.

Burn-in is overblown in most scenarios... you have to be hammering the same game for hours a day, every day, weeks on end, for it to be of any real concern. Or watching the SAME channel with the same logos, again, for hours a day, every day.

HOWEVER, when it comes to PC use, it's not uncommon (and certainly not stupid) for people to be in front of their screens for 12-16 hours a day, with the same windows open... Photoshop, video editing, DTP applications etc. OLED is never going to be suitable for these use case scenarios. That rules out a large potential consumer base right away. So that just leaves gamers... I would venture many have already bought an OLED TV anyway, or plan to given Nvidia will now be supporting VRR on LG OLEDs. LG have a cheaper 48" OLED due next year also, so it's looking pretty sweet for those with lounge gaming set-ups. This really doesn't leave any room for smaller affordable OLED PC gaming monitors, which would need to come in well under the price of a TV to stand a chance of selling (therefore cannibalising the LCD monitor market)... and looking at how expensive 21" OLEDs currently are ($4000+), I don't see a snow ball's chance in hell of this ever happening.
 

Most enthusiasts use more than one monitor. Monitors good for gaming are usually not good for productivity and vice versa. For ages there have been multiple monitors on my desk because there is no single solution that covers all bases.

LCDs are good for workhorse work productivity and OLED are ideal for entertainment consumption. Currently OLED products suffer from the Goldilocks problem, too small or too big. I believe there is a nice size market for a 32" OLED 4k120 VRR @ the $2,000 - $3,000 price point.

Burn in is a poor term for OLEDs problem, it should really be called "Wear In" because pixels that get used more intensely get worn faster than others which produces the burn in looking effect. If you shuttle off the workhorse load on an LCD and are mindful of taking care of the side oled (no taskbar, no icons, etc, etc) you can get a decade of problem free entertainment consumption use from the oled.
 
I believe there is a nice size market for a 32" OLED 4k120 VRR @ the $2,000 - $3,000 price point.

I don't disagree, but we currently have 60Hz 21" OLEDs and 32" LCD monitors at twice that price... this presents a MASSIVE chasm to cross before a 32" OLED 4K 120Hz VRR at $2K-3K becomes even a distant dream, because here and now, it's an absolute fantasy. But yeah, if by some miracle it happened, it would sell.
 
I don't disagree, but we currently have 60Hz 21" OLEDs and 32" LCD monitors at twice that price... this presents a MASSIVE chasm to cross before a 32" OLED 4K 120Hz VRR at $2K-3K becomes even a distant dream, because here and now, it's an absolute fantasy. But yeah, if by some miracle it happened, it would sell.

Probably.....hopefully 2021 - 2022ish
 
For the next 18 - 20 months I'll be looking at what they'll make with hdmi 2.1 120hz HDR VRR in OLED and miniLED (and at what sizes and prices in both monitors and TVs). After that, I'm hoping dual layer lcd can come into consumer products in the next few years to fill the gap until microled is a real thing, even if it is first in hdmi 2.1 tvs. I'm guessing we can look to what the super expensive reference and professional editing monitors are using at each stage and hope it trickles down. The pro art is mini LED and has pretty good specs and features but isn't really the extreme reference monitor high end to look to. The real eizo, sony, and panasonic reference monitors just moved from OLED to Dual Layer LCD with 1000nit peak color brightness HDR and I'm guessing they will eventually switch to microLED when it becomes available way ahead of everyone else.. at astronomical prices way before consumers get them.

As it is looking now I'll probably skip dropping any huge money (over $1k) on a desk/command center monitor until hdmi 2.1 displays are out and more options are available. I will keep an eye out for a 77" LG C9 OLED to drop closer to $3200 ish range +tax on sale someday for the living room perhaps in the meantime (I wish they made a 70" and a 70" price range), and I'll probably get a 120hz capable quasi 4k rez PS5 when they come out or whenever their top end model is out (pro again? :b). Also waiting on nvidia to release a die shrink gpu with hdmi 2.1 output in top level Ti tier for true 4:4:4 10bit 4k 120hz HDR.

If being honest with myself, if one of the 43" 120hz 4k displayport monitors goes on sale cheap enough months from now I may swap out my 32" gk850g but I'm still on the fence even if it was $800 - $850+tax instead of $1100+tax since that money could be better spent on what I listed prior. A sale could tempt me though I guess. If nvidia starts banging a drum about a die shrink hdmi 2.1 gpu in the overlap I'd probably just save to buy a 55" C9 oled tv with hdmi 2.1 48gbps (for $1500 or less as the prices have been going) once I could get a top tier hdmi 2.1 output gpu.

The wait for hdmi 2.1 gpu is looking to be so long that I could conceivably blow money on a 77" C9 oled + ps5 for my living room and be playing 120hz quasi 4k games well before nvidia releases a top tier hdmi 2.1 gpu - pushing the pc gpu and hdmi 2.1 tv/monitor for pc purchases further out.
 
Last edited:
The thing is, monitor tech is pretty transparent... we know what's coming a long ways out. Panel production plans are very frequently known about more than a year before they even start, and you can add another year on top of that before any actual monitor becomes available... plus this doesn't even factor in the inevitable delays. So even if they announced an amazing new monitor tmrw, we wouldn't see it for at least a couple of years. However, given how expensive everything has become, even bog standard LCD, there is literally zero chance we'll see affordable OLED, Mini LED, Dual layer, and definitely not Micro LED, in the next few years. Outside of TV's of course (which ARE getting cheaper), but that's at sizes impractical for desktop use. It's a sad state of affairs, but it's just not going to happen.

I look at the recent high refresh 43" 4K monitors and it seems patently obvious manufacturers are putting as little effort into these as possible. The XG438Q was a major disappointment, and I don't hold out any hope for the CG437K or XG43UQ, which will be using virtually the same panel. The monitor market clearly exists separately to TV's, and it does seem as though that holds FAR more promise for affordable options... the only problem is the larger size.
 
Yes once I can get a die shrink nividia hdmi 2.1 output 48gbps 120hz 4k VRR Ti gpu I'll seriously consider moving my desk back a bit more to around ~ 4' to my eyeballs in order to use a 55" hdmi 2.1 4k 120hz VRR tv. That's the most likely scenario over something like a $4300 - $5000 (+ 8.75%tax here) Pro Art monitor unless they start making a much cheaper gaming one based on the same tech with hdmi 2.1 someday perhaps.

My main sticking point for dropping a decent amount of money is
HDMI 2.1 across the board in display and gpu for real 4:4:4 10bit 120hz 4k HDR instead of pushing 4k 120hz down too narrow of a pipe on displayport.

Everything else is jerry rigged to me and massively overpriced as such.
 
HDMI 2.1 could in theory give monitor manufacturer's a kick up the backside, given how many people I see talking of moving to TV's... but again, it doesn't change how slow the industry moves, and even if they actioned plans for a killer affordable 32"-43" high refresh VRR monitor tmrw, we wouldn't see it available for years. It's very frustrating, especially when you see flawed LCD panels selling for thousands. I am struggling to see light at the end of the tunnel really, and think OLED TV's hold far more promise than any monitor (outside of MicroLED, but that's forever away), but with the caveat of moving away from my traditional desktop set-up.

Almost feels like the monitor industry is trying to kill off traditional PC desktop gaming... won't be long before PC couch gamers have the best of both worlds.
 
If they push some dual layer lcd tvs to market they could end up being a segment in the next 2 - 3 yrs but I won't hold my breath on that becoming a reality. They run hot as of now so need active cooling almost like a small pc case, and they use a lot of power. They do hit 1000nit HDR color brightness in HDR on the reference monitors and are said to be able to go to 3000nit in tvs and with super deep black depths (.0003 black depth), no haloing, and no risk of or limitations due to safety features of burn in. Micro led are farther out at consumer level pricing, many years. That's why I said I'm hoping that dual layer lcd will bridge the gap after a few years until micro led a few years after. Like I said, I won't hold my breath that dual layer lcd will come to consumer market but hisense said they were going to release a 65" dual layer lcd tv in china "this year" (2019) so I'll be eager for info on how that works out. There are also cheaper and smaller JOLEDs that are going to be produced so at least there will be another option there as well.
 
With Oled it looks so pristine it looks artificial it doesn't have much for Depth so it all looks the same one pristine image from another pristine image. Even CRT monitors looked better than lcd monitors. I'm really impressed with the Samsung VA 24 inch I picked up last week. I went past the oled monitors at Walmart then I went past the Qled HDR Samsungs and the image just looks better on the Samsung Qled TV. Oled is perfection but it doesn't have much character. I assume most Oled panels will look the same without much variation.
 
Last edited:
2. Trailing/ghosting/stuttering on text scrolling
There is actually an interesting phenomenon that can come into play here. LCD and OLED are both sample-and-hold type displays, so the pixels change and then they remain solid until they're changed. The response time on OLED is so fast that you're perceiving all the intermediate steps of the animation, where with LCD the response time would blur it more and your eyes would see it as more smooth. The ghosting might be just retinal persistence. Both issues would improve with an increase to 120Hz.

I'd have to see it to know if it was an actual issue with the screen or digitizer. You could compare it to the ones at an Apple store or retailer to see if it's normal or not, but I'd wager it is.
 
There is actually an interesting phenomenon that can come into play here. LCD and OLED are both sample-and-hold type displays, so the pixels change and then they remain solid until they're changed. The response time on OLED is so fast that you're perceiving all the intermediate steps of the animation, where with LCD the response time would blur it more and your eyes would see it as more smooth. The ghosting might be just retinal persistence. Both issues would improve with an increase to 120Hz.

I'd have to see it to know if it was an actual issue with the screen or digitizer. You could compare it to the ones at an Apple store or retailer to see if it's normal or not, but I'd wager it is.

The response time for OLED should be tiny though; I've certainly never seen a problem on my B6.
 
he's talking about sample and hold blur from the way our eyes work, not response time. Bad response time can make black smear but sample and hold blur is intrinsic. You still get massive smearing at 60hz/60fps compared to 100fps to 120fps on a 120hz OLED or LCD. The sample and hold blur is 60% of that of a 60fps/60hz baseline when a game is at 100fps range (not average) or 50% (halved) at 120fps range to a soften blur more "within the lines" or shadow masks of objects and the whole scene moving around in your viewport.

In a demanding game you aren't looking at a single cartoonish ufo bitmap but rather an entire viewport full of high detail textures, depth via bump mapping, fine detail and even text being smeared as you move the entire game world around relative to you in your viewport when mouse looking, movement keying, or controller panning..
KlIRG0B.png


----------------------------------------------

https://www.blurbusters.com/faq/oled-motion-blur/

The answer lies in persistence (sample-and-hold). OLED is great in many ways, however, many of them are hampered by the sample-and-hold effect. Even instant pixel response (0 ms) can have lots of motion blur due to sample-and-hold.

Motion blur occurs on the Playstation Vita OLED even though it has virtually instantaneous pixel response time. This is because it does not shorten the amount of time a frame is actually visible for, a frame is continuously displayed until the next frame. The sample-and-hold nature of the display enforces eye-tracking-based motion blur that is above-and-beyond natural human limitations.

The only way to reduce motion blur caused by sample-and-hold, is to shorten the amount of time a frame is displayed for. This is accomplished by using extra refreshes (higher Hz) or via black periods between refreshes (flicker).

--------------------------------------------------

People also notice that OLEDS are so fast in response that they can actually show stutter type effects that would be smoothed by slower response times.

https://www.rtings.com/tv/tests/motion/stutter

  • Stutter shouldn't be confused with judder, which is a result of an inconsistent frame rate. Stutter is a result of low frame rate, but consistent frame timing.
  • When compared to LCD TVs, OLED TVs are much more likely to suffer from noticeable stutter due to their almost instantaneous response time.

Stutter produces an image which appears to jump between frames and is much more noticeable for low frame rate content such as movies and 30 fps video games. To evaluate the stutter of a TV, we measure the response time which shows the transition time between frames and use this to determine the time that a static frame is shown, which depends on the frame rate of the content.

If you want the smoothest image possible, look for a TV which has a longer response time to extend the transition between frames. You can also help to reduce the amount of stutter by watching higher frame rate content, enabling motion interpolation, or by flickering the backlight. These workarounds do have other side-effects though.

This is an unfortunate side effect of ever faster response times. Faster response times results in less motion blur, but a longer hold time, so more stutter. This is especially noticeable in OLED TVs with their nearly instantaneous response time.

So OLED has the potential to stay crisp at the higher frame rates + hz of higher hz displays where other displays would not be able to keep up so would smear as their response time was behind. However both LCD and OLED are bound by the way our eyes work with sample and hold blur. Even if you had a 1000hz display with the response time to keep up, you'd get blur based on your frame rate.
 
Last edited:
IMO people are too obsessed about the sample and hold effect.

That's just how we actually see in reality. Move anything and blurs. Move you hand quickly in front of you and your fingers blur together. Spin a spoked wheel and the spokes blur out. That is just natural.

Putting a strobe effect on top, is like wanting to walk around in reality with strobe light, getting rid of that terrible sample and hold effect that reality has.

As long as the display itself isn't smearing/ghosting, I am happy.

I remember my first LCD. It was a dell U2405 PVA display. Now that was something to complain about. A smeary/ghosting mess.

Though OLED can do dark frame insertion to give that strobing effect, for those so obsessed.
 
IMO people are too obsessed about the sample and hold effect.

That's just how we actually see in reality. Move anything and blurs. Move you hand quickly in front of you and your fingers blur together. Spin a spoked wheel and the spokes blur out. That is just natural.

Putting a strobe effect on top, is like wanting to walk around in reality with strobe light, getting rid of that terrible sample and hold effect that reality has.

As long as the display itself isn't smearing/ghosting, I am happy.

I remember my first LCD. It was a dell U2405 PVA display. Now that was something to complain about. A smeary/ghosting mess.

Though OLED can do dark frame insertion to give that strobing effect, for those so obsessed.

Real life has infinite frame rate and if you follow an object with your eyes there is no blur.

If you follow an object on a sample and hold display with your eyes it will be blurry because it's making small steps instead of smooth infinite steps.

The real fix is infinite frame rate, but we don't have the technology to do it. Strobing reduces the blur by at least removing some of the bad data between frames, letting our eyes/brain fill them in without blur.
 
Last edited:
  • Like
Reactions: elvn
like this
Real life has infinite frame rate and if you follow an object with your eyes there is no blur.

If you follow an object on a sample and hold display with your eyes it will be blurry because it's making small steps instead of smooth infinite steps.

The real fix is infinite frame rate, but we don't have the technology to do it. Strobing reduces the blur by at least removing some of the bad data between frames, letting our eyes/brain fill them in without blur.

So does very high frame rates at high hz... in the future most likely with interpolation but there are other interpolation like tricks that consoles and especially VR systems use already.

You'd need 1000fps (using interpolation technolgies of course) at 1000hz for essentially "zero" blur display wise like a crt but you get really big gains at very high fps even at 240hz and 480hz (with 240fps and 480fps filling them).

okw997S.png


motion_blur_from_persistence_on_sample-and-hold-displays.png


Strobing has some big tradeoffs.

BFI problems/trade-offs are severe in the tech's current state
-----------------------------------------------------------------------------------

-BFI/strobing has to be 120hz flicker rate or higher or people will get PWM eyestrain/fatigue. I don't know if this can be done without interpolation to increase effective frame rates to match the strobing.. which in the current generations usually produces artifacts and adds a lot of input lag.

-BFI or strobing also has to use a much higher starting brightness otherwise SDR color/brightness will be much dimmer and HDR color volume is thrown out entirely. This is a big problem for all displays but especially for the already more dim SDR and ABL capped HDR OLEDs. I've read reports that strobing can cut the display color brightness/overall brightness by 2/3.

-BFI has limited motion definition gains unless you are running higher fps (impossible off of a non-hdmi 2.1 gpu to hdmi 2.1 4k 120hz tvs) or using motion interpolation to get a soap opera or VR time warp/space warp effect.

https://forums.blurbusters.com/viewtopic.php?t=5262
View attachment 187806

This means:
- A non-strobed OLED has identical motion blur to a fast non-strobed TN LCD (a model with excellent overdrive tuning).
- An OLED with a 50%:50% on off BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on off BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so.

Vega from that thread:
I've tested this on my LG C8 OLED versus my 165 Hz 1440p TN. With the OLED set to 1080p 120 Hz, the OLED pixels are so fast (sample and hold in this case) that I can see each individual frame. It doesn't "appear" as smooth as say the TN panel set to 1080p/120 Hz because even a fast TN will "smear" the images together. OLED doesn't blur one frame to the next.

To me seeing as OLED pixels are so dark fast, if kept sample and hold, the refresh rate even needs to be higher than LCD to get that silky smooth fast refresh feeling.

https://www.blurbusters.com/faq/motion-blur-reduction/
Sometimes blur reduction looks very good — with beautiful CRT-style motion clarity, no microstutters, and no noticeable double images.

Sometimes blur reduction looks very bad — with distracting side effects such as double images (strobe crosstak), poor colors, very dim, very microstuttery, flickery.


You need to have an extremely high frame rate in order to keep a high frame rate minimum unless you are using some kind of interpolation. Running lower strobe rates in order to use lower frame rate graphs means worse PWM flicker.
Strobing visually looks highest-quality when you have a consistent frame rate that match the refresh rate. If your framerate slows down, you may see dramatic microstuttering during Blur Reduction.

In some cases, it is sometimes favourable to slightly lower the refresh rate (e.g. to 85Hz or 100Hz for ULMB) in order to allow blur reduction to look less microstuttery — by more easily exactly matching frame rate to a lower refresh rate — if your GPU is not powerful enough to do consistent 120fps.

Frame rates are lower than the strobe rate will have multiple-image artifacts.

  • 30fps at 60Hz has a double image effect. (Just like back in the old CRT days)
  • 60fps at 120Hz has a double image effect.
  • 30fps at 120Hz has a quadruple image effect.

Inadequate frame rates at higher strobing Hz:
View attachment 187807

Even if you had VRR + BFI/strobing to match the frame rates to the strobing , riding a roller coaster of different strobe lengths into lower strobes and back due to your varying frame rate span would be eye fatiguing and anything under 120hz strobing/blanking is bad in the first place imo.

You'd also be running lower frame rate ranges at 4k resolution with no hope of keeping a high enough frame rate to not sink below 120hz matched strobes in raw frame rates unless you were running a very easy to render or old game that gets over 150fps average.
 
Last edited:
With Oled it looks so pristine it looks artificial it doesn't have much for Depth so it all looks the same one pristine image from another pristine image. Even CRT monitors looked better than lcd monitors. I'm really impressed with the Samsung VA 24 inch I picked up last week. I went past the oled monitors at Walmart then I went past the Qled HDR Samsungs and the image just looks better on the Samsung Qled TV. Oled is perfection but it doesn't have much character. I assume most Oled panels will look the same without much variation.

Character? High refresh rate oled is an uncaged ANIMAL!

C6, C7, C9, AW5520QF...even the short lived 30" Dell Oled....I tested them all and the only one which dissapointed me was the 30" because they bungled it with 60hz and a weird 120hz strobe.

Up until now, there has never been a native 4k resolution 120hz OLED with VRR....but the AW5520QF changed all of that and is an absolute freaking beast. Yea its expensive, yea the LG tvs represent a better value if / when HDMI 2.1 support ever comes out....but the AW5520QF gives us a glimpse into the future and it is insane. It destroys LCDs for gaming....simply destroys them.

I don't know about character, but high refresh rate oled has a very unique feel to it, there is nothing else like it (not even the FW900 on the floor of my closet)....its really incredible and needs to be experienced. You will never look at gaming LCDs the same.
 
Last edited:
The closet? Better than out on the lawn I guess. :) (I've actually been hearing some good things about the FW900 lately.)

Wouldn't it be better to wait for next year's LG OLEDs with a possible solution for motion? (120 Hz BFI?)
 
Last edited:
The closet? Better than out on the lawn I guess. :) (I've actually been hearing some good things about the FW900 lately.)

Wouldn't it be better to wait for next year's LG OLEDs with a possible solution for motion? (120 Hz BFI?)

BFI is no good unless its paired with VRR and nobody has successfully pulled that off yet. That recent ELMB VRR VA panel was a disaster.
 
The second is harder to explain, probably because I’m too much of a noob to use the right terms to describe it. But eventually, when I scroll through text rapidly, the text seems to leave a trail behind it. And unlike the smooth natural blur my eyes perceive in moving objects in real life (and text in LCDs), this trail is awfully discrete and stutter-y. I don’t tend to consider my eyes good enough to see this stuff usually, but this stuck out like a sore thumb.
Mobile OLEDs have smear. My note 9 smears worse than VA when at low brightness :/

I understand that this is more of a TFT issue rahter than an OLED issue though.
 
Wouldn't it be better to wait for next year's LG OLEDs with a possible solution for motion? (120 Hz BFI?)

BFI is no good unless its paired with VRR and nobody has successfully pulled that off yet. That recent ELMB VRR VA panel was a disaster.


I don't know that BFI would be any good without really good interpolation of the future, VR warping tricks or console quasi 4k tricks on pc games to keep the frame rate at 100hz-100fps or better. Really 120hz strobes in my opinion are a minimum for PWM/strobing/BFI to not give eye fatigue over long periods and regular use but anything under 100 is really bad imo. It's basically PWM.

If you could use VRR for 1:1 frame rate to hz and sync'd strobe - your frame rates at 4k would be varying strobes and much lower strobe/blanking rates on demanding game's frame rate graphs. (e.g. A 80 fps game might be a 50 - 80 - 120 roller coaster of strobes, 60fps 4k might be 30 - 60 -90). Maybe on a simple game with an extreme frame rate it could work but then you wouldn't need VRR in the first place. Also at 100fps on a high hz monitor you are already cutting the blur by about 40%, and at 120fps you are cutting it by 50% (again interpolation and/or console quasi 4k tricks would help here).
- An OLED with a 50%:50% on off BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on off BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so.

If you don't use VRR and your frame rate doesn't match or exceed the strobe/BFI the entire graph it can get really messy with microstutter, flicker, and double images.
  • 30fps at 60Hz has a double image effect. (Just like back in the old CRT days)
  • 60fps at 120Hz has a double image effect.
  • 30fps at 120Hz has a quadruple image effect.
It also reduces the overall screen brightness by up to 2/3 which can mute colors and brilliance on an already dimmer running OLED, and obviously isn't very compatible with HDR.
 
Last edited:
Character? High refresh rate oled is an uncaged ANIMAL!

C6, C7, C9, AW5520QF...even the short lived 30" Dell Oled....I tested them all and the only one which dissapointed me was the 30" because they bungled it with 60hz and a weird 120hz strobe.

Up until now, there has never been a native 4k resolution 120hz OLED with VRR....but the AW5520QF changed all of that and is an absolute freaking beast. Yea its expensive, yea the LG tvs represent a better value if / when HDMI 2.1 support ever comes out....but the AW5520QF gives us a glimpse into the future and it is insane. It destroys LCDs for gaming....simply destroys them.

I don't know about character, but high refresh rate oled has a very unique feel to it, there is nothing else like it (not even the FW900 on the floor of my closet)....its really incredible and needs to be experienced. You will never look at gaming LCDs the same.

What will really bake their noodle is that, as a technology, OLED is actually capable of pulling 240Hz (4.17ms) without overdrive.

Most simply have not seen the true unlocked potential of OLED yet.
 
What will really bake their noodle is that, as a technology, OLED is actually capable of pulling 240Hz (4.17ms) without overdrive.

Most simply have not seen the true unlocked potential of OLED yet.

There isn't really such a thing as overdrive for OLED. The response time is microseconds. It will never be an issue. The refresh rate could be measured in KHz or MHz before pixel response became a bottleneck in image quality.

With an LCD you're supplying a specific voltage to get a specific state and using overdrive to make it transition faster, then it changes back without power. The transitions typically takes milliseconds or even 10s of milliseconds.
You're twisting a crystal around.

With OLED you simply provide the voltage and it's essentially instantly at the correct brightness. It's like turning on a light.
 
Last edited:
There isn't really such a thing as overdrive for OLED. The response time is microseconds. It will never be an issue. The refresh rate could be measured in KHz or MHz before pixel response became a bottleneck in image quality.

With an LCD you're supplying a specific voltage to get a specific state and using overdrive to make it transition faster, then it changes back without power. The transitions typically takes milliseconds or even 10s of milliseconds.
You're twisting a crystal around.

With OLED you simply provide the voltage and it's essentially instantly at the correct brightness. It's like turning on a light.

Yeah I get that, my point was that response time capabilities of OLED far exceed any standard currently available. It's funny to me when I read, for example, Eizo's new OLED monitor, with JOLED being the panel manufacturer, is quoted at 0.04ms response time... but it's a 60Hz panel, haha.

I think a lot of people (as in, not [H] folks) read "0.04ms 60Hz panel" and don't realize the panel is being heavily restricted in terms of capability. We won't really see OLED unleashed until HDMI 2.1 / DP 2.0 arrive.
 
Yeah I get that, my point was that response time capabilities of OLED far exceed any standard currently available. It's funny to me when I read, for example, Eizo's new OLED monitor, with JOLED being the panel manufacturer, is quoted at 0.04ms response time... but it's a 60Hz panel, haha.

I think a lot of people (as in, not [H] folks) read "0.04ms 60Hz panel" and don't realize the panel is being heavily restricted in terms of capability. We won't really see OLED unleashed until HDMI 2.1 / DP 2.0 arrive.

Yeah, I would love if the LG tvs supported 480hz 1080p. They have enough bandwidth with hdmi 2.1.
 
Theoretically but there are still limitations in reality.

https://forums.blurbusters.com/viewtopic.php?t=5262

- An OLED with a 50%:50% on off BFI will reduce motion blur by 50% (half original motion blur)
- An OLED with a 25%:75% on off BFI will reduce motion blur by 75% (quarter original motion blur)

Typically, most OLED BFI is only in 50% granularity (8ms persistence steps), though the new 2019 LG OLEDs can do BFI in 25% granularity at 60Hz and 50% granularity at 120Hz (4ms persistence steps)

Except for the virtual reality OLEDs (Oculus Rift 2ms persistence), no OLEDs currently can match the short pulse length of a strobe backlight just yet, though I'd expect that a 2020 or 2021 LG OLED would thus be able to do so.

That quote applies to frame rate and max Hz just as much as BFI.
Like gan7114 said and that quote above says - 2019 oled are capable of 4ms persistence which is 75% blur reduction in BFI but would also be 75% blur reduction at 240fps on a 240hz oled without using BFI if such a thing as a 240hz consumer oled existed. It also says that in the following years 2020 or 2021 OLED tvs should have 2ms which equates to 7/8ths of the blur eliminated compared to a 60fps at 60hz baseline smearing blur, which outside of BFI would require a 480hz consumer oled running 480fps.



The conclusion that article and most others come to is that we are eventually going to need a really advanced overdrive implementation and very high hz monitors in order to duplicate frames several times each so that we can hit very high frame rates to fill high hz. That should be a much better solution in the future than running BFI with it's really bad tradeoffs or trying to squeeze slightly higher graphics settings straddling a moderately bumped frame rate with VRR on demanding games, especially on very high resolution monitors. As it is we've just hit 120hz 4k and are pushing the limits of the display connection bandwidths yet again, but there are lower resolution 1080p 1ms TN monitors capable of 240hz that can fit (assuming your game is capable of feeding it 240fps).

-------------------------------------------------------

Regarding these speeds, keep in mind is that OLEDs are so fast that they are already outpacing the hold time of frame rates. Essentially they can be "too fast" if not adjusted for somehow. This can appear stuttery especially at low frame rate content (like 24fps movies) since there is no way to smooth the transitions (where there would otherwise be moderate response time blur on other displays to smooth it) unless you use interpolation. At least with 24fps you could theoretically just use a clean multiple interpolation of 24fps x5 to hit 120fps constantly, but in games that goes out the window due to the frame rate graphs and the unacceptable input lag added in current forms of interpolation. The occulus rift was however using a tech which cut any frame rate beneath 90fps down to 45fps and doubled it in order to constantly feed it's 90hz screen so people wouldn't suffer motion sickness, and there are some mindwarp etc techs for VR. Anyway once again some kind of interpolation tech is going to be needed eventually for gaming displays.

Vega from that thread:
I've tested this on my LG C8 OLED versus my 165 Hz 1440p TN. With the OLED set to 1080p 120 Hz, the OLED pixels are so fast (sample and hold in this case) that I can see each individual frame. It doesn't "appear" as smooth as say the TN panel set to 1080p/120 Hz because even a fast TN will "smear" the images together. OLED doesn't blur one frame to the next.

To me seeing as OLED pixels are so dark fast, if kept sample and hold, the refresh rate even needs to be higher than LCD to get that silky smooth fast refresh feeling.

That was the first complaint I'd heard of it at higher hz but there are a lot of people that don't like the stutter effect of the incredibly fast OLED response times at lower frame rate ranges, especially movies. If a really high quality interpolation tech were developed for gaming we could fill every hz with a frame which would go a long way toward minimizing or eliminating these problems (sample and hold blur, stutter, inability to supply enough frames at very high resolutions).
 
Last edited:
Current OLED sets support either 4k60Hz or 1440p120Hz since we don't have HDMI 2.1 cards yet. I wonder if next gen sets would allow for 1440p240Hz in addition to 4k120Hz, that would be awesome.
 
BFI is no good unless its paired with VRR and nobody has successfully pulled that off yet. That recent ELMB VRR VA panel was a disaster.

Honestly, it doesn't need to be a disaster. Just allow the user to adjust the panel overdrive while in ELMB mode and you're right as rain. ARE YOU LISTENING ASUS?!!
 
Current OLED sets support either 4k60Hz or 1440p120Hz since we don't have HDMI 2.1 cards yet. I wonder if next gen sets would allow for 1440p240Hz in addition to 4k120Hz, that would be awesome.

I hope so but I don't think they will. Supporting a 120hz signal was easy since the panel already did it with their smooth motion interpolation. Going even higher would require a lot more work because they have to make the panel capable.

But they worked with Nvidia to add gsync compatibly so they at least have some interest in gamers and that gives me hope.
 
  • Like
Reactions: elvn
like this
I hope so but I don't think they will. Supporting a 120hz signal was easy since the panel already did it with their smooth motion interpolation. Going even higher would require a lot more work because they have to make the panel capable.

But they worked with Nvidia to add gsync compatibly so they at least have some interest in gamers and that gives me hope.

I mean they gotta do something to entice people to upgrade from older OLED sets as well. They can either try to boost the brightness, which is known to be one of the cons of OLED for HDR content, or make the panel 240Hz and capable and use that as marketing "hey now we have 240Hz capable OLED!!! upgrade from your old 120Hz set today!" and making 240Hz seems like an easier task than trying to overcome the brightness obstacle. I don't see much improvements being made elsewhere that would entice people to upgrade to the newer sets.
 
I'd rather they worked on advanced interpolation with negligible lag and no artifacts. 240hz isn't very useful to me unless I am feeding it over 200fps. While there are some easy to render games that can do that even at 4k, most of the more demanding games are lucky if they can do 100 average at 4k and that's with a few over the top settings turned off. Higher rez 5k, 8k, etc down the road are pretty much going to have to do something about it. Consoles did some 4k checkboarding and used dynamic rez to keep even lower console fps solid and VR does warp tricks and/or cutting under 90 fps to 45 and doubling it to 90 still I think so it's neeeded in those segments too.
 
I'd rather they worked on advanced interpolation with negligible lag and no artifacts. 240hz isn't very useful to me unless I am feeding it over 200fps. While there are some easy to render games that can do that even at 4k, most of the more demanding games are lucky if they can do 100 average at 4k and that's with a few over the top settings turned off. Higher rez 5k, 8k, etc down the road are pretty much going to have to do something about it. Consoles did some 4k checkboarding and used dynamic rez to keep even lower console fps solid and VR does warp tricks and/or cutting under 90 fps to 45 and doubling it to 90 still I think so it's neeeded in those segments too.

Is that even technically possible to do at all? I know the Samsung TV's can do 60/120fps interpolation for games but it obviously adds lag and artifacting, I don't think this is even possible as seeing how you want to fill in the missing frames it's going to require some additional processing which will mean lag. Just doesn't sound technically possible to do IMO.
 
Is that even technically possible to do at all? I know the Samsung TV's can do 60/120fps interpolation for games but it obviously adds lag and artifacting, I don't think this is even possible as seeing how you want to fill in the missing frames it's going to require some additional processing which will mean lag. Just doesn't sound technically possible to do IMO.

It is possible to do lagless interpolation using the last 2 (or more) frames rendered, but it is less accurate than interpolating between 2 frames.
 
if the tech was concentrated on duplicating an already decent frame rate of say 75fps x 3 or 100 x 3 or even x4 or x5 and up to 100x 10 - it would probably be a lot easier than manufacturing/guessing tween frames. You wouldn't get any increased motion definition past your root frame rate but feeding much higher hz its full frame rate even as "duplicate pages in an animation flip book" would still give the massive motion clarity benefits (blur reduction).


https://www.blurbusters.com/blur-bu...000hz-displays-with-blurfree-sample-and-hold/
 
Last edited:
Most enthusiasts use more than one monitor. Monitors good for gaming are usually not good for productivity and vice versa. For ages there have been multiple monitors on my desk because there is no single solution that covers all bases.

LCDs are good for workhorse work productivity and OLED are ideal for entertainment consumption. Currently OLED products suffer from the Goldilocks problem, too small or too big. I believe there is a nice size market for a 32" OLED 4k120 VRR @ the $2,000 - $3,000 price point.

Burn in is a poor term for OLEDs problem, it should really be called "Wear In" because pixels that get used more intensely get worn faster than others which produces the burn in looking effect. If you shuttle off the workhorse load on an LCD and are mindful of taking care of the side oled (no taskbar, no icons, etc, etc) you can get a decade of problem free entertainment consumption use from the oled.

LOL wear in is such a silly dumb term. You have zero originality.
 
LOL wear in is such a silly dumb term. You have zero originality.

Seriously based on your post history you have added absolutely nothing to any discussion. All I see is drive by comments like this. If you don't have anything meaningful to say just don't post or go troll somewhere else.
 
Back
Top