HDR is very hard to implement on small screens my ass!!

Andyk5

[H]ard|Gawd
Joined
Jul 27, 2011
Messages
1,154
I have been shopping for an OLED HDR computer monitor for about a year now. After using the 65" LG OLED HDR for television I was hooked. Every review site and so called tech experts pointed out that small monitors are gonna take a long time to come up with HDR since HDR10 is hard to implement on small screens., thats why it came to 65 and 55 inch TV's first. Now we are expected to pay $1200-$1300 for 24-27 inch HDR 144hz monitors that are not even OLED just because the almighty display industry allowed us to have HDR in computer monitor format.


Enter iPhone X, 5.8 damn inches, OLED, HDR10. What happened to all the "HDR is hard to implement on small screens" idea?
 
What happened is OLED.

HDR is easy on OLED since you can adjust the brightness of each pixel individually.

HDR is hard on LCD because the brightness of the backlight sets the minimum black level. Doing HDR on an LCD requires splitting the backlight up into hundreds of separate zones - the repeatedly delayed 27" 4k 144hz HDR screens use 384 of them - and not let any light from one zone bleed into adjacent ones.

Big OLED screens are expensive because OLED is really expensive per square inch and for a computer monitor or TV you've got a huge number of square inches vs a phone or tablet.
 
Ok, thats fair. So if they don't insist on using LED we could have many other HDR computer monitors to choose from. A 55 or 65 inch OLED TV is about $2000, a 5.8 inch OLED screened all in one computer/tablet is $1000, then why can't we have 24,27 or 30 inch OLED HDR monitors, I don't understand. What is so magical about the computer monitor size that very small or very big OLED can be done, but medium size is very hard....
 
then why can't we have 24,27 or 30 inch OLED HDR monitors, I don't understand. What is so magical about the computer monitor size that very small or very big OLED can be done, but medium size is very hard....

The desktop market is always last to get technologies like these because it's the smallest market. Samsung is going to charge Apple $120 or $130 PER PANEL for their tiny 5.8" OLED panels. If you could make a 27" display out of 5.8" phone displays(and you can't, because as the display size goes up the yield goes down since you need a bigger panel without defects) just the panel cost alone would be $2600. Dell built a 30" OLED desktop display and priced it at $3500 which was probably a firesale price to get rid of the few units they had, since it was only on the market for like a month.

When you can buy a cheap LCD desktop monitor for a couple of hundred dollars, the market for monitors that cost thousands is very small, and the margins are low because the panels are so expensive, so there's little incentive to sink R&D time into building a $3-5K PC desktop monitor that will sell a few thousand units.

LG may eventually build PC desktop panels, it's true that they've had success selling large TV-sized panels, but again the market for that is much bigger. Many many more people are willing to pay $3-5K for a 65" TV than they are for a smaller monitor, and volume OLED panel production is still limited. LG themselves are only just starting to get into building and selling phone sized panels and they've had some quality issues.

It all comes down to where companies think they can make money, and premium PC desktop monitors are not that place for LG or Samsung. They'll eventually get there, but it's not a priority since there are more important markets to serve first.
 
Ok, thats fair. So if they don't insist on using LED we could have many other HDR computer monitors to choose from. A 55 or 65 inch OLED TV is about $2000, a 5.8 inch OLED screened all in one computer/tablet is $1000, then why can't we have 24,27 or 30 inch OLED HDR monitors, I don't understand. What is so magical about the computer monitor size that very small or very big OLED can be done, but medium size is very hard....
Screen burn in. The reason you are less likely to see a monitor anytime soon is screen burn in. A tv or even a phone screen rarely sits on a static image but a desktop can have static images sit on them for extended periods of time.
 
Screen burn in. The reason you are less likely to see a monitor anytime soon is screen burn in. A tv or even a phone screen rarely sits on a static image but a desktop can have static images sit on them for extended periods of time.

Lots of people use the LG OLED TVs as computer monitors or game on them for hours at a time and don't see any issues with burn-in. It does apparently happen rarely, but I doubt this is a serious barrier to a consumer display. However, it might be a barrier to professional displays which are used for many more hours before replacement and do a lot more static UI content, and professionals are the most profitable target market for an expensive, $3000+ display, so it's possible this is why we won't see OLED monitors until they are cheap enough to manufacture that they can target gamers(which are a much lower end market than professionals).
 
Lots of people use the LG OLED TVs as computer monitors or game on them for hours at a time and don't see any issues with burn-in. It does apparently happen rarely, but I doubt this is a serious barrier to a consumer display. However, it might be a barrier to professional displays which are used for many more hours before replacement and do a lot more static UI content, and professionals are the most profitable target market for an expensive, $3000+ display, so it's possible this is why we won't see OLED monitors until they are cheap enough to manufacture that they can target gamers(which are a much lower end market than professionals).
That was my point. We are more likely to see more LED HDR displays first.
 
You can get an amazing brand new 55 inch 4k OLED TV for $2000 today (before any sales). It is unlikely that a monitor sized OLED would cost less if made today (economics of scale). Not many people would buy 30 inch monitor for $2000 when they could get better quality 55 inch screen for the same price.

Be happy that modern 4k TVs are now including computer (monitor) friendly features. It allows us to get large high quality screens for way less money then ever before. The screen becomes more of a "display wall" rather then a small monitor. It alters the way you use the computer.
 
https://hardforum.com/threads/am-i-...-3-feet-away-from-me.1943328/#post-1043203788
27" viewed near from a fully adjustable desk chair is not a small gaming viewport as some make it out.

tnZD4xQ.png
A good monitor arm makes a big difference..
Gif made from a LAWRENCEcanDraw Ergorton LX monitor arm review:

cO88kLn.gif
*note that the 21:9 shown in the graphic below is based on a 34" 21:9, not a 35" (or 38").
What 40"+ monitors at desk/arm-length distances causes for gaming:

AjhPs2o.gif


I'm holding out for the
PG35VQ 35" UW 1000nit QD HDR 200hz G-sync 3440x1440

The difference with 21:9 content is that the sides (+440 each side) actually add game world real estate for immersion, not just zooming the same 16:9 viewport and scene contents jumbo sized out of bounds of your focal view. You could technically run 21:9 resolutions on a 4k 16:9 with bars though.

PG35VQ trade show gallery (click within gallery to zoom individual pictures)

After HDR content (on a truly HDR capable monitor, especially one enhanced by a high density FALD backlight), all other monitors are going to be narrow band by comparison. Cameras can take HDR photos (RAW) too so even for photo/still work and appreciation all non HDR monitors are going to be left behind within the next few years. Any HDR data shown on a narrow SDR range will clip or crush to white or mud dark once it hits the narrow SDR monitor's limits instead of continuing to show the color volume throughout the HDR luminance range.

Like others have mentioned - I also suspect that oled fading is a problem and will affect color accuracy as well as IR/burn in concerns on computer monitors/desktops. Another thing to remember is that while the HDR premium standard for LCD is 1000nit peak and .05 black depth, that means AT LEAST 1000nit / .05 . They make an exception for OLED to 500nit / .005 black depth, but 1000 nit is a real HDR minimum considering HDR movies are mastered at 4000nit. Whenever a display hits it's peak luminance or darkest black depth, it clips.. crushes blacks to mud and bright color highlights to white instead of showing the increased HDR color volume throughout. The oled will go a lot darker but at 500 or 700 it will cut out to a white 500 or 700 on 50% to 30% of the bright color highlights compared to 1000nit. 700 is a lot better than 500 though for sure. However, from the avsforum quotes below: "So what if an OLED measures 600 or 700 nits peak brightness with a white window? They have an extra big fat white subpixel they are using to cheat the numbers higher. But this will dilute color volume (and hurt resolution) compared to RGB displays."

I've wondered if OLED has to stay at around 500nit (700nit HDR in 2017) because of the fade/burn out weakness, not just a hard limitation of the tech's brightness capability. As I understand it, LG gets around individual color fading by using all white OLED sort of like a per pixel FALD array, but perhaps that would just allow them to fade more uniformly (but still less uniformly on static image planes like desktop use?). Dell pulled their OLED monitor which seems telling.

It's fine if you are ok with a TV as a monitor but it's missing out on all of the other modern gaming advancements so it is huge trade-off of features.
High hz (motion blur reduction, motion definition and path articulation increase) and g-sync (avoid screen aberrations and having to use v-sync) are a necessity for gaming aesthetically imo, and the response times of monitors are 5 to 8ms on ips while even low response time game mode tvs are usually around 15 - 25ms. The 4k tvs might do 120hz eventually at least.. a few can do native 120hz at 1080p right now.
FALD explained (cnet.com) youtube.com: edge lit LED uncovered ...... FALD uncovered
........ Vizio Reference FALD versus Samsung HU8550 Edgelit Array

HDR Explained

High Hz + High FPS motion clarity and motion definition benefits explained

G-sync Explained (Blurbusters.com)

Personally I'd guess that a true full featured oled gaming monitor is still several years off, if ever. In the meantime, for a few years anyway - a high density FALD QD filtered P3 color 1000nit HDR 3440:1440 VA with g-sync that is tight from around 100fps-hz to 120hz fps-hz (40 - 50% blur reduction and 5:3 to 2:1 motion def increase vs 60fps-hz) sounds like the way to go for me.



Some other interesting stand-outs about LG OLED tvs from http://www.avsforum.com/forum/40-ol.../2814081-rtings-review-up-2017-lg-oled-4.html
several reviews of 2016 LG OLEDs recognized their above black shadow detail, dithering noise, blockiness issues, and mentioned them in their reviews, and even LG has admitted the weaknesses with their 2016 LG OLEDs, claiming they have addressed these issues on their 2017 models, boasting of their higher bit-depth allocation, a new decountour filter, and a revamped dithering algorithm.

Shadow detail isn't OLED's strongest suit, but all of the LGs were still very good in this area after calibration to fix the default settings' crushed blacks

Considering the the C7 has a whopping 67% higher peak brightness than the B6 on the SDR Sustained 50% Window, I'm disappointed that Rtings didn't test the SDR Sustained Window at anything between 50 and 100%.

This is particularly important to me as a heavy PC user, where the likes of a maximized* explorer window or web browser would have a white amount equivalent to an 80 to 90% window.
So what if an OLED measures 600 or 700 nits peak brightness with a white window? They have an extra big fat white subpixel they are using to cheat the numbers higher. But this will dilute color volume (and hurt resolution) compared to RGB displays.

So peak luminance numbers do not tell the whole story. You have to actually WATCH with your eyes to see what looks better, side by side.

As others pointed out, what kind of measurement does Rtings have to measure dancing pixels/dithering noise/macroblocking on an LG OLED?

They can run some motion tests, which are all well and good, but are their generic tests going to pick up the tearing/frame-skipping/stuttering and motion artifacts that plagued B6 OLEDs, which they gave an "excellent" motion rating?

you would 't know that from reading a Rtings review now would you? They totally missed that. Try reading a real review from HDTVtest, where they talk about the noise and "dancing pixels" on Skyfall, "one of the best Blu-ray transfers of all time".

It's just more painfully obvious on lower-quality sources, but it can be seen on Blu-ray as well.

21 ms lag with near zero response time. 1080p 120 fps mode. All lag numbers the same with 444 with all resolutions. These are numbers a pc gamer should not wait for. BTW, 120 fps cuts lag in half.
Personally I think 2017 is a off year for OLED.
Don't forget, HDMI 2.1 opens up 120hz at 4K (WITH HDR at 444) which means 3D Ready as well which means you can use aftermark IR-based 3D transmitters.

And with the super fast transition time of OLED pixels that should mean no crosstalk either.

The review makes it seem frame interpolation isn't available at 4K.

I also wonder if the 120hz works only at 1080p (2160p120 at 420 is also possible and within HDMI 2.0a's 18gbps bandwidth limit), and also whether 1080p120 can also include HDR.
 
Last edited:
The 4k tvs might do 120hz eventually at least.. a few can do native 120hz at 1080p right now.

My understanding is that the main limitation at the moment is HDMI 2.0 running out of bandwidth. The panels could do 4K @ 120 Hz but we need HDMI 2.1 to actually be able to have enough bandwidth for that refresh rate to work. I'm still kinda hoping that Samsung offers replacement input boxes and firmware updates to make this possible on their current sets but it's unlikely.

Next few years will be interesting for 4K HDR and high refresh rate stuff.
 
Back
Top