Why OLED for PC use?

Without brightness you don't have better images.

This is subjective and your opinion. It also matters the range you're talking. The SDR on this display is, in my opinion, a "better image" than any FALD is capable of currently. It definitely beats the SDR images on the ProArt (the ProArt was fine, but it was either a choice of blooming or less cantrast, and the perfect blacks and viewing angles on this can't be beat for SDR content).

Now for HDR, I'd agree with you FALD is going to be better because of that extra brightness headroom. I never argued against that. I still think OLED can be pleasing and the extra contrast helps with that, though.
 
My environment has low to moderate brightness and 160 nits is generally plenty for me.




This is false. 160 nits is what I play games at. I'm playing Atomic Heart (which doesn't have native HDR) and it looks beautiful at 160 nits in the sRGB colorspace. I play SDR games on my TV at ~160 nits also. I know some people prefer brighter, and that's fine and depends on preference and room conditions, but saying games "games in SDR require a much higher brightness than 150 nits" is not accurate.

As far as flickering, supposedly the new ASUS OLED (same panel as the LG) can do higher-nit SDR without a problem, but obviously I haven't tried it; it'll be interesting to see if there are any flickering/eyestrain reports.




Cyberpunk looks gorgeous on my display in HDR. It also has many settings to match HDR to your display. I'm sure it can get brighter on a FALD display, but it's certainly no slouch on OLED, especially if you're not comparing side-by-side.
It's your preference again? 160nits sRGB looks like crap compared to 400nits wide gamut SDR. HDR300 looks like crap compared to HDR1400.
 
It's your preference again? 160nits sRGB looks like crap compared to 400nits wide gamut SDR. HDR300 looks like crap compared to HDR1400.

In your opinion. Lots of people view all SDR/sRGB content (movie, TV, games, photos, and work stuff too) at the same sRGB brightness range appropriate to their room. You may think it looks like crap, and you're welcome to your opinion, but wide gamut is not closer to reference (it's objectively farther from it), and the word "require" indicates some sort of objective standard you're citing, when in fact there is no such thing. Preference certainly comes into play, but it varies person to person.
 
In your opinion. Lots of people view all SDR/sRGB content (movie, TV, games, photos, and work stuff too) at the same sRGB brightness range appropriate to their room. You may think it looks like crap, and you're welcome to your opinion, but wide gamut is not closer to reference (it's objectively farther from it), and the word "require" indicates some sort of objective standard you're citing, when in fact there is no such thing. Preference certainly comes into play, but it varies person to person.
You don't have an option to see the higher range.
A better display is the one that shows higher range.
 
It's your preference again? 160nits sRGB looks like crap compared to 400nits wide gamut SDR. HDR300 looks like crap compared to HDR1400.

So tell me how this wide gamut SDR400 stuff works. I'm playing Shadow Warrior 3 on game pass and there is ZERO HDR support whether natively or through Auto HDR. I'm stuck with 120 nits SDR.
 
Without brightness you don't have better images.

There are many variables, the combination of which result in good image quality. And I mean MANY.

Peak brightness is probably not even in the top 100.

Sure, it is a nice to have for the content that supports it, but it is far from the end all of display specs.

And again, it is not the peak brightness that really matters. It is the ratio between the darks and the peak brightness that results in the dynamic range. Since OLED screens have very good darks, they don't need as high peak brightness to have good dynamic range, and will generally look absolutely fantastic, as long as your room isn't too bright.
 
So tell me how this wide gamut SDR400 stuff works. I'm playing Shadow Warrior 3 on game pass and there is ZERO HDR support whether natively or through Auto HDR. I'm stuck with 120 nits SDR.
Turn on local dimming. Crack up brightness. Turn off YCbCr sRGB gamma. Use YCbCR444 format in Nvidia panel. Then choose wide gamut. It's basically HDR400. It needs a competent HDR display to do it properly.
 
There are many variables, the combination of which result in good image quality. And I mean MANY.

Peak brightness is probably not even in the top 100.

Sure, it is a nice to have for the content that supports it, but it is far from the end all of display specs.

And again, it is not the peak brightness that really matters. It is the ratio between the darks and the peak brightness that results in the dynamic range. Since OLED screens have very good darks, they don't need as high peak brightness to have good dynamic range, and will generally look absolutely fantastic, as long as your room isn't too bright.
I already said color is lit by brightness. It's the top 1 factor. You can have infinite contract with a black dim picture as long as it has 0 nits.
 
You don't have an option to see the higher range.
A better display is the one that shows higher range.

Higher range does not matter for SDR/sRGB, especially if you prefer accuracy. For many people, probably most, perfect contrast trounces anything FALD can add here. As I've said before, I have zero interest in viewing sRGB in a higher range, and my experiences with it (including with FALD) have backed up that I prefer native sRGB.

Yes, brighter is better for HDR, but again, that's the minority of content I currently consume.
 
Turn on local dimming. Crack up brightness. Turn off YCbCr sRGB gamma. Use YCbCR444 format in Nvidia panel. Then choose wide gamut. It's basically HDR400. It needs a competent HDR display to do it properly.

All that kinda did was just make the colors look super saturated. Sorta like forcing BT.2020 on my LG CX. Does it look better? Well kinda, but I would say it looks more different rather than flat out better and I can see why some people would rather just prefer the original SDR presentation as that's how the game was originally made. Left is that tweaked wide gamut mode, right is the original sRGB look. The left definitely has more color to it but in person it looks very oversaturated. I had to turn the brightness down because 400 nits was just too much also.
 

Attachments

  • 20230314_234757.jpg
    20230314_234757.jpg
    630.5 KB · Views: 1
  • 20230314_234915.jpg
    20230314_234915.jpg
    579.2 KB · Views: 1
All that kinda did was just make the colors look super saturated. Sorta like forcing BT.2020 on my LG CX. Does it look better? Well kinda, but I would say it looks more different rather than flat out better and I can see why some people would rather just prefer the original SDR presentation as that's how the game was originally made. Left is that tweaked wide gamut mode, right is the original sRGB look. The left definitely has more color to it but in person it looks very oversaturated. I had to turn the brightness down because 400 nits was just too much also.
You should use your X27 to do it. Even without wide gamut, YCbCr still increase contrast.
 
Last edited:
Higher range does not matter for SDR/sRGB, especially if you prefer accuracy. For many people, probably most, perfect contrast trounces anything FALD can add here. As I've said before, I have zero interest in viewing sRGB in a higher range, and my experiences with it (including with FALD) have backed up that I prefer native sRGB.

Yes, brighter is better for HDR, but again, that's the minority of content I currently consume.
That accuracy means worse images. Are you seeing sRGB 80nits? It's not my concern that you prefer to see worse images. A better display is the one that has more range regardless.
 
That accuracy means worse images. Are you seeing sRGB 80nits? It's not my concern that you prefer to see worse images. A better display is the one that has more range regardless.

You're declaring this absolutely, when that's simply your own personal preference/opinion. sRGB in the correct color space is very much a *better* image to me. Oversaturated colors as shown above can appear to pop more, but you can lose detail and get incorrect colors for things, which would be extremely undesirable to me.

I view sRGB slightly higher than reference nits level because of my setup and my preference. It's not an absolute, and it is an opinion of what looks best for the variables of my environment. Calibrating it to my brightness level removes most of the inaccuracy this could introduce, and the rest is acceptable to me. If I was grading SDR in a completely dark room, 80 nits might make sense, but it doesn't here - the added tolerance for any light condition is worth it to me, while still preserving a high degree of accuracy thanks to the ability to calibrate to my specific nit level (as well as gamma). But I'd never declare 160 nits as the ONLY brightness level you can view SDR in or this specific brightness level objectively better in all situations; it depends on each user's variables and preferences.

A (calibrated) custom brightness is a far cry from declaring sRGB should always be in a wide gamut it's not intended for. And that's what you're doing - you're declaring that SDR/sRGB is objectively better in a wider gamut, when that's just not true. Just because YOU prefer it that way, doesn't mean it's better. But feel free to point out any game developers, artists, or content creators who actively encourage viewing their sRGB content in a different color space than it was made in.
 
You're declaring this absolutely, when that's simply your own personal preference/opinion. It is a *better* image to me.

I view sRGB slightly higher than reference nits level because of my setup and my preference. It's not an absolute, and it is an opinion of what looks best for the variables of my environment. Calibrating it to my brightness level removes most of the inaccuracy this could introduce, and the rest is acceptable to me. If I was grading SDR in a completely dark room, 80 nits might make sense, but it doesn't here - the added tolerance for any light condition is worth it to me, while still preserving a high degree of accuracy thanks to the ability to calibrate to my specific nit level (as well as gamma). But I'd never declare 160 nits as the ONLY brightness level you can view SDR in or this specific brightness level objectively better in all situations; it depends on each user's variables and preferences.

A (calibrated) custom brightness is a far cry from declaring sRGB should always be in a wide gamut it's not intended for. And that's what you're doing - you're declaring that SDR/sRGB is objectively better in a wider gamut, when that's just not true. Just because YOU prefer it that way, doesn't mean it's better. But feel free to point out any game developers, artists, or content creators who actively encourage viewing their sRGB content in a different color space than it was made in.
Only higher range can have better images. That's all the displays are trying to do. The problem is you don't have that option to see it with an OLED. Why does OLED try so hard to get bright if a dim image just look fine?
 
Only higher range can have better images. That's all the displays are trying to do. The problem is you don't have that option to see it with an OLED. Why does OLED try so hard to get bright if a dim image just look fine?

"Only higher range can have better images" is not a factual statement. If the image is sRGB, and you can display it just fine at your target brightness, it doesn't get "better" than that, except based on your preference/opinion (which adds inaccuracy). HDR is a different story, but that's not what we've been talking about here.

You already know this... Increasing brightness for OLED is for brighter environments where it could come in handy for SDR, and obviously for more impactful HDR. Again, nobody has claimed extra brightness isn't beneficial for HDR. Of course it is, and FALD has the clear advantage here. That said, OLED can look quite good in HDR tone-mapped with current brightness levels. It's not going to be ideal without tone-mapping yet, but I'm fine with that for the better SDR since I spend most of my time there, and HDR still works quite well for gaming.
 
Can't say I understand a lot of what is said in this thread, but as a satisfied owner of a LG C2, what mini led monitor could I get that would smoke my oled?
 
Can't say I understand a lot of what is said in this thread, but as a satisfied owner of a LG C2, what mini led monitor could I get that would smoke my oled?

That's going to depend on a lot of factors and what "smoke" means to you.

IMO, OLED is better if:
- You highly value black levels/contrast and don't like blooming.
- You like really great viewing angles with little loss of image fidelity.
- You have a darker/more controlled room environment where lower OLED brightness levels aren't an issue.
- You watch as much/more SDR content as sRGB, as OLED particularly shines in this content IMO.

IMO, full-array local dimming mini-LED monitors are better if:
- You're in a brighter room and need the extra brightness for SDR.
- You watch a lot of high-brightness HDR content and want it as close as possible to how it was mastered.
- You worry about burn-in and want to minimize risk.
- Text fringing bothers you on OLED if you read a lot.

As far as models, that's going to depend on budget and size considerations. A lot of people consider the ASUS ROG Swift PG32UQX king and speak highly of it. (One note - it's a few years old and I believe a successor model, the QXR, is supposed to launch this year). I've tried its brother, the ASUS ProArt PA32UCQ, but it had too many quality control issues to keep (supposedly QC is better on the UQX tho'), and I found the blooming too much of an issue in SDR content to be a great fit for me, which is why I went for a slightly smaller OLED.

Since you're coming from a TV, you might look at FALD TVs as well. I tend to like my Sony FALD, which I use for movies and console gaming. A lot of people really like the Samsungs as well and use them as monitors. I'm not well-versed on the current lineup TV wise tho' as I haven't researched in a few years.

Good luck!
 
They are larger TVs. A larger screen is easy to make compared to monitors. TVs tend to be less accurate than monitors too. And At that 77 inch you can buy a MicroLED this year.
I actually started this thread with large OLED TVs in mind because I want to make a new media room/home theater and maybe connect a PC to it for gaming.

Based on your response, large OLED TVs don’t suffer from being dim like OLED monitors do. The new LG G3 and Samsung QD-OLED TVs offer much increased brightness this year (as seen in the video I linked). If these are the OLEDs in question, would all this talk of OLEDs being too dim become irrelevant?
 
I actually started this thread with large OLED TVs in mind because I want to make a new media room/home theater and maybe connect a PC to it for gaming.

Based on your response, large OLED TVs don’t suffer from being dim like OLED monitors do. The new LG G3 and Samsung QD-OLED TVs offer much increased brightness this year (as seen in the video I linked). If these are the OLEDs in question, would all this talk of OLEDs being too dim become irrelevant?
You aren't going to get sensible answers from him because to him anything that isn't capable of 1000 nits full field brightness is crap. Most of the time for HDR content you aren't going to see that sort of brightness, it would be only something like a big explosion on screen. For SDR content it would mostly matter that you can see the screen comfortably. If it's going to be a media/home theater room then you are most likely controlling for light anyway so brightness is not a problem.

Realistically you will be very happy with the G3 or equivalent Samsung QD-OLED panel performance for TV use. Similarly using a PC for gaming will work nicely. I'd recommend going to an electronics store and checking these out for yourself because you might find that even the C2/C3 with their more limited brightness will look great to you and the money saved over the latest and greatest will be a better compromise.
 
Without brightness you don't have better images.
Thats as bad as saying without blackness you dont have better images.
Your argument only applies to HDR when more is needed, and even then is still brightness limited, muting your argument somewhat unless you have proof current max brightness levels are optimal?
Both extremes of blackness and brightness are needed for perfect HDR.
For SDR more brightness isnt needed.
 
Last edited:
Thats as bad as saying without blackness you dont have better images.
Your argument only applies to HDR, and even then is still brightness limited, muting your argument somewhat unless you have proof current max brightness levels are optimal?
Both extremes of blackness and brightness are needed for perfect HDR.
For SDR more brightness isnt needed.
He'll argue, somehow, that stretching SDR out to wide gamut is "better images" but only if it's not on OLED.

This whole "better images" thing does my head in.
 
You should use your X27 to do it. Even without wide gamut, YCbCr still increase contrast.
Prove it.
YCbCr is a method of encoding colour to take less space, it cannot improve contrast.
If anything its worse than RGB in its current implementation.
 
  • Like
Reactions: Senn
like this
I already said color is lit by brightness. It's the top 1 factor. You can have infinite contract with a black dim picture as long as it has 0 nits.
It is. All color perceived by the human eye is light. No one is arguing it isn't.

Light output is kind of the entire reason a screen exists.

It's the peak output numbers that aren't incredibly relevant except for a few flashy bright effects here or there.

You can make the analogy to a car. Sure, movement is kind of what the car is all about, but apart from bragging rights it really doesn't matter if your car has a top speed of 150mph at 160mph. Even if you once in a while test it at that speed on a track, for your every day life with that car you are never going to drive it at that top speed. You are going to be driving to the store, or to and from work, etc.

Most monitors are going to spend 99% of their lives in SDR on the desktop with brightness set so they get a max light output of about 75-80 nits, and that is perfectly fine.

On the rare occasion you drive all fast and sporty, you probably won't even come close to your top speed. Same with a monitor. On the rare occasion you actually view anything with HDR content, you probably aren't going to notice even rather large differences in max light output, as long as you have plenty of dynamic range.

The max light output is really only used for parts of scenes that dazzle and blind you, and you don't have to get super bright to have the same effect, especially in a dark room, on a screen with dark blacks, so your eyes have adjusted.

This is why OLED screens continue to take the top ratings. Their output is the most pleasing, and they do this without unnecessary ridiculous top end light output.

If you go to Rtings.com and sort the top list by movie viewing (one of the use cases they test where HDR becomes the most relevant) you'll find the top list absolutely dominated by OLED screens. The first entry that is not OLED comes in at #22 and is a Hisense dual LCD panel.

OLED has all but completely taken over the top echelons of image quality, and for good reason. They look fantastic, better than anything else out there I have seen. And they do this, despite having a lesser top light output figure, which kind of disproves your theory that max light output is as important as you think it is.

You can of course use whatever the hell you want to use. That's only between you and your eyes, but don't come in here and try to tell the rest of us that OLED sucks because it doesn't ave blinding levels of max light output, when we all know what we have seen, like the results, and oh, by the way, the most respected testers of screen quality almost unilaterally agree.

It almost makes me think you are trolling. Why don't you create your own thread to discuss the virtues of blinding light from a screen and stop crapping on this one.
 
Last edited:
  • Like
Reactions: Senn
like this
You should use your X27 to do it. Even without wide gamut, YCbCr still increase contrast.

My X27 is currently boxed up for storage as a backup display incase my Innocn dies. But I'm not sure how it would fair better when it's SDR brightness is capped even lower than the Innocn.
 

Attachments

  • Screenshot_20230315_091141_Chrome.jpg
    Screenshot_20230315_091141_Chrome.jpg
    246 KB · Views: 1
I can see the benefit of HDR in video and gaming content, but I usually keep it turned off on the desktop, because at least on my shitty current monitor, the ASUS XG438Q, turning HDR on overexposed the desktop to the point where it is practically unusably ugly and burns my eyes.

In Cyberpunk it looked pretty nice, but I kept the settings at about 600 max because any more than that started to become uncomfortable to my eyes.

That said, I will fully admit, I am not very well read on HDR, it's various technologies and optimal settings. It hadn't been a big deal to me yet. In most cases it winds up being more of an annoyance I have to deal with than something I actually want.

Appreciate the honesty and when it comes down to it, they are each owner's display and the displays are all flawed so people should purchase and use whatever suits their taste better.


Regarding HDR, while I agree with you in principle that the value of high inky contrast contrast as it is framed by the deep end - individual pixels into the depths of oblivion that oleds are capable of right next to lighter blacks/detail in blacks and darks and relatively bright colors and bursts of bright colors at the per pixel level is great. OLED has smaller % brightness peaks and sustained and it hits an ABL wall on occasional scenes. Fald are non uniform and fluctuating large bright bucket light sloppy zones that color/lift/dim outside of the lines in a sub 45x25 lighting resolution of tetris block structures where static scenes fall and as dynamic material moves across them.. They both work as well as they can for what each of their gains are but both degrade and sharply limit the picture in their own failings/cons.

In professionally mastered HDR, the whole screen isn't supposed to be cranked up as a unit. It's supposed to be a contrasted mixture across large range in a typical scene
.. from a depth down to oblivion and large areas of normal brightness for a large part of the screen, up to 50% typically
.. and bright colorful mids in another quad
...and up to bright light sources and highlights in the remaining quad worth of the screen more or less.

Those %s aren't all combined in place like a pie graph or app window/tile on your screen all of the time necessarily but can be spread and mixed across the entire scene depending on the content and location of the scene's light sources, depths of darkness and "dodge an burn" light vs dark providing details. E.g. scintillations on a fractured/disturbed lake surface in sunlight or bright moonlight cumulatively would add up to a decent % of screen being that bright. . That range is applied on the tv starting at the bottom ~ 50% of your display's range and then compresses what it can't fit into the remaining top ~50% of the display (especially for HDR 4000 and HDR 10,000 content which have a lot left over to compress down). In real world use screens aren't hitting the high end at a very high brightness in relation to the actual hdr 4000 and 10,000nit metadata/mapping of values fed to the displays. But even with a HDR 10,000 screen it shouldn't be blasting your eyes badly all of the time with professionally mastered material because typically only 1/4 of the screen space would be 1000nit to 10,000nit and that could be broken across a lot of relfections and bright highlights and smaller light sources (torches, lamps, headlights, etc). There could be some scenes where you would be seeing a very bright light source or area as a larger % of the screen than typical and for longer than typical on occasion though, or vice-versa darker scenes and areas.


The end goal is indeed more realism with a high end capability of 4000nit and 10,000nit (even a 20,000nit meta prototype monstrosity) . That increased range mostly as it maps to longer sustained 600 - 800 - 1000nit mids along with a segment of the screen displaying very bright highlights and light sources at 1000nit to 10,000nit - rather than the whole screen/scene being lifted brighter. That should get better once the tech catches up to avoid the tradeoff failings of both screen types available currently, hopefully with micro OLED someday and likely with active cooling and thicker housings vs ABL/heat issues. Situationally in scenes you shouldn't be staring into the sun all of the time in HDR content, and you probably aren't playing games in a daytime desert or a bright white sun-reflecting snowscape 24/7 either (though certain game genres might).


. . . . . . .


However since you got me thinking about it - in real life people put shaded snow goggles on in bright snowscapes when hiking/backpacking/skiing, and they use sunglasses and even polarized sunglasses when on bright beaches or baja driving in deserts, or when out fishing with the sun reflecting off of the water. A lot of people aren't going to squint in bright environments for long periods of time IRL if they can get some good eyewear to use. So I get your point. I think ultimately VR will be a big sector for HDR in the future too, with very bright screen peaks in effect with much brighter microOLED tech right near your eyes. Perhaps games in both desktop and VR screens will have virtual glasses in the games that you can put on when in the brightest environments or something lol, virtually donning and doffing the glasses. How is that for more realism? Your own ABL sunglasses in game. After all of that desire for brighter HDR peaks right? - but those kinds of 1000nit to 10,000nit peaks would still be great for regular bright highlights on metals, reflections, and in more mixed environments where lasers and magic and short duration lightning and things are in the scene, etc. so I wouldn't want to wear sunglasses in every single scene, or set strict brightness limits on HDR via settings globally applied to every scene necessarily.

The HDR levels even at a range up to hdr10,000nit won't be as high as real life scenes are 1:1 across the whole range to begin with but they will be pretty bright at times relative to dim to dark viewing environment and relative to changing scene content. I guess if you had more realism and you were in a super bright area on a 10,000nit peak screen - as in IRL if able to you'd turn some of the (virtual) lights off or turn a lighting dimmer down, vs very bright you'd cover your eyes or look away . . you'd move to a different area in the shade or indoors, or you'd put on a hat with a brim and some kind of sunglasses/eyewear. In space you'd put down your sun shielding visor on your astronaut helmet, etc. I keep a set of driving sunglasses in my vehicle at all times and I have shaded ski goggles for snowblowing etc. IRL. Some people use transition glasses too. I'd find it ironic if your HDR screen ended up activating your transition lenses to a darker effect in longer bright scenes lol. I think these kinds of things will be some of the growing pains with HDR implementation and adoption.


725876_tPyXzb2.png

.
.
.

abl-glasses-_1.jpg
 
Last edited:
Thats as bad as saying without blackness you dont have better images.
Your argument only applies to HDR when more is needed, and even then is still brightness limited, muting your argument somewhat unless you have proof current max brightness levels are optimal?
Both extremes of blackness and brightness are needed for perfect HDR.
For SDR more brightness isnt needed.

 
Appreciate the honesty and when it comes down to it, they are each owner's display and the displays are all flawed so people should purchase and use whatever suits their taste better.


Regarding HDR, while I agree with you in principle that the value of high inky contrast contrast as it is framed by the deep end - individual pixels into the depths of oblivion that oleds are capable of right next to lighter blacks/detail in blacks and darks and relatively bright colors and bursts of bright colors at the per pixel level is great. OLED has smaller % brightness peaks and sustained and it hits an ABL wall on occasional scenes. Fald are non uniform and fluctuating large bright bucket light sloppy zones that color/lift/dim outside of the lines in a sub 45x25 lighting resolution of tetris block structures where static scenes fall and as dynamic material moves across them.. They both work as well as they can for what each of their gains are but both degrade and sharply limit the picture in their own failings/cons.

In professionally mastered HDR, the whole screen isn't supposed to be cranked up as a unit. It's supposed to be a contrasted mixture across large range in a typical scene
.. from a depth down to oblivion and large areas of normal brightness for a large part of the screen, up to 50% typically
.. and bright colorful mids in another quad
...and up to bright light sources and highlights in the remaining quad worth of the screen more or less.

Those %s aren't all combined in place like a pie graph or app window/tile on your screen all of the time necessarily but can be spread and mixed across the entire scene depending on the content and location of the scene's light sources, depths of darkness and "dodge an burn" light vs dark providing details. E.g. scintellations on a fractured/disturbed lake surface in sunlight or bright moonlight cumulatively would add up to a decent % of screen being that bright. . That range is applied on the tv starting at the bottom ~ 50% of your display's range and then compresses what it can't fit into the remaining top ~50% of the display (especially for HDR 4000 and HDR 10,000 content which have a lot left over to compress down). In real world use screens aren't hitting the high end at a very high brightness in relation to the actual hdr 4000 and 10,000nit metadata/mapping of values fed to the displays. But even with a HDR 10,000 sreen it shouldn't be blasting your eyes badly all of the time with professionally mastered material because typically only 1/4 of the screen space would be 1000nit to 10,000nit and that could be broken across a lot of relfections and bright highlights and smaller light sources (torches, lamps, headlights, etc). There could be some scenes where you would be seeing a very bright light source or area as a larger % of the screen than typical and for longer than typical on occasion though, or vice-versa darker scenes and areas.


The end goal is indeed more realism with a high end capability of 4000nit and 10,000nit though - mostly as it maps to longer sustained 600 - 800 - 1000nit mids along with a segment of the screen displaying very bright highlights and light sources - rather than the whole screen/scene being lifted brighter. That should get better once the tech catches up to avoid the tradeoff failings of both screen types available currently, hopefully with micro OLED someday and likely with active cooling and thicker housings vs ABL/heat issues. Situationally in scenes you shouldn't be staring into the sun all of the time in HDR content, and you probably aren't playing games in a daytime desert or a bright white sun-reflecting snowscape 24/7 either (though certain game genres might).


. . . . . . .


However since you got me thinking about it - in real life people put shaded snow goggles on in bright snowscapes when hiking/backpacking/skiing, and they use sunglasses and even polarized sunglasses when on bright beaches or baja driving in deserts, or when out fishing with the sun reflecting off of the water. A lot of people aren't going to squint in bright environments for long periods of time IRL if they can get some good eyewear to use. So I get your point. I think ultimately VR will be a big sector for HDR in the future too, with very bright screen peaks in effect with much brighter microOLED tech right near your eyes. Perhaps games in both desktop and VR screens will have virtual glasses in the games that you can put on when in the brightest environments or something lol, virtually donning and doffing the glasses. How is that for more realism? Your own ABL sunglasses in game. After all of that desire for brighter HDR peaks right? - but those kinds of 1000nit to 10,000nit peaks would still be great for regular bright highlights on metals, reflections, and in more mixed environments where lasers and magic and short duration lightning and things are in the scene, etc. so I wouldn't want to wear sunglasses in every single scene, or set strict brightness limits on HDR via settings globally applied to every scene necessarily.

The HDR levels even at a range up to hdr10,000nit won't be as high as real life scenes are 1:1 across the whole range to begin with but they will be pretty bright at times relative to dim to dark viewing environment and relative to changing scene content. I guess if you had more realism and you were in a super bright area on a 10,000nit peak screen - as in IRL if able to you'd turn some of the (virtual) lights off or turn a lighting dimmer down, vs very bright you'd cover your eyes or look away . . you'd move to a different area in the shade or indoors, or you'd put on a hat with a brim and some kind of sunglasses/eyewear. In space you'd put down your sun shielding visor on your astronaut helmet, etc. I keep a set of driving sunglasses in my vehicle at all times and I have shaded ski goggles for snowblowing etc. IRL. Some people use transition glasses too. I'd find it ironic if your HDR screen ended up activating your transition lenses to a darker effect in longer bright scenes lol. I think these kinds of things will be some of the growing pains with HDR implementation and adoption.

View attachment 556505
725876_tPyXzb2.png

.
.
.

View attachment 556293


Yeah, I was thinking the same thing.

In real life when I go outside I am often wearing sunglasses. I don't necessarily want to wear sunglasses Infront of my screen :p
 
There's an increase of OLEDs being used for PC use (whether it be an OLED TV or the ever increasing availability of OLED monitors). But isn't burn-in a concern? Those using LG OLED TVs as PC monitors, have you experienced burn-in? Do you do anything "special" to prevent burn-in or just use it as it was any other monitor?
Actually never even looked into the OP topic until now but will answer.

AW3423DW owner for over a year now officially and have always wanted smoother gameplay along with some inky blacks. Comes with a 3 year burn-in warranty so will put to use when necessary. Monitors are a game of compromise with +/- and just have to tally to your needs.

3xpcg6.png
 
Do you have the U8H? That TV looks pretty damn amazing, much better than Samsung MiniLED. Wish we could get that kind of performance in a monitor form factor.
i have the canuckistani version of the U7G, its called U78G. its my main monitor(from the couch). :)
win11 says its 1500nit, they say 1000.... either way its bright as fuck!
 
RTings has that one in their list of the best gaming TVs winter 2023:

Best 4k Gaming TV = Samsung S95B QD-OLED . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . Video Games: 9.3 . . HDR Gaming score: 9.3

Best Upper Mid-Range 4k Gaming TV = Samsung Q90B QLED 720 zone FALD LCD . . . . . . . . . Video Games: 8.7 . . HDR Gaming score: 8.7

Best Mid-Range 4k Gaming TV = LG C2 OLED . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . Video Games: 9.3 . . HDR Gaming score: 9.0

Best Lower Mid-Range 4k Gaming TV = Hisense U8H LED 504 zone FALD LCD . . . . . . . . . . . . Video Games: 9.0 . . HDR Gaming score: 9.0



The U8H FALD LCD actually score better than the Q90B FALD LCD if you are only taking gaming performance into account. Their review of it does say the U8H VA gets noticeable red ghosting in some gaming content though.

. .

The Q90B QDLED FALD LCD has blooming and blurs a bit more than the s95B QD OLED too. Also the game mode is worse on samsung FALDs:

"Unfortunately, like most Samsung TVs, the local dimming feature performs worse overall in 'Game' Mode. The overall performance is pretty similar, but the TV seems to be spreading highlights out over a greater number of zones, so there's a bit more noticeable blooming. The processing is also slightly slower, so zone transitions are more noticeable."

"With local dimming on 'High', the contrast ratio is extremely high, better even than the Hisense U8H"

. . . .

Overall they all got good scores though.
 
Prove it.
YCbCr is a method of encoding colour to take less space, it cannot improve contrast.
If anything its worse than RGB in its current implementation.
Cannot see much difference on your limited OLED huh?

YCbCR is used to optimize luminance. You can crank up brightness to see the increased contrast easily as long as you can actually increase brightness . Since you cannot see it with OLED on SDR. I can show you what it looks like in emulated SDR in HDR mode.

I hope your OLED have enough range to see FALD miniLED SDR.

This is the screenshot from game. Download the jxr image to view in HDR mode to see the emulated version of SDR.
52750216436_c234b359eb_o_d.png


Emulated SDR sRGB 80nits


Emulated SDR BT1866 + sRGB + high brightness


Emulated SDR BT1866 + WideGamut + high brightness


The last two images are exactly what it looks like on FALD miniLED such as PG32UQX, PG27UQ, PG35VQ. You should've realized the SDR you see on these monitors is your OLED HDR.

Speaking of the dynamic range, the SDR BT1866 + WideGamut + high brightness is easily 3x more than mere sRGB with 500+nits peak brightness. It's not my business If you like very accurate dim sRGB. My eyes like to see better.
52749685962_d728ecf3a9_k_d.jpg
 
"Only higher range can have better images" is not a factual statement. If the image is sRGB, and you can display it just fine at your target brightness, it doesn't get "better" than that, except based on your preference/opinion (which adds inaccuracy). HDR is a different story, but that's not what we've been talking about here.

You already know this... Increasing brightness for OLED is for brighter environments where it could come in handy for SDR, and obviously for more impactful HDR. Again, nobody has claimed extra brightness isn't beneficial for HDR. Of course it is, and FALD has the clear advantage here. That said, OLED can look quite good in HDR tone-mapped with current brightness levels. It's not going to be ideal without tone-mapping yet, but I'm fine with that for the better SDR since I spend most of my time there, and HDR still works quite well for gaming.
Only higher range can have better images. Your dim OLED is incapable of doing that. You want to see higher you see ABL instead.
 
Back
Top