Why OLED for PC use?

Thank you for posting that, it follows my own experience.
I use an HDR2000 TV and found it to be so wonderful my next one will be a good quality HDR4000. I was close to getting one this year but am waiting to see how well next years models behave.

This will be for material graded HDR4000 or higher, it doesnt make sense to pump HDR1000 material up to HDR4000.
I have at times pushed HDR1000 up to near HDR2000 because it works quite well, but not for everything as it can be fatiguing (when the display is capable enough).
Doing the same to near HDR4000 wont be a good idea.
But a lot of material is not HDR4000 or higher so an HDR4000+ display will have limited use until this changes.

I cant wait for the holy grail of inky blacks and HDR10,000 but I doubt it will happen within 15 years unless some super new tech comes to light ;)
We should have HDR4000 to HDR6000 with inky blacks by then though, the latter of which wont be too far off the HDR10,000 experience.
Fun :)

ps
The first iterations of microLED (corrected, wrote miniLED originally) wont be much better than HDR2000. The ability to handle high heat quickly needs to be addressed to prevent rapid burn out like OLED currently has.
Smaller LEDs heat up faster without a decent heat sink.
This heat sink needs to be small as possible, reliable and not make any noise. I suspect a lot of fins on the back of the display will become the norm, unless planned obsolescence isnt prevented by law.
Either that or an even more efficient LED or equiv is available with flat enough response throughout the range to be useful for a high end display.
(lack of a flat response will either mean much more power is needed at certain light frequencies (wearing it out faster) or the max output of the LED will need to be derated, OR, the response will not be linear enough for high end displays)
 
Last edited:
Part of DolbyVision's PR:

10,000 nit Dolby HDR:
50%+ of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.
25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).
25% or less at the top (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)

The top end will be compressed on screens since all of consumer screens currently are way below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a the first few gens of LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit (momentary ABL across dynamic media and gaming scene considerations aside).

As you can see, the big focus is on the highlights in that top 25% range. I agree with others on how that increased color volume is the goal ultimately - but very very low resolution/density FALD matte screens we have now are not a good tradeoff vs per pixel emissive tech for my values . . especially in the sdr content and hdr1000 content world we live in currently at the moment. Also note that they had to liquid cool the display. ABL and heat reduction remains a concern on display tech as we get 2000nit+ screens and head toward 4000nit and someday 10,000nit screens. Even now, the current samsung 4k and 8k QD LED FALD LCD screens that do 2000+ nit have aggressive ABL.

That Dolby quote of yours only applies to the artificial gap set by 25% top range vs 50% mid vs 25% low based on the limitation of most displays such as OLED ABL or Samsung VA ABL.

HDR content can easily surpass 25% brightness. You just don't see these contents available as the majority of the displays is not ready yet. You won't see it anyway with OLED even it is graded over the artificially "25%" gap. But the current FALD can easily sustain over 60% brightness not even mentioning professional grading HDR can do 100%.

The artificial gap doesn't apply to the content itself. It applies to the limitation of displays.



When you talk about future HDR, the trade off is obviously that FALD with brightness, color, medium contrast will surpass OLED with very limited brightness, medium color, worse contrast even FALD has limited zones compared to OLED falling further off the supposed dynamic range with much less accuracy in both contrast and color.
 
Last edited:
s you can see, the big focus is on the highlights in that top 25% range. I agree with others on how that increased color volume is the goal ultimately - but very very low resolution/density FALD matte screens we have now are not a good tradeoff vs per pixel emissive tech for my values . . especially in the sdr content and hdr1000 content world we live in currently at the moment. Also note that they had to liquid cool the display. ABL and heat reduction remains a concern on display tech as we get 2000nit+ screens and head toward 4000nit and someday 10,000nit screens. Even now, the current samsung 4k and 8k QD LED FALD LCD screens that do 2000+ nit have aggressive ABL.
Also something else to remember about Dolby is that they are a theater company, and push shit for the high-end commercial market. If homes are ever capable of it is less of a concern for them.

I'll again go back to audio and theater levels. How many of you actually have a system that can hit reference levels in your living room (or computer room)? Like actually get 105dB on all channels and 115dB on the sub? I'm going to bet it is pretty few. It takes some serious size and amplification to do that, unless you are real close to said speakers. There's also some quality tradeoffs unless you want to spend even more. Like you can design more efficient speakers that will get more SPL with less wattage, but they usually don't have as good a distortion and imaging as some other designs. 115dB on the sub is super hard too, it just takes a lot more power than you'd think. You buy a monster 2000 watt sub, set it up in the room, measure it... and you find you are only at 109dB. You put in a second one in hopes of pushing it up and getting more flat bass response... and it is only 111dB instead of 112dB because of the interaction. So now you have to put in 2 more to maybe hit that reference.

None the less I bet even with systems that don't quite do what Dolby and THX say they should, you are pretty happy and have excellent, dynamic, sound. Maybe your mains only go to 100dB and your sub about the same, they still can rock the house pretty good at that level. Likewise even if you DO have a system that can hit that hard, it wouldn't surprise me to find out that you don't set the volume dial to 0dB reference often. I actually did at my old place for my computer (big system, small room, short distances) and I almost never went above -10dB. It was just too intense. Only things that were REALLY dynamic, that made very sparing use of those loudest hits would I want to be near full volume. For other things mastered to be more loud more often, I backed it off.


Basically, this is just a long-winded way of saying that just because Dolby is doing something and pushing it, doesn't mean that it matters to most consumers, or will ever come to most consumers.
 
I'm comparing C2 to PG32UQX right now and most of the time the C2 looks pretty dead and lifeless especially if there is any skybox in a game. Going to test Returnal right now since it should be where the C2 has the biggest advantage.

PXL_20230223_190439469~2[7112].jpg
 
It's the same old trick. Of course if guys keeping using Samsung VA as an example of ABL.

But even with ABL, VA still sustain much higher brightness than OLED.
Brightness Chart 2.png
 
Last edited:
It's the same old trick. Of course if guys keeping using Samsung VA as an example of ABL.

But even with ABL, VA still sustain much higher brightness than OLED.
View attachment 551369

No shit? We weren't saying that it has ABL that nerfs brightness down to OLED levels, we were saying that Samsung TV's tend to have a more aggressive ABL behavior, what's wrong with using Samsung as an example? Hell even from your own damn graph the PG32UQX also suffers from ABL that drops brightness down from 1500 nits to 1000 nits. Obviously still way higher than any OLED will achieve but that wasn't even the point lmao.
 
Also something else to remember about Dolby is that they are a theater company, and push shit for the high-end commercial market. If homes are ever capable of it is less of a concern for them.

I'll again go back to audio and theater levels. How many of you actually have a system that can hit reference levels in your living room (or computer room)? Like actually get 105dB on all channels and 115dB on the sub? I'm going to bet it is pretty few. It takes some serious size and amplification to do that, unless you are real close to said speakers. There's also some quality tradeoffs unless you want to spend even more. Like you can design more efficient speakers that will get more SPL with less wattage, but they usually don't have as good a distortion and imaging as some other designs. 115dB on the sub is super hard too, it just takes a lot more power than you'd think. You buy a monster 2000 watt sub, set it up in the room, measure it... and you find you are only at 109dB. You put in a second one in hopes of pushing it up and getting more flat bass response... and it is only 111dB instead of 112dB because of the interaction. So now you have to put in 2 more to maybe hit that reference.

None the less I bet even with systems that don't quite do what Dolby and THX say they should, you are pretty happy and have excellent, dynamic, sound. Maybe your mains only go to 100dB and your sub about the same, they still can rock the house pretty good at that level. Likewise even if you DO have a system that can hit that hard, it wouldn't surprise me to find out that you don't set the volume dial to 0dB reference often. I actually did at my old place for my computer (big system, small room, short distances) and I almost never went above -10dB. It was just too intense. Only things that were REALLY dynamic, that made very sparing use of those loudest hits would I want to be near full volume. For other things mastered to be more loud more often, I backed it off.


Basically, this is just a long-winded way of saying that just because Dolby is doing something and pushing it, doesn't mean that it matters to most consumers, or will ever come to most consumers.

Comparisons with audio are best related vs signal to noise/distortion (SINAD) when comparing quality - and power to a useful extent. Max sustained power output beyond the level that damages hearing should not be considered a defining quality factor because it isnt helpful.
SINAD for audio equates more to the Contrast Ratio + max brightness of a display. AND how well max brightness of the display can be utilised.
Not quite straight forward but pushing toward a better metric imo :)
 
PXL_20230223_192355068.jpg
PXL_20230223_192259417~2.jpg


Blacks are quite a bit lifted on camera compared to in person vs the C2 but the highlights like those plants and blue tentacles are far more saturated and impactful on the PG32UQX. When there aren't many highlights on screen and the scene is dark the C2's contrast advantage is significant. Otherwise, having compared a lot of games now, I don't really think the C2 does HDR justice in most games and seriously looks dead next to this monitor in most HDR titles. There are a few exceptions like Ori where I prefer the C2 unanimously but most of the time the PG32UQX is "good enough" to me in terms of contrast but blows way past the C2 in color volume and impact.
 
Last edited:
No shit? We weren't saying that it has ABL that nerfs brightness down to OLED levels, we were saying that Samsung TV's tend to have a more aggressive ABL behavior, what's wrong with using Samsung as an example? Hell even from your own damn graph the PG32UQX also suffers from ABL that drops brightness down from 1500 nits to 1000 nits. Obviously still way higher than any OLED will achieve but that wasn't even the point lmao.
I specifically add the second batch of PG32UQX as an example to see if you can be hooked to do you trick.

I know you will respond the same way but I don't say anything about VESA HDR1400 requires at least 900nits fulfilled over 30 minutes for certification.

Samsung is known for pushing VDE HDR2000 as bright as VESA HDR600. They don't get a certification from VESA on their VAs from both TVs and monitors.

This thread is more about OLED monitors for PC. It doesn't work when somebody busted out a Samsung VA for his tricks
 
I specifically add the second batch of PG32UQX as an example to see if you can be hooked to do you trick.

I know you will respond the same way but I don't say anything about VESA HDR1400 requires at least 900nits fulfilled over 30 minutes for certification.

Samsung is known for pushing VDE HDR2000 as bright as VESA HDR600. They don't get a certification from VESA on their VAs from both TVs and monitors.

This thread is more about OLED monitors for PC. It doesn't work when somebody busted out a Samsung VA for his tricks

picard-meme-facepalm.jpg
 
My LG OLED 26.5" is brighter than a Supernova took me 3-4 days to get the settings just right. I turned everything down but cranked up contrast up made the screen more readable not just text wise but viewing icons on the desktop or Windows. I can adjust the brightness with the remote on the fly if needed. I'm still using my 2011 Asus for general use will just game on the OLED.
 
The brightest OLED you can get is that PA32DC or 32EP950. How 600nits can be considered Supernova?
 
Last edited:
LOL are you LCD plebs really still using the same BS arguments?

How about compare the image quality while actually PLAYING a game.

Show the color accuracy of an LCD in motion. Oh you don't measure that? You only measure a static image?

Because the LCD is constantly displaying inaccurate colors during motion. When an LCD says "5 ms" response time that's how long it takes for it to go 90% of the way from gray to gray. Even on a "1ms" response time LCD the actual pixel transitiosn don't complete for tens of milliseconds. If you have constantly changing colors displaying at 120hz the pixels on an LCD never even makes it to a 100% of it's static color accuracy capabilities. Compare that to an typical OLED where a pixel will reach 100% capable accuracy within a milisecond.
Now add FALD speeds on top of that and LCD is even more innacurate during motion. LCD "color accuracy" is laughable during motion. OLED shits all over it.
 
LOL are you LCD plebs really still using the same BS arguments?

How about compare the image quality while actually PLAYING a game.

Show the color accuracy of an LCD in motion. Oh you don't measure that? You only measure a static image?

Because the LCD is constantly displaying inaccurate colors during motion. When an LCD says "5 ms" response time that's how long it takes for it to go 90% of the way from gray to gray. Even on a "1ms" response time LCD the actual pixel transitiosn don't complete for tens of milliseconds. If you have constantly changing colors displaying at 120hz the pixels on an LCD never even makes it to a 100% of it's static color accuracy capabilities. Compare that to an typical OLED where a pixel will reach 100% capable accuracy within a milisecond.
Now add FALD speeds on top of that and LCD is even more innacurate during motion. LCD "color accuracy" is laughable during motion. OLED shits all over it.
I've compared them long time ago. I tell you that OLED looks like crap. It's you still cannot see better. You should've seen better images 4 years earlier.

OLED doesn't have brightness. It only shows SDR fine. It is the OLED consistently displaying much worse accuracy in both contrast and color in HDR. If you have tools to scan and compare FALD to OLED to reference then OLED will have much worse accuracy overall. The brightness and color will drop further. Eyes will easily tell which one looks better.

52143842621_bf8a9bf7bd_o_d.png


All you can bank is on OLED response time that shows SDR gaming. Yet no serious gamers will use it as the brightness is too low. I've played way more games just fine on FALD LCD many hours without eyestrain. Once the brightness is higher, you get eyestrain from OLED flickering. What a paradox.

 
I've compared them long time ago. I tell you that OLED looks like crap. It's you still cannot see better. You should've seen better images 4 years earlier.

OLED doesn't have brightness. It only shows SDR fine. It is the OLED consistently displaying much worse accuracy in both contrast and color in HDR. If you have tools to scan and compare FALD to OLED to reference then OLED will have much worse accuracy overall. The brightness and color will drop further. Eyes will easily tell which one looks better.

View attachment 551385

All you can bank is on OLED response time that shows SDR gaming. Yet no serious gamers will use it as the brightness is too low. I've played way more games just fine on FALD LCD many hours without eyestrain. Once the brightness is higher, you get eyestrain from OLED flickering. What a paradox.



Lol completely ignoring what I said. LCD has garbage accuracy during motion. Go QQ in an LCD thread.
 
View attachment 551372View attachment 551373

Blacks are quite a bit lifted on camera compared to in person vs the C2 but the highlights like those plants and blue tentacles are far more saturated and impactful on the PG32UQX. When there aren't many highlights on screen and the scene is dark the C2's contrast advantage is significant. Otherwise, having compared a lot of games now, I don't really think the C2 does HDR justice in most games and seriously looks dead next to this monitor in most HDR titles. There are a few exceptions like Ori where I prefer the C2 unanimously but most of the time the PG32UQX is "good enough" to me in terms of contrast but blows way past the C2 in color volume and impact.
I believe you that bright HDR scenes have more pop to them, but I just can't get over that crushed shadow detail and gray blacks. look awful, though.

The upper image looks much better to me overall, more natural and with a, ironically, higher dynamic range, but I'm viewing it on an SDR screen so it probably looks quite different in reality.
 
Lol completely ignoring what I said. LCD has garbage accuracy during motion. Go QQ in an LCD thread.
I already said you can only bank on response time in SDR.

OLED doesn't even have true HDR. OLED cannot even do HDR at 30fps.
 
I believe you that bright HDR scenes have more pop to them, but I just can't get over that crushed shadow detail and gray blacks. look awful, though.

The upper image looks much better to me overall, more natural and with a, ironically, higher dynamic range, but I'm viewing it on an SDR screen so it probably looks quite different in reality.
Yeah phone camera is really not doing it justice. I don't see any crushed shadow detail from the PG32UQX, if anything its the C2 that is crushing dark detail in person.

C2 is carried really hard by its contrast in games like Returnal/Dead Space. As soon as there are any highlights though even in dark content, it just can't keep up. Like if I had to quantify, those blue tentacles in Returnal look 2x as bright/vibrant.

Even worse though if the game is outdoors or has a big skybox it literally looks washed out beside the LCD. TBH now that I'm actively comparing both with HDR enabled through a splitter box, I'm baffled at how I accepted the C2's HDR performance in a lot of these games this entire time. The localized bloom on the PG32UQX to me is no where near as bad the washed out lifeless HDR dominating the entire screen on the C2. The lifeless portrayal extends to SDR too though because in Hifi Rush, your characters yellow jacket looks noticeably less vibrant on the C2 since the game is so bright and vibrant most of the time causing ABL to come into play on the OLED.

Don't get me wrong, the C2 looks great and has what I consider "universal appeal" but now that I have both side by side to compare correctly, HDR goes to the PG32UQX no contest. I've shared my complaints about the PG32UQX here multiple times in regards to the main reason I can't main it (pixel response) but if you want to enjoy HDR content no question it destroys the C2. From testing across an assortment of games, I favor the C2 in 2 of the 10 HDR titles I test (Ori and Doom Eternal).

I've kept the C2 this entire time because the unfortunate reality is there's so little HDR content to consume on PC. Atomic Heart just released with no HDR support for example. Now that I see how even SDR stacks up between the two though I'm having second thoughts because the LCD can make SDR games look much better than the C2 thanks to that brightness advantage.

TLDR: I didn't think the C2's lack of brightness was hampering it too bad until I put a PG32UQX beside it. Now I know.
 
Last edited:
  • Like
Reactions: Meeho
like this
VR's ppd is disappointing low even when the full screen gets to 60ppi soon, virtual objects in the screen will be very low ppd - though some eye tracking tricks with foveated rendering and the varifocal lenses could help some. Regardless, VR 's next gens will use very bright per pixel emissive microOLED with HDR soon (and varifocal pancake lenses in a smaller, goggle-like form factor). Years later they'll use microLED similarly. So VR will be all per pixel emissive soon and going forward so it will be ahead of pc screens in that regard - all per pixel emissive from then on, at least in quality VR kits from the big players. Per pixel emissive is a better way to technically do displays. Eventually everything will be per pixel emissive.

FALD is a workaround for the time being. Gaming monitor FALD / gaming TV FALD needs a massive reduction in cell sizes by magnitudes even as a stop gap solution though as far as I'm concerned. Not having any glossy options whatsoever on FALD currently as far as I can tell just makes it that much worse of a choice for now imo. Again imo both types of screen are usable with downsides of both - but I personally prefer the tradeoffs of a glossy per pixel emissive to the current matte, low density FALD consumer implementations no matter how bright they get. Perhaps those things will change in the next few years with competition and tech advances in tech across the board. On the FALD side, a large reduction in backlight array cell sizes resulting in a massive increase in FALD "lighting resolution" (and a glossy option would be great) could make a big difference theoretically. I won't hold my breath though. Every time I watch my oled and see tiny (relatively) bright details like the light reflected in people's eyes and razor's edge brighter (relatively) color in blacks to the pixel on a gorgeous glossy screen it brings a smile to my face.

I'm comparing C2 to PG32UQX right now and most of the time the C2 looks pretty dead and lifeless especially if there is any skybox in a game. Going to test Returnal right now since it should be where the C2 has the biggest advantage.

View attachment 551365

But the brighter one looks just as bright as my screen can do when I look at that image. 🤔

Every time I see one of those I'm like.. wow the bright one looks good... but wait.. this is my screen viewing it so it looks just as good, actually better on my screen as this forum's images aren't HDR.

Maybe if you guys could just stream your games to my screen I could see brighter gameplay /s.

There are also different settings like warm vs cold and how you tweak everything that can make a difference, plus your camera's own biases. I used to take pics of my fw900 crt next to a lcd and it would make the fw900 look very pale in images when it didn't look like that in real life. Cameras capture things relatively and have their own settings and biases. Is that a composite or did you take a picture of both screens in the same frame? Regardless it's pretty meaningless as a pic but doing that will throw everything way off just like my crt was so the pic would be even more false. Our eyes also view things relatively, though not exactly the same way, so you'd have to view them each separately to get a better gauge of how it looks even IRL. Your eyes adjust to different conditions and ambient lighting and even bias lighting etc. Obviously FALD displays go much brighter but your whole splitter box methodology and two screens in the same camera frame methodology is flawed. No offense. You aren't the only person posting sbs shots of two screens in the same camera frame. Not everyone realizes (and if they do they are purposefully showing exaggerated images due to the way cameras work).
 
Last edited:
Elvn is like a virtual "cloud player" who talks stuff like he doesn't have it except with all the imaginations.

It's all the same how he denied a higher contrast image with the obvious luminance curves right in front his monitor.

It must be hard to imagine a sky over 2,000nits.
 
Six months later, still loving the hell of my OLED for PC use. For me, it's still by far the best overall PC display I've used and only time I've been highly content with one. Use for gaming, media, and work.

100% brightness, ABSL off, logo dimming off, and energy saver off. 0 burn in so far. Not that I'm worried, got an additional store warranty for it since I got it for so cheap. Pretty much a free unit or upgrade in 1.5 years.

I understand the advantages of miniLED, and will switch if a good one (for my needs) ever comes out. But I did not like the current offerings, most of which I tried. The NeoG7/8 make me sick, so it'll have to be IPS. The best IPS I've heard of is that Chinese Innoc, but has no warranty and fugly bezel (and apparently laggy). LG is supposed to be making decent ones in Q4 (27" 4k, 144hz, 1500 zones), hopefully they are good. Either way I have a great display while I wait for these companies to get it together with miniLED.
 
Last edited:
I already said you can only bank on response time in SDR.

OLED doesn't even have true HDR. OLED cannot even do HDR at 30fps.

Lol by claiming this you just admited LCD "can't even do SDR" at 30 FPS. Holy shit LOL. Oh look, a black and white pixel next to each other. Now LCD can't even do SDR at any FPS. LOOOOOL
 
Lol by claiming this you just admited LCD "can't even do SDR" at 30 FPS. Holy shit LOL. Oh look, a black and white pixel next to each other. Now LCD can't even do SDR at any FPS. LOOOOOL
It's OLED cannot display true HDR in the first place.

It's more like you are hallucinating hard enough to bank response time on SDR.

And OLED is not even that fast. Get wrecked by LCD in the end lol.

810563_1676661632051.png
 
It's OLED cannot display true HDR in the first place.

It's more like you are hallucinating hard enough to bank response time on SDR.

And OLED is not even that fast. Get wrecked by LCD in the end lol.

810563_1676661632051.png
Look at the ghosting in the first image and the overshoot in the second. The third from the OLED has none of that, so despite the blurring motion clarity is still improved due to the near-instant pixel response time. You get blur because OLED still uses sample-and-hold technology with image persistence. This is reduced by using BFI. Since I don't know where this image came from I don't know what settings were used to generate the image. TFT Central just posted their review if you're a Patreon subscriber.
 
When HDR video became a thing UHD Alliance (consisting companies like Dolby, Panasonic etc...) established a standard called UHD Premium that defined what a TV had to be capable of to show HDR as it was meant to be back then, what is real HDR that is. The limits were 1000 nits peak brightness with 0.05 nits black , essentially FALD LCD, OR 600 nits with 0.0005 blacks, OLEDs that is. These were very tight limits back then. Technology has since gone forward and these limits are now easily surpassed by both techs but it is still a clear standard and calling OLEDs version of HDR "just SDR" just because LCD has evolved further is completely asinine and dishonest.
 
Last edited:
Pretty disingenuous image comparison post there (again)..

Like I said previously - - - -

Is the zowie backlight operating as a 1000zone + FALD array in strobing mode? I'm not up on that screen but it it's not operating at peak performance in FALD lighting and HDR while strobing than that's a moot point. Are any of them even 4k ? Do they even have HDR? If not I don't see how they are valid to the discussion.

edit: I looked up at least the zowie XL2566k 360hz review (a january 2023 screen/review), and it is:
. . 1080p
. . TN
. . no HDR
. . EDGE LIT , no FALD, not even local dimming
. . the contrast ratio is very low, so blacks look gray when viewed in a dark room.

  • Mediocre contrast ratio results in raised blacks in dark rooms.
"The BenQ ZOWIE XL2566K has just decent text clarity. The matte coating gives text a slightly hazy look, and due to the low pixel density, text isn't very sharp, even after optimizing the Windows ClearType settings for the display (top photo).

Is there some other zowie that actually compares to a high end FALD LCD and a modern OLED we are discussing the tradeoffs of?

. .

Also , like I quoted from reviews before , you can't get accurate SDR with a FALD display unless you turn off the FALD, and then you are back to a very poor 1300:1 contrast ratio and the accompanying black depths. Which means FALD is never accurate. It's toning huge 7000 pixel zones (or 15k pixels at 8k) against each other and all along those contrasted edges/areas smoothed across even wider # of zones. Lifting and dimming to smooth it out in jumbo low lighting resolution anti-aliasing gradient of huge cells. At best a 45x25 "lighting resolution" which is much too low (but probably even less than that minus edge lighting). A more static or paused camera scene will have contrasted edges across the 7000 pixel zone's squares randomly depending on the scene map, and in dynamic media and gaming which is more often, the contrasted areas will constantly be jumping the fence between and across those ice-cube tray zones in whatever movement vectors and pathing the FoV/camera/virtual camera is taking.
 
Last edited:
Pretty disingenuous image comparison post there (again)..

Like I said previously - - - -

Is the zowie backlight operating as a 1000zone + FALD array in strobing mode? I'm not up on that screen but it it's not operating at peak performance in FALD lighting and HDR while strobing than that's a moot point. Are any of them even 4k ? Do they even have HDR? If not I don't see how they are valid to the discussion.

edit: I looked up at least the zowie XL2566k 360hz review (a january 2023 screen/review), and it is:
. . 1080p
. . TN
. . no HDR
. . EDGE LIT , no FALD, not even local dimming
. . the contrast ratio is very low, so blacks look gray when viewed in a dark room.

  • Mediocre contrast ratio results in raised blacks in dark rooms.
"The BenQ ZOWIE XL2566K has just decent text clarity. The matte coating gives text a slightly hazy look, and due to the low pixel density, text isn't very sharp, even after optimizing the Windows ClearType settings for the display (top photo).

Is there some other zowie that actually compares to a high end FALD LCD and a modern OLED we are discussing the tradeoffs of?

. .

Also , like I quoted from reviews before , you can't get accurate SDR with a FALD display unless you turn off the FALD, and then you are back to a very poor 1300:1 contrast ratio and the accompanying black depths. Which means FALD is never accurate. It's toning huge 7000 pixel zones (or 15k pixels at 8k) against each other and all along those contrasted edges/areas smoothed across even wider # of zones. Lifting and dimming to smooth it out in jumbo low lighting resolution anti-aliasing gradient of huge cells. At best a 45x25 "lighting resolution" which is much too low (but probably even less than that minus edge lighting). A more static or paused camera scene will have contrasted edges across the 7000 pixel zone's squares randomly depending on the scene map, and in dynamic media and gaming which is more often, the contrasted areas will constantly be jumping the fence between and across those ice-cube tray zones in whatever movement vectors and pathing the FoV/camera/virtual camera is taking.

Funny thing is that if we mention ABL on Samsung TV's he starts accusing us of "playing tricks" in an OLED thread or whatever, yet here he is throwing a shitty 1080p TN strobed BenQ into the discussion and using that as an example LOL.

tHiS tHrEaD iS mOrE aBoUt OLED MoNiToRs FoR PC. iT dOeSn'T wOrK wHeN sOmEbOdY bUsTeD oUt A BenQ TN fOr HiS tRiCkS.
 
Pretty disingenuous image comparison post there (again)..

Like I said previously - - - -

Is the zowie backlight operating as a 1000zone + FALD array in strobing mode? I'm not up on that screen but it it's not operating at peak performance in FALD lighting and HDR while strobing than that's a moot point. Are any of them even 4k ? Do they even have HDR? If not I don't see how they are valid to the discussion.

edit: I looked up at least the zowie XL2566k 360hz review (a january 2023 screen/review), and it is:
. . 1080p
. . TN
. . no HDR
. . EDGE LIT , no FALD, not even local dimming
. . the contrast ratio is very low, so blacks look gray when viewed in a dark room.

  • Mediocre contrast ratio results in raised blacks in dark rooms.
"The BenQ ZOWIE XL2566K has just decent text clarity. The matte coating gives text a slightly hazy look, and due to the low pixel density, text isn't very sharp, even after optimizing the Windows ClearType settings for the display (top photo).

Is there some other zowie that actually compares to a high end FALD LCD and a modern OLED we are discussing the tradeoffs of?

. .

Also , like I quoted from reviews before , you can't get accurate SDR with a FALD display unless you turn off the FALD, and then you are back to a very poor 1300:1 contrast ratio and the accompanying black depths. Which means FALD is never accurate. It's toning huge 7000 pixel zones (or 15k pixels at 8k) against each other and all along those contrasted edges/areas smoothed across even wider # of zones. Lifting and dimming to smooth it out in jumbo low lighting resolution anti-aliasing gradient of huge cells. At best a 45x25 "lighting resolution" which is much too low (but probably even less than that minus edge lighting). A more static or paused camera scene will have contrasted edges across the 7000 pixel zone's squares randomly depending on the scene map, and in dynamic media and gaming which is more often, the contrasted areas will constantly be jumping the fence between and across those ice-cube tray zones in whatever movement vectors and pathing the FoV/camera/virtual camera is taking.
Funny who's being disingenuous putting these quotes from reviews to say how low the zones of FALD can hammer the images worse than OLED does?

You should've realized that even with 8 millions zones OLED hammers the images even more with even less accuracy. FALD is at the top of HDR. You should've seen better.

It's always the same trick when you say something like FALD is bad without even mentioning something else like OLED is even worse.

Then you pulled out Samsung products to fit your narrative to ABL like OLED suddenly has better ABL now?

Funny you talk like 360Hz Zowie like 240Hz? If you want to compete in games, XL2566K will destroy whatever OLED you have.

And you think people has only one monitor? You want to see best graphics then you have a FALD. You want to be fast to compete then you have a TN.

OLED has no chance being a top dog. It only fits in between with a little fast SDR images.
 
Last edited:
Look at the ghosting in the first image and the overshoot in the second. The third from the OLED has none of that, so despite the blurring motion clarity is still improved due to the near-instant pixel response time. You get blur because OLED still uses sample-and-hold technology with image persistence. This is reduced by using BFI. Since I don't know where this image came from I don't know what settings were used to generate the image. TFT Central just posted their review if you're a Patreon subscriber.
Don't forget that's 360Hz vs 240Hz in a competitive game. You talk like they are all 240Hz. You don't even need full GTG to notice the pixel shift. 80% GTG is enough. You won't get advantage when using a 160Nits 240Hz OLED even with a little faster response time.
 
Funny thing is that if we mention ABL on Samsung TV's he starts accusing us of "playing tricks" in an OLED thread or whatever, yet here he is throwing a shitty 1080p TN strobed BenQ into the discussion and using that as an example LOL.

tHiS tHrEaD iS mOrE aBoUt OLED MoNiToRs FoR PC. iT dOeSn'T wOrK wHeN sOmEbOdY bUsTeD oUt A BenQ TN fOr HiS tRiCkS.
Funny if you care about response time that much you won't suddenly win a game with a 240Hz OLED against a 360Hz TN.
 
When HDR video became a thing UHD Alliance (consisting companies like Dolby, Panasonic etc...) established a standard called UHD Premium that defined what a TV had to be capable of to show HDR as it was meant to be back then, what is real HDR that is. The limits were 1000 nits peak brightness with 0.05 nits black , essentially FALD LCD, OR 600 nits with 0.0005 blacks, OLEDs that is. These were very tight limits back then. Technology has since gone forward and these limits are now easily surpassed by both techs but it is still a clear standard and calling OLEDs version of HDR "just SDR" just because LCD has evolved further is completely asinine and dishonest.
The current HDR standard is still rather high. There are not many good HDR1000 monitors out there. HDR needs many shades of 10bit color lit by higher brightness. We just start to see 10bit or 12bit. OLED cannot show that much 10bit when ABL kicks in. You won't see 1024shades cramped into 200nits. You see 8bit at that level of brightness.
 
Funny somebody else said how important the response time is. You want to add flames about how good OLED is at response time? Then get wrecked by TN.

This thread is more about OLED monitors for PC. It doesn't work when somebody busted out a strobed BenQ TN for his tricks.
 
This thread is more about OLED monitors for PC. It doesn't work when somebody busted out a strobed BenQ TN for his tricks.
This thread is more about OLED monitors for PC.
It doesn't work when OLED "HDR" falls off chart.
It doesn't work when OLED is just a little fast SDR monitor.
It doesn't work when OLED has the most ABL to display 200nits APL.
It doesn't work when OLED has unsolvable flickers.
 
Last edited:
Funny that I never said anything about trying to win games.


Me either. We are talking about FALD HDR vs OLED HDR 's tradeoffs. Once you omit fald or never have it on a screen to begin with your screen is 1300:1 natively or much less on other screens which is very bad. A VA tv with no FALD can be up to 6100:1 or more and an oled is way more, down to ultra/"off" black per pixel and that down to the razors edge pixel by pixel..


And competitive advantage?

Yeh maybe if you are playing on a LAN or locally against AI/bots. That's where practically all of the testing videos do it.

Online gaming server's low tick rates (many games horribly low) and the server ping compensation/interpolation more than muddies the results in online gaming. Biased net-code choices depending on the devs leanings at that. Marketing acts like specs are a 1:1 relation to online gaming servers. That is false.

Besides online gaming is rife with thousands and thousands of cheaters, even cleverly low key ones to make it believable and to carry their teammates. Thousands and thousands of general populace accounts banned, famous streamers of popular competitive game(s) and even professional competitors in LAN arena competitions busted outright during competitions. There are also dual rig cheats using a 2ndary streaming rig to avoid detection etc. Pretty competitive population I guess lol.

I really don't care about that. I play adventure games and some, (sometimes relatively heavy) co-op games. You can go play on a practically black and white 1080p TN (exaggerting, but yeahh..) with horrible contrast and black depth aggravated by matte type AG surface treatment and bad ppi to PPD all you want. Turn up the dark areas that are supposed to be shadowed while you are at it and turn off all of the grass and anything else that can give you an angle better than your actual ability to play the game aesthetically. Minecraft gunners. :LOL:

. . .


That says nothing about 4k FALD HDR media and gaming experience and tradeoffs vs a modern 4k OLED HDR media and gaming experience anyway. Disingenous and misdirection. Arguing the pros of FALD and referencing a non-fald edge lit, non HDR, 1080p TN with horrible blacks, low ppi to ppd and a matte surface aggravating it. Grasping at straws.
 
Last edited:
It's rather obvious a could player like elvn banging on his old CX hasn't seen neither better HDR images nor faster images.

Who's disingenuous in the first place talking about HDR with native contrast without FALD?

Who's disingenuous talking about games like cheats lol
 
The truth is always painful. There is money on it. You should check the sales of OLED monitors. It's pathetic. OLED doesn't work as monitors.

Even XL2566K has easy 50 times more sales than that 27" 240Hz OLED. Monitor is a grinding bossiness. A fragile OLED cannot stand a chance
 
Back
Top