27" 240hz OLED Monitor!!! (but there's a catch...)

This thing is laughably only 200 nits bright. LG OLED is terrible.
200 nits is totally fine for SDR content in my experience. My Samsung G70A LCD is not the brightest, maxing out at around 400 nits but I run it at about 120 nits, day or night. I used the LG CX 48" at the same brightness level. For me this is totally ok because I can control for light in the room. It does not mean I use it in a dark cave, just that there is no direct sunlight hitting the screen or high brightness overhead lamps.

200 nits is also what the PG42UQ has for it's "uniform brightness mode" so it's most likely how they avoid ABL in SDR content.

Hopefully in HDR it can perform similarly to the 42" OLEDs because that's where you want the higher brightness.
 
HDR needs brightness in addition to contrast. Why my Samsung QD-OLED that can do upwards of 1,500 nits is so amazing. WRGB OLED is just poor tech unfortunately.
 
HDR needs brightness in addition to contrast. Why my Samsung QD-OLED that can do upwards of 1,500 nits is so amazing. WRGB OLED is just poor tech unfortunately.
Except that very same QD-OLED is also limited to ~250 nits brightness for SDR. I would not be surprised if LG also adds a couple of different HDR modes like the AW QD-OLED for "True Black 400" DisplayHDR certification and a "about 1000 nits at peak brightness in small window sizes".

For a desktop display I would rather have a RWGB pixel structure over the triangle arrangement of QD-OLED. I do wish both companies just moved to standard RGB structure for future desktop displays though.

As always, you can't win with display tech and get something that does everything just right.
 
Last edited:
Not surprised. People should be realistic about their expectations.

Current Samsung QD Oled and LG W-OLED can only reach ~105 PPI. Suitable for 42‘ 4K, 34‘ UWQHD or 27‘ QHD.
Samsung also has other Oled tech used in laptop and mobile that can reach higher PPI , but their process seems to be limited to screen sizes of ~16 inches.
That leaves JOLED which does have 27 inch 4K Oled panels, but they are un-competative as their process yield is not mass market viable.

Also, given the general stagnant economic outlook for 2023 and much higher interest rates every where, the massive flop of 8K display products, Expect manufactures to limit investments to bring higher PPI devices to market.

One of the reasons why 42 4K exists was due to the 80+ 8K market. Unless we see smaller 8K TV’s in future. don’t expect smaller 4K Oleds!
I would expect to see a 50’ UW4K before a 32’ 4K 27’ 4K OLED.

It's all about PPD (Pixels Per Degree) and viewing angle. When comparing 4k screens - their perceived pixel density and viewing angle is the same when the distance is scaled.

At 50 to 60 deg viewing angle on all sizes of 4k screens, you will get 64 PPD to 77 PPD.

Massaged or alternative text sub-sampling and aggressive graphics anti-aliasing (at a performance hit) starts to compensate enough vs more gross text fringing and graphics aliasing at around 60 PPD, which on a 4k screen is about 64 deg viewing angle. This works though it's outside of the 50 deg to 60deg human viewpoint a bit.

Beneath 60PPD you will get text fringing and graphics aliasing more like what a 1500p screen would look like at traditional near desk distances, and if you scale the text up you'll be dropping from 4k 1:1 to around 1500p-like desktop real-estate too.. It's also worth noting that on the 2D desktop there is no AA for desktop graphics and imagery typically, just for text via text sub sampling. So the aliasing is uncompensated there entirely outside of certain authoring app's 3d viewports etc.


tJWvzHy.png


. .

3kU3adt.png


. . .

On a flat screen, the edges of the screen are always off axis by some amount. On OLED and VA screens, these off axis extents of the screen are non uniform color (OLED) or shift/shading (VA) gradients whose sizes grow the closer you sit to the screen.
The distortion field and eye fatigue zone is still there when at the optimal viewing angle (on a flat screen), but it is smaller. The edges of a screen are still as off-axis as if you were sitting an equivalent distance from them outside of the screen:

XvKRu9t.png


When you sit closer than 50 to 60 deg viewing angle, the sides are pushed more outside of your viewpoint causing a larger eye fatigue and non-uniform screen area on each side (as well as the PPD being driven down):

RUdpoK8.png
 
Last edited:
<...> the massive flop of 8K display products, Expect manufactures to limit investments to bring higher PPI devices to market.

One of the reasons why 42 4K exists was due to the 80+ 8K market. Unless we see smaller 8K TV’s in future. don’t expect smaller 4K Oleds!
I would expect to see a 50’ UW4K before a 32’ 4K 27’ 4K OLED.

I'd love a 48" curved 4k at 1000R for now but in the future a 55" 8k gaming display would be great. In an 8k - hopefully with gaming upscaling tech or nvidia AI upscaling hardware on the screen itself and DSC 3:1 if I'm making a wishlist, in order to bypass the cable/port bandwidth and upscale 4k and 4k-like uw resolutions. With a big wall of 8k screen you could run other window sizes or uw band on it while still getting 4k quads worth of desktop real-estate otherwise. I'm not holding my breath on that but it would be something to look forward to in the years ahead.

MBPT56W.png



From using the LTT resolution/bandwidth calculator - These uw resolutions look like they'd be nice to run on a 16:9 4k or 8k screen if they upped the Hz on oled tvs, even at 4k upscaled to 8k being sent as an 8k signal as far as the display would be concerned. They fit within HDMI 2.1's bandwidth, at least when using DSC 3:1 compression ratio.

8k at 32:10 ultrawide 7680 × 2400 rez @ 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

8k at 24:10 ultrawide 7680 × 3200 rez @ 150Hz, 10bit signal at 3:1 compression: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

4k at 24:10 ultrawide 3840 × 1600 rez @ 500Hz, 10bit signal at 3:1 compression: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

. . .

*could upscale 4k 32:10 to 8k uw version at 7680 x 2400.. As far as the screen is concerned 8k is 8k though if done at the gpu/pc end of the equation. You would get higher frame rate when upscaling of course.

4k at 32:10 ultrawide 3840 x 1200 rez @ 400Hz, 10 bit signal at 3:1 compression: 41.52 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/sec)

8k at 32:10 ultrawide 7680 × 2400 rez @ 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)



. . .

8k at 32:10 ultrawide rez 345Hz, 10bit signal at 3:1 compression: 76.38 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

8k at 24:10 ultrawide rez 270Hz, 10bit signal at 3:1 compression: 76.56 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)

4k at 24:10 ultrawide rez 780Hz, 10bit signal at 3:1 compression: 76.32 Gbit/s (DP 2.0 ~> 77.37 Gbit/s)
 
Last edited:
200 nits is totally fine for SDR content in my experience. My Samsung G70A LCD is not the brightest, maxing out at around 400 nits but I run it at about 120 nits, day or night. I used the LG CX 48" at the same brightness level. For me this is totally ok because I can control for light in the room. It does not mean I use it in a dark cave, just that there is no direct sunlight hitting the screen or high brightness overhead lamps.

200 nits is also what the PG42UQ has for it's "uniform brightness mode" so it's most likely how they avoid ABL in SDR content.

Hopefully in HDR it can perform similarly to the 42" OLEDs because that's where you want the higher brightness.

You are correct. RTings gave some lower scores by a point or so as a monitor usage for some screens that had slightly lower SDR brightness levels than some screens that had non-traditional SDR-brightness/SDR-whitepoint at much higher boosted ranges. However they didn't account for the fact that unlike the screens they gave the higher SDR score to - those with lower levels by comparison were below the ABL/ASBL limit which makes more sense when watching SDR. Unless you are beneath that threshold why boost your SDR higher when it can't be sustained and drops even lower? And above traditional SDR brightness/whitepoint levels by a lot more. Something to consider.

For gaming and movies, HDR (and for movies, dolby vision movies and show releases which samsung doesn't have) is a priority for me. AutoHDR where possible in other games. It makes sense to have brightness limiters there because the tradeoff has much higher gains, and the brightness/color volume curve isn't relative like SDR's. For SDR I'd rather keep it in "SDR" ranges in the first place and avoid the more aggressive brightness/duration limiters there entirely.
 
Last edited:
This thing is laughably only 200 nits bright. LG OLED is terrible.
These are ABL-disabled non-HDR settings.
Samsung is also only in the 200s leagues too when ABL is disabled.

Just like on a CRT/plasma being brighter white when smaller white areas -- you also get nit peaking behaviors on both Samsung and LG panels, but neither Samsung nor LG can do quadruple-digit nits when a large number of pixels are white.

(nit peaking? Sounds like I'm nit picking. Ha.)

This is discussing a difference smaller than IPS vs TN.

Both are vastly superior to LCD in sample and hold motion clarity.
 
These are ABL-disabled non-HDR settings.
Samsung is also only in the 200s leagues too when ABL is disabled.

Just like on a CRT/plasma being brighter white when smaller white areas -- you also get nit peaking behaviors on both Samsung and LG panels, but neither Samsung nor LG can do quadruple-digit nits when a large number of pixels are white.
I would also argue that for most desktop usage, that is plenty. If you like a very bright room maybe not, but if you can control your light level you probably don't want it eye-searingly bright all the time. It's going to depend on your room and your preference, but even in a normally lit room, I find that I will set a display to less than 200nits just picking what looks good to me. Like my laptop goes up to around 400 nits, and I've turned it all the way up in bright places like an airport with big windows and still wanted to turn it up more, but when I set it for comfortable levels in my living room or at work and measure it, usually I'm at around 160-180 or so.

Not saying it wouldn't be nice if they could do higher levels, but they can't and I don't think it is a problem for most people. I think people forget how dim computer CRTs were.

Don't get me wrong, I'm looking forward to the day when display technology is such that they can maintain full-scene brightness level higher than we'd ever want so that it is never an issue, but I think people get a little too worked up about OLED brightness.
 
I think people forget how dim computer CRTs were.
wrong, i am an active user of computer CRT monitors: compaq 7550 and FW900 CRTs and they are far to be "dim", i can use them at their max luminance levels (contrast setting) on a moderated natural lighted room in the day and also in the night with a room light bulb turned on with perceivable bright-vivid-colored-sharp-image quality.

as for software BFI, for those interested or curious, there is a program called desktopBFI that can be found by searching "desktopBFI", is hosted in github, it doesnt work in fullscreen mode unfortunately, but maybe be worth the try for those curious, in fact i am courious to test it on my friend's LG C1 OLED to see what it does for 60hz content were C1 hardware BFI mode sucks even at its highest mode for 60hz (still notable blury motion)
 
I think the dpi of 1440p at 27 inch is fine

on the plus side, this should lower the 120 hz models to well-below $500
1440p at 30,6" would be perfect 96PPI, where windows fonts are designed.
Sadly QHD is always 31.5" and 27" for some reason.
 
wrong, i am an active user of computer CRT monitors: compaq 7550 and FW900 CRTs and they are far to be "dim", i can use them at their max luminance levels (contrast setting) on a moderated natural lighted room in the day and also in the night with a room light bulb turned on with perceivable bright-vivid-colored-sharp-image quality.

The current calibration thread has them pegged at 105 nit I think.
 
wrong, i am an active user of computer CRT monitors: compaq 7550 and FW900 CRTs and they are far to be "dim", i can use them at their max luminance levels (contrast setting) on a moderated natural lighted room in the day and also in the night with a room light bulb turned on with perceivable bright-vivid-colored-sharp-image quality.
Then please quote the measured luminance spec at 100% window. Back when I did CRTs (Lacie Electron22Blue IV) I calibrated it to about 80 nits. Pushed too high, and you'd get blur as the beam leaked to surrounding phosphors.

The current calibration thread has them pegged at 105 nit I think.
That sounds like what I'd expect, particularly given they are older units and probably have faded over time. I don't remember what my Lacie quoted as its max, but it was less than 200, and 80 was the recommended calibrated level. Note that the OG sRGB assumes an 80nit level, and 64 lux ambient illumination.
 
i dont have a colorimeter to take those measures, and sure something like arround 100 nits sounds dim for some "on paper", specialy for those people that have not seen a current CRT monitor working in a long time and in good condition, in my opinion people should not base their judjments purely only on mueasured numbers (nits), i cannot speak for a monitor like that mentioned "Lacie", since i have not seen one, i am talking about what i am currently witnessing with my eyes on those mentioned monitors CRT that are far for being "dim" in the enviroment light conditions i mentioned, and i definitely i dont need a dark room that would be needed to use a dim screen comfortably.
so, " how dim computer CRTs were." is a wrong generalized missinformation.

as for "blur" at their max luminance levels, even modern games on the CRT monitors i mentioned look good sharp enough, clear readable smaller HUD, menu text even at their max brightness (luminance) levels and at a high enough resolutions and refresh rates.
 
Except that very same QD-OLED is also limited to ~250 nits brightness for SDR. I would not be surprised if LG also adds a couple of different HDR modes like the AW QD-OLED for "True Black 400" DisplayHDR certification and a "about 1000 nits at peak brightness in small window sizes".

Oh I was referring to my S95B QD-OLED, not the 34" QD-OLED UW gaming panel. The S95B on older firmware gets insanely bright for an OLED. Here is a real pic of me playing Calisto Protocol in the dark, it's like a 55" PG32UQX but without all of LCD/FALD drawbacks:
 

Attachments

  • 774400_20221203_201450.jpg
    774400_20221203_201450.jpg
    191.6 KB · Views: 0
Geez even the 42 C2 can do over 300 nits sustained on a 25% window. I'm guessing this is going to be an "HDR 400 True Black" monitor so obviously a peak brightness no greater than 400 nits which isn't great. Doesn't really spell much confidence for the 32" version.
 
Oh I was referring to my S95B QD-OLED, not the 34" QD-OLED UW gaming panel. The S95B on older firmware gets insanely bright for an OLED. Here is a real pic of me playing Calisto Protocol in the dark, it's like a 55" PG32UQX but without all of LCD/FALD drawbacks:

The S95B QD OLED has ABL when pushing SDR content ranges to 470nit and it drops back and forth to 190nit or less reflexively, (maybe even 150 when done via ABL not sure). We were saying better off just running 190nit SDR than have ABL ping pong in flatter SDR content. Making SDR's relative brightness range brighter isn't going to make it like HDR. HDR's ranges that are mid range, low-mids, SDR ranges etc remain in place while it is showing brighter highlights and light sources and darker depths. HDR isn't turning the brightness knob up. ABL is a worthwhile tradeoff for HDR's greater color volume throughout but prob not worth having ABL kicking in in SDR's relative brightness range imo. If that's what you prefer that's fine though. I just wouldn't run SDR like that personally.

RTings:

Update 08/05/2022: The peak brightness of the TV changed a bit with firmware update 1303. 2% windows are no longer dimmed by the TV, but nothing else has changed. The overall peak brightness of the TV in SDR is the same.

The Samsung S95B has good peak brightness in SDR. It's bright enough to overcome glare in bright rooms, but sadly, large bright scenes are dimmed considerably by the TV's Automatic Brightness Limiter (ABL). This is mainly distracting when watching sports with bright playing surfaces, like hockey. Setting Peak Brightness to 'Off' effectively disables the ABL feature, but also reduces to the peak brightness to about 190 cd/m² in most scenes.

HDR and Auto HDR are a different matter (and I prioritize those anyway). Also if you were running brighter SDR on a FALD instead of an OLED's aggressive ABL it might make more sense to do so.
 
i dont have a colorimeter to take those measures, and sure something like arround 100 nits sounds dim for some "on paper", specialy for those people that have not seen a current CRT monitor working in a long time and in good condition, in my opinion people should not base their judjments purely only on mueasured numbers (nits),
That's saying "I want to ignore objective data because it doesn't confirm my feelings." Photons are photons to your eyes, there are not some magic photons that the CRT emits that other displays don't that look brighter, despite being the same level. So, if 100nits on a CRT is a brightness level that works well for you, then you'd discover that it would also work well on an OLED or other kind of display. On the other hand if it is serviceable, but less bright than you'd like, you'd probably find yourself turning up the brightness on a display technology that could do it.

My original point stands though: computer CRTs, even at their peak, didn't get as bright as OLED monitors do now, even though OLEDs don't get as bright as LCDs do. Thus I think that most people will find that OLEDs do just fine, unless they are in a really bright room. While most LCDs these days can get as bright as 400nits or so, with some going beyond that, it is rare that you'd set them that high in a normal room. The 200nits of sustained brightness that this screen provides is likely to be more than enough.
 
That's saying "I want to ignore objective data because it doesn't confirm my feelings." Photons are photons to your eyes, there are not some magic photons that the CRT emits that other displays don't that look brighter, despite being the same level. So, if 100nits on a CRT is a brightness level that works well for you, then you'd discover that it would also work well on an OLED or other kind of display. On the other hand if it is serviceable, but less bright than you'd like, you'd probably find yourself turning up the brightness on a display technology that could do it.

My original point stands though: computer CRTs, even at their peak, didn't get as bright as OLED monitors do now, even though OLEDs don't get as bright as LCDs do. Thus I think that most people will find that OLEDs do just fine, unless they are in a really bright room. While most LCDs these days can get as bright as 400nits or so, with some going beyond that, it is rare that you'd set them that high in a normal room. The 200nits of sustained brightness that this screen provides is likely to be more than enough.

That 200 nits is only on a 25% window. Full field brightness will probably be between 100-150 nits is my guess. That's good enough for me as I run my desktop at 120 nits for SDR anyways but I do understand that some people use their displays in a really bright room and would need a lot more than that. You are also absolutely right in that it's at least as good as CRT in the brightness department.

1670267837051.png
 
That 200 nits is only on a 25% window. Full field brightness will probably be between 100-150 nits is my guess. That's good enough for me as I run my desktop at 120 nits for SDR anyways but I do understand that some people use their displays in a really bright room and would need a lot more than that. You are also absolutely right in that it's at least as good as CRT in the brightness department.

View attachment 532043
Ahh gotcha, the earlier post in this thread seemed to imply 200 nits 100% window with ABL/APL disabled. That would also work for me (115 nits in my case) but ya, I can see people running in to cases where they want more. I run about 160nits at work, last time I measured, because the room is pretty bright. Also it depends on how much you want it to pop against the background lighting. At work my display is pretty sedate since it is just for text. I am not interested in high contrast, rather the opposite since high contrast images can get fatiguing. However, at home I want it to pop more, I want more contrast and a brighter image, relative to the ambient lighting.

I'm hopeful that OLEDs will work out the peak brightness, and the burn in, issues with a few more generations. They have already gotten a lot better than when the first launched. Hopefully in 5 years it'll be to the point where they really don't have many tradeoffs vs LCD and they start to become the default technology for displays.
 
That 200 nits is only on a 25% window. Full field brightness will probably be between 100-150 nits is my guess. That's good enough for me as I run my desktop at 120 nits for SDR anyways but I do understand that some people use their displays in a really bright room and would need a lot more than that. You are also absolutely right in that it's at least as good as CRT in the brightness department.

View attachment 532043



Yeah i woudn't watch a movie projector on the beach either. exaggerating of course ☀️😎

..

Ahh gotcha, the earlier post in this thread seemed to imply 200 nits 100% window with ABL/APL disabled. That would also work for me (115 nits in my case) but ya, I can see people running in to cases where they want more. I run about 160nits at work, last time I measured, because the room is pretty bright. Also it depends on how much you want it to pop against the background lighting. At work my display is pretty sedate since it is just for text. I am not interested in high contrast, rather the opposite since high contrast images can get fatiguing. However, at home I want it to pop more, I want more contrast and a brighter image, relative to the ambient lighting.

I'm hopeful that OLEDs will work out the peak brightness, and the burn in, issues with a few more generations. They have already gotten a lot better than when the first launched. Hopefully in 5 years it'll be to the point where they really don't have many tradeoffs vs LCD and they start to become the default technology for displays.


From Rtings review of samsung s95B qd-oled SDR brightness section since that has been being compared in recent comments:

Sustained 100% Window
200 cd/m²

Automatic Brightness Limiting (ABL)
0.057

It can kick %'s of the screen up to 470nit but it's still a flatter SDR range unlike HDR. And in doing so it ping pongs back and forth with ABL being triggered.

Update 08/05/2022: The peak brightness of the TV changed a bit with firmware update 1303. 2% windows are no longer dimmed by the TV, but nothing else has changed. The overall peak brightness of the TV in SDR is the same.
The Samsung S95B has good peak brightness in SDR. It's bright enough to overcome glare in bright rooms, but sadly, large bright scenes are dimmed considerably by the TV's Automatic Brightness Limiter (ABL). This is mainly distracting when watching sports with bright playing surfaces, like hockey. Setting Peak Brightness to 'Off' effectively disables the ABL feature, but also reduces to the peak brightness to about 190 cd/m² in most scenes
 
So this monitor is not using QD OLED, and also it won't come with burn in warranty? That worries me a bit, but otherwise this seems like the dream monitor to me.
 
So this monitor is not using QD OLED, and also it won't come with burn in warranty? That worries me a bit, but otherwise this seems like the dream monitor to me.

Definitely not QD-OLED. As for burn in warranty we'll just have to wait and see if they announce one when they officially reveal the monitor at CES.
 
All of LG's displays have had 1 year warranties in North America. I don't see how this will be any different.
 
SDR/rec709 is based around reproducing the color range/gamut/luminosity of a CRT tube. Which is generally between 100 and 120 cdm2.

If you make SDR content brighter, all you’re doing is shifting everything up, including the black point, assuming you’re still maintaining its reproducible gamut range. Otherwise if the blackpoint remains the same you’re stretching the image out (in terms of color/contrast).

If you’re trying to watch TV in a bright room, I get it, you just want to see the darn thing. But if you’re trying to view it accurately, you’d never want it to be 400nits.

For desktop, it’s much the same. If you’re going for SRGB/Rec709 and trying to reproduce things properly then calibrating for 100cdm2 is generally what is recommended (or 120 if you like it a bit brighter). And that is what I do if I’m grading SDR content. If accuracy isn’t something you care about, most of what you do is business/spreadsheets/gaming, whatever then obviously you don’t care. Film/TV isn’t the only application for a monitor, I’m not an idiot. But all of those things still are using the same color spaces to display those things, it’s just that perhaps for you, accuracy for those things doesn’t matter much.

Personally I don’t know how/why anyone would want to stare at 400nits of brightness on a computer display. That is very bright. Perhaps if and only if I needed to combat a room that I couldn’t control the light, but either way that would be an unpleasant experience.
 
Last edited:
For text viewing, definitely not.

But OLED appeals to computer gamers, and there are definitely cases where you want to configure extremely high SDR peaking behaviors for small window sizes (10% or less).

But for playing games with sufficient adjustability, there are cases where you want to configure high peaking behaviors at 10% window sizes (and smaller) for certain re-tunable SDR content.

Some great games like, say, "Bioshock Infinite" and "Half Life 2" have adjustable black points where you can keep caves inky blacks, while making highlights brighter. Some of these games give excellent "HDR enhanced" experiences.So you can kind of turn them into very good looking pseudo-HDR from their existing SDR behaviors. By allowing them to peak very high at sub-10% windows. Realtime SDR to pseudo-HDR is very viable with select older SDR games.

These games actually used to simulate "HDR" on SDR via auto-iris effects and bloom effects, but as a side effect, they are highly tweakable to behave very well with HDR displays with a bigger color gamut, especially when re-tuning their adjustable black point to stay inky black.

Also, instead of a HDR simulator mode, they can go true HDR with some minor modifications. In Half Life 2 Remastered (the new 2023 edition), and variants such as the indie VR version, it manages to unlock their SDR-simulator HDR into more true HDR by unlocking their color gamuts that they were actually already running internally (and clipping to SDR in their HDR simulator modes), by bypassing the SDR tonemapping for the HDR engine and replacing it with a replacement HDR tonemapping. Funny how mods to a nearly 20-year-old game manages to have unlockable HDR.
 
Last edited:
For text viewing, definitely not.

But OLED appeals to computer gamers,
On this we agree. Good display tech is good display tech. Everyone benefits regardless of purpose.
and there are definitely cases where you want to configure extremely high SDR peaking behaviors for small window sizes (10% or less).

But for playing games with sufficient adjustability, there are cases where you want to configure high peaking behaviors at 10% window sizes (and smaller) for certain re-tunable SDR content.

Some great games like, say, "Bioshock Infinite" and "Half Life 2" have adjustable black points where you can keep caves inky blacks, while making highlights brighter. Some of these games give excellent "HDR enhanced" experiences.So you can kind of turn them into very good looking pseudo-HDR from their existing SDR behaviors. By allowing them to peak very high at sub-10% windows. Realtime SDR to pseudo-HDR is very viable with select older SDR games.
Right, basically the short version is that if you can get a game to have at least some capacity to be displaying luminosity past its original SDR gamut there is a use case for a 10% window being at 400 nits.

I get it. But my responses were more geared towards people that were commenting that 400 nits sustained for 100% windows in SDR was too low (aka full screen brightness). I was mostly stating in response to those comments that that wouldn't be a pleasant experience. And I would say that regardless of if we were talking about gaming, spreadsheets, or film. Again, unless you're in an very bright environment, which would be a sub-optimal viewing experience either way.
These games actually used to simulate "HDR" on SDR via auto-iris effects and bloom effects, but as a side effect, they are highly tweakable to behave very well with HDR displays with a bigger color gamut, especially when re-tuning their adjustable black point to stay inky black.

Also, instead of a HDR simulator mode, they can go true HDR with some minor modifications. In Half Life 2 Remastered (the new 2023 edition), and variants such as the indie VR version, it manages to unlock their SDR-simulator HDR into more true HDR by unlocking their color gamuts that they were actually already running internally (and clipping to SDR in their HDR simulator modes), by bypassing the SDR tonemapping for the HDR engine and replacing it with a replacement HDR tonemapping. Funny how mods to a nearly 20-year-old game manages to have unlockable HDR.
HDR gaming (and movies) is of course an entirely different matter. The goal will always be to get monitors that can finally get us to 10,000 nits. Until then, of course we want 1000 nits of peak brightness for "minimum optimal" HDR experience. There is no disagreement about HDR. Or if you can hack games to get HDR, again that's a totally different thing.
 
Why are some people so obsessed with brightness? It's not like CRTs produced massive brightness either. With calibration, you usually end up with like 15-20% of the brightness setting anyway. What's the problem here? Enlighten me (no pun intended).
 
Why are some people so obsessed with brightness? It's not like CRTs produced massive brightness either. With calibration, you usually end up with like 15-20% of the brightness setting anyway. What's the problem here? Enlighten me (no pun intended).

Is it really an obsession to want a more similar brightness level across the size range? Especially when the PPI is pretty similar? The problem here is that the larger models are capable of 800 nits peak without a heatsink, or 1000 nits with and that really helps in delivering a great HDR experience, so HDR is the problem here. This monitor will probably be no higher than 400 nits even though the PPI is the same as the 4K 42C2 which can still do 700 nits peak so the whole argument about higher PPI = less brightness can be thrown out the window here. HDR isn't just about true black levels, that's just one part of the equation. If one was truly obsessed with brightness then they would be flocking towards LCDs in the first place. If you do not care about HDR then there is no problem with the brightness at all actually. It's totally suitable for SDR content and who knows maybe that's what the majority of users will use it for I suppose.
 
Why are some people so obsessed with brightness? It's not like CRTs produced massive brightness either. With calibration, you usually end up with like 15-20% of the brightness setting anyway. What's the problem here? Enlighten me (no pun intended).
Why is a monitor in 2022 being compared to a CRT? A lot has happened between then.
 
Why is a monitor in 2022 being compared to a CRT? A lot has happened between then.
Because like it or not, those are still the monitors to beat. Those of us who have had high-end CRT's have been awaiting this moment for a long time. Something that trumps them on all fronts. OLED is certainly poised to do so. Motion clarity is the last frontier.
 
Because like it or not, those are still the monitors to beat.
That is HIGHLY subjective. If all you want to do is gaming, then MAYBE. But in terms of things like total luminance, peak luminance, width of gamut (or otherwise stating color volume/accuracy etc), high Hz, maximum resolution, then CRT has been "destroyed" for a very long time. And this is not to say anything about CRT flicker, eye strain, or other issues like that or geometric distortion (straight lines) or size/weight.

Basically if your standard is visual fidelity at all, then CRT has been underwater for some time. CRT's major advantage as you note has been latency and motion clarity. You can only make those subjective statements if you only prioritize those two things at the cost of every other metric a monitor has.
Those of us who have had high-end CRT's have been awaiting this moment for a long time. Something that trumps them on all fronts. OLED is certainly poised to do so. Motion clarity is the last frontier.
Again, CRT has been destroyed basically on all the fronts that matter for >99.9% of people. If you are the small minority that want to play fighters with arcade cabinet levels of accuracy, you're in an extreme minority case. I don't disagree if all you prioritize is the things I mentioned in the first sentence and you're welcome to your position, but you're obviously finding that the "unwashed masses" don't feel that way. And not acknowledging your extreme bias is just going to raise eyebrows.

EDITs: spelling/grammar not content.
 
Last edited:
  • Like
Reactions: elvn
like this
That is HIGHLY subjective. If all you want to do is gaming, then MAYBE. But in terms of things like total luminance, peak luminance, width of gamut (or otherwise stating color volume/accuracy etc), high Hz, maximum resolution, then CRT has been "destroyed" for a very long time. And this is not to say anything about CRT flicker, eye strain, or other issues like that or geometric distortion (straight lines) or size/weight.

Basically if your standard is visual fidelity at all, then CRT has been underwater for some time. CRT's major advantage as you note has been latency and motion clarity. You can only make those subjective statements if you only prioritize those two things literally at the cost of every other metric a monitor has.

Again, CRT has been destroyed basically on all the fronts that matter for >99.9% of people. If you are the small minority that want to play fighters with arcade cabinet levels of accuracy, you're in an extreme minority case. I don't disagree if all you prioritize is the things I mentioned in the first sentence and you're welcome to your position, but you're obviously finding that the "unwashed masses" don't feel that way. And not acknowledging your extreme bias is just going to raise eyebrows.

I used a graphics professional FW900 crt alongside LCDs for years.

The focus/convergence issues require working under the hood (with some electrical danger due to capacitors potentially). They all develop a glow and/or fading issue over time as well regardless and can eventually just start to pop or just die or become unusable - so if you think burn in is somewhat of a concern on an OLED, you are on a potential time bomb on a CRT produced in 2003.

They require around a 1/2 hour to warm up before they are at their full saturation and contrast (if they can even reach it anymore). That is a pretty big deal imo. They are subject to electric (imperfect source/line noise, blips on the circuit from other things in the house or things on the same circuit) . . and magnetic (speakers, etc) interference, moire issues - which are avoidable with some precautions or investments but still. . They are also very small in screen size compared to modern displays at 23.5" diagonal and relatively small in max rez and desktop real-estate.

100nit ? HDR? ~700nit + HDR is a standard at this point imo for media excellence. HDR is a game changer even more than 4k fidelity is. HDR which is another reason BFI is out for the most part now too imo, at least with the current technologies. If you run 120fpsHz you can still reduce the blur by about 50% down to where it starts being seen as more of a soften blur effect during viewport motion at speed . . and it will be reduced more as we get to higher fpsHz with higher Hz screens + as frame amplification technologies progress (perhaps borrowing from VR's more advanced tech in that area). I love motion clarity and it would be great to see full clarity during periods where I'm moving the viewport around at speed - but the tradeoffs are way too great now as far as I'm concerned. It's not even close anymore in overall definition, space (real-estate on screen as well as the physical dimensions of the screen), and most importantly - PictureQuality in many facets + the overall impact of HDR color volumes in media and game worlds.

If you still want to mess with them more power to you (literally) 😁 Some hobbiest people still develop film too while most just use digital imagery now. Nothing wrong with it really.

It's like as if digital filming had ~50% motion blur during periods when wheeling the camera around at high speed or during periods of high viewpoint/viewport motion when doing go-pro action video capture etc. . , and was on the road to less, but filmic motion cameras and photography could capture zero blur motion shots (after tweaking your camera inside and out periodically and after warming your camera up for a 1/2 hour on site every single time) - yet the film cameras were incapable of capturing and producing HDR color volume images and video (or any images above ~100nit) and was incapable of developing action film/images at higher resolutions, and of doing larger prints or displaying videos on larger screens. (I know that's not how film works this is just an "as if" analogy).
 
Last edited:
I used a graphics professional FW900 crt alongside LCDs for years.

The focus/convergence issues require working under the hood (with some electrical danger due to capacitors potentially). They all develop a glow and/or fading issue over time as well regardless and can eventually just start to pop or just die or become unusable - so if you think burn in is somewhat of a concern on an OLED, you are on a potential time bomb on a CRT produced in 2003.

They require around a 1/2 hour to warm up before they are at their full saturation and contrast (if they can even reach it anymore). That is a pretty big deal imo. They are subject to electric (imperfect source/line noise, blips on the circuit from other things in the house or things on the same circuit) . . and magnetic (speakers, etc) interference, moire issues - which are avoidable with some precautions or investments but still. . They are also very small in screen size compared to modern displays at 23.5" diagonal and relatively small in max rez and desktop real-estate.

100nit ? HDR? ~700nit + HDR is a standard at this point imo for media excellence. HDR is a game changer even more than 4k fidelity is. HDR which is another reason BFI is out for the most part now too imo, at least with the current technologies. If you run 120fpsHz you can still reduce the blur by about 50% down to where it starts being seens as more of a soften blur effect during viewport motion at speed . . and it will be reduced more as we get to higher fpsHz with higher Hz screens + as frame amplification technologies progress (perhaps borrowing from VR's more advanced tech in that area). I love motion clarity and it would be great to see full clarity during periods where I'm moving the viewport around at speed - but the tradeoffs are way too great now as far as I'm concerned. It's not even close anymore in overall definition, space (real-estate on screen as well as the physical dimensions of the screen), and most importantly - PictureQuality in many facets + the overall impact if HDR color volumes in media and game worlds.

If you still want to mess with them more power to you (literally) 😁 Some hobbiest people still develop film too while most just use digital imagery now. Nothing wrong with it really.

It's like as if digital filming had ~50% motion blur during periods when wheeling the camera around at high speed or during periods of high viewpoint/viewport motion when doing go-pro action video capture etc. . , and was on the road to less, but filmic motion cameras and photography could capture zero blur motion shots (after tweaking your camera inside and out periodically and after warming your camera up for a 1/2 hour on site every single time) - yet the film cameras were incapable of capturing and producing HDR color volume images and video (or any images above ~100nit) and was incapable of developing action film/images at higher resolutions, and of doing larger prints. (I know that's not how film works this is just an "as if" analogy).
I think you're quoting the wrong person. FWIW, I agree. However I placed my discussion on the most easy and obvious things that are tangible for people "still" using a CRT even if they don't believe the other issues.
 
I think you're quoting the wrong person. FWIW, I agree. However I placed my discussion on the most easy and obvious things that are tangible for people "still" using a CRT even if they don't believe the other issues.

I was agreeing with you and going into more details and my take on it. Not every quote is an argument against the quoted material but I can understand how you might get punch drunk from some discussions. :cool:
 
Deep breaths gentlemen.... :) I understand your perspective. And you're right. In terms of format compatibility, CRT was left back in the early 2000's:

Rec 601, SDR, resolution only barely surpassing 1080p. Yep! All valid points and totally a product of the era in which they were produced and when they were discontinued. And yeah, they had their weak points too. elvn - I totally forgot about the 30 min warm up. All Sony GDM's had that. Pretty annoying.

That's not what I'm talking about. Take a step back and look at it from a 100ft view. We used to have displays with rich contrast, excellent viewing angles, excellent motion resolution. Even though there were differences in CRT tech... Unless you bought a straight trash display, you were gonna get a decent image regardless.

We don't have that anymore. OLED's the closest thing that's an all-around, excellent at everything monitor. Not perfect. But all-out excellent. CRT was just an excellent display. And let's admit. If someone was to release a monitor that more-or-less resembled what the FW900 did even back in the day:

- 2560x1440 (yes I know it was 2304x1440 but for the sake of argument let's assume it's a 16:9 display)
- 10,000:1 contrast
- perfect, cross-talk free motion clarity that's truly 1ms persistence across the refresh range
- 100% sRGB.

You'd all preorder that thing in a heart beat. (We all would…)
 
Last edited:
Deep breaths gentlemen.... :) I understand your perspective. And you're right. In terms of format compatibility, CRT was left back in the early 2000's:

Rec 601, SDR, resolution only barely surpassing 1080p. Yep! All valid points and totally a product of the era in which they were produced and when they were discontinued. And yeah, they had their weak points too. elvn - I totally forgot about the 30 min warm up. All Sony GDM's had that. Pretty annoying.

That's not what I'm talking about. Take a step back and look at it from a 100ft view. We used to have displays with rich contrast, excellent viewing angles, excellent motion resolution. Even though there were differences in CRT tech... Unless you bought a straight trash display, you were gonna get a decent image regardless.
I never got to own a FW900, but I did own a Viewsonic P95f+ (brand new even, at the time). There were better displays than what I owned obviously, however that is kinda the point. Even a monitor that was considered top 85% in terms of consumer display is greatly outclassed by everything we have now.
I saw MANY trash CRT displays. I think you're misremembering things. Even my P95f+, which was considered to be an upper tier display maxed out a 1280x1024, it was 4:3. Most of them didn't have super high scanning modes and if they did their accuracy might drop and become blurry. It's a simlar story for talking about high scanning modes and hz,. Also before HDTV existed everyone was using CRT TV's that were 480i. If you add in TV into the equation the number of bad CRT displays I've seen goes up astronomically.
You'll have a hard time even listing a half dozen monitors that are CRT that anyone would even consider using now. Precisely because the specs are so far behind and anything related to fidelity is so poor, even if you could magically have them be brand new as if manufactured yesterday.

It's also REALLY telling that other than the FW900, NO ONE is trying to preserve any other CRT (and yes I'm including the HP rebranded one as well "as an FW900"). Basically only the last CRT with the best technology was even worth looking at or salvaging from. And even that's niche. (and yes arcade cabinets, another niche, but that's neither here nor there, and a bad argument considering they were all 60hz 480i/p, 4:3 TVs that no one would want to use for any other use case).
We don't have that anymore. OLED's the closest thing that's an all-around, excellent at everything monitor. Not perfect. But all-out excellent. CRT was just an excellent display. And let's admit. If someone was to release a monitor that more-or-less resembled what the FW900 did even back in the day:

- 2560x1440 (yes I know it was 2304x1440 but for the sake of argument let's assume it's a 16:9 display)
- 10,000:1 contrast
- perfect, cross-talk free motion clarity that's truly 1ms persistence across the refresh range
- 100% sRGB.

You'd all preorder that thing in a heart beat.
I really wouldn't. I'm only interested really in professional displays at this point, because that's what I need for work. If it doesn't have 4k I won't even consider it. I would actually prefer DCI-6k 2:1 34", if I could magically make a display size/resolution of my choosing. Of course the display meets full sRGB, that's kind of a joke because that's literally the color space of CRT as recreated on LCD and other display technologies. However, sRGB as a color space is far too narrow. I'm waiting on OLED and MiniLED price/maturity precisely because it can be used to grade in HDR10 color spaces and above. Even ignoring HDR, not being able to reproduce AdobeRGB much less ProfotoRGB for stills photography sucks. And not being able to display DCI-P3 for films also sucks.

10,000:1 contrast ratio also doesn't exist in a vacuum (tube, dad jokes). Considering that 10-bit sRGB is capable of 1.06 billion colors, which also has luminosity requirements you'd have a hard time showing/proving this accurately. 10:000:1 on 100 nits is basically stating that for every 1 nit increase you're capable of 100 values of difference in contrast, or saying you can see 1/100th of a nit change in value. I'll just say you probably can't see that with your eyes, and I would question the measuring devices as well. OLED makes those statements only because it's capable of true black or true off state. In other words its contrast ratios are puffed up as well.

I won't argue the motion clarity, though I'd say that OLED is there if not nearly there. But considering all of the issues CRT has with flicker, issues with alignment, etc, you may have motion clarity while at the same time just having a less usable (readable) image. And this is to say nothing about perfect geometry and 1:1 pixels other display tech can reproduce. I do not miss having to change/fix the scanning area, degausssing, or spending time in service menus at all. I also don't miss staring at a CRT either. In any way shape or form. Seeing 50 or 60hz scanning on a CRT is an absolute misery.

I'd say the only real advantage that CRT has over OLED/MiniLED is resolution independence. However even that is debatable because basically every monitor still worked better at certain resolutions than others even though there wasn't a "pixel grid".


Again, in that entire 4 list spec sheet, the only thing that isn't completely outclassed by modern displays is arguably "just" motion clarity. I do not envy those specs, nor would purchase them if they were available on a brand new display. The OLED that we're talking about (and off topic on) in this thread is already a much better fit for me than a CRT matching those specs, even if I was to not prioritize HDR or resolution, or greater color gamut reproduction.

EDIT: Also, I'm not trying to be pendatic. Even if those specs were all released on an OLED or some other special magical display type and not a CRT: I still wouldn't buy it.
 
Last edited:
I never got to own a FW900, but I did own a Viewsonic P95f+ (brand new even, at the time). There were better displays than what I owned obviously, however that is kinda the point. Even a monitor that was considered top 85% in terms of consumer display is greatly outclassed by everything we have now.
I saw MANY trash CRT displays. I think you're misremembering things. Even my P95f+, which was considered to be an upper tier display maxed out a 1280x1024, it was 4:3. Most of them didn't have super high scanning modes and if they did their accuracy might drop and become blurry. It's a simlar story for talking about high scanning modes and hz,. Also before HDTV existed everyone was using CRT TV's that were 480i. If you add in TV into the equation the number of bad CRT displays I've seen goes up astronomically.
You'll have a hard time even listing a half dozen monitors that are CRT that anyone would even consider using now. Precisely because the specs are so far behind and anything related to fidelity is so poor, even if you could magically have them be brand new as if manufactured yesterday.

It's also REALLY telling that other than the FW900, NO ONE is trying to preserve any other CRT (and yes I'm including the HP rebranded one as well "as an FW900"). Basically only the last CRT with the best technology was even worth looking at or salvaging from. And even that's niche. (and yes arcade cabinets, another niche, but that's neither here nor there, and a bad argument considering they were all 60hz 480i/p, 4:3 TVs that no one would want to use for any other use case).

I really wouldn't. I'm only interested really in professional displays at this point, because that's what I need for work. If it doesn't have 4k I won't even consider it. I would actually prefer DCI-6k 2:1 34", if I could magically make a display size/resolution of my choosing. Of course the display meets full sRGB, that's kind of a joke because that's literally the color space of CRT as recreated on LCD and other display technologies. However, sRGB as a color space is far too narrow. I'm waiting on OLED and MiniLED price/maturity precisely because it can be used to grade in HDR10 color spaces and above. Even ignoring HDR, not being able to reproduce AdobeRGB much less ProfotoRGB for stills photography sucks. And not being able to display DCI-P3 for films also sucks.

10,000:1 contrast ratio also doesn't exist in a vacuum (tube, dad jokes). Considering that 10-bit sRGB is capable of 1.06 billion colors, which also has luminosity requirements you'd have a hard time showing/proving this accurately. 10:000:1 on 100 nits is basically stating that for every 1 nit increase you're capable of 100 values of difference in contrast, or saying you can see 1/100th of a nit change in value. I'll just say you probably can't see that with your eyes, and I would question the measuring devices as well. OLED makes those statements only because it's capable of true black or true off state. In other words its contrast ratios are puffed up as well.

I won't argue the motion clarity, though I'd say that OLED is there if not nearly there. But considering all of the issues CRT has with flicker, issues with alignment, etc, you may have motion clarity while at the same time just having a less usable (readable) image. And this is to say nothing about perfect geometry and 1:1 pixels other display tech can reproduce. I do not miss having to change/fix the scanning area, degausssing, or spending time in service menus at all. I also don't miss staring at a CRT either. In any way shape or form. Seeing 50 or 60hz scanning on a CRT is an absolute misery.

I'd say the only real advantage that CRT has over OLED/MiniLED is resolution independence. However even that is debatable because basically every monitor still worked better at certain resolutions than others even though there wasn't a "pixel grid".


Again, in that entire 4 list spec sheet, the only thing that isn't completely outclassed by modern displays is arguably "just" motion clarity. I do not envy those specs, nor would purchase them if they were available on a brand new display. The OLED that we're talking about (and off topic on) in this thread is already a much better fit for me than a CRT matching those specs, even if I was to not prioritize HDR or resolution, or greater color gamut reproduction.
I'm probably not communicating clearly here. I'll just stop because it's off topic. Obviously you're a content creator/image professional and so your needs are different than my own. Back to the OLED at hand - I will say that what I AM excited about is that it's an OLED PC monitor... and it's $999. I'm not expecting BFI (but hoping - that would be a nice surprise) but I think it's a very nice step in the right direction. Until then for gaming I'm sticking with the Viewsonic blurbusters. I think I do have a Viewsonic P95f+ UltraBright I still keep around. Been forever since I used it though. I mostly relegate it to older 60hz console ports on the PC because damn - that motion fluidity.
 
Back
Top