Why OLED for PC use?

My statement was in regard to the transition from CRT, with full on/off contrast ratios or dynamic range in excess of 15:000 to 1, to LCD's with a small fraction of that. HDR with its further expanded color and contrast is great, but will not be as impressive to someone transitioning from another great technology.
Have you ever tested that contrast ratio? I'm not being snarky here, I'm curious. As I said, I didn't get to measure many CRTs, it wasn't until near the end of their run that I was able to get a calibrator, but when warmed up, I didn't measure their contrast as being that impressive.

When my TV broke last year, my GDM-F520 became my TV for a bit as I was figuring out what to do. Even though on paper it has a fraction of today's display's color and contrast, it could still have a spectacular picture. One that already looked closer to HDR than say an LCD without FALD anyway...
I guess... If you only care about black point and then not even about really using those near-black levels. With a 2.2 curve, 8-bit DAC, you don't get a ton of resolution in the low levels. So once you have enough contrast ratio that the difference between a symbol of 0 and 1 is noticeable, any more CR than that just makes it more noticeable. It doesn't give you additional detail.

I mean if all you care about, HDR wise, is a lower black point/on paper contrast... ok I guess. But what I find impactful about HDR is the brightness differences. It is a light shining brightly in the darkness, bright glints of reflection off a shiny surface, etc. I never saw that on CRT, wasn't until OLED/FALD LCD that sort of thing.

Good reply. Have to say that FALD still raises blacks around lit areas in a scene though. It's non-uniform, using a tetris brickwork of shapes, often more than one cell wide like a short gradient in order to avoid more overt blooming (blending it across zones). The larger dark areas vs larger bright areas, like testing contrast ratings using large white and black tiles or full screen, will have the larger contrast numbers but the mixed contrast areas and areas around their edges will drop back to 3000:1 to 5000:1 since the brighter zones are lifting the blacks~darks and/or dimming the brights (and can lose some detail on either end).
Though again, you get in to limits of human perception with things like that. You are right that light from a zone affects other zones and that you can't do hard transitions from bright to dark. However, it turns out that our eyes aren't real great at that either. That's why things like bias lighting and so on work in the real world. If I shine a light at you, it ruins your ability to see darker details around the area of that light. It's a trick I've been known to use to hide Halloween scare props. Likewise light in the real-world bleeds, so it ends up being a case where the bleed over between zones often works for how things actually look.

Still, individual per-subpixel lighting like OLED has is the ultimate path we want to take forward. Just saying that because MiniLED can push brightness higher than OLED, sometimes the HDR impact actually ends up being more impressive, depending on the content. Vincent Teoh of HDTVTest has said the next gen TCL TVs with 5000 nit brightness are pretty amazing for HDR because they can hit such a high level.
 
My statement was in regard to the transition from CRT, with full on/off contrast ratios or dynamic range in excess of 15:000 to 1, to LCD's with a small fraction of that. HDR with its further expanded color and contrast is great, but will not be as impressive to someone transitioning from another great technology.

When my TV broke last year, my GDM-F520 became my TV for a bit as I was figuring out what to do. Even though on paper it has a fraction of today's display's color and contrast, it could still have a spectacular picture. One that already looked closer to HDR than say an LCD without FALD anyway...

I never noticed a change in contrast when moving from my last CRT (22" Iiyama VisionMaster Pro 510) to my first LCD (Dell 2405FPW) ~17 years ago, but then again, I never got to test them side by side. The CRT gave off the smoke of doom one day, and my local electronics repair shop couldn't get the part needed to fix it, so I scrapped it and replaced it a week or two later.

I do remember early panels in the 90's being really bad in this regard, but by the time my 2405FPW came around the worst of the bad LCD days were over. I believe it shared the same panel as the contemporaneous Apple Cinema display. At least that's what I remember from the reviews I read at the time.
 
Have you ever tested that contrast ratio? I'm not being snarky here, I'm curious. As I said, I didn't get to measure many CRTs, it wasn't until near the end of their run that I was able to get a calibrator, but when warmed up, I didn't measure their contrast as being that impressive.


I guess... If you only care about black point and then not even about really using those near-black levels. With a 2.2 curve, 8-bit DAC, you don't get a ton of resolution in the low levels. So once you have enough contrast ratio that the difference between a symbol of 0 and 1 is noticeable, any more CR than that just makes it more noticeable. It doesn't give you additional detail.

I mean if all you care about, HDR wise, is a lower black point/on paper contrast... ok I guess. But what I find impactful about HDR is the brightness differences. It is a light shining brightly in the darkness, bright glints of reflection off a shiny surface, etc. I never saw that on CRT, wasn't until OLED/FALD LCD that sort of thing.


Though again, you get in to limits of human perception with things like that. You are right that light from a zone affects other zones and that you can't do hard transitions from bright to dark. However, it turns out that our eyes aren't real great at that either. That's why things like bias lighting and so on work in the real world. If I shine a light at you, it ruins your ability to see darker details around the area of that light. It's a trick I've been known to use to hide Halloween scare props. Likewise light in the real-world bleeds, so it ends up being a case where the bleed over between zones often works for how things actually look.

Still, individual per-subpixel lighting like OLED has is the ultimate path we want to take forward. Just saying that because MiniLED can push brightness higher than OLED, sometimes the HDR impact actually ends up being more impressive, depending on the content. Vincent Teoh of HDTVTest has said the next gen TCL TVs with 5000 nit brightness are pretty amazing for HDR because they can hit such a high level.
The 15000:1 was from an old magazine review where they compared a lot of LCDs against each other with that CRT as a reference. (I think some folks here have gotten even stronger results for CRT. Spacediver?)

Of course, CRT has weak ANSI, due to internal reflections. However, that analog behavior is I think similar to how the eye reacts. And never struck me like the on/off or dynamic range difference did. And of course, being emissive, when you went deeper into darkness, where internal reflections were less of an issue, you get that great detailing against those nice blacks.
 
That was in reply to the claim that FALD LCDs with current tech do 500,000:1 contrast so that would be just fine. Maybe they do hundreds of thousands to 1 on large black field/square or full screen black when measured against large white field/square or full screen white, which is great - but in mixed contrast scenes and objects, and the areas around boundaries they are back to 3000:1 (thousand not 100 thousand) , to 5000:1 because they have to either lift the whole area brighter using a tetris shape of brighter zones, or darken the whole area.

Either way it's narrowing the contrast in those cutout shapes of zones and the aura around them by an extreme amount compared to the max FALD is capable of in large more solid/uniform fields of dark or white, and losing a little detail due to that to boot. Not only is that contrast narrowed but can be sort of a puddle, blended like a gradient through more zones like a lifted aura even if not outright blooming. So the dark areas around a light object will be lighter and much less contrast by comparison to larger dark fields on the screen which will be multitudes darker. Basically FALD is very non-uniform so using some enormous black vs large white field contrast measurement to describe them isn't giving the real contrast, which is instead kind of like a relief map across a scene.

I'm not saying that makes FALD a bad choice. Both FALD and OLED have major cons. They both use some very clever tech implementations/hacks in an attempt to work around some of their big faults but those tricks can only do so much. I'm just clarifying that the huge contrast numbers they market are not true across the whole screen in real content. That and the fact that while it is no fault of FALD tech itself, I think practically all FALD screens have matte abraded screens which will raise the blacks, even the deep blacks, on any screen tech to more like grey-blacks whenever any ambient lighting is hitting the abrasions "activating" the sheen/micro-frost abrasion texture.



Thought it might be worth adding this reply of mine from the tcl 5000zone, 5000nit TV thread since it seems relevant.

-------------------------------------------------------------------------------------------------

From vincent's impressions HDTVtest vid on the 5000nit TCL :

"I was bored enough to measure the peak nits of several real world objects in my garden. It was an overcast day typical of manchester where I typically stay really - and even the reflections of a bicycle belt measure 3300 nits, and a flower, just a mere flower, measured over 500 nits. "


"While a 5000nit tv still won't come close to replicating what we see in real life when it comes to luminance, with what the sun at noon being rated at 1.6 billion nits - fortunately almost all HDR videos are graded at 4000 nits or below. Most consumer TVs with lower peak brightness will have to perform tone mapping when displaying 4000nit elements."

---------------------------------------------------------------------------------------------------


The Dolby Vision mastering vids I saw stated that for HDR10,000 masters they do 50% of scene at 0 to 150 nit, 25% of scene at 150 - 1000nit mids , and 25% at 1000 to 10,000 nit (highlights/reflections/direct light sources).

Vincent's measurements are pretty interesting. It would be neat to see a more extensive list measurements of everday objects but I haven't seen one. In graded/tone mapped content the highlight end would be cut down by a lot obviously but some brighter FALD screens can hold a 500nit+ flower bed on screen sustained. The 5000 nit TCL should be able to do the 3300nit bicycle belt (on an overcast day) too. We're still pretty far from real highlight brightnesses in brighter situations though. With hdr4000 and hdr10,000 the heat will be an issue. Some of the samsung 2000nit FALD screens for example, suffer aggressive ABL. The uxg/ucx pro-art screens that do 1500nit+ use boxy, vent grille housing with active cooling fans on a cooling profile kind of like a gpu or laptop.
 
Last edited:
I never noticed a change in contrast when moving from my last CRT (22" Iiyama VisionMaster Pro 510) to my first LCD (Dell 2405FPW) ~17 years ago, but then again, I never got to test them side by side. The CRT gave off the smoke of doom one day, and my local electronics repair shop couldn't get the part needed to fix it, so I scrapped it and replaced it a week or two later.

I do remember early panels in the 90's being really bad in this regard, but by the time my 2405FPW came around the worst of the bad LCD days were over. I believe it shared the same panel as the contemporaneous Apple Cinema display. At least that's what I remember from the reviews I read at the time.
One of my cars gave off the smoke of doom. It wasn't a good day.
 
The 15000:1 was from an old magazine review where they compared a lot of LCDs against each other with that CRT as a reference. (I think some folks here have gotten even stronger results for CRT. Spacediver?)

Of course, CRT has weak ANSI, due to internal reflections. However, that analog behavior is I think similar to how the eye reacts. And never struck me like the on/off or dynamic range difference did. And of course, being emissive, when you went deeper into darkness, where internal reflections were less of an issue, you get that great detailing against those nice blacks.
Understood, but my measurements were also on/off. It was the CR reported by the color calibration, which does a full dark screen, not an ANSI pattern. Now maybe the Lacie/NEC screen just had a particularly aggressive anti-burn in tech that others don't, but that was what I found and, like Zarathustra, I didn't notice a big contrast difference going from my final CRT to first LCD (though it was a VA LCD). I noticed other differences, some better some worse, like motion clarity, dark smearing, color, brightness, sharpness, resolution, etc, etc but contrast just wasn't that different.

The Dolby Vision mastering vids I saw stated that for HDR10,000 masters they do 50% of scene at 0 to 150 nit, 25% of scene at 150 - 1000nit mids , and 25% at 1000 to 10,000 nit (highlights/reflections/direct light sources).
Ideally that's how it'll work and the brightest highlights will be reserved only for peak moments, kinda like with theater sound which can hit 105dB but dialogue usually sits around 65-75dB. We shouldn't need displays that can sustain those brightnesses full field.
 
From vincent's impressions HDTVtest vid on the 5000nit TCL :

"I was bored enough to measure the peak nits of several real world objects in my garden. It was an overcast day typical of manchester where I typically stay really - and even the reflections of a bicycle belt measure 3300 nits, and a flower, just a mere flower, measured over 500 nits. "


"While a 5000nit tv still won't come close to replicating what we see in real life when it comes to luminance, with what the sun at noon being rated at 1.6 billion nits - fortunately almost all HDR videos are graded at 4000 nits or below. Most consumer TVs with lower peak brightness will have to perform tone mapping when displaying 4000nit elements."

I don't think this is terribly relevant, because most screens are watched indoors in the dark, so our eyes have adjusted already, and you get the same effect, with much less brightness.

My 42" LG C3 with its (according to RTINGS) ~815 nits peak in a 10% window is already sufficient to cause intense discomfort for me during nightime use. Heck, even during the day, with indoor lights on, a room without direct sunlight, it can be rather uncomfortably bright.

I don't know about you, but I wear dark sunglasses almost 100% of the time I am outdoors, because that shit is uncomfortable even on a grey overcast day.

It probably doesn't help that I am sitting about 2ft from a TV intended to be used at room distances, but still.

I'd argue that the absolute brightness of the real world is a rather silly thing to try to replicate in a screen. You'll probably never get there, and even if you do, it will be of limited value, and will likely cause a lot of discomfort in viewers.
 
I don't think this is terribly relevant, because most screens are watched indoors in the dark, so our eyes have adjusted already, and you get the same effect, with much less brightness.

My 42" LG C3 with its (according to RTINGS) ~815 nits peak in a 10% window is already sufficient to cause intense discomfort for me during nightime use. Heck, even during the day, with indoor lights on, a room without direct sunlight, it can be rather uncomfortably bright.

I don't know about you, but I wear dark sunglasses almost 100% of the time I am outdoors, because that shit is uncomfortable even on a grey overcast day.

It probably doesn't help that I am sitting about 2ft from a TV intended to be used at room distances, but still.

I'd argue that the absolute brightness of the real world is a rather silly thing to try to replicate in a screen. You'll probably never get there, and even if you do, it will be of limited value, and will likely cause a lot of discomfort in viewers.

That test he did was IMO one of the dumbest arguments I've ever seen. "This flower is 500 nits! Real world is so bright!". Ummm bring that flower inside and it ain't gonna be anywhere near 500 nits anymore lmfao. Not to mention our eyes adjust, something that is 1000 nits indoors would look insanely bright but if you bring it outdoors now it looks dim as hell. The brightness of the object didn't change, but our eyes did. Taking brightness measurements of objects outdoors cannot be used to make an argument because we would be viewing it indoors on our displays. Anybody who tries to use the "don't you ever go outside?" argument probably don't even understand the concept that our eyes behave differently outdoors vs indoors lol.
 
Last edited:
That test he did was IMO one of the dumbest arguments I've ever seen. "This flower is 500 nits! Real world is so bright!". Ummm bring that flower inside and it ain't gonna be anywhere near 500 nits anymore lmfao. Not to mention our eyes adjust, something that is 1000 nits indoors would look insanely bright but if you bring it outdoors now it looks dim as hell. The brightness of the object didn't change, but our eyes did. Taking brightness measurements of objects outdoors cannot be used to make an argument because we would be viewing it indoors on our displays. Anybody who tries to use the "don't you ever go outside?" argument probably don't even understand the concept that our eyes behave differently outdoors vs indoors lol.
Wasn't that more an answer to the "this 400 nits monitor would permanently damage my eyes" crowd though?
 
Wasn't that more an answer to the "this 400 nits monitor would permanently damage my eyes" crowd though?

No I believe Vincent conducted that "experiment" in response to people commenting that displays are bright enough and he was trying to make the argument that they aren't, but his testing methodology is completely flawed because he is ignoring the fact that when you are outside your pupils constrict while they are dilated indoors, so already the perceived brightness of objects viewed outdoors in real life vs indoors on a TV screen cannot even be compared apples to apples in the first place. A 3300 nits bicycle reflection is going to look WAY different when your pupils are constricted while you are outside vs dilated when you are indoors watching TV. It's just a really useless experiment that tells us nothing IMO.
 
No I believe Vincent conducted that "experiment" in response to people commenting that displays are bright enough and he was trying to make the argument that they aren't, but his testing methodology is completely flawed because he is ignoring the fact that when you are outside your pupils constrict while they are dilated indoors, so already the perceived brightness of objects viewed outdoors in real life vs indoors on a TV screen cannot even be compared apples to apples in the first place. A 3300 nits bicycle reflection is going to look WAY different when your pupils are constricted while you are outside vs dilated when you are indoors watching TV. It's just a really useless experiment that tells us nothing IMO.
The video is literally titled "Think Your OLED/ QLED TV is Too Bright? Here's Some Measured Brightness of Real-Life Objects" :)

We keep hearing people claiming that their OLED or QLED TV is already too bright, and that they'll need to wear sunglasses to watch HDR content. So we took some brightness measurements of random objects in real life, on a semi-sunny day in Manchester where it's not so bright that we felt compelled to wear sunglasses.


View: https://www.youtube.com/watch?v=PRSm2v7bfVo

Or are you refering to another video?
 
For those wondering about why the levels were chosen, there's an AVS Forums thread about Dolby's research or you can watch the talk it is about on Youtube. Long and short of it is Dolby went and built a custom rig that pointed a cinema projector at an LCD screen to give test subjects extremely high dynamic range and stupidly high brightness capabilities, and then did testing as to preferences and perception. People tended to prefer high brightness and also low black point.
 
The video is literally titled "Think Your OLED/ QLED TV is Too Bright? Here's Some Measured Brightness of Real-Life Objects" :)

We keep hearing people claiming that their OLED or QLED TV is already too bright, and that they'll need to wear sunglasses to watch HDR content. So we took some brightness measurements of random objects in real life, on a semi-sunny day in Manchester where it's not so bright that we felt compelled to wear sunglasses.


View: https://www.youtube.com/watch?v=PRSm2v7bfVo

Or are you refering to another video?


Yeah ok fine it's a response to "eye damage" but my point is that the experiment is USELESS. I'm not really nitpicking about what the title of the video is called or what people are claiming whether the displays are bright enough or TOO bright. The point I'm trying to make is that the experiment is worthless.
 
Understood, but my measurements were also on/off. It was the CR reported by the color calibration, which does a full dark screen, not an ANSI pattern. Now maybe the Lacie/NEC screen just had a particularly aggressive anti-burn in tech that others don't, but that was what I found and, like Zarathustra, I didn't notice a big contrast difference going from my final CRT to first LCD (though it was a VA LCD). I noticed other differences, some better some worse, like motion clarity, dark smearing, color, brightness, sharpness, resolution, etc, etc but contrast just wasn't that different.


Ideally that's how it'll work and the brightest highlights will be reserved only for peak moments, kinda like with theater sound which can hit 105dB but dialogue usually sits around 65-75dB. We shouldn't need displays that can sustain those brightnesses full field.
The 15000:1 was in a dark room in conjunction with the team at Displaymate. However, the display has to be calibrated for that. I've seen CRTs whose black level is set high and I'm like "why?". It's a CRT. Go for those blacks and contrast. :)

Admittedly, none of this works if the room isn't dim. If it isn't one's man cave or such. Offices with overhead lighting, I was more than happy to see LCD there. Especially if it meant dual screen and such.

And another shout out to the CX. :)
 
Last edited:
Re: vincents nit measuring outdoors on a grey dreary day - -

I agree that our eyes do view everything relatively. Which is why a "bright" light, phone, tablet, etc. in sunlight vs looking at one at night is a huge difference, or a dim phone/tablet setting in daylight vs at night in the dark. Also why modern FALD *SDR* brightness levels in particular are touted for brightroom viewing - because you need higher brightness in brightroom viewing just to get back to the way the "normal" lower nit SDR levels would look in a dim to dark reference environment.

Yes it's all relative but it would still be interesting to see some of the levels. Just curious of more values rated on a machine sensor in different levels of daylight for some comparisons.

Professionals mastering content, if given a large brightness range, could probably map some real world values to reference environment equivalents. Even just perceptually to where it "looks natural" like it did when measured at 500nit or 3300nit. That's why a reference environment is a thing. It gives a normalized base state to work from, (and also squeezes the most impactful range out of the display). Calibration is also typically done in the dark, so moving screens to environments where the ambient lighting changes (day to night, varying cloud cover, weather, seasons, room lighting changes) and your eyes are going to see values differently than where you supposedly locked your calibration values to (and different than where your content was mastered to).

So I tend to agree with you. No, Vincent's overcast england sky was not a reference environment, and they could probably master content to look like that 500nit measured flower with a lower nit flower in a very dim/dark reference environment. A flower recorded in HDR in a very bright sunny environment would probably be a lot higher than where that 500nit one might sink to though to match a reference environment. Might be obvious but there can probably still be a 500 nit reference environment flower, and a 3300nit reference environment based bike chain (or brighter) they just wouldn't be representing vincent's overcast england sky environment ones, rather those in a brighter environment (very bright sunny day, different climate, etc.).

The goal is a more realistic look so if a "500nit flower on an overcast england day" is something like 300nit+ in a reference environment, and a 3300nit dreary day bike chain looks something like 2000nit in a reference environment, that's fine as long as it looks the same more or less. The impact would be the same to your eyes in each environment after they were mastered to look the same brightness to your eyes/brain. Brighter based environments and brighter based things in scenes might need lower than real-world nits viewed in a reference environment but they'd still be quite high to look more realistic - even mids but especially reflections, highlights, light sources (sun, bright moon, headlights, etc). So you might get a 800nit flower and a 5000 nit bike chain in a bright summer day in the usa. Just throwing some numbers around, theoretically. On the mids especially, sustained nits would come into play, which is something OLED so far has not been good at (sunlit scenes but even bright white winter/ice scenes). Still love my 48CX per pixel emissive with black depths to oblivion though. :cool:
 
Last edited:
Vincent's goal was pretty much to educate that real-life dynamic range vastly exceeds a typical display.

Once you interpret this as a bait-headline, and simply treat it as an educational video about displays vs real life.

Also, some people live in a bright condo (Toronto-style) where you have full height floor-to-ceiling windows. And others in a cave in a basement.

Everyone will benefit differently from the dynamic range of their TV.

And no, 1000nit fullscreen is never too bright for the glass condo/apartment situation during daytime (I rented a glass condo in Toronto for a bit a number of years ago). It feels less bright than 200 nits fullscreen in the cave, given eye iris behavior to ambient environment. This is also why many TVs have ambient light sensor features to compensate their baseline -- for the day-to-night fluctuations.
 
Last edited:
Yeah the title of the video was silly. Unless something like: https://www.rtings.com/tv/reviews/best/by-usage/outdoor ?

The garden tour and measurements were interesting though. I do notice that when I walk outside with my phone, the screen may as well be off.

(Or yeah I guess can be indoor too. My projector in the other, more lit, room is certainly a different experience night versus day.)
 
Understood, but my measurements were also on/off. It was the CR reported by the color calibration, which does a full dark screen, not an ANSI pattern. Now maybe the Lacie/NEC screen just had a particularly aggressive anti-burn in tech that others don't, but that was what I found and, like Zarathustra, I didn't notice a big contrast difference going from my final CRT to first LCD (though it was a VA LCD). I noticed other differences, some better some worse, like motion clarity, dark smearing, color, brightness, sharpness, resolution, etc, etc but contrast just wasn't that different.


Ideally that's how it'll work and the brightest highlights will be reserved only for peak moments, kinda like with theater sound which can hit 105dB but dialogue usually sits around 65-75dB. We shouldn't need displays that can sustain those brightnesses full field.
My first LCD monitor had a contrast ratio of about 400:1 compared to the CRT I came from that was 600:1.
 
Good aperture grill CRT that were properly optimized got over 10k:1 (not ANSI) but bog standard cheap office monitors could be sub 1k for sure. I had a good CRT when I got my first gaming LCD and kept it in a 2 monitor side by side for many years and the CRT was significantly better in high contrast scenes like a starry night sky but in mixed brightness with lots of bright objects and no true blacks the difference was smaller.
 
Good aperture grill CRT that were properly optimized got over 10k:1 (not ANSI) but bog standard cheap office monitors could be sub 1k for sure. I had a good CRT when I got my first gaming LCD and kept it in a 2 monitor side by side for many years and the CRT was significantly better in high contrast scenes like a starry night sky but in mixed brightness with lots of bright objects and no true blacks the difference was smaller.
Can confirm. I miss my old Sony GDM's.
 
Interesting. My first LCD, the 24" Dell 2405FPW apparently had a 1000:1 contrast ratio.
Some of the old school TN panels had pretty shit contrast ratios. So if you got in on the LCD game early, you could have a pretty low contrast monitor potentially.
 
The 2405 was from 2005, I still have one laying around in the storeroom, as it refused to die. The calibrated contrast on that was about 620:1 - the biggest upgrade when I changed off of this monitor in the past was the off axis color and the reduction in smearing. I’m guessing it was a VA panel or it measures more like later VAs did other than the not great contrast.
 
Interesting. My first LCD, the 24" Dell 2405FPW apparently had a 1000:1 contrast ratio.
Mine was a Samsung 204T, which had a S-PVA panel. Samsung advertised a 700:1 contrast ratio while actual testing showed it to be around 400:1. I never tested it myself, as I "calibrated" it by eye. There is an old thread here that complimented its black level, but from what I recall it was pretty bad compared to my GDM-5010PT.
 
Mine was a Samsung 204T, which had a S-PVA panel. Samsung advertised a 700:1 contrast ratio while actual testing showed it to be around 400:1. I never tested it myself, as I "calibrated" it by eye. There is an old thread here that complimented its black level, but from what I recall it was pretty bad compared to my GDM-5010PT.
I used a 2000:1 contrast ratio TV panel of the same technology as a monitor for a bit. It was lovely for what it was. Not a single bad pixel or such. However, its version of black would light up my room. It's just a limitation of the technology. Liquid crystal panels not being strong enough to block the powerful backlight behind them.
 
I used a 2000:1 contrast ratio TV panel of the same technology as a monitor for a bit. It was lovely for what it was. Not a single bad pixel or such. However, its version of black would light up my room. It's just a limitation of the technology. Liquid crystal panels not being strong enough to block the powerful backlight behind them.

LCD TVs and monitors advertised their dynamic contrast ratio numbers. The static contrast was much lower, and most people disabled dynamic contrast, especially for monitor use, because it is super annoying having huge brightness fluctuations.
 
LCD TVs and monitors advertised their dynamic contrast ratio numbers. The static contrast was much lower, and most people disabled dynamic contrast, especially for monitor use, because it is super annoying having huge brightness fluctuations.
2000:1 was the measured contrast. TVs had much higher contrast than computer monitors at the time. So went with that. Even though was 40 inches...

Clearly though it made its contrast in the brighter range. Couldn't been the blacks, because there weren't any. Why I kept going back to CRT.

FALD was an exciting development though, but was quickly derailed in smaller sets for years by that edge lighting business.
 
Last edited:
Heh, my first LCD was a 50ms response, 15", 1024x768 panel from envision. I somehow still gamed on it though :ROFLMAO:. I got it on a killer deal for about $350 when it msrp'd for $600-650+.
 
Heh, my first LCD was a 50ms response, 15", 1024x768 panel from envision. I somehow still gamed on it though :ROFLMAO:. I got it on a killer deal for about $350 when it msrp'd for $600-650+.

Was this you? :p

1697741338384.png
 
Short answer, everything else looks like trash.

I'm not going to get a headache from an ancient CRT and I'm not dealing with smearing from VA panels, IPS contrast makes things gray or has terrible backlight halos. OLED is the only acceptable currently available display technology. Yes, having to plan around avoiding burn in is really frustrating at times, but it's better than other options.
 
I'm not going to get a headache from an ancient CRT

Not to mention hot red ears and face.

The "CRT tan" was real. ;p

You have to wonder what the long term effects of having our faces bombarded with stray electrons for hours all of those years may have been. :p
 
Not to mention hot red ears and face.

The "CRT tan" was real. ;p

You have to wonder what the long term effects of having our faces bombarded with stray electrons for hours all of those years may have been. :p
You mean i could blame some of my problems on all those years of a CRT?
 
You mean i could blame some of my problems on all those years of a CRT?

Wouldn't surprise me if excessive close up exposure to CRT's has some sort of impact. This isn't like cellphone towers which is just a little EM radiation. CRT's actually bombard the screen with electrons, and some get through, and those are what resulted in the red hot face and ears, or the so called "CRT Tan" we all used to have.

Who knows what countless hours of electron bombardment might do to more sensitive things, like eyeballs.
 
Wouldn't surprise me if excessive close up exposure to CRT's has some sort of impact. This isn't like cellphone towers which is just a little EM radiation. CRT's actually bombard the screen with electrons, and some get through, and those are what resulted in the red hot face and ears, or the so called "CRT Tan" we all used to have.

Who knows what countless hours of electron bombardment might do to more sensitive things, like eyeballs.
I have never heard of a "CRT tan" Don't even google the term unless you want to see some bullshit on race on reddit. I've used them for years and I'm still white as a ghost.
 
I have never heard of a "CRT tan" Don't even google the term unless you want to see some bullshit on race on reddit. I've used them for years and I'm still white as a ghost.

It's not an actual tan. It's just that flushed and hot face and ears you get after using a CRT monitor (or TV up close) for an extended period of time.

It goes away after a little while. (half an hour to an hour?) without leaving a permanent tan.

It was just the nickname for the phenomenon, at least in my neck of the woods.
 
It's not an actual tan. It's just that flushed and hot face and ears you get after using a CRT monitor (or TV up close) for an extended period of time.

It goes away after a little while. (half an hour to an hour?) without leaving a permanent tan.

It was just the nickname for the phenomenon, at least in my neck of the woods.
You can't be serious? What kind of terrible CRTs were you using? Perhaps with Sony GDM CRTs I've been in a bubble...
 
You can't be serious? What kind of terrible CRTs were you using? Perhaps with Sony GDM CRTs I've been in a bubble...

This happened with every single monitor and TV from when I started using them in the 1980's at least up until the early 2000's when CRT's started their slow fade into obscurity.

After a couple of hours on the NES and/or PC ones face and ears would be flushed and hot feeling. I never encountered a CRT TV or monitor this didn't happen on.

I vaguely remember hearing that by ~2007 all new CRT monitors had to meet FDA's new limits on ionizing radiation, so this may have actually gone away after that, but in all honesty there weren't all that many CRT's made in 2007 or after. They pretty much peaked in volume in the 90's.
 
With the FW900 as my main display at home, a pair of IPS panels at work, I noticed the former's richer, if smaller, picture when I got home, but I didn't perceive any delta in physical effects...
 
With the FW900 as my main display at home, a pair of IPS panels at work, I noticed the former's richer, if smaller, picture when I got home, but I didn't perceive any delta in physical effects...
Same here. I never experienced any physical effects other than the usual “I’m tired and staying up way too late” when I didn’t exercise self-control and instead pull all-nighter gaming sessions. :)
 
Back
Top