OLED Computer Monitors?

Why don't we just both leave it now. I'm worn out. I have only taken it this far because you originally came into this thread with a (perceived by me at least) tone that essentially said: "If you still believe in this tech, you're a fucking moron". . . and that grates on people when that tone is accompanied by questionable evidence and/or logic.
Completely regardless of anyone's opinion, my disgust is not only accompanied by but a direct result of the last 14 years of established history. We still wait for a true replacement (in performance, capability, flexibility and durability) for our CRTs, and still are forced to choose between video ghosting, color shift, input lag, or a price tag out of reach of everyone except a niche market.
 
Long story short, LG realized like everyone else that RGB OLED isn't viable for large panels, so they had to either abandon it entirely or come up with a kludge. It is impossible even in theory for WRGB to provide the same gamut range and purity as RGB, the white pixels serve as little more than pixel-sized backlights.

Who cares how a technology compares to a commercially unviable and unused technology? The question is how it compares to other technology currently existing, and in that, LG OLEDs are simply far superior to every other screen technology produced today. It's not even in question. They trounce the best of best plasmas in just about every measure.

Also, I dunno why everyone keeps referring to OLED as out of reach, pricing-wise. You can buy the LG 55EC9300 for $3000 now, which is only a few hundred dollars more than the best plasmas were selling for a couple of years ago, and it's a much better TV than those were. It's already competitive for a high-end TV.
 
Who cares how a technology compares to a commercially unviable and unused technology? The question is how it compares to other technology currently existing, and in that, LG OLEDs are simply far superior to every other screen technology produced today. It's not even in question. They trounce the best of best plasmas in just about every measure.
For TVs I have no doubt that's true. It means diddly as long as it remains a TV technology. Unless (as someone suggested) we start strapping 55" screens onto our 30" desks.

What I don't get is, if it's now possible for LG to produce a $3K 55" OLED TV, why is it impossible for them to produce a $1500 27" computer monitor? If the answer is that performance limitations of WRGB OLED prevent it, my claim that it's inherently inferior to RGB OLED is validated. And if the answer is that it is possible, why haven't they done it, or even announced an eventual plan to do it? So far, the only reason I've seen offered in discussions is that LG believe the market for it isn't big enough, a truly ridiculous claim considering the entire world has been patiently waiting for an affordable replacement for CRT capabilities for over a decade. And since the claim implies that people today are somehow better able to afford a $3K TV instead of a $1500 monitor.
 
What I don't get is, if it's now possible for LG to produce a $3K 55" OLED TV, why is it impossible for them to produce a $1500 27" computer monitor?
Many people have explained why this isn't the case. They haven't "fabed up" to the point where making smaller screens is economically viable at this point. They only have so many panels being cut out right now. They need to use them in the most advantageous way until they get more manufacturing capability online (as they are in fact doing).

If the answer is that performance limitations of WRGB OLED prevent it, my claim that it's inherently inferior to RGB OLED is validated.
That does not logically follow. For that to be valid, RGB OLED would also need to be proven viable as a monitor. If WRGB OLED is demonstrated to be ill-suited to monitors, it would still remains to be seen whether RGB OLED is viable.

An analogy. If you claim that ketchup can be used as automobile fuel, and I claim that mustard can be used as automobile fuel. . . if we only try mustard and the car conks out, it does not follow that ketchup is then demonstrated to be "superior" to mustard as an automobile fuel. You also have to try ketchup.

And if the answer is that it is possible, why haven't they done it, or even announced an eventual plan to do it? So far, the only reason I've seen offered in discussions is that LG believe the market for it isn't big enough. . .
That's not the reason. Though it is a factor. In combination with the current manufacturing base and economics involved.

. . . a truly ridiculous claim considering the entire world has been patiently waiting for an affordable replacement for CRT capabilities for over a decade.
This is overstated at best. The vast majority of the "entire world" barely cares about their televisions and gives no though to its underlying technology. They establish a budget, walk into the store and are usually (mis)guided into what to buy by a salesperson. But it's simply not true that anywhere near a sizable portion of the customer base is as picky as you or I. This is even more true with computer monitors.

If you would, please refrain from describing perfectly reasonable and rational statements as "ridiculous". . . it's that sort of tone that makes it nearly impossible to argue with you in a civil manner. It's fine to be wrong. It's something else to be wrong while arrogantly and condescendingly dismissing those who are in fact presenting better information while trying to repeatedly address your questions and assertions.

--H

Edit/Addendum: They just got WRGB OLED TVs out the door. It's just simply not fair or reasonable to judge its feasibility as a monitor either way based solely on it not being made into a monitor yet. It has just finally been released to the public. How can anyone expect them to release 55", 20", 34", 70", and 10" televisions, monitors, and smartphone screens all at once. You just can't.

Pointing to a lack of monitors only one year after WRGB OLED arrived as evidence that it won't eventually work in monitors is just not logical. As described prior, it's circular reasoning. The premise serves as the conclusion. And that ain't cricket.

They may never arrive. But the fact that they don't currently exist does not indicate either way whether they ever might exist.
 
Last edited:
Pointing to a lack of monitors only one year after WRGB OLED arrived as evidence that it won't eventually work in monitors is just not logical. As described prior, it's circular reasoning. The premise serves as the conclusion. And that ain't cricket.

They may never arrive. But the fact that they don't currently exist does not indicate either way whether they ever might exist.
My only point in all of this is that you can add your mights to the 14 years of mays and shoulds. Still zero options for an affordable OLED monitor of any flavor and still no plans from anyone to ever produce one.
 
My only point in all of this is that you can add your mights to the 14 years of mays and shoulds. Still zero options for an affordable OLED monitor of any flavor and still no plans from anyone to ever produce one.
Forgive me, but if after that long rebuttal that is your "only point" left standing, then you're not left with much. I'm not even sure it's a point worth making. As already said. . . the lack of one today says nothing about their feasibility in the future.

--H
 
LG is focusing on making their TV business viable. Once they have a ton of revenue coming from that and OLED is a proven commercial tech for larger screens, they'll probably get around to PC monitors. But it doesn't really make sense for it to be a priority.
 
Why did CRTs stop production? It was not some conspiracy to deny people high image quality. It was not some washy 'people accept bad pictures' argument either.

CRTs were simply really goddamn expensive to manufacture! Massive vacuum tubes, chemical coating processes, various metal screens/grills, electron guns, high-power high-precision analog electronics, multiple glass-welding stages, etc. So many ways to go wrong, so many production steps. Scales very poorly, with both display size and manufacturing capacity.

The much vaunted GDM-FW900 retailed for $2300 (or around $3000 today after inflation). Bigger tubes have a drop in picture quality (mainly resolution) because of fundamental physics issues (field collapse rate for large slew magnet fields) and the extreme expensive of the analog driving electronics for very large tubes at right frequencies, as well as the beam scanning time needing different phosphors with different emission characteristics.
Your bog-standard super-cheapie 15" CRT, which would look pretty awful even when calibrated by an expert (with whatever limited controls are available even with the back open) when bought new would still not be as cheap a a modern 24" IPS, and arguably look considerably worse in almost every situation.

Small OLED panels fabbed on glass or plastic can achieve acceptable yields and viable prices (competitive with LCD) because you can nicely bin them around defects. Big panels are viable because OLED and LCD both scale in roughly the same way for very large panels, so the defect rates converge for high-end panels (i.e. it costs comparably to manufacture a no-defect 50" OLED and 50" LCD).

The middle-ground of high quality, high resolution low-defect panels is ruled out for OLEDs until production ramps up and defect rates decrease. This is inevitable as more lines come online, and said lines run for longer and iron the kinks out of larger scale processes. In theory, there's no reason an OLED producer could not design and produce a small run of desktop-size OLED panels using the same lines as mobile device panels or TV panels. But the yield would be low (because the first run of ANYTHING is low) making them incredibly expensive. And guess what, this happens! The tiny quantity of viable panels end up in broadcast and mastering monitors (like the one at the start of the thread). This market is small and well-supplied, so there's no incentive to ramp up the lines needed to produce the volumes needed to get a handle on the process and raise yields, so there's n drop in production cost. To make desktop panels viable, either the process knowledge gains from current runs needs to be ported to desktop panels (either current tablet OLEDs get bigger and get rehoused for desktop use, or 4K TV panels shrink, or are cut down, and get more suitable display controllers), or a change in demand prompts a manufacturer to produce OLEDS that can do what LCDs cannot. the most likely candidate there is the recent surge in high framerate monitors. This is currently served by TN and a handful of VA and IPS panels (mainly golden samples and castoffs from the specific application market). OLEDs here can achieve much faster pixel switching times, much higher refresh rates, without the image quality drawbacks of TN. Current developments on VR are driving the controller and interface development needed for this, and raising awareness of the benefits of very high refresh rates and very low pixel persistence times (fittingly, likely inspired by Charles Poynton's comparisons of CRT and LCD psychovisual effects during the VOR for panning scenes). TN is topping out at 144Hz, so if OLED can beat this that opens the market for desktop panels.

Remember when shite TN panels cost and arm and a leg compared to CRTs? Shite TN panels are now dirt cheap because years of making lots of them gives a lot of experience in how the process works and how to manage it (lowering the defect rate) and a growing market allows for larger production runs (which cut the raw production equipment cost as a percentage per panel, because mass production scales well).



tl;dr: CRTs stopped production because they ceased to be economically viable. OLEDs are being produced for small panels and large panels because they are economically viable. OLEDs are not being produced for desktop monitors because it not yet is economically viable. The market for desktop OLED panels may be emerging in the guise of ultra-high refresh rate and low pixel-persistence gaming displays.
 
At this point I'm fairly sure we'll get a 4k OLED Oculus Rift (which will provide infinite monitors through Virtual Desktop) a lot sooner than we'll get actual OLED desktop monitors...
 
Possibly, but with the FoV that the Rift covers (90° horizontal for the current DK2) to produce a 'virtual' display comparable to a desktop monitor at comfortable size and distance (around 40°) a 4K panel would produce a 'virtual' monitor 853 pixels across. To replicate a 1920x1200 monitor, an 8k panel (~4000x4000 per eye) would be needed.
 
Why did CRTs stop production? It was not some conspiracy to deny people high image quality. It was not some washy 'people accept bad pictures' argument either.

CRTs were simply really goddamn expensive to manufacture! Massive vacuum tubes, chemical coating processes, various metal screens/grills, electron guns, high-power high-precision analog electronics, multiple glass-welding stages, etc. So many ways to go wrong, so many production steps. Scales very poorly, with both display size and manufacturing capacity.

The much vaunted GDM-FW900 retailed for $2300 (or around $3000 today after inflation). Bigger tubes have a drop in picture quality (mainly resolution) because of fundamental physics issues (field collapse rate for large slew magnet fields) and the extreme expensive of the analog driving electronics for very large tubes at right frequencies, as well as the beam scanning time needing different phosphors with different emission characteristics.
Your bog-standard super-cheapie 15" CRT, which would look pretty awful even when calibrated by an expert (with whatever limited controls are available even with the back open) when bought new would still not be as cheap a a modern 24" IPS, and arguably look considerably worse in almost every situation.

Small OLED panels fabbed on glass or plastic can achieve acceptable yields and viable prices (competitive with LCD) because you can nicely bin them around defects. Big panels are viable because OLED and LCD both scale in roughly the same way for very large panels, so the defect rates converge for high-end panels (i.e. it costs comparably to manufacture a no-defect 50" OLED and 50" LCD).

The middle-ground of high quality, high resolution low-defect panels is ruled out for OLEDs until production ramps up and defect rates decrease. This is inevitable as more lines come online, and said lines run for longer and iron the kinks out of larger scale processes. In theory, there's no reason an OLED producer could not design and produce a small run of desktop-size OLED panels using the same lines as mobile device panels or TV panels. But the yield would be low (because the first run of ANYTHING is low) making them incredibly expensive. And guess what, this happens! The tiny quantity of viable panels end up in broadcast and mastering monitors (like the one at the start of the thread). This market is small and well-supplied, so there's no incentive to ramp up the lines needed to produce the volumes needed to get a handle on the process and raise yields, so there's n drop in production cost. To make desktop panels viable, either the process knowledge gains from current runs needs to be ported to desktop panels (either current tablet OLEDs get bigger and get rehoused for desktop use, or 4K TV panels shrink, or are cut down, and get more suitable display controllers), or a change in demand prompts a manufacturer to produce OLEDS that can do what LCDs cannot. the most likely candidate there is the recent surge in high framerate monitors. This is currently served by TN and a handful of VA and IPS panels (mainly golden samples and castoffs from the specific application market). OLEDs here can achieve much faster pixel switching times, much higher refresh rates, without the image quality drawbacks of TN. Current developments on VR are driving the controller and interface development needed for this, and raising awareness of the benefits of very high refresh rates and very low pixel persistence times (fittingly, likely inspired by Charles Poynton's comparisons of CRT and LCD psychovisual effects during the VOR for panning scenes). TN is topping out at 144Hz, so if OLED can beat this that opens the market for desktop panels.

Remember when shite TN panels cost and arm and a leg compared to CRTs? Shite TN panels are now dirt cheap because years of making lots of them gives a lot of experience in how the process works and how to manage it (lowering the defect rate) and a growing market allows for larger production runs (which cut the raw production equipment cost as a percentage per panel, because mass production scales well).



tl;dr: CRTs stopped production because they ceased to be economically viable. OLEDs are being produced for small panels and large panels because they are economically viable. OLEDs are not being produced for desktop monitors because it not yet is economically viable. The market for desktop OLED panels may be emerging in the guise of ultra-high refresh rate and low pixel-persistence gaming displays.
I knew sooner or later someone would come along with the formal industry line. And most of it is absolute horseshit. E.g. "The much vaunted GDM-FW900 retailed for $2300". Yep, and let's simply ignore the dozens if not hundreds of other much vaunted CRTs that did not retail for $2300. Like my 2040u which I got brand new in 2000 for half that price.

Etc. I'll concede menufacturing costs were much higher with CRTs, you get what you pay for. Be honest, Ed. How many panels have you been through in the last 15 years? Add up a total cost and let us know.
 
Your ability to take vast swaths of densely packed information, pick one --usually tangential-- line out of it to superficially address, and then just blithely declare the rest to be "horseshit" will always be near and dear to my heart.

Bless you. Please keep fighting your long twilight struggle against the LCD threat.
 
Your ability to take vast swaths of densely packed information, pick one --usually tangential-- line out of it to superficially address, and then just blithely declare the rest to be "horseshit" will always be near and dear to my heart.

Bless you. Please keep fighting your long twilight struggle against the LCD threat.
I'll extend the same question to you as EdZ. How much have you spent on flatpanels in the last 15 years?
 
IIRC, after moving on from a 22" CRT (can't even remember the model now) around 2006 I've had 3 panels: a 24" VA LG whose model I can't remember HP l2335 (but needed to be manually set to Reduced Blanking timings to hit 1920x1200 due to incorrectly set EDID), a Dell 2408WFP, and a Dell U2412m. The LG was around £200 (2nd hand), the 2407wfp £250 (new), and the U2412m £200 (new). So £650 in total, near enough $1000 (bear in mind we're hit for more VAT over here). Both the HP and first Dell were in sufficient working order to be sold on, so knock around £200 off that figure for Net spend.
 
Last edited:
I'll extend the same question to you as EdZ. How much have you spent on flatpanels in the last 15 years?
$225 on the LCD I'm currently using (sig). Or have you not been paying attention? Like you, I held on to CRTs for quite a long time.

I bought my wife a 22" panel about eight years ago. It's still going strong. I think it cost $200-300.

What's your point? That CRTs (which I prefer, remember) are oh-so-durable? Well, I had a Nokia 17" monitor that I paid over $1400 for die on me after only 3-4 years. I had a Mitsubishi 2020u die on me after 2-3 years. And all three of my (albeit used) FW900s died despite gentle usage on my own part. Despite your personal experience, CRTs failed all the time.

Has it occurred to you that LCD turnover has been high over the last several years because people have been voluntary upgrading to something bigger or in a different format (from 4:3 to 16:10, etc.). Enthusiasts around here are now often running 30+ inches on their desks. Sometimes they're running three monitors on their desks. CRTs aren't exactly viable in that space (I tried running two FW900 simultaneously once). They were at their theoretical max size for desktop usage right around 24". . . with only SED or FED a possibility for anything larger with the same performance.

Hey! While we're at it, can I throw in all the monitors I oversee at work? Those things have been dirt cheap (yet our users have no complaints) for years and we throw them on the desk and forget about them for 5-10 years. They don't need adjustment like CRTs did (I remember managing all of those). . . and they hardly ever fail.

Have I mentioned that you're adorable?

--H
 
If it's reliability figures you're after, then I can give a good estimate: I handle display faults for a large office (about 7000 occupancy, with anything from 2 to 6 panels per person, most having 3 or 4). My best estimate is around 26,000 displays in the building (mix of Dell U2312m, P2314h and U2414h). Over the last year, slightly more than 50 have failed (mix of dead panel, FRC, backlight, inputs, etc), not counting physically damaged panels. ~15 were out of warranty, so were live for at least 3 years. That gives a yearly failure rate of between 1.3% and 1.9% depending on whether you count OOW failures.
 
<Snip - lots of good CRT info>

tl;dr: CRTs stopped production because they ceased to be economically viable. OLEDs are being produced for small panels and large panels because they are economically viable. OLEDs are not being produced for desktop monitors because it not yet is economically viable. The market for desktop OLED panels may be emerging in the guise of ultra-high refresh rate and low pixel-persistence gaming displays.

This is really what it comes down to in my opinion. I was reading this thread on an LG TV (55" OLED) yesterday, and have also perused it on my Nexus 6 (5.96" AMOLED WQHD Display). I own both of these devices because it is economically viable for a company to produce them with the intent of carving out a profitable segment in that market.

LG has stated that the new M2 production factory should produce between 70-80% yields which is a vast improvement over previous initiatives. If they manage to take more market share with the new TV's, economies of scale should kick in and further boost OLED research and development and market expansion.

With this in mind, I do see more affordable OLED monitors as a very real possibility in the near future. I think OLED is currently the most viable technology to bring a true improvement to desktop monitors, and more companies will make that leap when they deem it can be profitable. As a person who has owned two FW900's over the years, I can't wait for that day to come.
 
IIRC, after moving on from a 22" CRT (can't even remember the model now) around 2006 I've had 3 panels: a 24" VA LG whose model I can't remember (but needed to be manually set to Reduced Blanking timings to hit 1920x1200 due to incorrectly set EDID).

LG 246W http://hardforum.com/showthread.php?t=1167222 ? Still have mine and loving it. Also, the only screen I have purchased since my 19" AOC CRT. I am planning on buying a 144Hz screen soon to run beside my LG.
 
Why did CRTs stop production? It was not some conspiracy to deny people high image quality. It was not some washy 'people accept bad pictures' argument either.

CRTs were simply really goddamn expensive to manufacture! Massive vacuum tubes, chemical coating processes, various metal screens/grills, electron guns, high-power high-precision analog electronics, multiple glass-welding stages, etc. So many ways to go wrong, so many production steps. Scales very poorly, with both display size and manufacturing capacity.

The much vaunted GDM-FW900 retailed for $2300 (or around $3000 today after inflation). Bigger tubes have a drop in picture quality (mainly resolution) because of fundamental physics issues (field collapse rate for large slew magnet fields) and the extreme expensive of the analog driving electronics for very large tubes at right frequencies, as well as the beam scanning time needing different phosphors with different emission characteristics.
Your bog-standard super-cheapie 15" CRT, which would look pretty awful even when calibrated by an expert (with whatever limited controls are available even with the back open) when bought new would still not be as cheap a a modern 24" IPS, and arguably look considerably worse in almost every situation.

Small OLED panels fabbed on glass or plastic can achieve acceptable yields and viable prices (competitive with LCD) because you can nicely bin them around defects. Big panels are viable because OLED and LCD both scale in roughly the same way for very large panels, so the defect rates converge for high-end panels (i.e. it costs comparably to manufacture a no-defect 50" OLED and 50" LCD).

The middle-ground of high quality, high resolution low-defect panels is ruled out for OLEDs until production ramps up and defect rates decrease. This is inevitable as more lines come online, and said lines run for longer and iron the kinks out of larger scale processes. In theory, there's no reason an OLED producer could not design and produce a small run of desktop-size OLED panels using the same lines as mobile device panels or TV panels. But the yield would be low (because the first run of ANYTHING is low) making them incredibly expensive. And guess what, this happens! The tiny quantity of viable panels end up in broadcast and mastering monitors (like the one at the start of the thread). This market is small and well-supplied, so there's no incentive to ramp up the lines needed to produce the volumes needed to get a handle on the process and raise yields, so there's n drop in production cost. To make desktop panels viable, either the process knowledge gains from current runs needs to be ported to desktop panels (either current tablet OLEDs get bigger and get rehoused for desktop use, or 4K TV panels shrink, or are cut down, and get more suitable display controllers), or a change in demand prompts a manufacturer to produce OLEDS that can do what LCDs cannot. the most likely candidate there is the recent surge in high framerate monitors. This is currently served by TN and a handful of VA and IPS panels (mainly golden samples and castoffs from the specific application market). OLEDs here can achieve much faster pixel switching times, much higher refresh rates, without the image quality drawbacks of TN. Current developments on VR are driving the controller and interface development needed for this, and raising awareness of the benefits of very high refresh rates and very low pixel persistence times (fittingly, likely inspired by Charles Poynton's comparisons of CRT and LCD psychovisual effects during the VOR for panning scenes). TN is topping out at 144Hz, so if OLED can beat this that opens the market for desktop panels.

Remember when shite TN panels cost and arm and a leg compared to CRTs? Shite TN panels are now dirt cheap because years of making lots of them gives a lot of experience in how the process works and how to manage it (lowering the defect rate) and a growing market allows for larger production runs (which cut the raw production equipment cost as a percentage per panel, because mass production scales well).



tl;dr: CRTs stopped production because they ceased to be economically viable. OLEDs are being produced for small panels and large panels because they are economically viable. OLEDs are not being produced for desktop monitors because it not yet is economically viable. The market for desktop OLED panels may be emerging in the guise of ultra-high refresh rate and low pixel-persistence gaming displays.

Well, I actually believe there was a conspiracy, if a loose and not illegal one, of customers being manipulated to buy an inferior product with higher profit margins. All that hype regarding thinness meanwhile the picture quality was garbage on those early models.

I do agree that CRTs were probably ultimately doomed due to scaling issues and it probably not making sense to maintain legions of CRTs, invariably running at 60 Hz, to run MS Office.

Oh well...what happened happened...if anybody here gets sucked into an alternate universe and makes it back, please tell us of the successors to the GDM-F520 and FW900 and the sweetness that could have been...
 
Does anyone have or have used one of Sony's professional line of OLED displays? I think they had a 24".

I think they have a long way to go before they are gaming ready (motion/refresh).
 
Well, I actually believe there was a conspiracy, if a loose and not illegal one, of customers being manipulated to buy an inferior product with higher profit margins. All that hype regarding thinness meanwhile the picture quality was garbage on those early models.

I do agree that CRTs were probably ultimately doomed due to scaling issues and it probably not making sense to maintain legions of CRTs, invariably running at 60 Hz, to run MS Office.

Oh well...what happened happened...if anybody here gets sucked into an alternate universe and makes it back, please tell us of the successors to the GDM-F520 and FW900 and the sweetness that could have been...
I don't understand why it has to be one or the other. What angel of God descended 10 years ago and decreed that nobody on planet Earth either needed or wanted CRT technology? Why was it necessary to practically abandon the technology when, at the time, they literally couldn't even make them fast enough (it took us six months to find our 2040u), and when it still had major advantages over what replaced it? A decade later they STILL HAVE these same advantages.

I don't care whether any conspiracy existed to deprive the masses of affordable and truly professional image and video quality, it wasn't my claim. I simply said it has been the net effect of the demise of CRTs, a claim that still remains unanswered here or anywhere else I've ever raised it. If Sony can still produce $23K Trinitrons for Hollywood, I don't think it's unreasonable to claim they're still able to produce $2300 Trinitrons for eight billion other people. I understand the technology doesn't scale well and is expensive, but a well built CRT can and does last several times as long as flatpanel backlights and OLED panels (btw have they solved the problem with fading blue pixels?), and at least from sales figures most people find anything over 24-27" to be pointless on a desktop anyway.
 
Maybe OLED has a potential to become too good. It could improve but not by much, become cheap in about 5-10 years, then go 8K, then flexible, then what ? OLED clothes ? OLED carpets ?
Maybe LG are the idiots from a business perspective.
 
I don't understand why it has to be one or the other. What angel of God descended 10 years ago and decreed that nobody on planet Earth either needed or wanted CRT technology? Why was it necessary to practically abandon the technology when, at the time, they literally couldn't even make them fast enough (it took us six months to find our 2040u), and when it still had major advantages over what replaced it? A decade later they STILL HAVE these same advantages.

I don't care whether any conspiracy existed to deprive the masses of affordable and truly professional image and video quality, it wasn't my claim. I simply said it has been the net effect of the demise of CRTs, a claim that still remains unanswered here or anywhere else I've ever raised it. If Sony can still produce $23K Trinitrons for Hollywood, I don't think it's unreasonable to claim they're still able to produce $2300 Trinitrons for eight billion other people. I understand the technology doesn't scale well and is expensive, but a well built CRT can and does last several times as long as flatpanel backlights and OLED panels (btw have they solved the problem with fading blue pixels?), and at least from sales figures most people find anything over 24-27" to be pointless on a desktop anyway.

If you are not completely retarded, you would know that looking at sales figure 10 years back you would see people finding anything over 17-19" to be pointless.

Desktop displays are getting bigger and bigger, so unless you have something prove otherwise, shut your shit hole.
 
If you are not completely retarded, you would know that looking at sales figure 10 years back you would see people finding anything over 17-19" to be pointless.

Desktop displays are getting bigger and bigger, so unless you have something prove otherwise, shut your shit hole.
10 days? Was someone recently banned?

Between Sony, Mitsubishi, NEC, IIyama, Viewsonic etc etc the world was swimming in wonderful 21-22" CRTs, in the 1990's and 2000's. Well before 10 years ago. Not exactly pointless.
 
I don't understand why it has to be one or the other. What angel of God descended 10 years ago and decreed that nobody on planet Earth either needed or wanted CRT technology? Why was it necessary to practically abandon the technology when, at the time, they literally couldn't even make them fast enough (it took us six months to find our 2040u), and when it still had major advantages over what replaced it? A decade later they STILL HAVE these same advantages.
How many times do we have to explain this to you before you'll at least even acknowledge the attempt at an explanation?

The consumer made this happen. They liked LCDs and the vast majority of them don't even realize or recognize that their new LCD looks any worse than their CRT. Almost nobody (except us) ever put their new LCD next to their old CRT and compared the things we'd compare.

I can walk next door, and talk to my neighbor about his LCD TV. He'll say it's great. I'll mention black levels and contrast and he'll nod politely, and then repeat how happy he is with his TV. I'll visit my folks and have the same experience. And the in-laws. And my friend across town.

We are not entitled to CRTs. When the display manufacturers realized that they could pack ships full of easier-to-manufacture LCDs rather than expensive-to-manufacture, store, and transport CRTs, they took that opportunity. . . and they were able to do so because the vast majority of consumers didn't (and still don't) have a problem with LCDs. In fact, most were thrilled at how "thin and light" they are.

Why can't you buy a cheap CRT if Sony is still selling them to professionals? There's an economic principle called supply and demand. Look it up. They aren't producing a lot of them any more because there isn't as much demand as you merely assert. Those that truly need them will pay for them. Those that just really want them but aren't willing to pay the premium aren't entitled to what they want at whatever price they specify. Sorry, that's just the way the market works. The mob has spoken, and no reasonably priced CRTs for us.

Which is all to say. . . there are perfectly legitimate reasons why we can't have reasonably priced CRTs. You may not like the effect, but that doesn't make the reasons illegitimate. And the blame lies largely with the consumer.

If Sony can still produce $23K Trinitrons for Hollywood, I don't think it's unreasonable to claim they're still able to produce $2300 Trinitrons for eight billion other people.
Already addressed above. But in a nutshell: They make more selling LCDs to the people with no other options than they do manufacturing the $2300 CRT you claim so many people would buy. Otherwise, they'd do it. It's that simple.

If people were more discerning, and refused to settle for LCD, then we would be getting our CRTs. That's just the way it works. But the vast majority aren't that discerning, so they're happy to get their LCDs and the manufacturers are happy to shut down and retool the vast majority of their expensive CRT manufacturing plants.

All of this is why any technology that will come along in the future that will improve upon LCDs must also be relatively inexpensive (or rendered so quickly). Because the vast majority of people (absurdly, imo) consider LCDs "good enough". . . so any improvement brought along by things like OLED must essentially be "snuck in" to the mix and costs must quickly be brought down to be similar to LCD. . . that's where the "good enough" keeps smothering the "best" in the crib as happened with SED and FED. OLED is the first new technology to actually reach us, perform as advertised, and come down to a reasonable price ($3K is about where premium plasma was). That's why some of us have a little hope here. After being disappointed in the past.

If you're going to ask this question again, would you at least do us the courtesy of explaining where and how you find the explanation above insufficient? And please be aware that just blithely declaring swaths of it "bullshit" without substantiating that assertion really doesn't count.

--H

Shorter version: You can't have your $2,300 CRT because then Sony couldn't sell the product of their remaining/residual CRT manufacturing capacity for $23,000. The existence of residual manufacturing capacity and the premium price put on its product isn't a reasonable argument against the demise of CRTs, it's a result of the demise of CRTs. You have things precisely backwards.
 
Last edited:
Even shorter version: If the vast majority of consumers are willing (even enthusiastic) to buy objectively inferior (but superficially superior) product x versus the objectively superior (but superficially inferior) product y that is much more expensive to produce, ship, and store. . . then you stop selling y and you focus on x. No business is under any obligation to continue selling y merely because it's objectively superior in certain respects (while being inferior in other ways more readily visible to average consumers). And they won't/can't maintain the capacity to manufacture y (factories can't be left to sit idle or running below capacity) at a reasonable price so long as x is so well-received by the public, selling well, and undercutting the vast majority of the market for y. The End.
 
Last edited:
Even shorter version: If the vast majority of consumers are willing (even enthusiastic) to buy objectively inferior (but superficially superior) product x versus the objectively superior (but superficially inferior) product y that is much more expensive to produce, ship, and store. . . then you stop selling y and you focus on x. No business is under any obligation to continue selling y merely because it's objectively superior in certain respects (while being inferior in other ways more readily visible to average consumers). And they won't/can't maintain the capacity to manufacture y (factories can't be left to sit idle or running below capacity) at a reasonable price so long as x is so well-received by the public, selling well, and undercutting the vast majority of the market for y. The End.
Or as my cited article put it, the good is the enemy of the best. No argument there, and yet you continue to simply ignore the fact that China has technology "almost as good" as OLED at 1/20th the production cost, and continue to insist OLED has some kind of future for computer monitors. Please choose one of those two and go with it.
 
I bought my wife a 22" panel about eight years ago. It's still going strong. I think it cost $200-300.

What's your point? That CRTs (which I prefer, remember) are oh-so-durable? Well, I had a Nokia 17" monitor that I paid over $1400 for die on me after only 3-4 years. I had a Mitsubishi 2020u die on me after 2-3 years. And all three of my (albeit used) FW900s died despite gentle usage on my own part. Despite your personal experience, CRTs failed all the time.
I never said they didn't. I said quality CRTs have an expected lifespan several times that of backlights. So please explain what "going strong" means, and tell us how much original brightness a backlight is expected to lose after eight years of use, relative to a CRT.
 
If Sony can still produce $23K Trinitrons for Hollywood
They can't. The last Trinitron line shut down 6 years ago. The broadcast-grade monitors sold since have been stockpiled from that last run. That said stockpile is still going demonstrates just how low the demand for high-end CRTs is, and why it is uneconomical to keep a line going even when prices per tube are pushed up dramatically.

China has technology "almost as good" as OLED at 1/20th the production cost
What unspecified 'technology' is this? The return of SED? Yet another backlight variant for LCD? Breaking the minimum cel size for Plasma? Direct-emissive LED (i.e. OLED using inorganic phosphors rather than CHON-based ones)?
 
Odd how this thread went from OLED to CRT.
 
Or as my cited article put it, the good is the enemy of the best. No argument there, and yet you continue to simply ignore the fact that China has technology "almost as good" as OLED at 1/20th the production cost, and continue to insist OLED has some kind of future for computer monitors. Please choose one of those two and go with it.
You keep referring to one article (in ComputerWorld no less!) that merely repeats the claims of the Quantum Dot purveyors even as you dismiss the "marketing spin" of those pointing to actual OLED displays.

And now. . . this might sting a little. . . but QD does nothing to address the long-standing issues of LCDs that we all loathe and that OLED is essentially designed to address. . . black levels and contrast. Unlike you, I'll cite a source and actually quote it accurately:
We'll have to wait and see what effect quantum dot technology has on OLED TV development. QD sets might be able to produce great colors, but as LCD TVs they have shortcomings in areas such as black levels, contrast ratios, response times and refresh rates, and viewing angles.
This one link you've come up with about QD keeps being brought up by you whenever you want to change the subject and distract everyone from the shambles your position has become (Yields! No? WRGB is fake! No? Why can't I still have CRTs!?!). You ask a question, get long, detailed answers, ignore 98% of those detailed, substantive responses, and then cite this link again (while overstating its claims) as though it's relevant.

You handle these discussions like it's a partisan political argument. Your confirmation bias causes you to believe everything you read or see that excites your animosities towards your perceived/invented villains while dismissing outright anything that conflicts with your preconceived notions. All the while we hear about all your insider knowledge from within the industry and all your "reading" and yet the only link you've ever been able to produce is some claims from the LCD manufacturers that QD can be cheaply added to LCDs (Sony has already been doing so since 2013) and cause colors to look slightly better. Which, again, this might sting. . . does not address the primary reasons most people are so interested in OLED.

The hope is that who covet contrast/black-level/dynamic-range ("videophiles") will buy the larger OLED screens now available while production ramps up and costs come down. As those economies of scale come into play and prices come down, the hope is that OLED will become price-competitive (and plausibly even cheaper to manufactuer) with LCD. QD doesn't enter into that equation. Because those hopefully willing to pay the premium for OLED displays through this current critical period won't be fooled by QD's marginal improvements to colors.

QD doesn't address what OLED addresses. And by the time the average consumer is interested in the trade-offs between the two technologies, OLED will be more price-competitive with LCDs, and the choice will be easier, even for non-videophile consumers.

Now, quick! Change the subject again!
 
Last edited:
I never said they didn't. I said quality CRTs have an expected lifespan several times that of backlights. So please explain what "going strong" means, and tell us how much original brightness a backlight is expected to lose after eight years of use, relative to a CRT.
Hey! Look! I'm Jeff and just had my questions answered and my points demolished! So now I'm going to bog us down in irrelevant minutia as though the durability of a CRT tube vs an LCD backlight has suddenly been crucially relevant all this time (and as though CRT-based displays didn't also fail spectacularly in many different ways all the time just like any other consumer product)! Pay no attention to all of the ways my claims and assertions have been dismantled! Because I've found this tiny little redoubt upon which I can still stand and wage my war of rhetorical attrition against facts and inconvenient realities!

I'm honestly beginning to suspect that he's doing this on purpose. Since he clearly doesn't value intellectual honesty (look it up Jeff, that's not a psychological term, it's a rhetorical one), he doesn't mind seeing his larger points demolished again and again and feels no obligation to either modify his contentions or acknowledge the criticizms before he just restates them as though nobody has ever addressed them. He just wants to see how much he can get everyone to type with these bizarre tangents and one-sentence, obtuse replies to very long, carefully argued, and substantive posts.

So now, we'll either hear about how the lack of an OLED monitor today means we won't ever see one in the future (bad logic). . . or perhaps he'll bring up his favorite misrepresented/misleading article about Quantum Dot again? One can never tell, but that's half the fun! The anticipation!

--H

P.S. The answers to your questions are: "working great!" and "nobody f'ing cares in the context of this discussion." Nor should you, if you respected your own arguments if not those with whom you are arguing."
 
Last edited:
What unspecified 'technology' is this?
Quantum Dot technology. Already been in use by Sony since 2013. It's only news that other LCD manufacturers are now going to implement it too. It does nothing to improve black levels, contrast, or dynamic range (or viewing angles, response time, etc.). It just marginally improves colors on LCD. He brings it up whenever he is getting his ass (rhetorically) handed to him and he needs to change the subject to preserve the illusion that he can reply substantively at all. This happens often.
 
CRT vanished due to, weight and transport costs.
if you can make a lighter screen then you fit more into a ship then you sell more for the transport cost and it was an easy choice to make for the manufacturers.
 
Returning the discussion from off-topics, the hard facts for the beginning of 2105 are:

1. Mass manufacturing of sizable 4K OLED panels is starting and this is light in the tunnel for seeing OLED monitors in the future, or at least 4K OLED TVs in sizes suitable for desktops.

2. No OLED monitors in 2015 since LG will not produce any OLED panels below 55".
 
I pray this will be the last thing I feel compelled to say in here. . .

-------------------------------------------------------

Time and time again, you've said things that do not ring true. Whether it was about OLED yields, or the "reading" you've done that bizarrely informed you that WRGB OLED was "vastly inferior" and represented a "scam" with "washed out IQ" because it was "fake OLED", or your claims that even LG has admitted that WRGB is inferior and a "kluge". . . time and time again, we have asked for substantiation of these assertions and the only link you have ever provided was in support of a suddenly all-new argument about how Quantum Dot technology will squeeze OLED out of the market.

Not only do you you clearly overstate that article's claims but you conveniently fail to address the fact that (as already stated immediately above, I'm sure you'll ignore it all) QD does not address any of the long-standing problems with LCD panels (it does not improve black levels, contrast, dynamic range, viewing angles, response time, or any of the other things OLED was designed to improve upon). Funny how you never mentioned that! I've gone into more detail above (again, which you'll ignore) about why this wholly undercuts your argument that QD will strangle OLED in its crib. I need not repeat it here.

But this bears pointing out. . . you don't just exaggerate that article's claims. You don't just replace all the "mays" and "mights" with "is" and "will". . . but you outright misstate what's there. Repeatedly.

I stand by that claim. If you read my cite, China has technology almost as good as OLED and it's 1/20th the cost to produce. So if you can manage to rub those last two precious brain cells together, please tell us how many consumers you think will opt for OLED.

Or as my cited article put it, the good is the enemy of the best. No argument there, and yet you continue to simply ignore the fact that China has technology "almost as good" as OLED at 1/20th the production cost, and continue to insist OLED has some kind of future for computer monitors. Please choose one of those two and go with it.

Here's what that article (amidst what is obviously QD/LCD-slanted PR) actually states:
Then last week, China-based TCL, the third largest manufacturer of flat-screen TVs in the world, disclosed plans to ship a 55-in quantum dot LCD TV, which offers the same quality picture as OLED, but at a one-third the cost.
How do you get from 1/3 to 1/20? I hesitate to call this an outright lie. For all I know, you did your own bizarre math and failed to notice where they switch from discussing 1080P displays to 4K displays (or some other error). Maybe you posted that otherwise unsubstantiated "1/20" figure in good faith somehow. But, regardless, time and time again you have just been exposed as an unreliable and supremely biased purveyor of information. To the point where you assert LG's TV is $11k when it is $3k. Or that QD manufacturing costs are 1/20 that of OLED when your own oft-cited "source" specifies 1/3. And, one should point out, that's last year's OLED manufacturing cost and makes no mention of the targeted manufacturing costs (which OLED PR plausibly states will be below LCD's).

Argument by "whack-a-mole" isn't a terribly effective tactic. Because the point is that each of your arguments is always "whacked". . . but you make it entirely too easy when you couch your "arguments" (usually laced with condescension and/or disdain) with wholly invented facts that never stand up under scrutiny.

Argument by 'whack-a-mole" is an effective tactic to just annoy and delay the inevitable, though. Hoping to just exhaust those who have repeately proven you wrong and even disingenuous. I think it's pretty clear that's your only real motivation now.

How is this discussion not over yet? You have been proven to be mistaken at best and outright dishonest at worst at every turn? Is your goal here just to keep plodding forward and changing the subject whenever you can in hopes we give up on you and you just then "win" through attrition? I have to believe that's the case.

Unless your goal here is merely to troll people, the only reasonable course of action for you here that would demonstrate any integrity would be to just admit that you can't substantiate the vast majority of the things that you've said and that your opions on this matter are not as well-founded as you tried to make us believe. That doesn't necessarily mean your overall opinion is wrong. OLED could still turn out to be a bust in the monitor (and even TV) segment. But not for the reasons you specified. Because those have been dismantled. Repeatedly.

--H
 
Last edited:
8152dc205a50924e96563328d4abfd64.jpg
 
I pray this will be the last thing I feel compelled to say in here. . .

-------------------------------------------------------

Time and time again, you've said things that do not ring true. Whether it was about OLED yields, or the "reading" you've done that bizarrely informed ou that WRGB was "vastly inferior" and represented a "scam" with "washed out IQ" because it was "fake OLED", or your claims that even LG has admitted that WRGB is inferior and a "kluge"
Likewise on the final shot wish, but ime people tend to accuse others precisely of what they themselves are guilty of. If you would learn to read what is written instead of what you wish to read, other people including me wouldn't have to waste their time responding to and correcting it.

My reading (in quotations no less by you) was and is on the topic of addition of white pixels to RGB OLED, an idea not LG's, because everyone who actually knows wtf they're talking about in the industry is aware it's a kludge, and the only way to produce OLED panels with workable yields. When LG says they found a way to go from 10% yields to 80, that's what they mean. They have not made any kind of major breakthroughs or reinvented the technology. Long story short, these panels will contain bad pixels galore, they're simply harder to detect with added white pixels which can and do function as little more than pixel-sized backlights.

As a matter of fact, if God forbid you were slightly better read, you would have already known the charge of "fake" isn't mine but Samsung's.

Again I hope to be through defending myself, my knowledge and experience with someone who clearly doesn't know half as much about the industry and current state of affairs in it as I do. For whatever reason you're the only member I find myself in constant contention with, aside from Mr. 12 Days Now.
 
Back
Top