Sony OLED PVM 2541A TRIMASTER EL

You also have the math backwards the Phone/TV screen economics. The coming OLED TVs have 100 times the area and would require 100 times the OLED material to build a screen, IF they both had 100% yield. Given realistic yields we can expect OLED material usage to be several hundred times higher on a big screen TV.

$50 Cell screen * ~300 times material cost = $15000 for TV screen.

So if anything they would need to use cheaper materials on the TV to keep costs from going runaway.

Realistically they use the same material across the board. Everyone is fighting the same physics on OLED burn-in. There is zero evidence to the contrary.

While OLED material costs do contribute, the total cost per area is dominated by the TFT type, the defect rate, the number of steps required, and the process time.

The LG LC150WXD panel was estimated to have cost $600 each to produce in 2010. Its cost was reflected by its expensive LTPS run through a small output 4.5G plant.

The future large OLED displays will use an 8.5G process with similar dimensions to LCD TV lines, as well as various methods to reduce production costs which you are familiar with already such as IGZO TFT and RGBW techniques.

The objective of these panel makers is to reduce the cost of the OLED panel to that of LCD per area. This is really distant considering the capex and yield requirements, but OLED has an edge by eliminating the lamp, light-guide plate, second polarizer and so on.

If anything, so far OLED has a way to go before it is even as good as Plasma.

I don't think anyone should recommend OLED for desktop usage until they were ready to be the first guinea pig, and I note you aren't.

I don't see any basis for you saying this. I don't know the details of PDP degradation except to note that it involves ion sputtering during sustained discharge. OLED degradation is an effect of applied current. There is no need for a guinea pig as it can be roughly calculated with a few known parameters. I'm not sure what the relationship is, but I suspect it is non linear. Eventually, there will be emitters for which the current required for 60-80 cd/m2 output is low enough to satisfy computer interface operation. Using a low brightness colour scheme for static interface areas and clamping down on large static white fields will help too.

More tomorrow perhaps.
 
Last edited:
While OLED material costs do contribute, the total cost per area is dominated by the TFT type, the defect rate, the number of steps required, and the process time.

The LG LC150WXD panel was estimated to have cost $600 each to produce in 2010. Its cost was reflected by its expensive LTPS run through a small output 4.5G plant.

The future large OLED displays will use an 8.5G process with similar dimensions to LCD TV lines, as well as various methods to reduce production costs which you are familiar with already such as IGZO TFT and RGBW techniques.

The objective of these panel makers is to reduce the cost of the OLED panel to that of LCD per area. This is really distant considering the capex and yield requirements, but OLED has an edge by eliminating the lamp, light-guide plate, second polarizer and so on.



I don't see any basis for you saying this. I don't know the details of PDP degradation except to note that it involves ion sputtering during sustained discharge. OLED degradation is an effect of applied current. There is no need for a guinea pig as it can be roughly calculated with a few known parameters. I'm not sure what the relationship is, but I suspect it is non linear. Eventually, there will be emitters for which the current required for 60-80 cd/m2 output is low enough to satisfy computer interface operation. Using a low brightness colour scheme for static interface areas and clamping down on large static white fields will help too.

More tomorrow perhaps.

Obviously whoever buys this thing is going to be a guinea pig. Are you recommending zzcool purchase it and possibly see it burn-in and be ruined while he's still making installment payments on it? And the warranty would cover usage which is explicitly against the user instructions?

This isn't some theoretical exercise. If I were rich I'd get a few and throw them away as they burned out. Alas...I'm not....
 
Are you recommending zzcool purchase it and possibly see it burn-in and be ruined while he's still making installment payments on it? And the warranty would cover usage which is explicitly against the user instructions?

Oh no, he'd be constantly fighting the anti burn-in scheme if he tried. The manual gives clear instructions on what the monitor is for and what the consequence is for driving bright static images for long hours. He'd get away with occasional computer usage at low brightness levels, but that is all.

I task all my video and game activity to my 15EL. LCD and OLED is an optimal match, especially when calibrated.
 
I guess this would not work well for gaming and PC usage?

It would work fine, however it's limited to 60hz and it costs $35K USD. Most people aren't interested in paying that kind of money for ANY gaming display, let alone a 60hz limited one.
 
Input lag is surprisingly low on that thing. Still I think a much larger consumer-grade OLED would give you a better effect overall even if the image isn’t reference-quality like this.

Yeah FWIW you could get three 77” LG OLEDs at a third of the “retail” cost of this masterpiece. Even if you were to pick this up for “cheap” I expect it wouldn’t get much cheaper than 10 grand if it fell off the back of a truck. Only the most discerning videophiles would choose this one display over whole FOV coverage if you’re going to play slow at 60 hz and below I would think.
 
Last edited:
Back
Top