Whoisthisreally
[H]ard|Gawd
- Joined
- Feb 14, 2009
- Messages
- 1,143
You also have the math backwards the Phone/TV screen economics. The coming OLED TVs have 100 times the area and would require 100 times the OLED material to build a screen, IF they both had 100% yield. Given realistic yields we can expect OLED material usage to be several hundred times higher on a big screen TV.
$50 Cell screen * ~300 times material cost = $15000 for TV screen.
So if anything they would need to use cheaper materials on the TV to keep costs from going runaway.
Realistically they use the same material across the board. Everyone is fighting the same physics on OLED burn-in. There is zero evidence to the contrary.
While OLED material costs do contribute, the total cost per area is dominated by the TFT type, the defect rate, the number of steps required, and the process time.
The LG LC150WXD panel was estimated to have cost $600 each to produce in 2010. Its cost was reflected by its expensive LTPS run through a small output 4.5G plant.
The future large OLED displays will use an 8.5G process with similar dimensions to LCD TV lines, as well as various methods to reduce production costs which you are familiar with already such as IGZO TFT and RGBW techniques.
The objective of these panel makers is to reduce the cost of the OLED panel to that of LCD per area. This is really distant considering the capex and yield requirements, but OLED has an edge by eliminating the lamp, light-guide plate, second polarizer and so on.
If anything, so far OLED has a way to go before it is even as good as Plasma.
I don't think anyone should recommend OLED for desktop usage until they were ready to be the first guinea pig, and I note you aren't.
I don't see any basis for you saying this. I don't know the details of PDP degradation except to note that it involves ion sputtering during sustained discharge. OLED degradation is an effect of applied current. There is no need for a guinea pig as it can be roughly calculated with a few known parameters. I'm not sure what the relationship is, but I suspect it is non linear. Eventually, there will be emitters for which the current required for 60-80 cd/m2 output is low enough to satisfy computer interface operation. Using a low brightness colour scheme for static interface areas and clamping down on large static white fields will help too.
More tomorrow perhaps.
Last edited: