Even a chance for OLED monitors?

"Oled backlight" does not exist.
Oled by definition means that there is no backlight at all, there is just the light generated by the pixels thselves. That is why this technology is completely angle view problem free and pixel response time is comparable with crt.
If they add a backlight to it to increase the life time artificially then they are bringing back the entire list of lcd problems, negating the entire purpose of oled technology.
I pray that when they do that it will be without any backlight.
 
Guys, no reason to get your panties up in a bunch over some wild speculations. I'm just saying that lamp manufacturers focusing on simplified only white OLED panels might could be cost/quality-outperforming WLED backlights and pure OLED screens by a wide enough margin for backlight use to come into play before the real OLED panels becomes mainstream. That or if pure OLED ultimately fail as computer screens for some unlikely reason.

I guess I got the idea from the description of W-OLED being 'white led with a color filter'. Sounds very much like a 'backlight solution' to me. :)
 
Don't hold your breath. My advice is to stop being a display enthusiast for the next 5-10 years.
 
I guess I got the idea from the description of W-OLED being 'white led with a color filter'. Sounds very much like a 'backlight solution' to me. :)

W-OLED should be imagined as a white OLED sub-pixel which has either an R, G, or B filter placed on top of it which gives it the final colour. This is more subtle than a ye ol' backlight with TFT matrix and LCD matrix plastered in front of it.

Basically this means fewer layers, easier to produce and essentially none of the disadvantages of an LCD (viewing angle, ghosting, IPS glow, gamma shift, etc.).

Yes, you could use W-OLED for backlight purposes, but who the heck would want to do that? :p
 
I guess I got the idea from the description of W-OLED being 'white led with a color filter'. Sounds very much like a 'backlight solution' to me. :)

But it isn't. There are still >6 million (Actually >8 million in LG set) individually controlled OLED subpixels. That isn't a backlight, those are active OLED pixels.

WOLED seems to confuse some people into thinking it isn't "Real OLED" and only R,G,B OLED is real OLED. This is nonsense.

WOLED with filters is actually a superior solution to a couple of big OLED problems.

Short blue life and uneven color wear. Using WOLED completely fixes this issue. Each color subpixel now has the exact same chemical composition and durability.

Color tuning. OLED has a another problem matching the proper color spectrum, especially for the Blue component. Color filters can be precisely tuned. Giving a better match for target Rec 709 color components.


I think it is much more likely that Samsung will move to WOLED, than LG will move to RGB OLED. WOLED is mainly advantages, with minimal disadvantages.
 
WOLED seems to confuse some people into thinking it isn't "Real OLED" and only R,G,B OLED is real OLED. This is nonsense.

Yeah, I'm tired of hearing this. White OLED schemes were actually developed by Kodak, which employed Ching W. Tang, who while working there invented the OLED.
 
I am sure screen burn is an issue. One of the reasons it's taken so long to get OLED to market is because of problems with unequal ageing of the different colours. This must happen quite quickly, or it wouldn't be a problem. And that must mean susceptibility to screen burn.

That said, whereas there are reports of screen burn on Samsung Super Amoled phone screens, these are no so widespread as to indicate any showstopping problem. My Samsung Galaxy S3 shows no signs of it after about 1 year of use and I am not very careful about screen brightness and static images.

It is an issue. On my PS Vita (which has an OLED screen), if you watch in complete darkness, you can see image retention right away. It's fairly dim though - in a normally lit room, it's hard to see, even if you leave something like the netflix screen up with no motion for 20 mins.
 
I just want to see a monitor that looks exactly like my iphone 4S. The rest I don't care. I can read for hours on my iphone, regardless of the text size, without any eye strain at all, and I can't say the same thing about any other monitor out there. That is all I want, iphone 4 like monitor: No backlight, perfect view-ing angles, awesome colors, awesome response, no tones of light coming towards my eyes like with curent monitors. I yet have to see a single monitor on the market that has proper whites, paper like, when luminance is 80-90. Or better, I want to see a monitor that has whites that when you look at you don't feel like staring at the sun. Is there any? I doubt it...
I came to one conclusion though, the only monitors that actually get really close to that are the glossy displays, since they let white be white: they don't make it sparkling/glowing and they don't force you to up the brightness to compensate for the ag coating.
 
Snowdog: What about that Sharp IGZO stuff? Somewhat confusing since I read they use it for both WOLED and TFT.

I just want to see a monitor that looks exactly like my iphone 4S.
Isnt that a LED-lit IPS? Most probably without PWM flicker, though. Then a regular ACD should do the job for you, no?
 
Snowdog: What about that Sharp IGZO stuff? Somewhat confusing since I read they use it for both WOLED and TFT.


Isnt that a LED-lit IPS? Most probably without PWM flicker, though. Then a regular ACD should do the job for you, no?

I am shocked, I never researched this as I was told by some friends that is AMOLED, guess it is IPS after all. Guess any good quality glossy IPS with proper contrast should do it for me. :eek:

Guess I will see the proof of ag coating causing it for me when my glossy crossover arrives. CrossFingers!
 
I am shocked, I never researched this as I was told by some friends that is AMOLED, guess it is IPS after all. Guess any good quality glossy IPS with proper contrast should do it for me. :eek:

Guess I will see the proof of ag coating causing it for me when my glossy crossover arrives. CrossFingers!

well, you'd need an IPS screen with a A-TW polarizer to get rid of the IPS glow on large screens, on a small screen like yours you won't be able to notice it.
 
Snowdog: What about that Sharp IGZO stuff? Somewhat confusing since I read they use it for both WOLED and TFT.

IGZO is just a different material for transistors instead of amorphous silicon. It could be used in OLEDs or LCDs because both do use transistors to control display elements.

But much like both use power, it doesn't mean OLEDs and LCDs are combined. All the Sharp IGZO displays I have seen discussed are standard LCDs with standard LED backlights.

No one is putting an OLED back-light in an LCDs, because it would drive up costs significantly while offering no real visual benefit.

By the time OLED back-lights were inexpensive enough to use, real OLED displays will be even less expensive than OLED backlit LCDs. Thus making them pointless.

There won't be OLED backlit LCDs.
 
What about Sony Crystal LED?

AFAIK it is using non-organic LEDs, as used everywhere else.

No problems with uneaven aging, or wear - normal LEDs are quite long lived.

So, why bother with this organic LED nonsense?
 
What about Sony Crystal LED?

AFAIK it is using non-organic LEDs, as used everywhere else.

No problems with uneaven aging, or wear - normal LEDs are quite long lived.

So, why bother with this organic LED nonsense?

Image quality wise, an LED array like "Crystal LED" should be basically the same as OLED. LEDs are also a very mature technology, and I'm sure panel consistency and lifespan is much better with LEDs.

The first real problem is power consumption, which is much higher in LED displays.

The second, more serious issue is that you can only make LEDs so small. These Crystal LED displays are 1920x1080 at 55'', that's a really low DPI count. It's likely that it's theoretically impossible to scale LEDs down to monitor/phone sized displays. If it was possible to do so, manufacturures would have started work on shrinking LEDs long before investing into highly experemental OLED tech.

tldr; Crystal LED is a great tech for as long as it's relevant. Unfortunately it's likely to hit a brick wall as DPI counts continue to rise.
 
IGZO is just a different material for transistors instead of amorphous silicon. It could be used in OLEDs or LCDs because both do use transistors to control display elements.

But much like both use power, it doesn't mean OLEDs and LCDs are combined. All the Sharp IGZO displays I have seen discussed are standard LCDs with standard LED backlights.
I never thought they were combined. I was merely curious about IGZO and how it related to OLED. Directed it to you, since you know your stuff.

IGZO OLED: http://www.engadget.com/2012/06/01/sharp-igzo-oled-lcd-sid-2012/
 
I never thought they were combined. I was merely curious about IGZO and how it related to OLED. Directed it to you, since you know your stuff.

IGZO OLED: http://www.engadget.com/2012/06/01/sharp-igzo-oled-lcd-sid-2012/

Amorphous Silicon, or a:Si used in current LCD monitors does not have enough carrier mobility to supply sufficient current to an OLED. Polycrystalline Silicon (LTPS) is used in current OLED devices but its cost is very high, and TV sized devices are impractical.

IGZO/Oxide has roughly 10x the mobility of a:Si, so it can drive AMOLED effectively. It is also compatible with existing LCD lines so companies can retool an existing LCD TV production line to AMOLED without heavy expenditure. This is what LGD is doing with its 55" OLED TV. It also allows more efficient and higher resolution with LCD compared to a:Si as transistor area can be reduced.

I received the EL9500 today. It lives up to the hype although I need to buy a colorimeter to correct the white point. Picture thread tomorrow, I hope.
 
I've been impressed every time I've seen a device with an OLED screen.

The cost of any decent (IPS, VA) LCD panels is still higher than most people are willing to go for a computer monitor though, so that will certainly limit the adoption of newer, even more expensive technologies for PC monitors.
 
Depends on what you call "quality". The kind of quality people who are looking to OLED usually want is "true" blacks. No 0.10 cd/m², no 0.03 but 0.00.

And for that, the best two monitors I could currently find are actually NOT expensive. More about them here

Expensive monitors may be great for accurate color reproduction for print purposes and such but I doubt that entertainment (which is why most people would spend a little more on a monitor) is designed with e.g. wide gamut displays in mind. Hell, even watching "The Dark Knight" the other day with my monitor calibrated to an L* response curve (which is supposedly a "great choice" because it mimics the human eye best), the shadows became so bright that the camera noise became way too dominant in the image. And of course the image loses in contrast compared to e.g. Gamma 2.2.
Which is why I think things like wide gamut and L* calibration may be great for professionals but not important when it comes to entertainment - and can even be negative.
 
For the issue of burn in, just remember that it's not an issue with the OLED displays found in phones and the vita. Maybe some other weird issues with "blotches", but I haven't seen burn in from the samsung phones i've played with. Phones should be really susceptible to burn in, right.
 
Here are some pictures I've taken. The pocket camera refuses to render sharply nor will it capture dark tones and highlights simultaneously, so I've given up. If it were up to scratch I'd make a new thread.

359ep35.jpg

Pulse, film
1ovntc.jpg

Genius Party
bdjehg.jpg

Nature's Great Events
2euuds3.jpg


Something interesting to note is that black is not infinite: When viewed in a dark room with my peripheral vision, there is a just discernible glow. It is still pulling something like 30K:1 CR.
 
Last edited:
Yeah, one of course can't show the picture quality through devices that can't capture the whole spectrum.
The only way would be to create an HDR picture using a camera with manual exposure. And uploading the 32-bit image.

But that's very unfortunate that it's not completely black in a darkened room... hopefully it's just this particular display and not an aspect of the technology in general.
 
But that's very unfortunate that it's not completely black in a darkened room... hopefully it's just this particular display and not an aspect of the technology in general.

It isn't unfortunate. Black is still jet black. It is so deep that it is not visible to the central, colour sensitive area of the retina in an environment excluding all light sources.
 
It is unfortunate for me because I don't care about "relative blackness". If I view a completely black image on my CRT projector, I don't have to talk about my retina, I can simply and confidently say that there is nothing visible on the wall. Nothing at all. Which is exactly what I hoped OLED displays could deliver.
 
CCD on maximum exposure cannot detect AMOLED. 3000:1 PVA is on the right

2ibc0pi.jpg


Cursor visible
a3dr29.jpg


To say that AMOLED glows in the dark would be to misconstrue me.
 
Do you have a reliable light meter? I'd wonder how it compares to displays like the Pioneer Kuro or the Sharp Elites. The Pioneer Kuro (9gs) glow in a dark room too.. but they also can pull off something like a 30, 000:1 real static contrast ratio black level is around 0.0031 cd/m2. Honestly I don't see how anyone would need anything darker than that, when there is even a minimal amount of light on the screen it will look completely black even in a dark room. I'd also be interested in the maximum brightness as this is problematic with plasmas. Can they hit the same 300cd/m2+ levels as LCDs while retaining that low black level or are they more limited like plasma in the sub 150cd/m2 range
 
Last edited:
It is unfortunate for me because I don't care about "relative blackness". If I view a completely black image on my CRT projector, I don't have to talk about my retina, I can simply and confidently say that there is nothing visible on the wall. Nothing at all. Which is exactly what I hoped OLED displays could deliver.

I owned a 34" CRT HD Pananonic TV. It could do what you're saying but whenever any light source came on the screen there would be blooming. So even though it had a theoretically infinite contrast ratio the ANSI contrast was very poor. Moving on I got my current plasma and let me tell you there is no comparison . Light would always spill into the black bars of movies on my old TV, any bright object in a scene would ruin the experience. From what I understand this is no different from CRT projectors.

It is also important to remember that the OLED monitor we are looking at here is already outdated and not nearly at the same price level as what is going to be coming out shortly OLED *should* bring us infinite on/off and ANSI contrast.
 
To say that AMOLED glows in the dark would be to misconstrue me.

Well, I guess I will just have to see what you meant with that peripheral vision glow for myself then...
Thanks for the evidence photos though :)

Moving on I got my current plasma and let me tell you there is no comparison . Light would always spill into the black bars of movies on my old TV, any bright object in a scene would ruin the experience. From what I understand this is no different from CRT projectors.

That is quite possibly true, though I don't think I've ever noticed it. The only thing one does of course notice is the natural indirect illumination on the wall. But projecting directly onto said wall (no screen edges), I am pretty sure I couldn't see where the projected image ended and the wall began (in black areas).
Can't try it out now though, since I unfortunately currently live a couple of hundred kilometers away...
At any rate... spill or not, I remember watching Equilibrium for the first time on it, where the main character stands in a completely darkened room and you hear people whispering from various sides... sitting there in complete darkness, I actually turned my head, thinking somebody was talking outside of my apartment (although I suppose THAT was more thanks to the combination of great recording and headphones). Quite a breathtaking moment. And when he then fired his pistols in the dark... spill around those muzzle flashes - not necessarily a bad thing... ;)

But... should I remember this thread once I visit my parents again in a couple of months or so, I'll check and take some pictures.
 
Last edited:
I have to admit when movie makers would fade the screen to black artistically there was nothing like a CRT though :eek: Look forward to the pics!
 
Do you have a reliable light meter? I'd wonder how it compares to displays like the Pioneer Kuro or the Sharp Elites. The Pioneer Kuro (9gs) glow in a dark room too.. but they also can pull off something like a 30, 000:1 real static contrast ratio black level is around 0.0031 cd/m2. Honestly I don't see how anyone would need anything darker than that, when there is even a minimal amount of light on the screen it will look completely black even in a dark room. I'd also be interested in the maximum brightness as this is problematic with plasmas. Can they hit the same 300cd/m2+ levels as LCDs while retaining that low black level or are they more limited like plasma in the sub 150cd/m2 range

The 15" LG one I have is based around RGB WOLED and is rated around 200 cd/m2 full white. The 55" RGBW model should do 500 cd/m2 easily. Unlike PDP the contrast ratio is effectively free no matter the picture brightness level, but like PDP, peak white is brighter than full white. I think it might be regulated in software rather than current limitations.

StaticSurge said:
It is also important to remember that the OLED monitor we are looking at here is already outdated and not nearly at the same price level as what is going to be coming out shortly OLED *should* bring us infinite on/off and ANSI contrast.

I got it for AUD$622 shipped new from Amazon.de. The remaining stock of the production run is available for anyone here to buy. I see a lot of complaints about black levels and CRT replacements on this forum and well, here is the solution.

Also, if the residual emission is related to TFT leakage, then the metal oxide TFT of the upcoming AMOLED TVs should reduce the effect.
 
Also, if the residual emission is related to TFT leakage, then the metal oxide TFT of the upcoming AMOLED TVs should reduce it

Awesome. I can't wait to see what an infinite contrast ratio really looks like! If only I had the money for another TV in that price range!
 
I have to admit when movie makers would fade the screen to black artistically there was nothing like a CRT though :eek: Look forward to the pics!

I should also mention that my projector cost 300€ (used, obviously... but only for about 700 hours!!). That's also something to consider - at least for me.

This thread is actually about desktop use of OLED but that's why at least for home theater use, it'll probably be another 20 years until any technology gets far enough to rival a ~2000€ CRT projector that can do 1080p. At least based on my requirements.

Whoisthisreally said:
here is the solution.

Well... at least for me, a 15" screen doesn't cut it as a computer monitor. With widescreen (I mention that because up until recently, I used a 22" CRT at 1280x960), 23" and full HD are minimum.
Although... only for gaming... well, I'd actually get it instantly if it cost 100€. And I'd think about it until 200€. But above that, I'll stick with what I currently have ;)
 
Just as a quick aside to all this banter, it is interesting to bear in mind that what lots of people are aiming for is a movie to be rendered "as the director intended". This normally means as per the cinema, since that's what the director usually had in mind.

It's pretty ironic therefore to consider that the black levels in the cinema are actually pretty poor, even in a THX certified cinema. A Pioneer Kuro completely murders the black levels you see at the movies.
 
Just as a quick aside to all this banter, it is interesting to bear in mind that what lots of people are aiming for is a movie to be rendered "as the director intended". This normally means as per the cinema, since that's what the director usually had in mind.

It's pretty ironic therefore to consider that the black levels in the cinema are actually pretty poor, even in a THX certified cinema. A Pioneer Kuro completely murders the black levels you see at the movies.

Yeah, I've thought about this. None of the display technologies render in an interchangeable way; I do not think the widely used 2.2 gamma is intended to be applied to dark room viewing, but all of the review sites calibrate toward it. The gamma value of cinema is higher than 2.4, isn't it?

As consumer displays are as a whole arbitrary we may as well aim to simulate the display the film was edited on.
 
@ Chippy

They are? Well... I would love to see some actual numbers about that. Including what type of projection was used.

Because I was indeed very displeased when I e.g. watched Inception at the IMAX in Toronto because of the awful gray mess I was confronted with (and paid quite a lot for). But I'm pretty sure I never saw anything like it at a small, local arthouse theater in Austria.

Also - isn't THX really rather meaningless? It seems to me they just love to put their logo on ever other product out there. I mean... 5.1 speakers for 200 bucks or so THX certified? I'm sure that's great quality right there...
 
Just as a quick aside to all this banter, it is interesting to bear in mind that what lots of people are aiming for is a movie to be rendered "as the director intended". This normally means as per the cinema, since that's what the director usually had in mind.

It's pretty ironic therefore to consider that the black levels in the cinema are actually pretty poor, even in a THX certified cinema. A Pioneer Kuro completely murders the black levels you see at the movies.

Directors intent would depend on the director. I've never heard a director say that he wanted his movies to look washed out. Most cinemas these days are just cash cows and use outdated projectors.

I've seen the difference between a 800:1 display and a 30 000:1 display I have a lot of trouble believing what a director would want you to see was the first one.

As far as home standards go ISF has always considered contrast ratio of the uttermost importance.
 
@ Chippy

They are? Well... I would love to see some actual numbers about that. Including what type of projection was used.

Because I was indeed very displeased when I e.g. watched Inception at the IMAX in Toronto because of the awful gray mess I was confronted with (and paid quite a lot for). But I'm pretty sure I never saw anything like it at a small, local arthouse theater in Austria.

Also - isn't THX really rather meaningless? It seems to me they just love to put their logo on ever other product out there. I mean... 5.1 speakers for 200 bucks or so THX certified? I'm sure that's great quality right there...

From what I understand THX isn't meaningless but having a THX certified system does not mean it is reference it just means it meets certain standards for grayscale, color gamut, gamma and sometimes sound.

Anyways directors for the most part aren't sitting in cinemas watching their own movies they are using professional broadcast monitors which do have very good black levels. I think I remember reading an interview with George Lucas about widescreen HDTVs bringing the user closer to what he intended.
 
Yeah, I've thought about this. None of the display technologies render in an interchangeable way; I do not think the widely used 2.2 gamma is intended to be applied to dark room viewing, but all of the review sites calibrate toward it. The gamma value of cinema is higher than 2.4, isn't it?

As consumer displays are as a whole arbitrary we may as well aim to simulate the display the film was edited on.

I found this on AVS forum posted by UMR who has calibrated diplays by the hundreds and is probably considered one the best calibrators in North America.

I suspect if there is a change in the gamma spec to 2.4 it will be a huge mistake. It should be fixed at 2.22 or left alone. This is even more true as black levels fall as technology changes and near black is obscured by inadequate ANSI contrast.

How the encode/decode process uses gamma should not be changed. If it does it will be similar to deciding on when to use and not use SMPTE color space versus RGBs for DVDs. You will not know when to switch. There is a ton of gear already out there with what is an effective 2.2 gamma on the display side. Suddenly switching to using 2.4 will wreak things for little if any gain. You can already alter the gamma of an image in post if you want to enhance contrast.

It is obvious looking at sources that all most all are setup for a 2.22 gamma as Doug has said. If you look at the specs for the Sony reference monitors they are 2.22 gamma by default. I have also worked in post with a RED camera and a 2.22 gamma on the decode looks great while 2.4 is too dark.

Varying gamma with room light level does not appear to work to me. I would alter light output and black level instead.
 
From what I understand THX isn't meaningless but having a THX certified system does not mean it is reference it just means it meets certain standards for grayscale, color gamut, gamma and sometimes sound.

Anyways directors for the most part aren't sitting in cinemas watching their own movies they are using professional broadcast monitors which do have very good black levels. I think I remember reading an interview with George Lucas about widescreen HDTVs bringing the user closer to what he intended.

Doesn't matter that he is a nutcase then ?

1000
 
whoisthis,
ever notice any burn in?

It has only seen about 8 hours of video and I haven't observed any temporary IR nor burn-in. The software dims the set if a static image is left on. To test it I'd need a moving image with a fixed lit area, a colorimeter, and a good reason to again prove that emissive displays can be abused.
 
Back
Top