What do we think of Sony moving away from OLED and back to LED?

You still don't get it. The organic pixel will degrade with heat. On small area. it can go bright by sinking heat to the surrounding area or improve cooling but once you hit a large area, brightness will drop like crazy. all OLED are pretty much limited to about 200 nits at 100%. This is actually worst because you get 3000 nits at 3% but still drops to 200 nits at 100% so you get a much more dramatic drop as the bright area increase.
You cant prove those claims without testing or experimentation, like RTings extreme burn-in tests.
Saying OLEDs degrade is just as valid as saying humans approach death day by day.
Bruh...this is a TV. And what exactly did LCD's do to you? :ROFLMAO:
I guess this puts the "LCD is brighter" argument to rest. Feel free to point other advantages if you wish to discuss in good faith.
 
Not everyone is that sensitive to motion blurr and even that is if all you do in game with your monitor. a lot of larger monitor fills many roles now from a desktop TV for streaming media, providing a large screen real estate for work without having to use enlarge fonts, etc. a good QLED like a Samsung QN90B/C does a much better job overall that any OLED.
It's very easy to see the pixel response time issue on my 16" M2 Max Macbook Pro. Switching virtual desktops, scrolling a website...it's all very blurry in motion. You can live with it just fine for desktop use but that doesn't change the fact that it is worse response times than a 15 years old IPS display, all for extremely good HDR performance.

I'm hoping that in about 3 years when I'm due for an upgrade on this company laptop Apple has moved to OLED.
 
I guess this puts the "LCD is brighter" argument to rest. Feel free to point other advantages if you wish to discuss in good faith.

Umm LCD is brighter if you wanna compare flagship TV to flagship TV. The Mini LED TV's are hitting 4000-5000 nits on the flagship level my guy. But anyway I'm not trying to convince you or anyone that LCD is better at all, because I actually don't think it is when you look at everything overall. In fact I'll soon be getting a 32" QD OLED as my primary display.
 
It's very good for HDR...but has the absolute worst pixel response times you can buy. Like truly terrible. It is a blurry mess in motion for anything above 30 fps.
I just tried UFO test and thought I was having macular degeneration. That explains it!
 
You cant prove those claims without testing or experimentation, like RTings extreme burn-in tests.
Saying OLEDs degrade is just as valid as saying humans approach death day by day.

I guess this puts the "LCD is brighter" argument to rest. Feel free to point other advantages if you wish to discuss in good faith.

Those are not claims. They are facts well known in the industry. It is also one of the reason for the long delay to produce a 4K HDR OLED smaller than 48" while maintaining similar spec for the LG C series TV. Without a breakthrough in the organic portion of the pixel. the only way to increase larger area brightness is active cooling (fan. pumps, etc). Power is also another factor as the power/brightness ratio for OLED sucks compared to LED because the organic compound have more resistance by nature so it requires more power to produce same brightness. while that's not a problem when you're only achieving high brightness in small area for a short time, any dramatic increase in brightness in a large area will also use more power than a LED and require a larger power supply. That's the main reason while they have been able to increase peak brightness on OLED in the 2% to 25% range in the last 5 years, there's basically been minimal increase in brightness in the 50% to 100% range at all.
 
Last edited:
I know that most of you are not engineers so Ill give you a simple explanation of the OLED problem. Let say that one light bulb present 2% of the screen. so the whole screen matrix consist of 50 light bulbs. If we start with one 100 watts light bulb. Logic would say that the brightness level will increase in a linear fashion as we put in more 100W bulbs into the matrix and the end result is a matrix of 50 x 100 watts bulbs. But in the case of OLED. It'll be like we start with a 100 bulb and then as we add more bulbs into the matrix, we start reducing the power to each bulb in the matrix so at the end you end up with a matrix of 50 x20 watts bulbs. While this is also a problem with current LED technology as the 100% brightness is still below the peak but at least you can usually get over 800 nits on 100% on a HDR 1000 screen vs the 200 nits you see on current best OLED.
 
Last edited:
I think they're just tired of having their flagship being fundamentally a product made by another manufacturer. Both in terms of PR and margins.

If they give up on OLED completely it'll be a bit sad that we can't get QD-OLED with Dolby Vision anymore.

There is no indication whatsoever they've overcome blooming, just talk about how they want more peak brightness. Which is frankly marketing speak.
I feel like they are gradually using larger screens as time goes on and larger screens mean you sit further away and sitting further away means the local dimming zones are appear smaller and their in practice more imperceptible to the human eye lol. So if blooming is not noticable and HDR color brightness is outstanding then it makes sense to me. Also no burn in bullshit lol
 
Those are not claims. They are facts well known in the industry. It is also one of the reason for the long delay to produce a 4K HDR OLED smaller than 48" while maintaining similar spec for the LG C series TV.
No that has to do with fabs and investments. Each fab can only target a specific size of display (known as motherglass). Computer monitors have come last because it's the smallest segment.

The first OLED size that was invested into was cellphone sized. All of which have way higher PPI than a TV and at this point are capable of 1000 nits+.
Those were primarily done by Samsung first, years later followed by LG.

It then took forever to reach TV, but it was the safe bet because TV's sell way more to consumers than specialized monitors. And it only came here once the price for a top end TV came down to a "justifyable" price. While OLED tech was technically purchaseable as early as 2004, it was basically demo only at that point ($2500 for an 11" display). The first commercially available display in a size that people would want was by LG in 2012 at 55", it was only 1080p, and it costed $10,300. This market in which OLED TV's have become ubiquitos and are of reasonable cost has only existed for about the past 5 years.

It's finally come to desktop monitor size last because LG and Samsung have invested in the fabs necessary to build specifically monitor sized displays.
Without a breakthrough in the organic portion of the pixel. the only way to increase larger area brightness is active cooling (fan. pumps, etc). Power is also another factor as the power/brightness ratio for OLED sucks compared to LED because the organic compound have more resistance by nature so it requires more power to produce same brightness. while that's not a problem when you're only achieving high brightness in small area for a short time, any dramatic increase in brightness in a large area will also use more power than a LED and require a larger power supply. That's the main reason while they have been able to increase peak brightness on OLED in the 2% to 25% range in the last 5 years, there's basically been minimal increase in brightness in the 50% to 100% range at all.
I'm not the other guy, so I don't agree that OLED has equal brightness to LCD, particularly Mini LED. However I will say that your point has limited importance, because it's rare at best to have scenes in any game or movie in which you'd want to have max brightness across the whole display.

How many times in a game or movie do you want your eyes to be blasted with 2000 nits of light? That would mean the whole screen end to end is just white. While it's obvious that while full screen brightness is a "weakness" of OLED, they've designed all of the controllers to maximize OLED's strengths and minimize how much of a problem this is. It's rare to need more than 20-25% of the display to reach peak brightness at any given time. I'm doing some color grading in HDR for fun right now (Rec2020, st2084, 1000 nits), and I am grading very little into that zone top. Partly for impact reasons and partly because it's literally not enjoyable to be getting blasted by 1000+ nits even in a small area. A good chunk of this contains sunsets and the sun, and basically only shots in which the sun is directly visible do I even want it to hit 1000 nits. Otherwise for people, places, skin, etc - it's all mostly between 1-300 nits.

Most of the Hollywood film community is also doing similar. 90%+ of most scenes will be graded to 200 nits with perhaps some sections hitting 500. If it hits 1000+ there's usually a stylistic reason to want to do it or because something "should" actually be that bright.

While there is reason in the industry to have 5000+ nit displays, the reasons for doing so even in practice is for smaller and smaller windows for greater luminance variance. For the same reasons listed above, there will never be a movie that wants to blast its audience with 5000 nits end to end, unless they want to shock the audience with a movie depiction of a nuclear blast or something.
 
Last edited:
No that has to do with fabs and investments. Each fab can only target a specific size of display (known as motherglass). Computer monitors have come last because it's the smallest segment.

The first OLED size that was invested into was cellphone sized. All of which have way higher PPI than a TV and at this point are capable of 1000 nits+.
Those were primarily done by Samsung first, years later followed by LG.

It then took forever to reach TV, but it was the safe bet because TV's sell way more to consumers than specialized monitors. And it only came here once the price for a top end TV came down to a "justifyable" price. While OLED tech was technically purchaseable as early as 2004, it was basically demo only at that point ($2500 for an 11" display). The first commercially available display in a size that people would want was by LG in 2012 at 55", it was only 1080p, and it costed $10,300. This market in which OLED TV's have become ubiquitos and are of reasonable cost has only existed for about the past 5 years.

It's finally come to desktop monitor size last because LG and Samsung have invested in the fabs necessary to build specifically monitor sized displays.

I'm not the other guy, so I don't agree that OLED has equal brightness to LCD, particularly Mini LED. However I will say that your point has limited importance, because it's rare at best to have scenes in any game or movie in which you'd want to have max brightness across the whole display.

How many times in a game or movie do you want your eyes to be blasted with 2000 nits of light? That would mean the whole screen end to end is just white. While it's obvious that while full screen brightness is a "weakness" of OLED, they've designed all of the controllers to maximize OLED's strengths and minimize how much of a problem this is. It's rare to need more than 20-25% of the display to reach peak brightness at any given time. I'm doing some color grading in HDR for fun right now (Rec2020, st2084, 1000 nits), and I am grading very little into that zone top. Partly for impact reasons and partly because it's literally not enjoyable to be getting blasted by 1000+ nits even in a small area. A good chunk of this contains sunsets and the sun, and basically only shots in which the sun is directly visible do I even want it to hit 1000 nits. Otherwise for people, places, skin, etc - it's all mostly between 1-300 nits.

Most of the Hollywood film community is also doing similar. 90%+ of most scenes will be graded to 200 nits with perhaps some sections hitting 500. If it hits 1000+ there's usually a stylistic reason to want to do it or because something "should" actually be that bright.

While there is reason in the industry to have 5000+ nit displays, the reasons for doing so even in practice is for smaller and smaller windows for greater luminance variance. For the same reasons listed above, there will never be a movie that wants to blast its audience with 5000 nits end to end, unless they want to shock the audience with a movie depiction of a nuclear blast or something.

Yes but how long does an average cell phone last and how many hours are the screen on at high brightness in a day?

Most bright daylight scenes on 4K HDR movies average over 300 nits or higher. Maybe a bit less due to compression on streaming media. I have routinely pointed a light meter towards my old Q90T while watching 4K bluray movies. Go watch Dune and something with daylight desert scenes and you can average close to 400 nits. Same with a lot of large explosion scenes but those last under a second.
 
Yes but how long does an average cell phone last and how many hours are the screen on at high brightness in a day?
I would say a lot. For heavy users being at max brightness constantly isn't uncommon.

Why? The difference is use case, a cellphone has to be bright enough to be visible in direct Sun. When I'm shooting, I generally would say that the minimum brightness to be considered "daylight visible" is 1000 nit. Ideally higher. When shooting indoors, 300-400 nits is plenty. At night 100-200 is plenty.

That's very different than a TV, which ideally is in a light controlled environment.
Most bright daylight scenes on 4K HDR movies average over 300 nits or higher. Maybe a bit less due to compression on streaming media. I have routinely pointed a light meter towards my old Q90T while watching 4K bluray movies.
The very short: I feel like you're telling me is that "you agree".

Peaks are minimal at best. Averages are far lower. 300 nits full screen brightness is basically enough for watching movies with a useful max peak of 1000.

Everything else feels like "nit" picks.
Go watch Dune and something with daylight desert scenes and you can average close to 400 nits. Same with a lot of large explosion scenes but those last under a second.
I have, recently even. As well as Blade Runner 2049. I also have an AppleTV+ sub, which is pretty nice if your goal is to watch as much Dobly Vision/Atmos content as possible, as Apple maintains that standard across literally all of their programming. Even Ted Lasso is in HDR. Foundation has high production values so it's also useful/fun for TV testing. I assume it's similar on HBO Max, though I don't have a sub.
 
I would say a lot. For heavy users being at max brightness constantly isn't uncommon.

Why? The difference is use case, a cellphone has to be bright enough to be visible in direct Sun. When I'm shooting, I generally would say that the minimum brightness to be considered "daylight visible" is 1000 nit. Ideally higher. When shooting indoors, 300-400 nits is plenty. At night 100-200 is plenty.

That's very different than a TV, which ideally is in a light controlled environment.

The very short: I feel like you're telling me is that "you agree".

Peaks are minimal at best. Averages are far lower. 300 nits full screen brightness is basically enough for watching movies with a useful max peak of 1000.

Everything else feels like "nit" picks.

I have, recently even. As well as Blade Runner 2049. I also have an AppleTV+ sub, which is pretty nice if your goal is to watch as much Dobly Vision/Atmos content as possible, as Apple maintains that standard across literally all of their programming. Even Ted Lasso is in HDR. Foundation has high production values so it's also useful/fun for TV testing. I assume it's similar on HBO Max, though I don't have a sub.

You're not an average cellphone user then. not many people stream HDR media on their cellphone or use it as a screen all day and as I said, you can push the cellphone OLED as most cellphone last last than 4 years while TVs are expected to last up to 10.

Just put a OLED and a QLED in a room without blackout curtains. I've done both. I had at 77" LG CX for a month and promptly dumped it for my Q90T as it was unwatchable in the day time with normal light leaking shades. Pretty much the same when I tried a 42" C2 vs my QN90B

The biggest problem with OLED although it's mostly momentary is that as the bright area spread across a large area. The screen actually got dimmer as it goes. It's really noticeable.
 
You're not an average cellphone user then.
What I was saying about max brightness was in reference to normal people.
not many people stream HDR media on their cellphone or use it as a screen all day and as I said,
I personally don't. But I'm also not a social media-ite. Though I will say I don't use my cellphones for monitoring. I was just using monitoring as a point of reference for similar levels of brightness that normal people would want in a cellphone. And while doing running during the day, I for sure max my brightness out so it's usable in daylight and with sunglasses on.

It's well known that power users likely drain their batteries 1.5x a day (for all those kids that have their phones in their hands every second of every day). I don't get anywhere close to that.
you can push the cellphone OLED as most cellphone last last than 4 years while TVs are expected to last up to 10.
Maybe? I think 5-7 is fairly reasonable. It's more than possible that hanging out here and with other enthusiasts has skewed by view of "normal" for TV buying.
Just put a OLED and a QLED in a room without blackout curtains. I've done both. I had at 77" LG CX for a month and promptly dumped it for my Q90T as it was unwatchable in the day time with normal light leaking shades. Pretty much the same when I tried a 42" C2 vs my QN90B
I think there is some degree of preference there too. But you're basically describing the same issues that cellphones also must overcome, which is daylight and/or ambient light.

That also means you're likely not viewing HDR "in spec". If something was graded for 1000 nit, in order for you to overcome that with the issues you're talking about, you're likely intentionally increasing brightness. Because if you weren't, then obviously the difference between your displays wouldn't be substantively different because as we've discussed all of HDR is basically using 200-300 nit averages. And 200-300 nit "is the same" as 200-300nit; it does not change across displays.

If I've misunderstood, please feel free to correct, but as far as I can tell you're really just talking about tradeoffs you've taken due to the rooms your displays are in. We also just prioritize different things.
I like the motion out of OLED, as well as no-haloing, almost no overshoot, and pure black. The big advantage of Mini-LED is brightness. And I would argue it's about a tie for contrast (with different display examples winning on one side or the other). However considering my earlier statements about what is required for full screen brightness, it's "my opinion" that OLED has the least downsides.
 
What I was saying about max brightness was in reference to normal people.

I personally don't. But I'm also not a social media-ite. Though I will say I don't use my cellphones for monitoring. I was just using monitoring as a point of reference for similar levels of brightness that normal people would want in a cellphone. And while doing running during the day, I for sure max my brightness out so it's usable in daylight and with sunglasses on.

It's well known that power users likely drain their batteries 1.5x a day (for all those kids that have their phones in their hands every second of every day). I don't get anywhere close to that.

Maybe? I think 5-7 is fairly reasonable. It's more than possible that hanging out here and with other enthusiasts has skewed by view of "normal" for TV buying.

I think there is some degree of preference there too. But you're basically describing the same issues that cellphones also must overcome, which is daylight and/or ambient light.

That also means you're likely not viewing HDR "in spec". If something was graded for 1000 nit, in order for you to overcome that with the issues you're talking about, you're likely intentionally increasing brightness. Because if you weren't, then obviously the difference between your displays wouldn't be substantively different because as we've discussed all of HDR is basically using 200-300 nit averages. And 200-300 nit "is the same" as 200-300nit; it does not change across displays.

If I've misunderstood, please feel free to correct, but as far as I can tell you're really just talking about tradeoffs you've taken due to the rooms your displays are in. We also just prioritize different things.
I like the motion out of OLED, as well as no-haloing, almost no overshoot, and pure black. The big advantage of Mini-LED is brightness. And I would argue it's about a tie for contrast (with different display examples winning on one side or the other). However considering my earlier statements about what is required for full screen brightness, it's "my opinion" that OLED has the least downsides.

Yea, different priorities. I really don't care too much about spec if the picture is underwhelming in my viewing environment and I actually buy movies and Series in 4K HDR disc like the last season of Picard and Strange new World since I hate compression (especially the Audio) from most streaming services. I have 2nd dipped almost 80% of my Blu-ray movies that were remastered in 4K HDR. The OLED does have advantage in the area you mentioned but usually only if you watch for it. Most of the time when I watch a media, I'll be paying attention to the story/scene rather than trying to see if a Halo effect was present in a scene unless it jumps out at me which is super rare.
 
I hope it's true that Sony is working to produce something special with LCD again! I think the last time they really *tried* was on the venerable Z9D. Backlight master drive was a real thing but Sony quietly shelved it to save money when they shifted focus to OLED. The term was still allowed to be bandied about in future years even though it was not the real deal...but based on the strength of its legacy they knew it would help marketing.

I've had some OLEDs since then and without wanting to write a bunch of paragraphs I'm familiar with the pros and cons. I'm definitely open to going back to LCD if Sony is able to thread the needle here
 
Last edited:
Back
Top