LG 32GS95UE – OLED 31.5″ with 4K @ 240Hz and 1080p @ 480Hz Support - The death knell of LCD panels

I love higher fpsHz. However I've always suspected that you'd get diminishing returns locally after 240fpsHz on the motion definition, motion articulation aspect (more dots per dotted line curve, more unique animation pages in an animation flip book that is flipping faster). For the blur reduction aspect, the sky is the limit to 1000fpsHz (or higher, but oled is theoretically capable of 1000Hz) though, so the more Hz (with the least tradeoffs) the better.



It seems like a very nice screen, and I can see where people would be happy with it. I just dislike when very high performance (locally) things are promoted as a 1:1 relationship to advantage in online gaming systems as they function now. Online games through servers are all lower rates (some very very low) and rubberbanding all of the time, it's just a matter of whether its a short enough rubberband for you to not notice it so much. Making the rubberband tinier and tinier locally isn't going to change the online latency, tick rates, biased interpolated results of the (online rather than LAN competition) game server's rubberband. Most testing of high fpsHz screens in online reviews showing the advantages is done locally or on a LAN, and often vs bots. I'm assuming your spheretrack was done locally in the same fashion, with the sphere being analagous to a bot.


995654_IMG_1010.gif


Still it's pretty cool. Can you run a tracker similar to that which is running through a remote online game server's interpolation code? and get a report card readout of the % accuracy/cursor-hits delivered back to you? Then compare that to the local/LAN version.

The thing is, in online gaming server systems - when you see your cursor on the ball a certain % at the center of the ball, or how quickly you got your cursor onto the ball - it's not necessarily how the online server is processing the result back to you, or even where the ball exactly "was" in the first place at any given time during it's directional changes, according to the server's rates and the server's final process.. So even if you are tracking things better locally that doesn't translate 1:1 to the online clockwork. a.k.a. rubberbanding, temporal shift, etc.

I'm curious of what your 240fpsHz vs 480fpsHz blind testing results would be across days of testing just locally even,
but I wonder what your results would be with some system to record how accurate to the ball you were able to be (at least considering what you were seeing vs what the server processed) at 240fpsHz solid vs 480fpsHz after the latent round trip and through a typical game server's biased processing over a bunch of testing days.
It would have to be blind testing of the fpsHz while doing each run, so no placebo or bias in your performance.
Might have to be multiple players, and who is quicker to the ball and then more accurate to the moving ball at any given time changes it's color for it to hash out like online gaming would be though, as servers usually do a balancing act with latency compensation, but how they do it depends on each game's server code design choices.
Not trying to be too argumentative in saying that, I'm genuinely curious about things like that.

If 480fpsHz locally lets you flow better regardless of how in sync it is to the server clockwork I can understand that, especially from a blur reduction aspect but I'm curious about the motion definition benefits past 240fpsHz solid, and especially in regard to online gaming dynamics and server code.



The detail provided by 4k for far away objects/opponents in games can be valuable. It might depend on the game's graphics (detailed graphics rather than cartoonish or older games), your graphics settings on that detailed game, and also the size and expanse of the game's arenas (large outdoor areas vs corridor shooters for example) - you'd be losing that 4k detail plus muddying the screen at 1080p more than a 1080p native screen would.

Are you somehow implying that "server workings" are going to negate how much better I can track targets when I'm playing at higher frame rates? Because I hope not lol.
 
I'm saying what you see is not what you get. So in a way, yes. You can see an object on your screen that doesn't mean it's where the server determines it is or was. Aka peekers advantage and rubberbanding, which can be extremely overt or subtle. You can't track what you can't see, and even if you tracked it perfectly locally that's not necessarily how the server determined the result, or even where the opponent exactly was on it's end. You aren't a 1:1 relationship to the server state, or it's biased results which can differ depending on how it was coded. The server tick rate is way lower than your fpsHz, and it's performing biased decisions besides that with latency compensation ~ interpolations. You would be much tighter playing in LAN tournament, and conversely, less accuruate overall online.
 
Oh boy ok at this point I don't think there's any point in continuing this conversation. You seem to believe that server behavior will cancel out any advantage to higher refresh rates so I'll just leave you to it. The rest of us who actually do play online games will know that this isn't the case.
 
Oh, if you're saying the real world gaming performance benefit of 480hz vs. a good 240hz is small, trust me I'm with you. I value very different things in a display vs. what say a paid, professional Counter Strike player values. I want to sit back and play big open world story games, sometimes with a controller.
 
You would be much tighter playing in LAN tournament, and conversely, less accuruate overall online.

. .

Oh boy ok at this point I don't think there's any point in continuing this conversation. You seem to believe that server behavior will cancel out any advantage to higher refresh rates so I'll just leave you to it. The rest of us who actually do play online games will know that this isn't the case.

. . .

I specifically questioned about blind testing 240fpsHz solid vs 480fpsHz playing in *online gaming servers* rather than locally vs bots and bot-balls, and rather than LAN competitions . . to see if any appreciable, measurable, advantage in online gameplay due to motion definition aspect increase, in this case, 480fpsHz compared to 240fpsHz (locally: more dots per dotted line path shape than 240 per second, more unique animations than 240 per second in a flip book flipping 240 pages/second).

I also mentioned that blur reduction can be an advantage, depending, so very high fpsHz could be valuable there, reducing the sample-and-hold blur of the entire game world during fast FoV movement to be less smeary or blurry so that you could identify opponents and things better while moving the viewport at speed. . 1000fpsHZ on an oled for example (using some super advanced frame gen probably in years ahead) would be great. That would be 1ms or 1 pixel of blur at 1000fps/second which would be amazing (kind of like BFI w/o suffering it's tradeoffs). However lowering the rez and/or playing non-native has potential to muddy detail so there can be a tradeoff there if sacrificing resolution for higher fpsHz.

So considering the above, I wouldn't say there is zero advantage to some high fps + high Hz rates in online gaming.

However it is questioning the measurable limit of appreciable local motion definition aspect gains, and especially as that pertains to while playing *online games* with network latency compensation, tick rates and server interpolation machinations. Where on the other hand we can never get enough fpsHz for the motion clarity/blur reduction aspect whether local or online.

Also mentioned that 4k detail could be a small advantage especially in high detail games with far view distance arenas and game settings, which also would be more valuable if the screen was blurring as little as possible while you move the whole gameworld-viewport around.

. . . .
 
. .



. . .

I specifically questioned about blind testing 240fpsHz solid vs 480fpsHz playing in *online gaming servers* rather than locally vs bots and bot-balls, and rather than LAN competitions . . to see if any appreciable, measurable, advantage in online gameplay due to motion definition aspect increase, in this case, 480fpsHz compared to 240fpsHz (locally: more dots per dotted line path shape than 240 per second, more unique animations than 240 per second in a flip book flipping 240 pages/second).

I also mentioned that blur reduction can be an advantage, depending, so very high fpsHz could be valuable there, reducing the sample-and-hold blur of the entire game world during fast FoV movement to be less smeary or blurry so that you could identify opponents and things better while moving the viewport at speed. . 1000fpsHZ on an oled for example (using some super advanced frame gen probably in years ahead) would be great. That would be 1ms or 1 pixel of blur at 1000fps/second which would be amazing (kind of like BFI w/o suffering it's tradeoffs). However lowering the rez and/or playing non-native has potential to muddy detail so there can be a tradeoff there if sacrificing resolution for higher fpsHz.

So considering the above, I wouldn't say there is zero advantage to some high fps + high Hz rates in online gaming.

However it is questioning the measurable limit of appreciable local motion definition aspect gains, and especially as that pertains to while playing *online games* with network latency compensation, tick rates and server interpolation machinations. Where on the other hand we can never get enough fpsHz for the motion clarity/blur reduction aspect whether local or online.

Also mentioned that 4k detail could be a small advantage especially in high detail games with far view distance arenas and game settings, which also would be more valuable if the screen was blurring as little as possible while you move the whole gameworld-viewport around.

. . . .

Well of course there are huge diminishing returns past 144Hz. 240Hz is a mild improvement but still one that I notice and very much appreciate, as for 480Hz well I don't actually have a 480Hz monitor so I can't comment but I would imagine the gain would also be just as small if not even smaller. I personally don't think I would play any better even if I upgraded to anything beyond 240Hz but there are certainly others who can get the most out of it. All of this "in theory" stuff is great but it doesn't always line up with real world performance. It seems like you're saying 4K 240Hz is the better option because 4K allows you to have more detail in motion vs 480Hz at 1/4 the resolution. Perhaps Optimum put too much emphasis on "seeing details" and motion clarity in his review because I think the main advantage of 480Hz isn't the raw motion clarity but rather just how much more of a "connected" feel you have with your PC and the game. Not really something I can explain well in technical terms but I think anyone who plays a lot of fast paced games would know what I mean by saying that higher refresh rate allows for that more "connected" and locked in feeling and thus allowing you to play better, at least up to the point of diminished returns which vary from person to person.
 
Well of course there are huge diminishing returns past 144Hz. 240Hz is a mild improvement but still one that I notice and very much appreciate, as for 480Hz well I don't actually have a 480Hz monitor so I can't comment but I would imagine the gain would also be just as small if not even smaller. I personally don't think I would play any better even if I upgraded to anything beyond 240Hz but there are certainly others who can get the most out of it. All of this "in theory" stuff is great but it doesn't always line up with real world performance. It seems like you're saying 4K 240Hz is the better option because 4K allows you to have more detail in motion vs 480Hz at 1/4 the resolution. Perhaps Optimum put too much emphasis on "seeing details" and motion clarity in his review because I think the main advantage of 480Hz isn't the raw motion clarity but rather just how much more of a "connected" feel you have with your PC and the game. Not really something I can explain well in technical terms but I think anyone who plays a lot of fast paced games would know what I mean by saying that higher refresh rate allows for that more "connected" and locked in feeling and thus allowing you to play better, at least up to the point of diminished returns which vary from person to person.

Yes that was one of my points more or less, that there were a lot of variables you would be trading off one way or the other. . and to beat the main point to death, very high fpsHz is probably more meaningful in local play and lan tournaments, like driving a high performance car on a racing circuit to get it's full performance.

Online is not what-you-see-is-what-you-get, so while you might feel the flow locally, what you are aiming at isn't always actually where you are seeing it as far as the gaming server is concerned. "Peekers advantage" is a glaring and most in-your-face example of that but that in-equivalency is happening for a lot of things dynamically in online game world's action states. You are always going back in time on gaming servers, so they are always rubberbanding at least a little, but then they also make biased decisions based on the code before deciding between different perspectives as to what "interpolated" event happens. Plus it's on a much lower tick than your local fpsHz to begin with.

. . . . .

This video shows a sony motion capture camera at 480 vs 240 on a bouncing tentacled ball. They compare a 960 as well earlier but I linked the timestamp at 480 vs 240. While they say higher maybe be useful for some recording some fast action sports, they are saying that in regard to it being useful to making slow motion clips of the sports action, not that it would look any better in "real time". Kind of obvious as none of the formats they would publish their material into in the end would be showing people 240, 480, 960 frames/second (unless converted to lower frames as slo-mo).


According to that video, I don't think it would be much difference in real time object motion definition that you are *watching* other than blur reduction cut by half.

However, blur busters did do a fast mouse movement test at 3840 pixels per second movement rate (e.g. mousing across a 4k screeen from end to end in 1 second) to track the stroboscopic aftershadow effect of a mouse and 480 was smoother to the pursuit camera. In real use it may be less obvious than the strobe camera/after shadow, but it does show a visible increase - so there may still be some juice to squeeze there out of 480fpsHz after all for mouse tracking - where it could make a difference especially locally and in LAN tournaments. That, and don't forget that the sample and hold blur is cut down by half as much as 240. So there are benefits no matter what.
Like I said though, online is a different system where it is likely less meaningful difference between 240-480 because what you are seeing and aiming at isn't necessarily exactly where it is/was as far as the server is concerned vs it's own slower clockwork and biased, interpolated "balanced" results delivered. You aren't getting a 1:1 relationship to the online game world, wysi not wyg. (That and lowering the rez and/or suffering non-native resoltuion instead of 4k can be a tradeoff vs other advantages in high detail, expansive arena games).



project480-mousearrow.jpg



This is the actual test if you happen to have a very high refresh rate monitor to run it on:

https://www.testufo.com/mousearrow
 
Last edited:
Well I just wouldn't use the 1080p mode because I already think 1080p at 24 inches looks like ass so I can't imagine how bad it's going to look at 32. If this monitor instead had a 1440p 360Hz mode with 1:1 pixel mapping (no scaling at all) then it would be far more enticing. As it is though, I would be paying a premium for a feature I would never use. Whether or not 480Hz will make any meaningful difference in online play can be debated to death. You can make your arguments about tick rates, rubberbanding, etc. but there will always be those who claim it makes a world of a difference. Since I don't actually have the monitor I won't comment on whether or not 480Hz mode is a game changer but I know for sure that going from 144Hz to 240Hz made a difference for me.
 
I probably wouldn't use the 1080p mode if I bought this monitor, but I'm tired of waiting for decent OLED to come out to replace my Predator X27. I had the LG 27in OLED that came out last year for a week or so, but couldn't really adjust to going down to 1440p and returned it. So I'll probably get this one and just pay the premium. One thing I hope they sorted out is the flicker with G-Sync and HDR enabled at the same time that would happen with the 27in at 240hz.
 
I probably wouldn't use the 1080p mode if I bought this monitor, but I'm tired of waiting for decent OLED to come out to replace my Predator X27. I had the LG 27in OLED that came out last year for a week or so, but couldn't really adjust to going down to 1440p and returned it. So I'll probably get this one and just pay the premium. One thing I hope they sorted out is the flicker with G-Sync and HDR enabled at the same time that would happen with the 27in at 240hz.

Are you talking about general VRR flicker? That won't ever be solved. My MSI QD OLED has it, my LG CX has it, and so does every other OLED in existence.
 
So that is an inherent issue with OLED?

Unfortunately yes. Maybe one day they will solve it but it's been years already so I'm doubting it. VRR flicker can at least be somewhat mitigated by maintaining a stable frame rate. If you have Rivatuner then you can actually visualize how frametime spikes correlate to the flickering.
 
Unfortunately yes. Maybe one day they will solve it but it's been years already so I'm doubting it. VRR flicker can at least be somewhat mitigated by maintaining a stable frame rate. If you have Rivatuner then you can actually visualize how frametime spikes correlate to the flickering.
That makes a lot of sense, the only game I actually had the issue on my 27" OLED was Hogwarts Legacy, it was probably because the frametimes in that game were all over the place.
 
Its not just OLED, LCD's also suffer from it to varying degrees. The only displays that from my experience are completely immune to it even in the absolute worst cases like shader comp or loading screens are those with hardware Gsync modules like the PG32UQX.
 
Its not just OLED, LCD's also suffer from it to varying degrees. The only displays that from my experience are completely immune to it even in the absolute worst cases like shader comp or loading screens are those with hardware Gsync modules like the PG32UQX.
I've only had displays with G-Sync modules over the last 10 years other than the short time I had the LG OLED, so that's probably why I wasn't aware of that.
 
Its not just OLED, LCD's also suffer from it to varying degrees. The only displays that from my experience are completely immune to it even in the absolute worst cases like shader comp or loading screens are those with hardware Gsync modules like the PG32UQX.

The Alienware AW3423DW has a hardware Gsync module but still suffers from VRR flicker. I guess you need LCD + Gsync module if you don't want any flickering.
 
The Alienware AW3423DW has a hardware Gsync module but still suffers from VRR flicker. I guess you need LCD + Gsync module if you don't want any flickering.
Oh yeah that's true but I do think it reduced it compared to the DWF model that had no Gsync module.
 
Unfortunately yes. Maybe one day they will solve it but it's been years already so I'm doubting it. VRR flicker can at least be somewhat mitigated by maintaining a stable frame rate. If you have Rivatuner then you can actually visualize how frametime spikes correlate to the flickering.

I think it's like MistaSparkul said, and that's probably because, from what I've heard, the gamma is set to the peak Hz of the OLED display. So if you have a 120hz display and you have a frame rate graph that is lower (where VRR is active and dropping the Hz to match it) - it'll lift the blacks some, but if your frame rate is varying a lot from high to low, and hitting "potholes", it will look like the blacks are flickering as the gamma rapidly changes along with VRR switching the Hz to match the frame rate.

Maybe if you set your screen to a lower Hz, low enough where 1/3 to 1/2 of your frame rate graph is above that cap, you'd only be getting the variance of the low end of the graph and it would be less extreme. Best case would be that your common low frame rate in the graph would match the peak Hz setting of the display, then it wouldn't vary at all outside of a few fps potholes, but most people want higher graphics eye candy combined with the highest Hz they can get with it, where they can lean on VRR to make the regular frame rate average's roller coaster frame rate graph~variance range smooth. Otherwise you could disable VRR and it probably won't happen anymore since the hz always stays at 120, but then you have to deal with stutter (though you could frame rate cap to avoid tearing). Another reason more advanced frame generation would be a good thing.

some games seem worse than others,and loading screens are goofy in general. It hasn't really bothered me personally but I try to keep relatively high frame rates too so maybe that helps.
 
Last edited:
I've only had displays with G-Sync modules over the last 10 years other than the short time I had the LG OLED, so that's probably why I wasn't aware of that.

Honestly the only true upgrade to the Acer X27 would be the Asus PG32UQX. I have a feeling that it will be the next "FW900", as in nothing will beat it at what it's good at for well over a decade. After the FW900 there was pretty much no other display that could best it in terms of motion clarity so people were just holding onto it while everyone was buying LCD monitors. From the looks of it, no monitor coming out in the near term will be able to trounce the PG32UQX when it comes to HDR performance. So while everyone else is buying OLED displays, the PG32UQX will remain in a league of it's own at what it's excels at and I'm sure it will continue to be sought out long after it's been discontinued just like the FW900.
 
I really think anyone considering the PG32UQX should try and get the ViewSonic version. Basically inaudible fan, better optimized overdrive and it is as bright as the near launch PG32UQX's unlike the recently manufactured ones where they gimped brightness a bit for who knows what reason.
 
There was one on eBay for $800 last week. Nobody in their right mind is paying $1700+ for these.
 
There are FALD gaming tvs that go very bright, near or at 2000nits in some small window, but some of the samsungs have aggressive ABL (on FALDs) - probably because they don't have active cooling fans on cooling curves like the ucg, ucx and a few other monitors do.

A big thing is still that you can get OLED in glossy. Matte rasises the black level in ambient lighting and can compromise the picture quality overall when that abraded outler layer texture/glaze is activated by ambient light.

There are other tradeoffs between screens, among other things (lcd response time, some ghosting, peak Hz, hdmi 2.1 port?, some fan noise potentially, etc.) - that FALD raises blacks around brightly lit objects and areas too, so the screen is un-uniform especially in HDR, with way less contrast in the those areas even when not outright "blooming" (though they do that too).

OLED with phosphorescent blue, meta/MLA micro lense array and othe advancements should continue to get better, higher brightness for longer than it has before too. At least in models that have that tech as it comes out. So it's won't be limited by the same amount as it has been so far.


ucx tftcentral review:

"The response times showed a few slower transitions in practice, especially for light to dark shade transitions which resulted in a bit of pale blurring to the moving image. At the very top end 144Hz the overall average response times were a little slower than the refresh rate window, and 60% of those measured transitions could keep up with the 144Hz frame rate properly. This results in a bit of added blurring to the image, another reason why the motion clarity isn’t quite as sharp as some other recent high refresh rate IPS screens.

pg32ucx_tftcentral_pursuit_comparison.jpg
 
Last edited:
QD-OLED has raised blacks too in ambient light to a much worse degree than matte. The only consideration between matte/glossy right now IMO is if you prefer to lose some saturation with matte during the day vs raised blacks with QD-OLED semi glossy. In the dark they basically look the same.


View: https://youtu.be/VaX_dEvO1FA?si=Nt5S_4ajedhjkzNI&t=326


ucx is slower response times so blurs more/ghosts. Plus 144fpHz blurs more than 240fpsHz as a starting point before even adding the response time ghosting. It also raises blacks/drops contrast of everything next to dark areas and objects in a non-uniform scene overall, and even outright blooms.

QD OLED does raise blacks though you are right. But matte will raise it too, and matte can compromise details a bit too when "activated" due to the textured abrasions and diffusion, and give you an overlayer look to the screen as well. There are still OLEDs , even 240HZ OLEDs I think, that are WOLED though. Edit: like the one this thread is about - - - https://tftcentral.co.uk/news/lg-32...de-oled-panel-full-specs-and-pricing-released

Plenty of tradeoffs to go around but 144hz with slow response times that cause ghosting, blur, plus FALD zone contrast dropping/non-uniformity and even outright blooming (+ matte) are some big ones on the ucx.


Not the most detailed testing done or anything, but this vid was dropped yesterday. He does mention (the unfortunate) matte abraded surface briefly. So even though it's not QD-OLED, you can't get away from the matte tradeoff on this WOLED model.

LG’s NEW 32” 4k 240hz OLED Monitor Review - The End Game (32GS95UE)​


View: https://www.youtube.com/watch?v=-VlNyYu1eUg
 
Last edited:
What I've discovered is that at times, if my glossy screen is struggling to output more than 400 nits while my matte screen isn't, I'm not going to care that it's matte.
 
I'll probably get this display. Might just pre-order it. But people need to quit calling these OLEDs "end game displays", we'll always get better displays in the future. OLED is still in it's infancy, the lighting will get brighter, the burn-in prevention will be more sophisticated, OLED will eventually be replaced with something superior. I buy a display expecting to keep it 3-5 years, not any longer than that. Not sure why reviewers keep calling them "end game".

This display does check off a lot of boxes. I'm glad it has HDMI 2.1 so G-Sync will work over HDMI instead of using DP1.4.
 
I'll probably get this display. Might just pre-order it. But people need to quit calling these OLEDs "end game displays", we'll always get better displays in the future. OLED is still in it's infancy, the lighting will get brighter, the burn-in prevention will be more sophisticated, OLED will eventually be replaced with something superior. I buy a display expecting to keep it 3-5 years, not any longer than that. Not sure why reviewers keep calling them "end game".

This display does check off a lot of boxes. I'm glad it has HDMI 2.1 so G-Sync will work over HDMI instead of using DP1.4.
VRR flicker is a huge issue, IMO. Can't believe they haven't solved it by now. I love OLED. But, man, VRR flicker is a real bummer.

Alan Wake II starts with a super low light/low contrast scene. It was the first thing I booted up on an OLED. The next scene after, is a slightly brighter, but still dim, forest area. And I was treated to a bunch of VRR flicker in there.
 
I'll probably get this display. Might just pre-order it. But people need to quit calling these OLEDs "end game displays", we'll always get better displays in the future. OLED is still in it's infancy, the lighting will get brighter, the burn-in prevention will be more sophisticated, OLED will eventually be replaced with something superior. I buy a display expecting to keep it 3-5 years, not any longer than that. Not sure why reviewers keep calling them "end game".

This display does check off a lot of boxes. I'm glad it has HDMI 2.1 so G-Sync will work over HDMI instead of using DP1.4.

I guess it would be endgame for the present time. Something better will come out but until they do then these are the endgame for the time being. Well....endgame for everything except HDR that is :ROFLMAO:
 
VRR flicker is a huge issue, IMO. Can't believe they haven't solved it by now. I love OLED. But, man, VRR flicker is a real bummer.

Alan Wake II starts with a super low light/low contrast scene. It was the first thing I booted up on an OLED. The next scene after, is a slightly brighter, but still dim, forest area. And I was treated to a bunch of VRR flicker in there.
If I encounter flickering I'll post it in this thread, I know some games to test, Hogwarts, Alan Wake 2, and if anyone knows of other games with flicker let me know I'll test them. I pre-ordered the display I should have it mid April (Week of the 15th).
 
If I encounter flickering I'll post it in this thread, I know some games to test, Hogwarts, Alan Wake 2, and if anyone knows of other games with flicker let me know I'll test them. I pre-ordered the display I should have it mid April (Week of the 15th).

COD MW3 is easily the quickest way to get flickering. If you haven't played the game in a while, it will start shader compilation upon bootup and will send any OLED display into a flicker madness.
 
LGs new OLED displays next year should finally have phosphorescent blue and alleviate a huge cause of burn in. That is end game for me.

RGB subpixel PHOLED with MLA+ that doesn't have any brightness nerfs? Sign me up :) Really hope LG can deliver on it as it's not set in stone yet but just in planning phase or design phase or whatever.
 
RGB subpixel PHOLED with MLA+ that doesn't have any brightness nerfs? Sign me up :) Really hope LG can deliver on it as it's not set in stone yet but just in planning phase or design phase or whatever.
I wouldn't expect phOLEd + MLA to not have *any* brightness nerfs at all, especially any model without a full panel heatsink and active cooling fans on cooling profiles, but they should be able to get very appreciable improvements. Hopefully on the brightness and duration of mids in HDR, specifically.

Even the ~2000 nit samsung gaming tvs have aggressive ABL and they are FALD. I'd guess that 4000nit TCL or whatever does too, just guessing. I'd have to look into that one.
 
I wouldn't expect phOLEd + MLA to not have *any* brightness nerfs at all, especially any model without a full panel heatsink and active cooling fans on cooling profiles, but they should be able to get very appreciable improvements. Hopefully on the brightness and duration of mids in HDR, specifically.

Even the ~2000 nit samsung gaming tvs have aggressive ABL and they are FALD. I'd guess that 4000nit TCL or whatever does too, just guessing. I'd have to look into that one.

Ok fair enough perhaps asking for no brightness nerfs is asking a bit much. I would just like to have at least 1000 nits for most real scene content and I'll be happy for a while. The current QD OLEDs are more like 300-400 nits for a lot of real content.
 
Back
Top