42" OLED MASTER THREAD

Well, now that I bought the close-enough-to-holy-grail-4k-TV (LG C2 42) , the only logical next question is: What year do you all predict bandwidth will be good enough and content will be plentiful enough that 8k displays will be an attractive proposition? 2027? 2030?
I thought we would be seeing more 8K screens by now but it seems nope, the above 4K sector is still a wasteland. HDMI 2.1 can handle 8K 60 Hz and possibly even 8K 120 Hz with DSC.

Maybe in 5 years we might have something?
 
I thought we would be seeing more 8K screens by now but it seems nope, the above 4K sector is still a wasteland. HDMI 2.1 can handle 8K 60 Hz and possibly even 8K 120 Hz with DSC.

Maybe in 5 years we might have something?

I just don't see the point of 8k for gaming. For desktop work yeah I get it higher resolution = higher PPI = crisper sharper text. But for games where we are at the point which rendering at 1440p with DLSS can provide a better image than actual native 4k, is there really much need for 8k? Seems like such a huge waste of GPU power to dedicate to pixel pushing when it can be used for rendering better images instead. Although even if we did have 8k screens we could probably just render at an internal 5k with DLSS quality mode and get just as good of an image.
 
I remember when there was a push for 4K in phones, but people abandoned it in favor of 1440p-ish screens because "It's enough", and it's been like that for years. I think we've hit that same plateau in TVs as well. 8K is nice to have, but it's extremely difficult to drive with very little benefit. 4K seems to be the sweet spot.
 
So I'm not the only one who thinks this is brighter and sharper looking even when set to the same res as the AW3423DW. The softness of the AW is not just a 1440p thing, it just looks soft due to the pixel layout to me.
Do you have Sharpness at the default setting (10) on the LG? Since that actually overshapens the image, the neutral setting is 0.

AW does not look soft to me personally but just mentioning that since it may be the reason the LG looks sharper.
 
Do you have Sharpness at the default setting (10) on the LG? Since that actually overshapens the image, the neutral setting is 0.

AW does not look soft to me personally but just mentioning that since it may be the reason the LG looks sharper.
Yeah I set it to the recommended 0 but even at 10 it's extremely difficult to tell apart from 0 unless my face is pressed close to the screen and I'm pixel peeping. Going up to like 25+ I can see sharpness changes more easily.

I dunno because I have my brothers PG35VQ to compare the AW with and even though it's a 35" vs 34", the AW looks noticeably softer. The softness I'm referring to is only noticeable to me with desktop use or text heavy stuff which I noticed right away the second I first powered it on without it being compared to anything. In games I wouldn't be able to tell without something side by side to compare like in that video.

On another note my friends C2 appears to be WBE and I disabled TPC and GSR to get rid of the ASBL which to me looks to activate even more aggressively than on the C1:

WBE.jpg


Also the 2000 hour forced pixel refresh is set to 500 hours on the C2 42" probably because they know it will be used as a monitor or gaming display.
 
Last edited:
Just got my C2 42 -- Amazing display... but have an ultrawide issue:

NVIDIA RTX 3090 Ti. I can set the desktop just fine to 3840x1600 at 60hz (TV resolutions) and 3840x1600 at 120hz (PC resolution) in the NVIDIA control panel with G-SYNC enabled for full screen and windowed. The desktop works great but everytime I go into true full screen mode in a game, the TV reverts to a full 16x19 4k image.

All scaling items in NVIDIA control panel is grayed out, scaling is being done on display, aspect ratio options in TV menu during image is only original, 16:9 or 4:3. None of them work correctly in ultrawide games.

Thanks in advance!


IMG_7454.jpg
 
I'd be allllll over an 8k monitor at 42" or less. I once had an ips 4k 24" and the ppi made it like looking through a window. 8k at 42" would be even higher ppi and glorious. I sold it because 24" was just too small for me for productivity :( and really, gaming too. Far cry 3 at 4k max gobbled up about 3.4gb vram for me back then on my sli gtx 970 setup.
 
I just don't see the point of 8k for gaming. For desktop work yeah I get it higher resolution = higher PPI = crisper sharper text. But for games where we are at the point which rendering at 1440p with DLSS can provide a better image than actual native 4k, is there really much need for 8k? Seems like such a huge waste of GPU power to dedicate to pixel pushing when it can be used for rendering better images instead. Although even if we did have 8k screens we could probably just render at an internal 5k with DLSS quality mode and get just as good of an image.
An 8K display at something like 40-50" size would be excellent for both desktop use and gaming. 8K makes integer scaled 1080p, 1440p and 4K possible. So you can have very sharp desktop UI while still retaining a lot of desktop space and then game at those lower resolutions for performance. Similarly DLSS can help with gaming performance.
 
Just got my C2 42 -- Amazing display... but have an ultrawide issue:

NVIDIA RTX 3090 Ti. I can set the desktop just fine to 3840x1600 at 60hz (TV resolutions) and 3840x1600 at 120hz (PC resolution) in the NVIDIA control panel with G-SYNC enabled for full screen and windowed. The desktop works great but everytime I go into true full screen mode in a game, the TV reverts to a full 16x19 4k image.

All scaling items in NVIDIA control panel is grayed out, scaling is being done on display, aspect ratio options in TV menu during image is only original, 16:9 or 4:3. None of them work correctly in ultrawide games.

Thanks in advance!


View attachment 474701
Try borderless window option to see if that works. And make sure your game is actually set at 3840x1600 as well. The Tomb Raider games I find can be sometimes picky about how they are set up.
 
Try borderless window option to see if that works. And make sure your game is actually set at 3840x1600 as well. The Tomb Raider games I find can be sometimes picky about how they are set up.
Hi there - Yeah, that works but a lot of games don't have borderless window options. I found that their are ultrawide options in the Game Optimizer menu on the TV but even with that, the results in true fullscreen are all over the place.
 
I just realized the softness I noticed on the AW is visible even on HDTV's comparison footage. Ladies face is a blur in comparison. I don't think this can 100% be contributed to 4K vs 1440p.

View attachment 474767

I have a 32 inch BENQ 4k ips monitor from 2016. Obviously it has much higher PPI than my current 48 inch LG CX OLED. However, to my eye the 48" still looks much sharper. Obviously the colors and motion are better on the OLED, which I'm sure contribute to the perception, but it also must have something to do with the matte coating.

Conclusion: I'm LG OLED TV convert; waiting on my 42 to get here....Probably won't buy a dedicated "monitor" anytime in the foreseeable future.
 
An 8K display at something like 40-50" size would be excellent for both desktop use and gaming. 8K makes integer scaled 1080p, 1440p and 4K possible. So you can have very sharp desktop UI while still retaining a lot of desktop space and then game at those lower resolutions for performance. Similarly DLSS can help with gaming performance.

I know but I was saying that from a pure gaming standpoint, 8k seems like a huge waste (Seriously will anyone be able to tell the difference in game between native 8k vs. 4k/5k with DLAA on screen sizes smaller than 55 without zooming in with a microscope?). But with more and more people using these large displays for both gaming and working I would say the demand for getting 8k OLEDs will probably increase over time.
 
Thought about it a lot and going back and forth on whether it would be better to get a mini-LED monitor or the C2 42. Weighed the pros and cons of both a standard monitor and a 42 inch OLED and determined that the OLED really is the best choice. I've been trying to follow the situation regarding the panel for the C2 42. Looking at the latest info, it seems that some panels have the latest EVO panel from LG and some do not, is that still the case?

If you purchase the C2 42 now, is there a way to determine if it has the latest panel or not? Or would it be better to wait until the end of the year (Q4) to purchase the C2 42?

A quick rundown would be greatly appreciated and apologies if this has been discussed before.
 
I have a 32 inch BENQ 4k ips monitor from 2016. Obviously it has much higher PPI than my current 48 inch LG CX OLED. However, to my eye the 48" still looks much sharper. Obviously the colors and motion are better on the OLED, which I'm sure contribute to the perception, but it also must have something to do with the matte coating.

Conclusion: I'm LG OLED TV convert; waiting on my 42 to get here....Probably won't buy a dedicated "monitor" anytime in the foreseeable future.

It's partially that the resolution is getting to the point of diminishing returns, but It's also OLED's infinite pixel to pixel contrast make it look much sharper.
Really it isn't OLED making an image look sharper, it's the LCD's limited contrast softening the image.

Check out this video where they do a blind test comparing an 8k LCD to 4k OLED.



And that wasn't even with the lights off. In a dark room OLED destroys
 
Hi there - Yeah, that works but a lot of games don't have borderless window options. I found that their are ultrawide options in the Game Optimizer menu on the TV but even with that, the results in true fullscreen are all over the place.
Rather than creating a custom resolution, try going to the 'Adjust desktop size and position' section of nvidia control panel. You probably want Scaling Mode = No scaling, Perform Scaling On = Display, and check the box for Override the scaling mode set by games and programs. On the Size tab of that same section, check Enable Desktop Resizing and click Resize....lower the Height to the minimum which will give you 3840x1620.

That should create a custom resolution that has always worked for me. Some games require that you set your Desktop resolution to it before starting them, and some will just see it as an option and use it either way.
 
I'm typing this currently on a 48" LG OLED CX that has been great, but I'm considering getting the 42" C2 for my desk as my bedroom TV is going to hell (an ancient 30-something inch 1080p with a huge bezel, backligts aroudn the edges are starting to die so there's a darker spot. Served me well enough for many years though, that old Samsung). and I can move the 48" CX over there. I've been considering the 42" OLED C2 but I was staying my hand knowing that QD-OLED would be coming out. Now that we see the only major QD-OLED Alienware 34" is not a 4K resoluition and there's some question about image quality and other issues, it seems like the LG C2 may be a better option even in terms of image quality.

However, that one Alienware isn't everything - I'm curious about the Asus 42" OLED, and surely there must be other "30-49" OLEDs and QD-OLEDs releasing soon? There's also of course all the "Mini-LED" monitors which seem to be the best of conventional backlighting, such as there's an ASUS ROG model that has 1000+ zones FALD, but between the extreme price (sometimes close to $3000) and the lack of the contrast/backlighting benefits of OLEDs, I'm not sure it stacks up, but I've not done a ton of research on the newest versions. So if there's anything else in the 30 to 49 range arriving soon that competes (especially I'd like to see a 4K apples to apples LG OLED to QD-OLED ) with the LG C2 let me know
 
Broke down and ordered the C2 off Amazon today. 42" is what I would consider the perfect size for the upper end of a large desk monitor. The 48" C1 offering just struck me as a little too big. I've been running a 32" 4K Acer (XB321HK) for the last few years... time for a nice upgrade!
 
I know but I was saying that from a pure gaming standpoint, 8k seems like a huge waste (Seriously will anyone be able to tell the difference in game between native 8k vs. 4k/5k with DLAA on screen sizes smaller than 55 without zooming in with a microscope?). But with more and more people using these large displays for both gaming and working I would say the demand for getting 8k OLEDs will probably increase over time.
I don't even disagree. I want 8K purely for desktop use because it opens up so many more DPI scaling options. Right now the only usable one on my LG CX 48" is 125%. Anything more is too large. 8K at the same size and I could probably pick between 150-250% or something so there's more flexibility for setting the text size vs available desktop space.
 
I'm typing this currently on a 48" LG OLED CX that has been great, but I'm considering getting the 42" C2 for my desk as my bedroom TV is going to hell (an ancient 30-something inch 1080p with a huge bezel, backligts aroudn the edges are starting to die so there's a darker spot. Served me well enough for many years though, that old Samsung). and I can move the 48" CX over there. I've been considering the 42" OLED C2 but I was staying my hand knowing that QD-OLED would be coming out. Now that we see the only major QD-OLED Alienware 34" is not a 4K resoluition and there's some question about image quality and other issues, it seems like the LG C2 may be a better option even in terms of image quality.

However, that one Alienware isn't everything - I'm curious about the Asus 42" OLED, and surely there must be other "30-49" OLEDs and QD-OLEDs releasing soon? There's also of course all the "Mini-LED" monitors which seem to be the best of conventional backlighting, such as there's an ASUS ROG model that has 1000+ zones FALD, but between the extreme price (sometimes close to $3000) and the lack of the contrast/backlighting benefits of OLEDs, I'm not sure it stacks up, but I've not done a ton of research on the newest versions. So if there's anything else in the 30 to 49 range arriving soon that competes (especially I'd like to see a 4K apples to apples LG OLED to QD-OLED ) with the LG C2 let me know

The only smaller OLED panels atm are the 42" LG and 34" Samsung QD-OLED. Samsung is making their own version of that but who knows when it will release and if it will have a pile of issues like their other monitors have. But the panel is the same so the pixel structure problems will be there and possibly the same antiglare coating.

For the 42" model the only one to look out for is the ASUS. It is so far the only model announced that it would have a heatsink allowing for possibly brighter HDR but I wonder if ASUS will reneg on that because the Philips models also specifically don't have a heatsink on the 42 and 48 inch models despite being posited as gaming monitors. ASUS is the only one to also offer Displayport which would be a welcome feature.

I think we will need to wait for next year for the bendable 42" LG was demoing recently and the rumored 49" QD-OLED, whether that is a 16:9 or super ultrawide panel.

I think the LG is really good as it is. The only things I would like changed (based on my CX), ignoring any possible future panel tech improvements:
  • Automatic static brightness limiter toggle in regular menus. Having to use service menu for that sucks.
  • Put that 4th HDMI port to the side rather than poking out the back. Even better would be a breakout box for connections like older Samsung TVs had.
  • Displayport connector. I'd sacrifice one HDMI for this if necessary.
  • Picture by Picture features. Split in halves vert/horizontally, split in 4x grid.
  • 120 Hz BFI as it was removed from the 42" model.
  • Ability to disable those "instant game response has started" messages.
 
I got my C2 from Costco and it has the same OLED module info as my friends indicating WBE.
Just wondering, are you going to be doing any kind of '42C2 vs AW-3423DW' head to head comparison?
I know they are very different screens, and would excel at different things, however let's say you needed a screen to do 50% text based work 8 hour work day and 50% FPS gaming what one would you choose?
 
Just wondering, are you going to be doing any kind of '42C2 vs AW-3423DW' head to head comparison?
I know they are very different screens, and would excel at different things, however let's say you needed a screen to do 50% text based work 8 hour work day and 50% FPS gaming what one would you choose?

Not SoCali, but I would choose neither option for that. I would get a separate setup with a proper RGB LCD monitor for text-based work, or I would go with the XG321UG or PG32UQX if I had to have a single setup for both work and gaming.
 
Just wondering, are you going to be doing any kind of '42C2 vs AW-3423DW' head to head comparison?
I know they are very different screens, and would excel at different things, however let's say you needed a screen to do 50% text based work 8 hour work day and 50% FPS gaming what one would you choose?
If I had to choose strictly between these 2 displays for work/play combination it would be the C2. I think it's text quality is better and it looks sharper while giving you more real estate for multiple windows. For games its far more versatile since you can just run 3840x1600 and get almost a 38" UW or 3440x1440 and get basically a 34/35" UW or if the games 21:9 support blows, use native 4K.

I honestly think HDR performance between the 2 in real world use case not involving test slides is so close that nobody would even be able to tell the difference without them side by side. I literally had to get a meter out to confirm if my eyes were telling the truth. They are either equal or one barely edges ahead of the other depending on game/movie based on ABL behavior. The C2's advantage here is that it's a TV so you can play with Tonemapping or HGIG based on content so it's again more versatile in handling all types of HDR games since so many now a days just don't offer competent HDR controls. This same TV aspect I think makes the color volume difference a smaller deal than people are making it out to be. Yes the AW provides more saturated reds for example but without having them side by side, it's hard for me to definitively say it's a improvement or just a different presentation/calibration. In that regard, with the C2 you can just pump the color setting to 55 or 60 and that perceived difference in saturation is basically gone.

The AW big advantage is the slight curve is nice on the eyes given the width. Taking the curve into account they aren't hugely different in width (12cm difference) but I definitely notice that it's more comfortable. The other big benefit is 175hz vs 120hz which I personally definitely benefit from for just desktop use but it's harder for me to tell the difference in games as even at 3440x1440, I rarely get north of 120FPS in many games with a 3090.

The one universal scenario where I think someone should buy the AW over the LG is if you require very high SDR/full field brightness for your environment or are scared of spending $6 on a service remote to disable ABSL. My C2 measures around 137nits full field and the AW does double that so if lots of white docs/pages are part of your work flow and you're in a really bright room all day the AW is the better choice. It's AR coating doesn't bother me since I'm not in a super bright room during the day (it turns a slight/mild grey).

Right now the biggest issues I have with the AW is that it looks soft and now that I've spotted it, the fringing on text even at a decent distance is visible. Second, my fan is pulsing at random which is super annoying. Lastly, I purchased a monitor for the sake of it being a monitor and sleeping/waking as it should which the AW fails miserably at. The AW in the grand scheme of things behaves no different than the C2 since it requires physically powering on after its done with a pixel refresh except with the C2 you at least have a remote to do so. To many this is a minor thing but it's a massive deal to me as I never touch the power button on a monitor after the day it's setup on my desk so it's a huge annoyance. Not to mention the rotation on my Ergotron stand is really sensitive so pressing the power button rotates the monitor every single damn time and I have to adjust it.

EDIT: Also keep in mind I'm not a super heavy FPS player, I play basically all genres. For strictly shooters the AW is probably the better choice (curve + 175hz) but keep in mind it actually has higher input lag than the C2 and I can see and feel it moving windows around on desktop.
 
Last edited:
If I had to choose strictly between these 2 displays for work/play combination it would be the C2. I think it's text quality is better and it looks sharper while giving you more real estate for multiple windows. For games its far more versatile since you can just run 3840x1600 and get almost a 38" UW or 3440x1440 and get basically a 34/35" UW or if the games 21:9 support blows, use native 4K.

I honestly think HDR performance between the 2 in real world use case not involving test slides is so close that nobody would even be able to tell the difference without them side by side. I literally had to get a meter out to confirm if my eyes were telling the truth. They are either equal or one barely edges ahead of the other depending on game/movie based on ABL behavior. The C2's advantage here is that it's a TV so you can play with Tonemapping or HGIG based on content so it's again more versatile in handling all types of HDR games since so many now a days just don't offer competent HDR controls. This same TV aspect I think makes the color volume difference a smaller deal than people are making it out to be. Yes the AW provides more saturated reds for example but without having them side by side, it's hard for me to definitively say it's a improvement or just a different presentation/calibration. In that regard, with the C2 you can just pump the color setting to 55 or 60 and that perceived difference in saturation is basically gone.

The AW big advantage is the slight curve is nice on the eyes given the width. Taking the curve into account they aren't hugely different in width (12cm difference) but I definitely notice that it's more comfortable. The other big benefit is 175hz vs 120hz which I personally definitely benefit from for just desktop use but it's harder for me to tell the difference in games as even at 3440x1440, I rarely get north of 120FPS in many games with a 3090.

The one universal scenario where I think someone should buy the AW over the LG is if you require very high SDR/full field brightness for your environment or are scared of spending $6 on a service remote to disable ABSL. My C2 measures around 137nits full field and the AW does double that so if lots of white docs/pages are part of your work flow and you're in a really bright room all day the AW is the better choice. It's AR coating doesn't bother me since I'm not in a super bright room during the day (it turns a slight/mild grey).

Right now the biggest issues I have with the AW is that it looks soft and now that I've spotted it, the fringing on text even at a decent distance is visible. Second, my fan is pulsing at random which is super annoying. Lastly, I purchased a monitor for the sake of it being a monitor and sleeping/waking as it should which the AW fails miserably at. The AW in the grand scheme of things behaves no different than the C2 since it requires physically powering on after its done with a pixel refresh except with the C2 you at least have a remote to do so. To many this is a minor thing but it's a massive deal to me as I never touch the power button on a monitor after the day it's setup on my desk so it's a huge annoyance. Not to mention the rotation on my Ergotron stand is really sensitive so pressing the power button rotates the monitor every single damn time and I have to adjust it.

EDIT: Also keep in mind I'm not a super heavy FPS player, I play basically all genres. For strictly shooters the AW is probably the better choice but keep in mind it actually has higher input lag than the C2 and I can see and feel it moving windows around on desktop.

Having noticebly higher input lag almost seems like it would defeat the purpose of having the 175Hz of the Alienware for fast paced fps. I'm not sure what's better, lower refresh rate + lower input lag or higher refresh rate + higher input lag.
 
Having noticebly higher input lag almost seems like it would defeat the purpose of having the 175Hz of the Alienware for fast paced fps. I'm not sure what's better, lower refresh rate + lower input lag or higher refresh rate + higher input lag.
Lower input lag at 120Hz is better since you will likely be hard pressed to actually notice the difference between 120 and 175 Hz most of the time.
 
It's partially that the resolution is getting to the point of diminishing returns, but It's also OLED's infinite pixel to pixel contrast make it look much sharper.
Really it isn't OLED making an image look sharper, it's the LCD's limited contrast softening the image.

Check out this video where they do a blind test comparing an 8k LCD to 4k OLED.



And that wasn't even with the lights off. In a dark room OLED destroys

I haven't watched the video but I'm guessing the OLED won? Beyond a certain resolution our eyes are more attracted to brightness and contrast than resolution. For most people, 4K is beyond that resolution barrier.
 
Having noticebly higher input lag almost seems like it would defeat the purpose of having the 175Hz of the Alienware for fast paced fps. I'm not sure what's better, lower refresh rate + lower input lag or higher refresh rate + higher input lag.
It's definitely noticeable between the two to me with the mouse but I dunno if it would be super apparent to everyone.


Screenshot 2022-05-19 130821.png
 
I haven't watched the video but I'm guessing the OLED won? Beyond a certain resolution our eyes are more attracted to brightness and contrast than resolution. For most people, 4K is beyond that resolution barrier.
Quality of Pixels > Quantity of Pixels.

The enhanced contrast, especially in dark scenes, is what makes OLED look so much more eye-popping, even compared to very bright, FALD LCD set.
 
Well, now that I bought the close-enough-to-holy-grail-4k-TV (LG C2 42) , the only logical next question is: What year do you all predict bandwidth will be good enough and content will be plentiful enough that 8k displays will be an attractive proposition? 2027? 2030?
We won't hit 4k running like 1080p or 1440p to at least 2027 2030 , 8k fluid and content distributed 2032 a decade. If you look at every resolution they all took a decade to advance while the news one advanced in tbe climbing ladder 🪜
 
Just ordered it from Amazon and it gets here a week from tomorrow (42" C2). Been using a Samsung 43" Neo Qled on a 2nd pc. It's been called "a bright Oled" in articles and it does look quite amazing for games with its FALD. So I'll see how they compare to each other for pc games and movie streaming. I know the C2 will have better contrast & blacks.
 
Last edited:
If I had to choose strictly between these 2 displays for work/play combination it would be the C2. [snip]

Have you been using the C2 for productivity? Is the (lack of?) white uniformity annoying for daily driver work? What SDR brightness are you keeping it at while in a productivity role?
 
Have you been using the C2 for productivity? Is the (lack of?) white uniformity annoying for daily driver work? What SDR brightness are you keeping it at while in a productivity role?
Yeah but I don't deal with much white only content (dark mode everything). Even if I did my panels white uniformity is pretty clean and not an issue at all. Nothing at all like that HDTVTest sample.

Setting SDR brightness is kind of tricky because if I calibrate for 100nits full field, smaller windows will get much brighter. If I calibrate for 100nits in a small window, full field brightness is super dim. I ended up just sticking with 80nits full field brightnes which is 35/100 and perfectly adequate for me.

Might sound super dim to some but when I'm staring at a display this size 6+ hours a day in a medium lit environment, it's necessary to keep it that low for the sake of preventing eye fatigue. When I game I just use windows shortcut for Auto HDR/HDR and that's it.
 
Have you been using the C2 for productivity? Is the (lack of?) white uniformity annoying for daily driver work? What SDR brightness are you keeping it at while in a productivity role?
To piggyback on SoCali's comment but I have the 48" CX, using a monitor this big with 100% size windows for anything that isn't an IDE, graphics/3D modeling/video/music editing suite or a game is foolish. Multiple windows side by side or stacked is the way to go. I can tell that with a 100% white window there is some uniformity issues that manifest as a tint when you go horizontally from center to edges but in real use this is never noticeable.

Like SoCali I also use dark modes where available. My display is at about 120-130 nits even for daytime use. I use MacOS in SDR mode (because MacOS external display handling is a POS and can't even detect this as a HDR display). In Windows I set the SDR brightness slider to 7% when running in HDR mode. This equated to 120 nits based on the Spyder 5 calibrator I had and visually looks similar to SDR mode but with slightly worse color accuracy. This might differ between specific screens.
 
In everyday use how much brighter is the C2 than the C1? I know it's like 200 nits brighter; But what does that actually look like in practice?
 
Last edited:
In everyday use how much brighter is the C2 than the C1? I know it's like 200 nits brighter; But what does that actually look like in practice?
If you're referring to the 42" that this thread is about it's basically the same brightness as the 48" C1.

The 55"+ C2's are like 50-75nits brighter in small highlights compared to last years 55 C1. Only the heatsinked G2 is significantly brighter.
 
Back
Top