LG 48CX

I don’t know what it is with me, but I found my 27uq to be vastly superior in regards to HDR than my C9 or recent Q90R. To me the HDR impact wasn’t even close. I’m wondering if it has something to do with the smaller screen. It just seems much brighter with HDR highlights.

That's interesting because I compared my GX to the Acer X27 (which is basically the same monitor as the Asus) and did not find it to be much brighter than the OLED. Blacklevels and contrast of course were better on the OLED.
 
Only in bright scenes. I have the Acer X27, basically Acer's version of a PG27UQ, and it's absolutely terrible in dark scenes due to FALD blooming.

Wouldn't call it terrible, more that its kind of what is expected. Lets remember that this is also an IPS panel with the advantages and disadvantages that means. I have yet to see a screen that handles text and similar better than the X27/PQ27
 
Last edited:
Wouldn't call it terrible, more that its kind of what is expected. Lets remember that this is also an IPS panel with the advantages and disadvantages that means. I have yet to see a screen that handles text and similar better than the X27/PQ27

Text is great which is why I'm still keeping it but now it's going to be retired as a desktop monitor instead of gaming.
 
I got my 55cx, ran a firmware upgrade (03.00.60), and both the pendulum demo (default checked g-sync when started, still worked windowed) and my eyes in games think g-sync works in 120Hz 4k 4:2:0. windowed and fullscreen. (also did a full clean video driver installation).
55" is borderline too tall, but I'm ok with it. I wouldn't recommend the 55" unless you're really sure it'll fit and it wont be too tall.

Game mode looks like garbage, no exaggeration, but it is "picture mode - game". In "picture mode - normal" it looks fine, and still allows instant game response etc, could it just be a misunderstanding and that picture mode - game is not necessary? It is just called "picture mode" after all. I can't Imagine LG forcing people to use that mode (it really looks as bad as some monitor game modes).

I do not feel any extra input lag with picture mode - normal and instant game response - on, but that doesn't mean it isn't there. I can test the input lag later or tomorrow. But most likely, game mode is just a trashy unecessary picture mode. Description in picture sounds like "instant game response" includes all normal game mode functions.

Ugg, so the game picture mode still looks terrible? I thought I read on AVS Forum that they fixed it. On my C9, I found the game mode was locked in "Extended" color mode and didn't allow "Wide" color mode which is much better for games. Being locked to "Extended" made everything look washed out and lifeless.

The big problem though on the C9 was the only way to get the super low input lag was to have the PC Input and Game picture preset modes simultaneously which had terrible picture quality. Any other picture mode outside of "Game", even with instant game response turned on, had additional input lag according to my high speed camera tests. I fear the CX may be the same...
 
  • Like
Reactions: elvn
like this
Ugg, so the game picture mode still looks terrible? I thought I read on AVS Forum that they fixed it. On my C9, I found the game mode was locked in "Extended" color mode and didn't allow "Wide" color mode which is much better for games. Being locked to "Extended" made everything look washed out and lifeless.

The big problem though on the C9 was the only way to get the super low input lag was to have the PC Input and Game picture preset modes simultaneously which had terrible picture quality. Any other picture mode outside of "Game", even with instant game response turned on, had additional input lag according to my high speed camera tests. I fear the CX may be the same...
I did not even try to tweak it, but the default settings game mode picture mode looked really bad.
I tested input lag with the 960fps mode in s9+, and had previously done the same for PG27UQ, and they have within reason the same input lag.

From finger impact on mouse to cs:go movement onscreen:
Full rendering chain cs:go 450 120Hz PG27UQ ~20ms
Full rendering chain cs:go 117fps cap nvidia panel lg cx - standard picture mode, instant game response on ~25ms
LTT baseline: 240Hz TN 15ms (and tn being ~2ms faster than IPS in PG27UQ, and 240Hz being average 2ms faster than 120Hz, the difference between these 3 tests, adjusted for inpug lag increasing settings, all 3 are equal within the margin of error). I posted this in more detail a couple pages back.

The reason i tested 117fps cap was because some review claimed VRR on increased input lag, which lg claims it shouldn't and it does indeed not seem to do.
The other test running 450fps uncapped lowers the input lag by a few ms so 20ms uncapped vs 25ms capped should be equal, or at most something like 2ms slower than the PG27UQ.
Since the results in standard mode were so good, i decided to be lazy and not even bother tweaking game mode, but I'll get around to it sometime soon.

Looked a bit at game mode now and it looks like it can be tweaked to look just like standard, for example, with the exception that it lacks 1 option: color temperature.
Copying the values from "standard", the mode that looks most accurate out of the box, makes game mode look fine. It just has pretty bad default settings.
 
Last edited:
The reason i tested 117fps cap was because some review claimed VRR on increased input lag, which lg claims it shouldn't and it does indeed not seem to do.
The other test running 450fps uncapped lowers the input lag by a few ms so 20ms uncapped vs 25ms capped should be equal, or at most something like 2ms slower than the PG27UQ.

Rtings redid their input lag tests.

1080p and 1440p @ 120 Hz7.4ms
4k @ 120 Hz11.6 ms
1080p with Variable Refresh Rate6.4 ms
1440p with VRR6.7 ms
4k with VRR11.6 ms

So input lag on these is a complete non-issue.
 
Rtings redid their input lag tests.

1080p and 1440p @ 120 Hz7.4ms
4k @ 120 Hz11.6 ms
1080p with Variable Refresh Rate6.4 ms
1440p with VRR6.7 ms
4k with VRR11.6 ms

So input lag on these is a complete non-issue.
Still suspicious that their figures are double what lg claims (which my own tests indicate are probably true) and that hardware-scaled 1080p and 1440p gives them lower figures, doesn't add up. Unless someone has a technical explanation for that? Some processing they forgot to turn off that simply takes less time in lower resolution despite hardware scaling? (some necessary background processing that can't be turned off?)
That would explain it actually, and there are a lot of little processing options here and there that are easy to miss.

I don't want to bash rtings, I appreciate the site, nor say my own test has perfect accuracy, but it seems off. Especially when they first bungle the test so it is 4-5 times higher input lag than it "should", now 2x higher.
Maybe 3rd times the charm? :)

I trust tftcentral best, but seems they haven't reviewed it.

[Edit]: https://www.flatpanelshd.com/review.php?subaction=showfull&id=1585020920 FlatpanelsHD have reviewed and measured input lag to equal values between 1080p and 4k as 13ms, though they do not specify if 60Hz or 120Hz. If it is indeed 60Hz as is most common, then lg's claims of 5ms and my tests line up decently with that too).
 
Last edited:
Rtings often correct their scores within approximately a year and a half after their initial testing. After that it usually settles to actual values.
 
Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped) as a minimum , no averages, in order to get 8.3ms per frame at 120fps (or 8.6ms at 115fps) for the upper limit of what the monitor can do. However a more realistic test would be 90 to 100fps average where people are using higher graphics settings on demanding games, relying on VRR to ride a roller coaster of frame rates.

-------------------------------------------
If someone is doing graphics settings overboard or has a modest gpu and cranks up the graphics at 4k resolution on a game so that they are getting say 75fps average, they would then be frame durations something in the ranges of:

......25ms / 16.6ms <<< 13.3ms >>>> 11.1ms / 9.52ms

at...40fps / 60fps <<< 75fps >>> 90fps / 105fps

------------------------------------------
https://win.gg/news/4379/explaining-tick-rates-in-fps-games-difference-between-64-and-128-tick
  • CSGO official matchmaking: 64-tick
  • CSGO on FACEIT: 128-tick
  • CSGO on ESEA: 128-tick

Valorant tick rates:
  • Valorant official matchmaking: 128-tick

Call of Duty: Modern Warfare tick rates:
  • COD multiplayer lobbies: 22-tick
  • COD custom games: 12-tick
While that sounds fast, many CSGO players have monitors capable of running at 144Hz. In simple terms, the monitor can show a player 144 updates per second, but Valve's servers only give the computer 64 frames total in that time. This mismatch in the server's information getting to the computer and leaving the server can result in more than a few issues. These can include screen tearing, a feeling like the player is being shot when protected behind cover, and general lag effects.
---------------------------------------------------

You'd think that a tick of 128 would be 7.8ms and a tick of 64 would be 15.6ms , but it's not that simple... (see the quotes below)

----------------------------------------------------


http://team-dignitas.net/articles/b...-not-the-reason-why-you-just-missed-that-shot

interpolation. When your game client receives a package from the server, it doesn’t simply show you the updated game world right away. This would result in everyone breakdancing in 128 or 64 tick intervals across the map. Rather, it waits a set interpolation time called “lerp”, whose name probably originated by a network engineer stepping on a frog.

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation


cl_interp = cl_interp_ratio / cl_updaterate
So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.


Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.

Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

Lag Compensation


The inevitable conclusion from the preceding segment and also the fact that all players on the server have a ping is, that everything you see on your screen has happened on the server already a few Milliseconds in the past.


Let’s leave any philosophical and Einsteinian implications of this to the side for the moment to focus on how a playable game is produced from this situation in which you don’t have to pre-aim your crosshair in front of the enemy.


The process responsible for this is lag compensation in which the server accounts for both ping and interpolation timings through the formula:


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)


Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.


Combining all of these factors - these tiny ms differences on the LG are really moot and especially if arguing latency regarding online (rather than LAN) gameplay and without using solid frame rates that don't dip below the max Hz of the monitor.
 
Anyone testing would have to make sure that they are running 120fps (115 or 117fps capped)
This is the most important for testing VRR for sure. If they just claim "120Hz" it's too sloppy a test to trust, thats why I specified exact settings, framelimiter, vsync off etc. I KNOW from my own test that there is no input lag problem with 117fps VRR, the rtings/flatpanelsHD, not so much.
 
Apparently Samsung is striking back hard with something called QD-OLED, which combines OLED tech with their quantum dot tech and may significantly reduce the burn in issue (I am thinking in terms of non-gaming PC usage). Also LED/LCD production is apparently being abandoned in favor of this newer tech. LG is apparently worried about this new tech.

I would not touch a QD-LED, or QLED or whatever, but Q-OLED? That is a horse of a different color!

If that is the case, we may see significant new options for desktop based displays, smaller sizes,ect, with OLED black levels and response times.

This looks like a killer display (every LG OLED I have seen is drool worthy) but I may follow QD-OLED more closely before pulling the trigger on this. Such a display would get me off the couch and and back into the office for gaming. I’d like to see maybe a little smaller size too.
 
Apparently Samsung is striking back hard with something called QD-OLED, which combines OLED tech with their quantum dot tech and may significantly reduce the burn in issue (I am thinking in terms of non-gaming PC usage). Also LED/LCD production is apparently being abandoned in favor of this newer tech. LG is apparently worried about this new tech.

I would not touch a QD-LED, or QLED or whatever, but Q-OLED? That is a horse of a different color!

If that is the case, we may see significant new options for desktop based displays, smaller sizes,ect, with OLED black levels and response times.

This looks like a killer display (every LG OLED I have seen is drool worthy) but I may follow QD-OLED more closely before pulling the trigger on this. Such a display would get me off the couch and and back into the office for gaming. I’d like to see maybe a little smaller size too.

QD-OLED is years away if it even ever comes to market. Samsung had prototypes last year but they were only showed privately behind closed doors because they sucked. They were super dim, not even as bright as LG's first generation of OLED tvs.
 
I'd rather they make a dual layer LED LCD (also called ULED) like top reference monitor manufacturers have switched to. ULED/Dual-Layer LCDs use a second LCD in monochrome as the backlight for near per pixel lighting and the second lcd layer also acts as an additional light filter/blocker for deeper blacks. The ones I've seen covered in articles use a 1080p monochrome screen behind a 4k screen, so one monochome lit pixel for every four on the 4k but that is still super tight and equates to 2,073,600 "backlights" instead of the 512 to 1500 zone FALD, or 25,000 zone mini LED. They were quoted at 3000nit peak and .0003 black depth for the consumer TV model hisense released in China. I'm not sure about the $25,000 + reference monitors other than they are listed as 1000nit peak. They do run hot, have large chasis for active cooling and they probably use a lot of power - so they have room to improve but they still sound promising and the consumer models reportedly would be pretty cheap to make.

These $1500 (potentially 1000 to 1300 in november) OLED CX, C9, E9 TVs are still the solid choice for HDMI 2.1 120hz 4k HDR VRR displays by end of this year. There really aren't many other hdmi 2.1 displays at all yet and with a good partnership with nvidia for VRR. One of those LG OLEDs will be my choice for end 2020, so I'll be running it 2021 - 2022+. At that point I'll start deciding between a more expensive VR/AR kit if they have advanced enough instead of dropping money on both types of displays and instead of going cheap on one or the other.
 
Last edited:
I don't know if LCD has anything to give anymore, even dual layered or some blackwhite oled backlight variant - it still has slow as hell responsetime unless they figure out how to combine g-sync/similar with strobing.
 
I don't know if LCD has anything to give anymore, even dual layered or some blackwhite oled backlight variant - it still has slow as hell responsetime unless they figure out how to combine g-sync/similar with strobing.
Dual LCD isn't ideal to gamers anyways. They are too reliant on local dimming, which causes high input lag.
 
I don't know if LCD has anything to give anymore, even dual layered or some blackwhite oled backlight variant - it still has slow as hell responsetime unless they figure out how to combine g-sync/similar with strobing.

Yup while having dual layer would help give LCDs good contrast ratio sort of like having millions of dimming zones would, it does absolutely nothing for response times.
 
Yes it would still have tradeoffs but they would also have no burn in so would be capable of very high hdr color brightness at near pixel level combined with inky blacks.... so no more dim/glow halos of low density FALD displays which limits their actual color brightness and loses details .. no hard ABL drops, sub hdr1000 color caps on narrow %windows, white subpixels, heat concerns or burn in concerns of OLED (burn in risk mitigation being the reason for lower brightness cutoffs and ABL in the first place). I'm really looking forward to my OLED by the end of the year but if a well performing dual layer LCD came along in a few years it could potentially make a good stop gap until something better comes along later like actual LED emitters which could be a long wait (even longer for expensive yet consumer priced). I'd have to see it reviewed of course.

---------------------------------------
The Q90R according to Rtings: 3.7ms at 80%, 9.6ms at 100% ...
" The Q90 has an outstanding response time. There's only a very small blur trail behind fast-moving objects, but there's significant overshoot in the 0-20% transition, causing some artifacts in dark scenes. "

The Q9FN according to Rtings: 3.5ms at 80% 15.5ms at 100% ..
" The Samsung Q9FN has an excellent response time. The blur in the photo is due to persistence, there is almost no motion trail. " < -- no mention of overshoot on the most difficult transitions here but I'm sure there are some

Where they suffer is the worse 20% of transitions where they can be as high as 22 - 30ms response time, which means some black trailing on some grey-black transitions as a trade-off. That trailing mainly visible if you are running 100fps or higher otherwise the sample and hold blur smearing would be just as bad, more or less masking it.

Both of those displays have about 6ms, 11ms, or 14ms input lag in game mode with VRR on, depending as they weren't both tested at 4k VRR. I'm assuming since there is HDR gaming that they local dimming was active.. in fact I think you have to go into a special menu to turn it off on some models.

So not perfect by any means but dual layer LCD would gain a lot of the oled per pixel SBS contrast and ultra black depth goodness along with very bright HDR color capability and detail-in-color that a FALD can't do with their blooming and dimming "halos". Also avoids restrictive "Simmer" levels of color brightness, harsh ABL reflex kicking on and off. That gains in several facets over both types of current HDR displays. I'd welcome the option, especially for my living room TV but I'd consider it on my pc depending if one were available in a few years. By then I'll potentially be looking at spending more on a better VR headset than a PC gaming pancake monitor though, depending how things go. :)
 
Can anyone confirm if HDR in Windows works at 4K 120 Hz 8-bit YCbCr420 on the CX?

You do not need 10-bit for HDR as Windows performs dithering at 8-bit. HDR will work perfectly at 4K 60 Hz 8-bit RGB without any banding. Many HDR displays are just 8-bit + FRC panels anyway.
While 8-bit HDR works in RGB and YCbCr444, I don't know if it works in YCbCr420 at 120 Hz. If it doesn't work on the CX, it still doesn't tell us if it's a Windows or LG limitation.
 
Last edited:
No, HDMI 2.0 does not have the bandwidth for 10-bit RGB at 4K. He is most likely running YCbCr 4:2:0 to get 120 Hz. You can only get a max of 4:2:2 at 60 Hz and 10-bit with HDMI 2.0.
You don't need 10-bit for HDR on PC. See above.
 
need is relative. 8bit even dithered on a native 10bit panel is going to be a non native color resolution which = some banding. Native uncompressed is best in all video and audio sources even if there are "ok enough to use" and "you won't notice it that often" signals available. You will have banding. Dithering adds noise to soften it a bit (at the cost of some clarity/detail) and increased view distance can help - but it's still there. If we don't have a gpu that can do it yet sure it's the best you can get for now and is usable.
 
  • Like
Reactions: N4CR
like this
need is relative. 8bit even dithered on a native 10bit panel is going to be a non native color resolution which = some banding. Native uncompressed is best in all video and audio sources even if there are "ok enough to use" and "you won't notice it that often" signals available. You will have banding. Dithering adds noise to soften it a bit (at the cost of some clarity/detail) and increased view distance can help - but it's still there.
I've tested RGB 8-bit dithering in madVR, 10-bit, and 12-bit on my OLED with a 16-bit lossless gradient test pattern. There is zero visible difference between them. If you switch off dithering on 8-bit, you can see the banding immediately. With dithering, it's visually identical to even 12-bit. I looked hard for dithering noise and could not see it even in the lossless static test pattern. In a noisy compressed 4:2:0 video, it's going to be completely invisible.

The dirty screen effect of the OLED panel was far worse than any noise or banding that might have been there.
 
It's so funny how we've all been using 6bit+FRC and 8 bit only LCD panels for years and years but now suddenly 8 bit+FRC isn't good enough :LOL:
 
It's so funny how we've all been using 8 bit LCD panels for years and years but now suddenly 8 bit isn't good enough :LOL:
A clean 8-bit signal is not good enough for HDR. That will result in visible banding. You need something that's effectively 10-bit, either natively or with dithering.
 
A clean 8-bit signal is not good enough for HDR. That will result in visible banding. You need something that's effectively 10-bit, either natively or with dithering.

Ya, just edited my post. I have an Acer X27 which isn't 10bit native but instead 8bit+FRC and it looks plenty fine to me.
 
Ya, just edited my post. I have an Acer X27 which isn't 10bit native but instead 8bit+FRC and it looks plenty fine to me.
To use 8-bit + FRC you need to send a 10-bit signal to the display, which exceeds both HDMI 2.0 & DisplayPort 1.4 bandwidth for 60 Hz & 120 Hz 10-bit RGB respectively. What I'm talking about is sending a dithered 8-bit RGB signal to the display, so the panel runs in native 8-bit mode without FRC.

I have the Predator X27 as well and I run it at 8-bit RGB 120 Hz with dithering. There is absolutely no point in running it at 10-bit YCbCr422 since it's an 8-bit + FRC panel anyway.
 
To use 8-bit + FRC you need to send a 10-bit signal to the display, which exceeds both HDMI 2.0 & DisplayPort 1.4 bandwidth for 60 Hz & 120 Hz 10-bit RGB respectively. What I'm talking about is sending a dithered 8-bit RGB signal to the display, so the panel runs in native 8-bit mode without FRC.

I have the Predator X27 as well and I run it at 8-bit RGB 120 Hz with dithering. There is absolutely no point in running it at 10-bit YCbCr422 since it's an 8-bit + FRC panel anyway.

Is dithering something you can enable from NVCP or does it require another app? And does it only benefit HDR content? I don't really do HDR gaming on my X27 because the FALD blooming drives me nuts.
 
Is dithering something you can enable from NVCP or does it require another app? And does it only benefit HDR content? I don't really do HDR gaming on my X27 because the FALD blooming drives me nuts.
You must leave HDR permanently enabled in Windows to use 8-bit RGB with dithering. You can set the "SDR content appearance" slider to 30 to get ~200 nits for SDR content. You must disable "Dynamic Tone Mapping" on TVs for the desktop to look correct.

This will benefit any application that uses the new DXGI formats for HDR and WCG. You can watch YouTube in HDR and view wide gamut images in Edge / Chrome instead of being limited to sRGB. Most UWP apps also support HDR and WCG.
 
https://www.benq.eu/en-eu/knowledge...oes-monitor-panel-bit-color-depth-matter.html
A little exagerrated and obviously marketing new monitors.. but still..
dAZ2uUj.png



..
Unaltered/native/uncompressed source material is always more desirable. The nearest you can get the better imo.

https://en.wikipedia.org/wiki/Frame_rate_control
FRC is a form of temporal dithering which cycles between different color shades with each new frame to simulate an intermediate shade. This can create a potentially noticeable 30 Hz flicker. FRC tends to be most noticeable in darker tones, while dithering appears to make the individual pixels of the LCD visible.[1] Modern TFT panels (as of 2020) often use FRC to display 30-bit deep color or HDR10 with 24-bit color panels.

This method is similar in principle to field-sequential color system by CBS and other sequential color methods such as used in Digital Light Processing (DLP).


70px-Green-cyan_plus_cyan_alternating.gif

The demonstration on the right mixes cyan and green-cyan statically (top) and by rapidly alternating the colors. (bottom) The size of the thumbnail has been reduced to reduce the flicker people may experience in their peripheral vision while reading the article. In a monitor that uses FRC the alternating colors would be more similar, reducing the flicker effect.



https://www.eizo.com/news/2020/03/04/eizo-releases-worlds-first-true-hdr-reference/

ColorEdge PROMINENCE are the first HDR reference monitors to overcome the severe drawbacks of other HDR technologies available to the market – ABL (Auto Brightness Limiter) and local dimming. They achieve a true HDR visual experience without the limitations of these technologies to ensure users always see accurate colors and brightness in every pixel.

The ColorEdge PROMINENCE CG3146 supports HLG (hybrid log-gamma) and the PQ (perceptual quantization) curve for displaying and editing broadcast, film, and other video content in HDR. The optimized gamma curves render images to appear truer to how the human eye perceives the real world compared to SDR (standard dynamic range).

The color and brightness of an LCD monitor can shift due to changes in ambient temperature and the temperature of the monitor itself. The ColorEdge PROMINENCE CG3146 is equipped with a temperature sensor for accurately measuring the temperature inside the monitor, as well as estimating the temperature of the surrounding environment. With this temperature sensing and estimation technology, the monitor adjusts in real-time, so gradations, color, brightness, and other characteristics continue to be displayed accurately. Furthermore, EIZO uses AI (artificial intelligence) in the estimation algorithm of the monitor so it can distinguish between various temperature changing patterns to calculate even more accurate correction. EIZO’s patented digital uniformity equalizer (DUE) technology also counterbalances the influences that a fluctuating temperature may have on color temperature and brightness for stable image display across the screen.

Additional Features


  • Single-Link 12G/6G/3G/HD-SDI and Dual- or Quad-Link 3G/HD-SDI
  • VPID support for SDI connections
  • HDMI and DisplayPort inputs
  • 99% reproduction of DCI-P3
  • 3D LUT for individual color adjustment on an RGB cubic table
  • 10-bit simultaneous display from a 24-bit LUT for smooth color gradations
  • Quick adjustment of monitor settings via front bezel dial
  • Light-shielding hood included
  • 5-year manufacturer’s warranty2
 
Last edited:
But I didn't say anything about HDR in the post you quoted.
It doesn't actually matter if HDR is enabled or not. Enabling dithering with 8-bit and retaining full colour resolution is still better than running 10-bit YCbCr422. With high quality dithering on 8-bit, you can get less banding than even native 10-bit on some panels. Without HDR you'll have to configure the NVIDIA / AMD driver to perform dithering, while with HDR, Windows enables it automatically.
 
I did not even try to tweak it, but the default settings game mode picture mode looked really bad.
I tested input lag with the 960fps mode in s9+, and had previously done the same for PG27UQ, and they have within reason the same input lag.

From finger impact on mouse to cs:go movement onscreen:
Full rendering chain cs:go 450 120Hz PG27UQ ~20ms
Full rendering chain cs:go 117fps cap nvidia panel lg cx - standard picture mode, instant game response on ~25ms
LTT baseline: 240Hz TN 15ms (and tn being ~2ms faster than IPS in PG27UQ, and 240Hz being average 2ms faster than 120Hz, the difference between these 3 tests, adjusted for inpug lag increasing settings, all 3 are equal within the margin of error). I posted this in more detail a couple pages back.

The reason i tested 117fps cap was because some review claimed VRR on increased input lag, which lg claims it shouldn't and it does indeed not seem to do.
The other test running 450fps uncapped lowers the input lag by a few ms so 20ms uncapped vs 25ms capped should be equal, or at most something like 2ms slower than the PG27UQ.
Since the results in standard mode were so good, i decided to be lazy and not even bother tweaking game mode, but I'll get around to it sometime soon.

Looked a bit at game mode now and it looks like it can be tweaked to look just like standard, for example, with the exception that it lacks 1 option: color temperature.
Copying the values from "standard", the mode that looks most accurate out of the box, makes game mode look fine. It just has pretty bad default settings.

Ok good, thanks. I couldn't believe that LG locked the color gamut in game mode to only "Extended" in Game picture mode on the C9. Some num-nuts over there made a mistake. I've got my 48CX pre-order through B&H.
 
Can anyone confirm if HDR in Windows works at 4K 120 Hz 8-bit YCbCr420 on the CX?

You do not need 10-bit for HDR as Windows performs dithering at 8-bit. HDR will work perfectly at 4K 60 Hz 8-bit RGB without any banding. Many HDR displays are just 8-bit + FRC panels anyway.
While 8-bit HDR works in RGB and YCbCr444, I don't know if it works in YCbCr420 at 120 Hz. If it doesn't work on the CX, it still doesn't tell us if it's a Windows or LG limitation.
I have not been able to make HDR work in 120Hz 4k 4:2:0. Not in desktop, not inside destiny2.
 
It doesn't actually matter if HDR is enabled or not. Enabling dithering with 8-bit and retaining full colour resolution is still better than running 10-bit YCbCr422. With high quality dithering on 8-bit, you can get less banding than even native 10-bit on some panels. Without HDR you'll have to configure the NVIDIA / AMD driver to perform dithering, while with HDR, Windows enables it automatically.
Right, but that still has nothing to do with what I said in the post you quoted.
 
I don't know if LCD has anything to give anymore, even dual layered or some blackwhite oled backlight variant - it still has slow as hell responsetime unless they figure out how to combine g-sync/similar with strobing.

I took my CX back at the last moment. I mean, after reading this thread, and reading nothing but concern and disappointment, it scared me enough to just rock out my Samsung NU8000 for another year or two.

This LCD looks great.
 
Is there a way to tell the exact video signal the TV is receiving at any moment in webos?
 
I took my CX back at the last moment. I mean, after reading this thread, and reading nothing but concern and disappointment, it scared me enough to just rock out my Samsung NU8000 for another year or two.

This LCD looks great.
People are just nitpicking possible pre-release flaws. I've used it for about a week now and I'm very happy.
Without a doubt in a league of it's own above the PG27UQ/acer x27/similar input lag-free FALD with g-sync. No other LCD can compare.
It's not perfect in every regard, but I have no doubt that the lg cx is the best overall gaming display ever made.
 
People are just nitpicking possible pre-release flaws. I've used it for about a week now and I'm very happy.
Without a doubt in a league of it's own above the PG27UQ/acer x27/similar input lag-free FALD with g-sync. No other LCD can compare.
It's not perfect in every regard, but I have no doubt that the lg cx is the best overall gaming display ever made.

Lol even if those flaws weren't there, people will always find something to nitpick about the LG. No matter what it's not going to be the perfect end all be all end game of displays. It's just another stopgap until we hit another major leap and right now, there is simply no better alternatives to what the LG offers and I can't wait to get mine as soon as they come in stock.
 
Lol even if those flaws weren't there, people will always find something to nitpick about the LG. No matter what it's not going to be the perfect end all be all end game of displays.
Relatively speaking, it's downright cheap. Perhaps Alienware will do a release at some point and extract the full gaming potential of the panel!

[and hopefully will ship a tub of lube to ease the burn from their likely MSRP...]
 
Back
Top