LG 48CX

Everyone buying the C9, E9, B9, and CX will have HDR capability.. 10bit for movies especially since they aren't 120hz or 4:4:4 chroma (though 24fps movies x5 interpolates cleanly to 120hz so that can be useful). We will also get HDR capability in games and even at 120hz 4k 444 10bit HDR, even the CX assuming nvidia allows 10bit hdmi at least on it's 3000 series, (also assuming that it will have hdmi 2.1). The next gen of hdmi 2.1 consoles will have HDR too.

The only reason we are arguing about the visual leap in quality HDR is because a few people started saying HDR was a gimmick. For some reason, people are trying to ignore the huge visual benefits that a good HDR display has on available quality HDR material and upcoming HDR content over monitors that lack real HDR color volumes. So if anything, it was a sub-thread within this thread claiming no merit to HDR, calling it a gimmick, that received a flood of detailed testing examples, quotes from experts, visual examples including color temperature maps still and videos of popular movies using HDR color mapping, and user experiences including owners in this thread.

A lot of people have said HDR is a bigger visual benefit than 4k even. To me it's just another one of those growing pains. Certain people dig their heels in and tried to ignore the benefits of 16:9 over 4:3, 1440p and later 4k over 1080p, 120hz over 60hz, and g-sync/vrr over no variable frame rate.... also dvd over sd, 1080p over dvd, and 4k video content and displays... not needing a computer in a house was another one, not needing a separate internet service line, not needing a cellphone.. etc. etc.... I'm sure there will be some people that resist the move to AR in the future too. The world will move on without you. :LOL:
 
Everyone buying the C9, E9, B9, and CX will have HDR capability.. 10bit for movies especially since they aren't 120hz or 4:4:4 chroma (though 24fps movies x5 interpolates cleanly to 120hz so that can be useful). We will also get HDR capability in games and even at 120hz 4k 444 10bit HDR, even the CX assuming nvidia allows 10bit hdmi at least on it's 3000 series, (also assuming that it will have hdmi 2.1). The next gen of hdmi 2.1 consoles will have HDR too.

The only reason we are arguing about the visual leap in quality HDR is because a few people started saying HDR was a gimmick. For some reason, people are trying to ignore the huge visual benefits that a good HDR display has on available quality HDR material and upcoming HDR content over monitors that lack real HDR color volumes. So if anything, it was a sub-thread within this thread claiming no merit to HDR, calling it a gimmick, that received a flood of detailed testing examples, quotes from experts, visual examples including color temperature maps still and videos of popular movies using HDR color mapping, and user experiences including owners in this thread.

A lot of people have said HDR is a bigger visual benefit than 4k even. To me it's just another one of those growing pains. Certain people dig their heels in and tried to ignore the benefits of 16:9 over 4:3, 1440p and later 4k over 1080p, 120hz over 60hz, and g-sync/vrr over no variable frame rate.... also dvd over sd, 1080p over dvd, and 4k video content and displays... not needing a computer in a house was another one, not needing a separate internet service line, not needing a cellphone.. etc. etc.... I'm sure there will be some people that resist the move to AR in the future too. The world will move on without you. :LOL:

Funny examples given that 16:9 isn't really better than 4:3. Every aspect ratio is a tradeoff. You're losing horizontal visibility or vertical visibility.

Really, the best aspect ratio is probably 1:1 just because it's the best middle ground and would work well for different types of content.

 
most games use HOR+ and cgi suites typically use virtual cameras similarly, emulating real lenses. A wider aspect lens will always show more (HOR+) game world or film a larger scene. Once the resolutions (and desktop real-estate) are high enough it really just becomes a game of leap frog since you can run ultrawide on a 16:9 for example, or just resize your app window's length and width ratio to taste if you want it more like a piece of paper or a newspaper.
 
Funny examples given that 16:9 isn't really better than 4:3. Every aspect ratio is a tradeoff. You're losing horizontal visibility or vertical visibility.

Really, the best aspect ratio is probably 1:1 just because it's the best middle ground and would work well for different types of content.
For a one-eyed person, a square is probably close to ideal.
For people with two eyes normally spaced however, 16:9 is close to the physical ratio of our field of vision.

I tested my field of view a few years ago by standing fairly close a wall, moving my finger up, down and to the sides until i couldn't see it while looking straight ahead and marking each spot (i had that blue sticky stuff you stick posters etc with and stuck it on the wall when it faded from vision). The ratio of the rectangle that created for me, was, i shit you not, 16:9 + decimal change. You can try what your ratio is this way in a few seconds.

However, despite my own physical results, I prefer wider aspect ratios. I suspect this has to do with us evolving on a planet where up and down isn't as important (no huge eagles or monkeys hunting us as we evolved), but almost everything important happening in the horizontal plane/ground, so the brain probably evolved to better process the horizontal plane than the vertical.
 
For a one-eyed person, a square is probably close to ideal.
For people with two eyes normally spaced however, 16:9 is close to the physical ratio of our field of vision.

I tested my field of view a few years ago by standing fairly close a wall, moving my finger up, down and to the sides until i couldn't see it while looking straight ahead and marking each spot (i had that blue sticky stuff you stick posters etc with and stuck it on the wall when it faded from vision). The ratio of the rectangle that created for me, was, i shit you not, 16:9 + decimal change. You can try what your ratio is this way in a few seconds.

However, despite my own physical results, I prefer wider aspect ratios. I suspect this has to do with us evolving on a planet where up and down isn't as important (no huge eagles or monkeys hunting us as we evolved), but almost everything important happening in the horizontal plane/ground, so the brain probably evolved to better process the horizontal plane than the vertical.

There is no relationship between a human's field of vision and the ideal aspect ratio for a computer screen. That's a fallacy.

We're literally looking at a website right now that is basically newspaper/portrait aspect ratio.
 
There is no relationship between a human's field of vision and the ideal aspect ratio for a computer screen. That's a fallacy.

We're literally looking at a website right now that is basically newspaper/portrait aspect ratio.
Obviously there is an ideal aspect ratio as described above, though specifically for viewing video/playing games or other fullscreen content.
For reading/productivity its different for sure and probably just down to preference how you size your windows. Just about having as massive an area as possible and not running out of space for as many as possible custom sized windows in use.
 
Obviously there is an ideal aspect ratio as described above, though specifically for viewing video/playing games or other fullscreen content.
For reading/productivity its different for sure and probably just down to preference how you size your windows. Just about having as massive an area as possible and not running out of space for as many as possible custom sized windows in use.

That's why I'm saying that 1:1 is a good aspect ratio for general computer use. Of course if you're going to optimize for one specific type of content there's going to be a best option.
 
That's why I'm saying that 1:1 is a good aspect ratio for general computer use. Of course if you're going to optimize for one specific type of content there's going to be a best option.
Sure, I could very well see a 1:1 100" or so being ideal, and just running 21:9 or whatever preferred with black bars or spare space above/under for games etc. Just prohibitively expensive atm to make.
Im going to try games in 3840*1600 windowed on this when it arrives and see what can fit above/under in terms of video chat etc, same principle.
 
For OLED I would recommend using 16:9 as much as possible to avoid having burn-in.
Watching movies from time to time with black bars is fine but playing games with black bars sound like a bad idea...

doom , quake and tombraider with polygon smoothing were all dependent on glide support at first though.. and opengl. Which also required a pass-through add-on gpu to do it properly. But mainly, games needed to be patched to get polygon smoothing so it was content dependent in that respect.
What is "polygon smoothing"?
You mean bilinear texture filtering?
I hate this feature on early 3d accelerated games and whether possible disable it to get unfiltered look. It is too bad 3dfx did not allow to disable bilinear filtering system wide and I consider this biggest omission in their drivers.
 
Why do I get the feeling everyone in this thread is still waiting for their 48" CX:s? :D

Seems like there are delays all over now, in EU the date has been pushed back several times, from mid May to now something like end of June.
 
Why do I get the feeling everyone in this thread is still waiting for their 48" CX:s? :D

Seems like there are delays all over now, in EU the date has been pushed back several times, from mid May to now something like end of June.

Just a few days ago my local retailer said 3.6.2020 and now it's been pushed back to 30.6.
 
Just a few days ago my local retailer said 3.6.2020 and now it's been pushed back to 30.6.
It seems us dutchies were the first ;)

I’ve been following the debate on HDR and black probs etc here. For me the step-up from a CRG9 to this is a massive difference.

Been playing quite a bit of Hell Let Loose lately and the size, and local dimming is just so much better. HDR is also significantly better on the oled vs the CRG9 although that peak brightness was way better - the 10 dimming zones killed it in darker scenes.
 
It seems us dutchies were the first ;)

I’ve been following the debate on HDR and black probs etc here. For me the step-up from a CRG9 to this is a massive difference.

Been playing quite a bit of Hell Let Loose lately and the size, and local dimming is just so much better. HDR is also significantly better on the oled vs the CRG9 although that peak brightness was way better - the 10 dimming zones killed it in darker scenes.

I have a CRG9 on the desktop and a 65" C9 in the living room and I can agree with that. The LG is just a much better gaming display if you exclude support for custom resolutions. My C9 has refused to run at 3840x1600 fullscreen in some games, just showing a blank screen, borderless or windowed usually works. The CRG9 seems to work more effortlessly at custom resolutions.

How are you finding the difference in size, aspect ratio and resolution?

I feel the CRG9 is decently sharp at the roughly 0.5-0.8m viewing distance I have but would want to push a 48" further back to mitigate the size and larger PPI. I am considering ordering one but I really do enjoy working on my CRG9 with its PbP mode and the unique aspect ratio is pretty cool in games too. Tried to play Yakuza Kiwami 2 at 3840x1080 on my C9 the other day and was just not feeling it because I need to sit so much farther away from the TV since it's not curved etc. 3840x1600 seems to be the ideal ultrawide choice for these.
 
My CX is 20cm further back than the CRG9 i had. Also the DPI on the CX is less than the CRG9 but at this viewing distance, i experience no difference between the two.

Also custom resolutions do work on the CX. I tried 3840x1600 for Hell let loost at epic settings. It's limited to 60hz but you won't hit that reliably with a 2080 TI. So i switched back to 4k@120hz with medium/high settings to get a constant 90-100 FPS. Which is better :).

I like the height. It might depend on the game. Obviously for racing games UWHD is better. But for shooters such as HLL with many height differences (shooting from mountains) i think 16:9 is better.

I've noticed i don't really mind the non-curve on the CX vs the CRG9. That's also because the viewing angle and color repo on the oled is better than the VA panel.
 
Anyone that says HDR is a gimmick need to watch Gemini Man and the dark church fight scene. That immediately made a big impression on me. When we get games with UE5 and high quality cinematic assets with HDR, people are gonna be blown away. Actually, games will look much more impressive than even the best reference movies, simply because its higher fidelity.

Im really looking forward to seeing CX48 and how the smaller pixels affect image quality. There is just something very uncanny about sitting close to a 65" C9, you need to sit further back. Maybe its just that my sharp vision is better than 20/20, that i see things others dont notice. Sub 1080p content looks really, really bad on a 4k 65", thats for sure. With PC monitors i think 1080p gets uncanny over 21.5", even when you adjust viewing distance. Its kind of strange, may just be subpixel structure tricking me, but 24" 1080p looks bad. Im curious if this phenomenen exist with TVs from a few metres back. 4k @ 65" certainly looks soft to look at when you sit in a stress less sitting a few metres away. Dont trust the bullshit figures out there about viewing distance and sharpness.
 
Why do I get the feeling everyone in this thread is still waiting for their 48" CX:s? :D

Seems like there are delays all over now, in EU the date has been pushed back several times, from mid May to now something like end of June.

I got bored waiting, so I went ahead and bought the CX 55". No regrets.
 
I got bored waiting, so I went ahead and bought the CX 55". No regrets.

Same here but got the 55" GX instead. But if the 48" ever is available I might "downgrade" as I use it mainly as a desktop monitor. Only downside really is that I would like some more brightness than OLEDs can offer with ABL kicking in frome time to time (bright room that I have a softness for that "pop" :) ). Actually have a Samsung Q95T on order as well but honestlyt don't really feel it can be a match for the CX/GX. For living room and similar its OLED all the way though.
 
Last edited:
Same here but got the 55" GX instead. But if the 48" ever is available I might "downgrade" as I use it mainly as a desktop monitor. Only downside really is that I would like some more brightness than OLEDs can offer with ABL kicking in frome time to time (bright room that I have a softness for that "pop" :) ). Actually have a Samsung Q95T on order as well but honestlyt don't really feel it can be a match for the CX/GX. For living room and similar its OLED all the way though.

The problem with LCD TVs is their image quality is based on using shitloads of processing to make up for the technical limitations of LCD. If you enable game mode to get good latency the image quality is crap by comparison because they can't do any of their tricks to boost contrast.
OLED doesn't need any processing to look good. The only reason to use processing with OLED is to make imperfect sources look better, games and general PC use does not need that.
 
The problem with LCD TVs is their image quality is based on using shitloads of processing to make up for the technical limitations of LCD. If you enable game mode to get good latency the image quality is crap by comparison because they can't do any of their tricks to boost contrast.
OLED doesn't need any processing to look good. The only reason to use processing with OLED is to make imperfect sources look better, games and general PC use does not need that.

This is a double-edged sword.

I own both a Samsung Q90R and an LG CX. Both are very much flagship TVs, but both go about processing the picture very differently. The Q90R does an amazing job at cleaning up low bitrate sources. The upscaler is top notch. Unfortunately, this adds latency, so it is most definitely not for gaming. Combine this with eye-watering brightness, and you get an image that is absolutely brilliant in its presentation in 95% of situations.

The LG CX produces a superbly clean image, but because of the way OLED works, lower bitrate content looks subpar. This isn't because the CX does a bad job. It's just that the image on the CX is so clean that it exposes the imperfections of lower bitrate content (macro-blocking being the most serious offender), something that is smoothed over very well on the Q90R.

In gaming, both sets are superb, but I'd give the edge to the CX simply because of gsync and a slightly cleaner image in motion. As a computer monitor, the CX wins hands down due to contrast afforded by OLED.
 
The problem with LCD TVs is their image quality is based on using shitloads of processing to make up for the technical limitations of LCD. If you enable game mode to get good latency the image quality is crap by comparison because they can't do any of their tricks to boost contrast.
OLED doesn't need any processing to look good. The only reason to use processing with OLED is to make imperfect sources look better, games and general PC use does not need that.

My thoughts exactly, there seem to be a lot of compensating for X, Y and Z with LCD (sometimes called QLED). OLED is just more effortless.
 
In gaming, both sets are superb, but I'd give the edge to the CX simply because of gsync and a slightly cleaner image in motion. As a computer monitor, the CX wins hands down due to contrast afforded by OLED.

I assume that is gaming without the gaming mode that seems to be somewhat broken on the Q90R (and the Q90T)? Only thing I am really missing with my GX is a bit more brightness / less ABL. Burn in is still unknown for 2020 models as no one have had one long enough to get it even if it is still a problem I would assume.

For anyone interested in gaming on the the competition (Q90T)...what a shitshow (the TV that is, video is good and KG is a AVS member which seems to know his stuff).

 
RTings has just retested input lag on the CX and....low and behold the input lag is NOT 23ms lmfao.

1590519838251.png


Pretty sure that's tested at 4k60Hz and 4k120Hz with VRR will probably be in the 6ms range as it's suppose to be ;)

On a side note they have not redone LFC testing as that still shows that there's no VRR below 40Hz but madpistol has already proven that there is indeed LFC working below the 40Hz threshold.
 
The difference in motion clarity between a Q90R and CX is not what I consider slight. It's massive.

The Q90R takes almost 30ms to do 0-20% and does so with tons of overshoot. It's a total smearfest along with 120hz PWM causing duplications in motion.

I don't think people understand how huge the pixel response advantage of OLED is over LCD. It's clearly visible even between my X27 and C9 when I had both side by side.

Here is the Q90R vs C9.



My first question on this is what picture mode is being used on the Q90R? Both the Q90R and C9/CX have gaming modes which turn off a lot of picture processing and dramatically reduce input latency. Also, if that's the game that was used to test gaming (RDR2), that's not a very good example; on the Xbox One X, that's a 4k/30 game.

My main test is something like Forza Motorsport 7. I agree that the C9/CX does better, but the details is fantastic on the Q90R in motion... as long as game mode is enabled. If you turn game mode off, it becomes unplayable due to input latency.
 
RTings has just retested input lag on the CX and....low and behold the input lag is NOT 23ms lmfao.

View attachment 248384

Pretty sure that's tested at 4k60Hz and 4k120Hz with VRR will probably be in the 6ms range as it's suppose to be ;)

On a side note they have not redone LFC testing as that still shows that there's no VRR below 40Hz but madpistol has already proven that there is indeed LFC working below the 40Hz threshold.

BoomHeadshot.gif
 
  • Like
Reactions: N4CR
like this
My first question on this is what picture mode is being used on the Q90R? Both the Q90R and C9/CX have gaming modes which turn off a lot of picture processing and dramatically reduce input latency. Also, if that's the game that was used to test gaming (RDR2), that's not a very good example; on the Xbox One X, that's a 4k/30 game.

My main test is something like Forza Motorsport 7. I agree that the C9/CX does better, but the details is fantastic on the Q90R in motion... as long as game mode is enabled. If you turn game mode off, it becomes unplayable due to input latency.

That visual difference you are seeing has nothing to do with input latency but rather the huge difference in pixel response times between an OLED and VA panel. The much slower response times of the VA panel is what's resulting in that extra blur compared to the OLED.
 
That visual difference you are seeing has nothing to do with input latency but rather the huge difference in pixel response times between an OLED and VA panel. The much slower response times of the VA panel is what's resulting in that extra blur compared to the OLED.

Maybe it just doesn't bother me that much. Again, there's not doubt that image is clearer on the OLED in motion (100% sure this is due to pixel response), but I have no problems gaming on the Q90R. I would even argue that slower adventure/RPG games (like FF7 Remake) are better on the Q90R simply because the Q90R does better with extreme brights.

For fast motion games, there's no doubt the the C9/CX is better, but Q90R is not bad. The specs don't tell the full story. They both play games extremely well.
 
Maybe it just doesn't bother me that much. Again, there's not doubt that image is clearer on the OLED in motion (100% sure this is due to pixel response), but I have no problems gaming on the Q90R. I would even argue that slower adventure/RPG games (like FF7 Remake) are better on the Q90R simply because the Q90R does better with extreme brights.

For fast motion games, there's no doubt the the C9/CX is better, but Q90R is not bad. The specs don't tell the full story. They both play games extremely well.

Yeah slower games are definitely fine on the Q90R. Where VA really starts to show it's response time weaknesses is when you start going higher and higher in frame rates with fast games. At 120Hz playing fast games the difference between the two will be far more pronounced vs playing slow 30fps games where the VA panel has no trouble keeping up with 30fps content.
 
Quite an update from RTINGS as mentioned. Looks like they rushed the original review. So 120 Hz BFI works fine in "Game" picture mode. MPRT looks to be around 3.5ms on their oscilloscope, so quite good motion clarity. (About 2.3 times clearer motion than sample-and-hold 120 Hz mode).
 
June can't come fast enough :(
Fuck June. June for me is running without gsync due to my 1080 GTX. I want my 3080 TI! Of course, I am trying to suppress the thoughts of what I am going to do if AMD releases first and it significantly beats the 2080 TI.
 
If we go based off AMD's track record then they won't be significantly beating a 2080 Ti, they'll be lucky to even match it. But hey maybe this time RDNA2 can finally break the trend and offer up serious competition.
 
Quite an update from RTINGS as mentioned. Looks like they rushed the original review. So 120 Hz BFI works fine in "Game" picture mode. MPRT looks to be around 3.5ms on their oscilloscope, so quite good motion clarity. (About 2.3 times clearer motion than sample-and-hold 120 Hz mode).


3.5 is a little under the 4ms here. It's definitely a lot tighter but BFI has several trade-offs. BFI on max has been quoted as unusable on the CX. The ufo in the picture should show 35 - 50% dimmer than the rest of the ufo's for BFI too :cool:
349271_KlIRG0B.png


The PWM effect on your eyes even if "you can't see it", dulling of the screen color vibrancy, no HDR, visible flicker on max BFI, potentially not working right with VRR - rules out BFI for me other than a tech demo mode for easy to render games with massive frame rates.. but I'm glad people have options.

From the RTings Review:
"Update 05/26/2020: 120Hz BFI only works properly in Game mode. Since BFI isn't available when G-SYNC is enabled, to display a 4k @ 120Hz signal with BFI, you have to disable VRR from the source and manually enter Game Mode. In any other picture mode, 4k @ 120Hz signals skip frames, causing duplications when BFI is enabled. "

--------------------------

If I keep 100fps and higher I'll keep the sample and hold blur more to a fuzzy soften blur rather than smearing 60fps-Hz blur, and that is mostly evident on fast viewport panning of the whole game world... yet not viewport "flicking" like a gnat since that is so fast that it is more of a teleport from A to B. So more on the fast rolling pans and orbiting or active scanning of the game world moving the viewport around.

"G-SYNC stops working on all resolutions below 40Hz, and at 120Hz on a 4k resolution, there's some minor glitching.

For G-SYNC to work, the TV must be on 'Game' mode with Instant Game Response Time enabled.

Currently, FreeSync isn't supported, but if it's supported on a future firmware update, we'll update the review."

----------------------------------------


11.6ms is good. I'm assuming they ran 120fps solid at 120hz to test that so each frame was 8.3ms. HDMI 2.1 gpus might have some additional benefit, and potential firmware updates along with it, who knows. Most people would be lucky to get 100fps average not minimum on a game with demanding graphics at 4k resolution, then using g-sync/vrr to ride that frame rate roller coaster usually +/- 15 to 30fps from that average.

So say for example, a 100fps average frame rate graph:
70/85fps <<<---100fps avg. ---->>> 115/130fps (or 115capped)

That results in frames of
14.3ms/11.8ms <<<-----10ms---->>> 8.7ms/7.7ms


It would also reduce blur compared to a 60fps-60Hz baseline smearing blur:

~15% blur reduction? <<<------ 40% blur reduction ---->>> 50% blur reduction (max of 120fps at 120hz without BFI)


So 11ms of input lag isn't bad considering... and a lot of people will be over 10ms (well under 100fps) anyway in their framerates going for max still eye candy over high frame rate benefits, or using more affordable gpus, or both.

Very good news. That takes one strike off the checklist for me.
 
Last edited:
I highly doubt RTings tested 4k latency at 120Hz, pretty sure the test was done at 4k60Hz with VRR. 4k120Hz mode should see even lower input lag figures.

EDIT: Double checked RTings tables, they show a value for 4k120Hz and 4k60Hz with VRR and both clock in at 11ms. There is no value for 4k120Hz with VRR but I guess it would be lower since VRR has less lag based on their charts.

1590533324371.png
 
Last edited:
It's game mode with all the game specific interpolation off and a completely fair comparison.

The higher the games refresh rate the more smeary the Q90R becomes. RDR2 is a great example because even with all that artistic motion blur Rockstar added, the significant difference in motion is still visible. F7 just exacerbates this.

Many don't notice it but I saw the inverse ghosting from overshoot instantly in Shadow of the Tomb Raider and that was what made my bro return it and get a C9 instead last year.

IMO the Q90R and current Samsung line up are okay for someone who doesn't have an OLED to compare to but are terrible otherwise due to all the bad choices Samsung makes. 120hz PWM, poor FALD algorithm in game mode, super slow VA panel and peak brightness that it never achieves outside of test slides. It's only advantage is full field brightness over an OLED.

I do think the game mode interpolation is amazing when set low enough to minimize artifacts though. They are ahead of everyone else with that.

Lets' not forget that turning on game mode will turn your high end Samsung into something no better than a low end cheapo $500 VA panel. :ROFLMAO:

 
I think this may be the worthy upgrade to my 2015 samsung

Is there a benefit of getting the GX over the cx?
 
Last edited:
I think this may be the worthy upgrade to my 2015 samsung

Is there a benefit of getting the GX over the cx?

Oh, yes. I had the JS9000 and even my 2017 LG OLED is a massive step up.

Not sure on the GX - I need to read about it but maybe someone will step in sooner.
 
Oh, yes. I had the JS9000 and even my 2017 LG OLED is a massive step up.

Not sure on the GX - I need to read about it but maybe someone will step in sooner.
Really?!

I've been sooo torn as I love 3d. But pc use wise I did not even begin to see the issues until I started playing Valorant.

I have the JS series 55" was thinking maybe I could 'upgrade' to 65" but was not sure how it would look / be as a gaming monitor.

The g sync capability is really calling!
 
HDTVtest review of the CX.



edit*
I thought the rtings would have already been on here, guess not.

 
  • Like
Reactions: elvn
like this
Informative videos thanks. Took away some of my concerns with uncompressed audio formats. It supports artmos/DolbyTrueHD (atmos is the same with some extra encoding added for mapping the ceiling speakers), and PCM 7.1 uncompressed.. it doesn't support uncompressed dolby dts master audio but if your pc movie playback app, nvidia shield or similar type box, or bluray player/uhd player can send it as uncompressed PCM 7.1 you are still good. Sources should be able to pass your uncompressed DTS content on the fly as LPCM uncompressed 5.1 or 7.1.

Atmos support means it supports Dolby TrueHD too, but some older titles or encodings might still have DTS-MA (uncompressed DTS Master track). DTS-HD is a lossy HD codec. DTS Master Audio is a loss-less HD codec format (competing with Doly True HD) and is not supported even on pass-through via eARC.
https://www.soundandvision.com/content/how-do-dolby-truehd-and-dolby-atmos-differ
"Dolby TrueHD is a lossless audio codec that supports up to eight audio channels on Blu-ray Disc. ... While Dolby Atmos and Dolby TrueHD are two separate soundtrack formats, Atmos data on Ultra HD Blu-ray is actually an extension to TrueHD that is folded into the bitstream to maintain backwards compatibility. "

--------------------------------
Input Lag at 120hz 4k
==================

According to the updated test numbers from RTings, the input lag shouldn't be an issue anymore at ~11ms.
At 100fps SOLID you get 10ms frames, at 120fps SOLID you get 8.3ms frames.
If you cap at 115fps to avoid v-sync input lag on overages, you'd get ~ 8.7ms frames.

So say for example, a 100fps average frame rate graph:
70/85fps <<<---100fps avg. ---->>> 115/130fps (or 115capped)

That results in frames of
14.3ms/11.8ms <<<-----10ms---->>> 8.7ms/"7.7ms" (8.7ms capped, 8.3ms max 120fps)

So unless you are running over 100fps as a common minimum frame rate, that would never really come into play mathmatically. Even at 100fps min to 115fps resulting in 10ms to 8.7ms it would probably be undetectable. That doesn't even take into consideration online gameplay (rather than LAN gameplay) having isp latency and the game having latency compensation between player and server and other players no matter what your "ping" is.

-----------------------------------

VRR Black Levels - still lose ultra blacks
===================================

Hopefully the black levels with VRR will be addressed eventually:

https://www.avsforum.com/forum/40-o...9-dedicated-gaming-thread-consoles-pc-31.html

Yesterday:
"I've actually just noticed it for the first time over the weekend, and it really stuck out and smacked me in the face. It really bugged me. So much so, that I disabled G-sync for the time being. The game was Ori and the Will of the Wisps. The gamma is all jacked up when G-S ync is on, highly visible in the map screen as posterization in the gradients surrounding anything white (like an icon, or the cursor) on the otherwise pure black screen. The same thing happens during gameplay in the Mouldwood Depths area in game that takes place in almost pure darkness; nasty gradients surrounding the illuminated areas. Pure black is pure black (no glow) but it's obvious there's something wonky going on.

It looks perfect with G-Sync off."


It's not a problem, it's a "feature": .... (n)

"I guess for now, we can consider the raised greys (near blacks) as something along the line of 'Dark Boost' or 'Black Equalizer' on gaming monitors where they boost the near blacks to allow you to see those Dark Corner McDougals hiding behind dark corners in Call of Duty and other FPS games.
You can call it, LG's Gaming Feature for OLEDs if you want. biggrin.gif
Else, if it really becomes an eyesore, then we have no choice but to just disable G-Sync for now on certain games and just enable Game Mode instead."


------------------------------------------

Dolby Vision HDR black levels - still sounds like it loses ultra blacks depth
==============================================================

I found references to Dolby Vision elevating black levels and "upcoming manufacturer fixes" since Jan 2018 so this doesn't sound like a new issue.

January 2018:
https://www.highdefdigest.com/news/...elevated-dolby-vision-hdmi-black-levels/40738

2019 link position. Thread continues to present:
https://www.avsforum.com/forum/40-o...olby-vision-raised-blacks-issue-oleds-45.html

"tested internal Netflix app on cx and I get raised black on the Sabrina fade to black screen you mentioned above, So from my findings so far, DV raised blacks don’t just occur on the Apple 4K tv "

------------------------------------------

Long wait for me but time flies
==========================
I'll be waiting for the november deals and the 3000 series gpus to see if there is hdmi 2.1 and 10bit 120hz 4k 444 before I get a 55" C9 or 48" CX but it's good to hear more questions and issues are getting answered and raised, and that at least some of them are more or less absolved. I'm looking forward to more feedback from owners in this and other threads.
 
Back
Top