LG 48CX

Here's some interesting info on banding in PC mode, 12 vs 10-bit: https://www.avsforum.com/forum/40-o...aming-thread-consoles-pc-14.html#post59699346
Interesting at a glance, thanks. I'll dig in more a bit later but i am one of the people who unapoligetically quote HDRTest and other purists and review sites regarding pure source material, fidelity, and downgrades in features on different tvs and monitor (though not expert calibration like they are).

No matter how you spin it I'll always want 1:1 native wherever possible and native will always be the pure choice. There is lossless transmission (essentially), and there is lossy transmission whether it is a video signal or an audio singal. Otherwise signals are watered down by processing and compression... even if people try to say "it's fine" , "hardly noticeable" , "almost no noticable difference", "usually not visible /audible difference", or "to most people" to blow smoke at degrading the original source material.

.......Showing 8bit dithered on a 10bit panel is not native color resolution. (10bit/12bit over improvement over 8bit also matters in quality cameras btw).

https://dgit.com/4k-hdr-guide-51429/
When we talk about 8 bit color, we are essentially saying that the TV can represent colors from 00000000 to 11111111, a variation of 256 colors per value. Since all TVs can represent red, green, and blue values, 256 variations of each essentially means that the TV can reproduce 256x256x256 colors, or 16,777,216 colors in total. This is considered VGA, and was used for a number of years as the standard for both TVs and monitors.



With the advent of 4K HDR, we can push a lot more light through these TVs than ever before. because of this, it’s necessary for us to start representing more colors, as 256 values for each primary color is not going to reproduce nearly as lifelike images as something like 10 or 12 bit.



10 bit color


10bit.png



Source: 4k.com



10 bit color can represent between 0000000000 to 1111111111 in each of the red, blue, and yellow colors, meaning that one could represent 64x the colors of 8-bit. This can reproduce 1024x1024x1024 = 1,073,741,824 colors, which is an absolutely huge amount more colors than 8 bit. For this reason, many of the gradients in an image will look more more smooth like in the image above, and 10 bit images are quite noticeably better looking than their 8-bit counterparts.

.......Lowering chroma compresses the signal from the original fidelity of 4:4:4 source material too (but not in the the case of 4:2:0 movie material)
https://en.wikipedia.org/wiki/Chroma_subsampling
Digital signals are often compressed to reduce file size and save transmission time. Since the human visual system is much more sensitive to variations in brightness than color, a video system can be optimized by devoting more bandwidth to the luma component (usually denoted Y'), than to the color difference components Cb and Cr. In compressed images, for example, the 4:2:2 Y'CbCr scheme requires two-thirds the bandwidth of (4:4:4) R'G'B'. This reduction results in almost no visual difference as perceived by the viewer

Anything less than native video and native audio is going to be compressed and degraded. They are just trying to tell you how much degradation is "ok".
 
Last edited:
So I read through more of that article. Some very enlightening information. Thanks for the link.

=====================================================

There are reports that the LG CX only supports 444 chroma in pc mode, and that mode causes banding.

" For colored text, 4.2.2 makes it blurry, while 4.4.4 makes it clear. So for PC use, 4.4.4 really is much preferred. For video it totally doesn't matter, and for a lot of gaming it probably doesn't matter, but if the game has colored text content (and many do) then it can be a bit of a problem. "
-----------------
" I may now understand better what Vincent meant to say, which was that the TV accepts 4:4:4 outside PC mode (but then converts it to 4:2:2). "
------------------------
" there are those like me that sit 3-4 feet away from it or even closer, which definitely can tell the difference, source is other people viewing as well or friends w/ the same/ those here that have also chimed in on it. "
----------------------------
"I mean if that's the case then I guess I'll wait till next years mode, since 4:2:2 is definitely noticeable over 4:4:4 for PC use, anyone else that sits 7+ feet away though using it as a game console probably won't see the diff as many noted, just def for not up front PC use, having to compromise quality isn't what u spend this kinda money on."
-----------------------------
" According to the rtings review of the CX, the CX no longer has full chroma support for 1080p@120Hz, which is quite odd since even the C8 has it! "

--- We'll have to see how it operates overall off of hdmi 2.1 4k 120hz regarding 10bit 444 on the 3000 series I guess but I think what I'm reading is that you can't get 444 chroma in game mode currently, or in any mode other than PC mode for that matter (and pc mode causes banding for some reason?). Apparently it is the same PC mode only for 444 on the C9 as well according to that thread. Also that the Game mode's black flattening "near black flashing" fix doesn't work with VRR enabled currently giving some glitchy results around white objects on black backgrounds.

-
=============================================

==============================================

There is also talk earlier in the thread saying that having the ability to send a 12bit signal gets a better result even on a 10bit panel, due to errors and extra information available. (Sounds kind of like downsampling a higher resolution to get a better picture as an analogy).

When outside HDMI PC input mode these LG OLEDs do add additional noise to the signal from any source, that's why the HDR at YCbCr 4:2:2 10bit appears to be banding free. It's not, btw, and the TV is kind of dumb at how it adds noise - use a 10 bit grayscale ramp with HDMI set as anything else but PC, with YCbCr 10 bit in HDR and with only a grayscale shown it will look as it has no banding, then suddenly display something else like the Windows Explorer window over the grayscale and you'll suddenly notice the banding over the grayscale as the TV algo which controls the noise added is tricked into decreasing the amount of noise shaping.

"what's the point of 12 bit, the panel is 10 bit anyway?" There are multiple reasons why you want the signal that enters the TV to have as high quantization level as possible:
1. because at 12 bit, there is no banding visible by people due to the signal quantization itself. 8-bit and 10-bit (with no noise added somewhere in the signal) will have visible banding when displayed accurately simply because 8 or 10 bit are not enough to encode the colors without having banding visible by the human eyes (see the ITU report I mentioned) (10 bits is enough for SDR but not for HDR). 12 bit however is enough to have the color encoded so that there is no visible banding when displaying that image content accurately.
2. because any processing performed on a signal introduces errors in the least significant bits of the signal. When the signal it works on is a 12bit signal, these errors will have less visible consequences than when the signal is 10 bits or 8 bits.
3. because having a 12bit input signal allows to do both noise shaping and dithering to 10 bit that will result in an image which will preserve some of the extra detail in the 12 bit signal, and this means an image which appears more detailed than a 10 bit encoded one.
 
Last edited:
That doesn't sound good. So in the process of removing the daughter chip for processing full HDMI 2.1 and moving to a fully integrated solution on the CX, LG actually downgraded the TV? Figures...

I mean, this is great because they were able to get cost down, but if it's at the expensive of IQ on a premium product, that's never good.
 
Interesting at a glance, thanks. I'll dig in more a bit later but i am one of the people who unapoligetically quote HDRTest and other purists and review sites regarding pure source material, fidelity, and downgrades in features on different tvs and monitor (though not expert calibration like they are).


I guess you mean HDTVTest? :)
 
  • Like
Reactions: elvn
like this


If you want to see the BFI in action at different camera shutter speeds. The interesting thing here is that Auto setting does something different than any of the BFI presets. Maybe it varies BFI based on screen motion or something?
 
Tonight (CET) we might get some answers... NVidia is doing the keynote at computex at 4:00. I hope it includes more details on consumer GPU Ampere! We might get more interesting discussions or disappointments here:)
 
I don't know how the CX operates but on the B7 there is no such thing as "PC Mode". You can have your input classified as a "PC Input" and then have the picture mode set to Game Mode. That is how you obtain 444 chroma with the lowest lag. PC Mode and Game Mode are not 2 seperate picture modes. It's PC Input with Game Mode picture mode.
 
  • Like
Reactions: elvn
like this
Tonight (CET) we might get some answers... NVidia is doing the keynote at computex at 4:00. I hope it includes more details on consumer GPU Ampere! We might get more interesting discussions or disappointments here:)


No Computex tonight .. no Nvidia news Tonight.

" Computex usually takes place in early June, however, the show has since been pushed back to September 28-30 as a precautionary measure "
https://www.pcgamer.com/amd-nvidia-intel-computex-2020-attendance/

" We were expecting to hear more about Nvidia’s next-gen GPU plans at Computex 2020, but as the Taiwanese tech show has been postponed until the end of September due to the Covid-19 pandemic, we may have to wait a while to find out more. "
https://www.techradar.com/news/no-this-is-not-the-first-image-of-the-nvidia-geforce-rtx-3080
 
I don't know how the CX operates but on the B7 there is no such thing as "PC Mode". You can have your input classified as a "PC Input" and then have the picture mode set to Game Mode. That is how you obtain 444 chroma with the lowest lag. PC Mode and Game Mode are not 2 seperate picture modes. It's PC Input with Game Mode picture mode.

That's exactly how it works on the C9 and presumably CX as well. You select "PC" for the type of your say HDMI 1 input, its name can be anything. Then you set it to a picture preset you want. PC mode gives noticeably better text rendering so I would use it on the desktop at all times. Toggling between PC mode and non-PC mode isn't quick without using a second HDMI or a switcher and since GPUs usually have just one HDMI port and switchers don't support high refresh rates it's not so easy. Personally I would just stick to PC mode, stop worrying and love the OLED.
 
Thanks for clarifying the modes and inputs. I guess that thread was complaining that PC input was the only one that did real 444 output (and a few claims that PC Input has banding for some reason.. maybe they meant 8bit output on 10bit panel, or the vrr issue idk). They seemed to have been complaining about the CX only doing 4:2:2 output on the other non-PC inputs even when fed 4:4:4 content. It was a lot of information to digest in that thread, I tried to figure out what was going on the best I could at the time.

4:4:4 reportedly only on the PC input
=================================
As long as the CX can do game mode on the PC input at 444 output (and later 10bit 4k 120hz 444 VRR) ..that's pretty solid for now on the 444 issue. Movies are 4:2:0 or upconverted to 4:2:2 anyway so in regard to a uhd player or nvidia shield ~ streaming devices on other input(s) not a big deal. If you had some 10bit 444 HDR content off a hdmi 2.1 console or other device on other inputs it might be a little annoying though. I think they said it was the same on the C9 so if that's how it is, it sounds like that's how it is.. unless they fix it with a firmware update someday.

12bit signal benefits, CX 40Gbps Limit, Nvidia 10bit over hdmi
==================================================
The 12bit sent rather than 10bit having a slight improvement won't be a big deal on the CX 40Gbps limit vs the C9 / E9.. as long as nvidia 3000 gpus have hdmi 2.1 and support 10bit over that hdmi - but it was definitely worth learning about. Without 10bit over hdmi from nvidia 3000, a 55" C9./E9 on sale would be my choice. Nvidia 3000 series will probably have 10bit hdmi 2.1 output but I'm still waiting on confirmation.

Near Black Flashing fix of flattening of blacks* not working in game mode when VRR is enabled
=========================================================================
I'm still pretty concerned about the near black flashing fix not working with VRR - reportedly making brighter blacks and weird gradients around white objects on black backgrounds in games... but hopefully they will fix that in a later firmware update on both the 2019 and 2020 models.

*LG uses dithering only on the near blacks when 12bit or 10bit native is sent to the 10bit panel, in order to add noise to eliminate the near black flashing issue. That lowered the resolution in the areas too much in game mode apparently so they instead applied a black flattening fix of some sort for game mode later on. It's been reported that the near black fix via flattening is bypassed in game mode when VRR is active, showing wierd results.
 
Last edited:
No Computex tonight .. no Nvidia news Tonight.

" Computex usually takes place in early June, however, the show has since been pushed back to September 28-30 as a precautionary measure "
https://www.pcgamer.com/amd-nvidia-intel-computex-2020-attendance/

" We were expecting to hear more about Nvidia’s next-gen GPU plans at Computex 2020, but as the Taiwanese tech show has been postponed until the end of September due to the Covid-19 pandemic, we may have to wait a while to find out more. "
https://www.techradar.com/news/no-this-is-not-the-first-image-of-the-nvidia-geforce-rtx-3080

Wrong:

https://www.computextaipei.com.tw/

There will be an online event including the nvidia keynote early june and the physical exhibition is pushed to september.
 
Wrong:
https://www.computextaipei.com.tw/
There will be an online event including the nvidia keynote early june and the physical exhibition is pushed to september.

Oh nice , I did not know that ! :)
2. June Talks Nvidia Online at 10am

But for Nvidias 3000 News its to early and the Wrong Place ... would be cool when Nvidia drops some Infos about it
but i think we have to wait until August / September ( Gamescom Timeframe like 2018 with the 2000 Series ) but who knows.

Nvidia Talks about :

" NVIDIA - AI innovation moves humanity forward
Jensen Huang, founder and CEO, NVIDIA, will be in this talk. As AI will bring improved health, safety, and productivity to people, Jensen will give a prospect how NVIDIA’s latest miracle technology supports both ecosystem partners and global customers"

Again .. wrong Place for a Gamer Card.
 
Opted to buy a 55" (I can fit it/more useful as spare TV when a better "monitor" pops up, small price difference), will arrive tonight tomorrow. can't see from the specs if it comes with a proper "high speed" hdmi cable, or do I need to buy one? (bought one)
 
Last edited:
  • Like
Reactions: elvn
like this
Opted to buy a 55" (I can fit it/more useful as spare TV when a better "monitor" pops up, small price difference), will arrive tonight. can't see from the specs if it comes with a proper "high speed" hdmi cable, or do I need to buy one?

A 40 or 43inch would be great so when I heard they were making a 48 inch I thought at least it's smaller than 55 and so it should be a more usable size.. but lately after going over the viewing distance difference - like you, I've been considering the overall usage including after I retire it from pc use. I think that a 55inch is a smart move if you have the room. It really only needs another 6 inch increase in viewing distance compared what I consider a 48inch's view distance of ~40" (to 48inch).

Once the 3000 series performance is confirmed I'll decide between a c9, e9, or CX. The one thing keeping the 48 inch in consideration for me is that I will be using 43inch side display(s). The 6 inch increase in view distance might be enough to require moving up another notch in scaling which would lose a little more desktop real estate in the side monitor(s)
 
Last edited:
I have to admit when I heard a US release of "June 2020" for the 48", I hoped that had meant June 1, 2020. I realize they kept it vague on purpose, but am checking Best Buy and Amazon daily...(not interested in the B&G Photo pre-order option)
 
The one thing keeping the 48 inch in consideration for me is that I will be using 43inch side display(s). The 6 inch increase in view distance might be enough to require moving up another notch in scaling which would lose a little more desktop real estate in the side monitor(s)
Yeah that's a good point for the 48". I once used a 50" plasma at a reasonable distance and 3x 24" near me, and shifting focus back and forth was tiresome.
 
Well, ordered a LG CX 48" today for 1399 euros. If it does not get delayed, should be available for pickup on the 22nd. Also ordered an adapter plate from VESA 100x100 to 300x200 as well as a HDMI 2.1 cable. Will see how that works out when it all gets here.
 
That's exactly how it works on the C9 and presumably CX as well. You select "PC" for the type of your say HDMI 1 input, its name can be anything. Then you set it to a picture preset you want. PC mode gives noticeably better text rendering so I would use it on the desktop at all times. Toggling between PC mode and non-PC mode isn't quick without using a second HDMI or a switcher and since GPUs usually have just one HDMI port and switchers don't support high refresh rates it's not so easy. Personally I would just stick to PC mode, stop worrying and love the OLED.


Actually, the trick is to use PC mode + Deep Color enabled (a setting which has changed names somewhat through the years). At least thats how it has been on my E6, C7 and GX. Assume it's the same on the 77" C8 but that's in the living room so not too much text processing going on there. And yes, I do love my OLEDs :D
 
Man the AVS forum is such a chore to read there's like 200+ new posts daily in the LG OLED forum.

If I remember correctly, the C9 has bad banding issues in PC 4:4:4 mode. Its been tested multiple times that running the TV in "Console" mode (which allows 4:2:2 only) shows a lot less banding overall especially in HDR mode. Many side by side tests show you actually get overall better picture quality in Console 4:2:2 vs PC 4:4:4 at least for general PC gaming.

Does anyone know if that's still the case on the CX and if that issue has ever been fixed on the C9 and all previous generations? I'm currently running on Console mode unless told otherwise but wouldn't mind the full 4:4:4 experience when doing office work. Thanks.
 
Yeah it can get a little confusing. PC input + game mode.. Console Game input and Game mode .. (and no PC mode). Some inputs accept 444 in but only show 4:2:2 out. PC mode is the only one that does 444 out at least according to that thread I think. Also if for some reason hdmi 2.0b is only putting out 8bit to this screen, 8bit + dithering still has banding compared to a 10 or 12 bit signal sent to a 10bit panel. I don't know if we will have the full picture of what features and limitations there are until we have hdmi 2.1 gpus outputting to the hdmi 2.1 tvs.
 


If you want to see the BFI in action at different camera shutter speeds. The interesting thing here is that Auto setting does something different than any of the BFI presets. Maybe it varies BFI based on screen motion or something?


Great video! Thanks man! The Auto vs High vs Medium comparisons this guy did are terrific.
 
Man the AVS forum is such a chore to read there's like 200+ new posts daily in the LG OLED forum.

If I remember correctly, the C9 has bad banding issues in PC 4:4:4 mode. Its been tested multiple times that running the TV in "Console" mode (which allows 4:2:2 only) shows a lot less banding overall especially in HDR mode. Many side by side tests show you actually get overall better picture quality in Console 4:2:2 vs PC 4:4:4 at least for general PC gaming.

PC 4:4:4 SDR has no banding and it's the only way to get correct 4:4:4 chroma. But once you activate HDR in PC-Mode you indeed see worse banding than with HDR console-mode. That's at least my conclusion with the C9 after testing.
 
Yeah it can get a little confusing. PC input + game mode.. Console Game input and Game mode .. (and no PC mode). Some inputs accept 444 in but only show 4:2:2 out. PC mode is the only one that does 444 out at least according to that thread I think. Also if for some reason hdmi 2.0b is only putting out 8bit to this screen, 8bit + dithering still has banding compared to a 10 or 12 bit signal sent to a 10bit panel. I don't know if we will have the full picture of what features and limitations there are until we have hdmi 2.1 gpus outputting to the hdmi 2.1 tvs.

This link is a few years old but I believe most of it is true still (mostly since HDMI 2.0 still is).

https://www.lg.com/uk/support/product-help/CT00008334-1437128590776

My impression is that PC mode mainly cuts down processing, that you can see by just moving the mouse cursor around and get a feeling for the lag. And Deep Color is the key to 4:4:4. Now , those of course go hand in hand and sometimes the Deep Color gets activated automatically based on settings (on TV or PC). So in short, my conclusion is that to get sharp text, you need Deep Color.

For anyone unsure, I would recommend using a 4:4:4 test image, like the one at Rtings ("How to test for Chroma Subsampling")

https://www.rtings.com/tv/learn/chroma-subsampling
 
  • Like
Reactions: elvn
like this
PC Input
=========

...Is there is no "PC Mode" in the OSD per se, only activated by naming or using one already named PC?

...Does having an input named PC Input make that input a "pc mode"? - but you can further activate "game mode" in the OSD on top of it? Or are they somewhat redundant? Different TVs have different names and ways of doing things.

...Also wondering if you can name more than one input "PC" or it that is just a single factory input named that that has the "PC mode" functionality.


PC input Banding
================

...Why do you think banding shows up in PC Input/"mode"?

...People in the avs thread were saying while all of the hdmi inputs can accept 444, they all output 4:2:2 except for the PC input which they said can output real 444 chroma. Yet the 444 output, which should be truer from sources that are 444 chroma, has banding.

... Do you think the PC input is outputting 8bit or 8bit dithered instead of processing a 10bit (or 12bit) signal? Both 8bit and 8bit with dithering will show more banding since they are not native color resolution (10bit or highter). If that were the case, do you think hdmi 2.1 gpus and consoles have a chance of maintaining source fidelity 1:1 through the pipeline for 10bit 444 chroma material and displaying it properly?


Game Mode with VRR active bypasses the near black fix
=============================================

...Black flattening was added by LG for game mode so that the near black's flashing wouldn't happen anymore in games. However once they enabled VRR in firmware updates the black flattening was bypassed, and this is still an issue with the CX from reports.

.. The "near-blacks fix via black flattening" being bypassed with VRR active reportedly makes brighter areas around bright/white objects with a weird gradient of some sort. The overall brightening of contrasted areas was originally assumed to be a brighter gamma issue but some owners claim that it is actually weird brighter gradients in black backgrounds off of bright object's edges and into near blacks. If that is the case, adjusting your gamma wouldn't be a uniform adjustment and would screw up how other things would look. That is, if there is a bright gradient "halo"/noise thing going on where near blacks are supposed to be adjusted cleanly, making those glitched near black areas darker via global gamma would darken all of the other colors and things in entire scenes needlessly and inappropriately (since the gamma of everything besides the near black areas was already correct to begin with).
 
PC 4:4:4 SDR has no banding and it's the only way to get correct 4:4:4 chroma. But once you activate HDR in PC-Mode you indeed see worse banding than with HDR console-mode. That's at least my conclusion with the C9 after testing.

ALL presets on the 2020 model have working Chroma 4:4:4 this year, LG finally fixed that.
 
ALL presets on the 2020 model have working Chroma 4:4:4 this year, LG finally fixed that.

I hope that is confirmed. It was talked about in the avsforum's "2020 LG CX–GX dedicated GAMING thread, consoles and PC".. Some mentions that while every input took 4:4:4, the outputs were 4:2:2: except for PC input that actually output true 4:4:4 (which had banding).

Also regarding my question about whether the possiblity that hdmi 2.0b could be sending 8bit instead of 12/10 native color resolution could explain that there is banding showing on the PC input:
https://www.avsforum.com/forum/40-o...aming-thread-consoles-pc-18.html#post59746482
"The main difference between normal mode and PC mode seems to be that the TV turns off its internal dithering (I'm now fairly sure that's what @stama was talking about when he mentioned noise). There is also an argument to be made (like Stacey Spears of Spears & Munsil has done several times in the past) that the OLEDs simply fall back to 8-bit entirely in PC mode. We'll never know for sure."

If you are sending 12bit/10bit to the 10bit panel instead of 8bit you supposedly wouldn't need the dithering like you would on 8bit anyway, but if it is actually sending 8bit on PC input instead of the 12/10bit material you want to send - that would be a problem and would explain why banding is showing up, especially 8bit without dithering, since dithering would cut down the banding (but there would still be visible banding vs native).
 
Last edited:
PC Input
=========

...Is there is no "PC Mode" in the OSD per se, only activated by naming or using one already named PC?

...Does having an input named PC Input make that input a "pc mode"? - but you can further activate "game mode" in the OSD on top of it? Or are they somewhat redundant? Different TVs have different names and ways of doing things.

...Also wondering if you can name more than one input "PC" or it that is just a single factory input named that that has the "PC mode" functionality.

Name of the input does not matter, just setting the input icon as PC. A lot of TVs like my Samsung KS8000 go into PC mode simply by naming the input "PC". I am glad the LG is at least a little bit more sensible than that even though finding the setting is a bit buried within the Home dashboard. When I get my LG CX 48, I plan to name HDMI 1 as PC and HDMI 2 as Macbook Pro for their input devices.

You can separately activate the "Instant game response" mode for the currently used image preset for lower input lag. I don't know if running the actual Game mode results in even lower lag but I haven't noticed input lag issues in the games I've been playing.

The TV falling back to 8-bit in PC mode would not be all that out of character considering Nvidia GPUs don't support 10-bit output in anything but games (or apps using fullscreen DirectX apparently) unless you use the Studio driver which afaik enables 10-bit output on select graphics apps. AMD or Nvidia Quadro GPU would be the better use for testing this as afaik they don't have these artificial limitations. Most people will have a hard time telling 8- and 10-bit apart anyway and 8-bit should not cause severe banding issues.
 
Thanks for the clarification about the modes on the LG tvs.

8 bit and other lossy non-native material is "fine"
=========================================
Regarding 8 bit being "ok" compared to sending more lossless signals of 12bit or 10bit to a panel's 10bit native color resolution ... 8 bit dithered helps but still has banding so it's not prefered, especially for HDR content. What would be even worse is if the suspicions are true that the banding in PC input icon mode could be from dithering being disabled entirely in PC input icon mode which would cause even worse banding.

That's why some people in that avs thread are running 8bit YUV 444 8bit instead of RGB since Nvidia forces dithering on YUV to look "almost as smooth in PC mode as YUV422 10bit normal mode". Even then there is the posterization issue though, so some people just consider pc input ("PC icon mode")'s 444 broken on LGs. At least for now. A lot of people are waiting to see what nvidia's 3000 series supports. HDMI 2.1 , 10bit 444 RGB etc or not.

Bypassing of near black fix when VRR enabled in game mode
=================================================
Hopefully LG will also fix the reported "bypassing of the near black fix when VRR is active in game mode" issue at some point as well.
 
Last edited:
I got my 55cx, ran a firmware upgrade (03.00.60), and both the pendulum demo (default checked g-sync when started, still worked windowed) and my eyes in games think g-sync works in 120Hz 4k 4:2:0. windowed and fullscreen. (also did a full clean video driver installation).
55" is borderline too tall, but I'm ok with it. I wouldn't recommend the 55" unless you're really sure it'll fit and it wont be too tall.

Game mode looks like garbage, no exaggeration, but it is "picture mode - game". In "picture mode - normal" it looks fine, and still allows instant game response etc, could it just be a misunderstanding and that picture mode - game is not necessary? It is just called "picture mode" after all. I can't Imagine LG forcing people to use that mode (it really looks as bad as some monitor game modes).

I do not feel any extra input lag with picture mode - normal and instant game response - on, but that doesn't mean it isn't there. I can test the input lag later or tomorrow. But most likely, game mode is just a trashy unecessary picture mode. Description in picture sounds like "instant game response" includes all normal game mode functions.
 

Attachments

  • 20200603_192307.jpg
    20200603_192307.jpg
    348.5 KB · Views: 0
Game mode looks like garbage, no exaggeration, but it is "picture mode - game". In "picture mode - normal" it looks fine

Then change the settings. White balance to warm 2 (or 3 if you like) would be the first most important step to get rid of the too bluish/cool image.

Instant game response mode is necessary for VRR / GSYNC, you have to activate this. When IGSM is enabled you can take whatever picture-mode preset you like, everyone will have the same inputlag.
 
Just being a devil's advocate here so please don't take offense:

-For the "Game Mode" visual complaints, Don't you think the engineer's at LG probably have a better understanding of how their products work and what mode is best for each situation? I'd imagine the "look" of gamemode is that way for a specific set of reasons that you may not understand. I don't think they purposely make gamemode look like "trash" for just the hell of it or they would be out of a job. I kind of take their professional implementation a lot more seriously than what some dude says on the internet. I use gamemode and have no problems at all and I have pretty picky standards (obviously, as I use a $1,600+ TV as a computer monitor, when most people rock <$200 LCD TN's).

Edit: For example, in gamemode, whites actually look white (maybe with a hint of blue). In ISF Expert that Rtings calls out as "most accurate", my whites look like piss-yellow/orange. I think gamemode looks more accurate (especially more vivid & lifelike) than ISF Expert. I don't have the link as it's lost somewhere in the 10,000+ page LG OLED AvsForum thread but I believe "most accurate / ISF Expert" = picture as intended for oldschool movie screen projectors as the "gold standard". AKA, not the same situation & environment on an OLED screen in a home environment.
 
Last edited:
Just being a devil's advocate here so please don't take offense:

-For the "Game Mode" visual complaints, Don't you think the engineer's at LG probably have a better understanding of how their products work and what mode is best for each situation? I'd imagine the "look" of gamemode is that way for a specific set of reasons that you may not understand. I don't think they purposely make gamemode look like "trash" for just the hell of it or they would be out of a job. I kind of take their professional implementation a lot more seriously than what some dude says on the internet. I use gamemode and have no problems at all and I have pretty picky standards (obviously, as I use a $1,600+ TV as a computer monitor, when most people rock <$200 LCD TN's).

Lol. Yeah. I remember having to do some adjustments on my JS9000 in order for it to look decent in Game mode also. Biggest thing was adjusting the sharpness because the default value was crap and needed to be adjusted to have the same sharpness as the regular mode.

Out of curiosity, I looked in the menu on my B7 and I'm in Game Mode (User). It definitely doesn't look like crap, but then again I remember making some adjustments after choosing this mode. I switched to PC Mode and it didn't look any better, really...just brighter (because I never use that mode, it's set to the default brightness while I have Game Mode cranked way down for the sake of my retinas).
 
Rtings updated their review:

" Update 05/26/2020: 120Hz BFI only works properly in Game mode. Since BFI isn't available when G-SYNC is enabled, to display a 4k @ 120Hz signal with BFI, you have to disable VRR from the source and manually enter Game Mode. In any other picture mode, 4k @ 120Hz signals skip frames, causing duplications when BFI is enabled."

For me, this is a non-issue because gsync is useless above 120fps anyways.
 
Just being a devil's advocate here so please don't take offense:

-For the "Game Mode" visual complaints, Don't you think the engineer's at LG probably have a better understanding of how their products work and what mode is best for each situation? I'd imagine the "look" of gamemode is that way for a specific set of reasons that you may not understand. I don't think they purposely make gamemode look like "trash" for just the hell of it or they would be out of a job. I kind of take their professional implementation a lot more seriously than what some dude says on the internet. I use gamemode and have no problems at all and I have pretty picky standards (obviously, as I use a $1,600+ TV as a computer monitor, when most people rock <$200 LCD TN's).

For example, in gamemode, whites actually look white (maybe with a hint of blue). In ISF Expert that Rtings calls out as "most accurate", my whites look like piss-yellow/orange. I think gamemode looks more accurate than ISF Expert.
I know exactly why game mode is as it is (exaggerated low gamma curve to see better in dark areas etc), I was just pointing out it seems unecessary to use at all, and most likely completely unrelated to input lag performance. If you personally have tweaked the game mode then it obviously looks nothing like the default.
I just mentioned this because there was a lot of criticism of game mode here and in videos earlier. Unless i misunderstood and they said game mode but really meant "instant game response".
 
Rtings updated their review:

" Update 05/26/2020: 120Hz BFI only works properly in Game mode. Since BFI isn't available when G-SYNC is enabled, to display a 4k @ 120Hz signal with BFI, you have to disable VRR from the source and manually enter Game Mode. In any other picture mode, 4k @ 120Hz signals skip frames, causing duplications when BFI is enabled."

For me, this is a non-issue because gsync is useless above 120fps anyways.

Not sure what you mean, everything is useless above 120 hz since the panel does not support higher refresh rates AFAIK?
 
Just being a devil's advocate here so please don't take offense:

-For the "Game Mode" visual complaints, Don't you think the engineer's at LG probably have a better understanding of how their products work and what mode is best for each situation? I'd imagine the "look" of gamemode is that way for a specific set of reasons that you may not understand. I don't think they purposely make gamemode look like "trash" for just the hell of it or they would be out of a job. I kind of take their professional implementation a lot more seriously than what some dude says on the internet. I use gamemode and have no problems at all and I have pretty picky standards (obviously, as I use a $1,600+ TV as a computer monitor, when most people rock <$200 LCD TN's).

Edit: For example, in gamemode, whites actually look white (maybe with a hint of blue). In ISF Expert that Rtings calls out as "most accurate", my whites look like piss-yellow/orange. I think gamemode looks more accurate (especially more vivid & lifelike) than ISF Expert. I don't have the link as it's lost somewhere in the 10,000+ page LG OLED AvsForum thread but I believe "most accurate / ISF Expert" = picture as intended for oldschool movie screen projectors as the "gold standard". AKA, not the same situation & environment on an OLED screen in a home environment.

If you think the ISF Expert mode is piss yellow, that is you being used to a very blue tinted image and seeing that as white. Our eyes can be easily deceived like this and many think that a blue tinted screen looks sharper and has "more white" whites. Rtings uses display calibrator hardware for measuring things according to a 6500K white point which is kind of like the daylight white you would see in real life, which is usually warmer rather than blueish. When I calibrated my CRG9 to 120 nits and 6500K, at first I felt that it was way too dark and looked too warm but after getting used to it, now it feels comfortable and accurate. Color accuracy also depends on ambient lighting conditions as well as individual panel variance, which is why you can't just plop in calibration values made on monitor A in environment X and expect them to be perfect on monitor B in environment Y. But they might be a decent starting point nonetheless.

Often display presets are made for what the manufacturer thinks looks most impressive in a store at max brightness under fluorescent lights. That is a completely different situation than running it at home at far lower brightness and in a much darker room or using it during the daytime in bright sunlight. People will usually respond to color settings that produce the most vivid looking colors despite this being usually very inaccurate. Wide gamut LCDs and preset modes on displays clearly show a prevalence of these settings over more accurate ones.
 
Back
Top