Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Testing in game (Doom) - there's definitely an improvement in motion with BFI (use auto or high). It's most noticeable when strafing rather than aiming.
The problem is it gets thrown off by low frame rates or input lag not sure what exactly. Like it worked well at 1080p 120fps regardless of vsync. But if I ran at 4k with lower frame rates, input lag increased (not sure if just due to lower FPS itself) and the BFI gets thrown off to the point where it actually makes the motion worse. HDR itself didn't seem to affect. Also below 120hz the flicker is terrible.
How are you testing this? Is there a way to enable BFI and GSYNC at the same time? It appears to be one or the other based on my testing.
I have a 7.1 surround setup on small klipsch surround speakers and a different subwoofer in my pc room. I have some decent headphones (AKG 712's) I can plug into a receiver too. I am looking forward to "full" hdmi 2.1 with eARC support, which would avoid having to use the "phantom monitor" trick in order to get hdmi audio output from nvidia which I find buggy with my window placement software since the phantom monitor disappears any time I turn off the receiver, shuffling window positions, blinking monitors , and screwing up all memorized window positions and position hotkey placement. It also wastes a hdmi output on my gpu. The C9 has uncompressed audio format pass-through via e-ARC, 48Gbps hdmi 2.1 which is more guaranteed to get 120hz 4k 444 to the 10bit native monitor from a hdmi 2.1 gpu, and at least the black levels all work on non VRR HDR right now.
I'm dedicating the 48"CX (or 55" C9 if these support issues don't pan out) to being a gaming and media stage, with other monitor(s) in the array for static desktop/app windows. The OLED will be playing games, showing movies and videos, streams, and even high resolution photo and art slideshows/screensavers and perhaps audio visualizations at times. Some of those sources are uncompressed audio, 7.1 DTS-HD, uncompressed PCM 7.1, ATMOS, etc.
Btw Tidal and Amazon also have ATMOS music libraries that they market as "growing". (cordcutters article about Tidal ATMOS music). I'm mainly going to be using uncompressed audio for movies and FLAC audio files, but the Tidal ATMOS music support is intriguing if the library grows a lot in the future.
I want the display to be able to show uncompressed native video, and to pass through uncompressed audio sources untouched.
That means 10bit over nvidia hdmi 2.1 to the native 10bit 4k 120Hz 444 panel uncompressed. No DSC compression, not being forced to lower chroma, lowering Hz or upscaling of lower resolution. No cutting it down from 1:1 source material with available sources.
That also means uncompressed audio formats passed through untouched, uncompressed, uncut from the original source fidelity.
HDMI 2.1 inclusion was supposed to check all the boxes for eventual hdmi 2.1 gpus and other hdmi 2.1 sources. LG decided to take away or omit a bunch of those check-boxes on the CX, at least as it stands right now.
No, G-Sync has to be off for BFI on LG OLED. In order to properly use BFI like said above, the FPS of the game should be identical to the refresh rate of the BFI. It's why I prefer RTSS FPS cap; you can go down to the 1/1000th of a second to match the refresh rate in BFI after confirming it here:
https://testufo.com/refreshrate
It's also one of the big negatives of BFI; that you have to have enough GPU overhead (wasted resources usually) so that FPS doesn't dip below 120. Which at 4K, is a usually a tall order. Unlike older games in which the GPU utilization may have stuck +/- 10% and frame capping without using VRR was easy, in a lot of today's games GPU utilization can double or more depending on the scene making VRR a necessity.
For best results it is recommended to give headroom of at least few FPS so more like 115 fpsFor GSYNC on this monitor, I've set the Max FPS to 119 in the Nvidia Control Panel. I don't notice any tearing when I do that. It's also far more forgiving for my 2070 Super (compared to your Titan RTX).
Btw Tidal and Amazon also have ATMOS music libraries that they market as "growing". (cordcutters article about Tidal ATMOS music). I'm mainly going to be using uncompressed audio for movies and FLAC audio files, but the Tidal ATMOS music support is intriguing if the library grows a lot in the future.
I have a 7.1 surround setup on small klipsch surround speakers and a different subwoofer in my pc room. I have some decent headphones (AKG 712's) I can plug into a receiver too. I am looking forward to "full" hdmi 2.1 with eARC support, which would avoid having to use the "phantom monitor" trick in order to get hdmi audio output from nvidia which I find buggy with my window placement software since the phantom monitor disappears any time I turn off the receiver, shuffling window positions, blinking monitors , and screwing up all memorized window positions and position hotkey placement. It also wastes a hdmi output on my gpu. The C9 has uncompressed audio format pass-through via e-ARC, 48Gbps hdmi 2.1 which is more guaranteed to get 120hz 4k 444 to the 10bit native monitor from a hdmi 2.1 gpu, and at least the black levels all work on non VRR HDR right now.
I'm dedicating the 48"CX (or 55" C9 if these support issues don't pan out) to being a gaming and media stage, with other monitor(s) in the array for static desktop/app windows. The OLED will be playing games, showing movies and videos, streams, and even high resolution photo and art slideshows/screensavers and perhaps audio visualizations at times. Some of those sources are uncompressed audio, 7.1 DTS-HD, uncompressed PCM 7.1, ATMOS, etc.
Btw Tidal and Amazon also have ATMOS music libraries that they market as "growing". (cordcutters article about Tidal ATMOS music). I'm mainly going to be using uncompressed audio for movies and FLAC audio files, but the Tidal ATMOS music support is intriguing if the library grows a lot in the future.
I want the display to be able to show uncompressed native video, and to pass through uncompressed audio sources untouched.
That means 10bit over nvidia hdmi 2.1 to the native 10bit 4k 120Hz 444 panel uncompressed. No DSC compression,not 8bit dithered~banding, not being forced to lower chroma, lowering Hz or upscaling of lower resolution. No cutting it down from 1:1 source material with available sources.
That also means uncompressed audio formats passed through untouched, uncompressed, uncut from the original source fidelity.
HDMI 2.1 inclusion was supposed to check all the boxes for eventual hdmi 2.1 gpus and other hdmi 2.1 sources. LG decided to take away or omit a bunch of those check-boxes on the CX, at least as it stands right now.
It seems to be for theatre types who can't run two cables basically.Is HDMI audio really going to be THAT superior over just using USB or Optical audio? Or is it just because your receiver lacks any input besides HDMI. Every audiophile I know has never once mentioned HDMI audio and all use external USB DACs. This is my buddy's setup with Sennheiser HD 800S. I get all my audio advice from him and he's never once told me that HDMI audio is the most superior.
View attachment 246384
View attachment 246385
The main reason to use HDMI audio is supported formats, you simply cannot pass Atmos over USB, there is no such thing as a DAC that can decode Atmos and no consumer software for it on PC either.
The benefit of this is somewhat limited if you don't have height speakers, however, as the AVR will just downmix and you end up with a mix that is usually not super different from the native 7.1 mix. Depends on the content, though.
Nothing really, just that such a device has yet to be brought to market.So it's purely for surround sound purposes then, I see. What's currently stopping Atmos from being passed over USB?
So it's purely for surround sound purposes then, I see. What's currently stopping Atmos from being passed over USB?
was using in game vsync settingsHow are you testing this? Is there a way to enable BFI and GSYNC at the same time? It appears to be one or the other based on my testing.
Were you monitoring your frame rates? Your frame rate has to precisely match your refresh rate to get proper motion on strobing/scanning displays.
Also try using "scanline-sync" from RTSS. It achieves v-sync without the input lag.
And have you tried any refresh rates between 60hz and 120hz with BFI? I'm wondering if all refresh rates are compatible, or just a select few.
Also, Club3D's DP 1.4/HDMI 2.1 adapter is set to release in June and will give you access to 4K/120hz on the CX but they stated that this first iteration will not support VRR:
I tried the CG437K but even at it's current $1200 price it's just not worth it. $300 more for this OLED (+ whatever $ to accommodate it desk/distance wise) is a bargain for the massive jump in PQ.
TBH the brightness means nothing when the monitor only has 16 zones. I played FF7R on it swapping between my X27 to compare. The entire game had literally 1/2 the screen blasting 600nits+ because of HUD elements alone. It's super washed out in HDR compared to an actually capable HDR display (X27).
Not only that but it's too slow for 60hz let alone 120, see:
My suggestion to anyone out there considering both the CX and CG437K is pay the miniscule price different (Acer was $1500 new just a few months ago!) and do whatever it takes to make the OLED work for you instead of buying that terrible monitor.
Also, Club3D's DP 1.4/HDMI 2.1 adapter is set to release in June and will give you access to 4K/120hz on the CX but they stated that this first iteration will not support VRR:
I paid under $1K for the Acer and I'm not going to change my room layout just for a monitor (If I'm willing to use a 48/9", I would have gotten a Q70R or a 900F by now). I already tried a bunch of 4K Blu-ray on it and the HDR is beautiful (not as good/bright as my Q9 but good enough for computer work) and not washed out. I have a PS4 Pro coming in and I'll see how it goes. But what I meant was that I'm not doing console gaming on this. I do my console gaming and movie watching on the Q9 and will upgrade to a Q900T or whatever current model when FFVII R part 2 comes out. For a large screen TV, I'll take the best QLED over OLED anyday.
Pretty close to my experience as well. I'm exclusively using a CX 55" as a computer monitor, but I also have a 65" Samsung Q90R for my living room and consoles. The CX is magnificent as a computer monitor. However, the Q90R is big, bright, and beautiful. I also think that Samsung does a better job on low bitrate content (especially for sub-4k formats on youtube). That's not to say the CX does bad, but I can see a lot of macro blocking near black on the CX that just isn't there on the Q90R.
For console gaming, movies, and TV, the Q90R is amazing. I'm looking forward to seeing the Xbox Series X and PS5 in all their glory on the Q90R.
Also, in case anyone is wondering, FFVII Remake looks spectacular in HDR on the Q90R... and that's only on a PS4 slim. I can't imagine how good it would look on a PS4 Pro.
Yea, FFVII R looks great on the Q9 even just using a slim. Will have to run it again on the PS4 pro when it gets here on Weds. How far do you sit from the 55" when you're using your PC?
4 or 5 feet. I had a 27" Dell monitor before this (1440P, 144hz, gsync). In order to accommodate the 4x increase in size, I bought a floor stand VESA mount and moved my desk away from the wall.
Link to floor stand: https://www.amazon.com/gp/product/B01M3XIJRG/ref=ppx_yo_dt_b_asin_title_o03_s00?ie=UTF8&psc=1
As an unintended bonus, the stand has a glass base, so I was able to move my PC out from under my desk and just sit it on top of the base.
Honestly, the entire setup is pretty ghetto, but it works well. Lord knows the picture quality upgrade from a 27" TN to a 55" OLED TV is insane.
I personally would not even consider the Acer unless it was $650. Motion clarity is so poor that it's unusable as a gaming display.
People have different standards and for me a edge lit BGR monitor that has 16 huge zones and behaves globally lit in 98% of HDR content with pixel response of VA's from 2014 is just unacceptable. There's literally no scenario outside of loading screens where it's local dimming does anything.
I'm not trying to rain on your parade so don't take it the wrong way but it is objectively terrible in HDR compared to even a $499 TCL R625.
Computer monitors are overpriced.Why is this LG half the price? Whats the 1500 dollar difference here??
Help a fren out here guys:
Im about to pull the trigger on the Alienware AW5520QF, theyre 3K new on Ebay (US shipping) (inb4 no HMDI 2.1, you have to use it in a dark room, and you need a giant desk)
Why is this LG half the price? Whats the 1500 dollar difference here??
Help a fren out here guys:
Im about to pull the trigger on the Alienware AW5520QF, theyre 3K new on Ebay (US shipping) (inb4 no HMDI 2.1, you have to use it in a dark room, and you need a giant desk)
Why is this LG half the price? Whats the 1500 dollar difference here??
Interestingly i don’t see any problems with fonts on my cx...Curious, for those of you who have tried improving text quality on the OLED by changing ClearType settings, have you also tried to change the system font (Segoe UI) in Windows 10? If I remember correctly, that font is made for ClearType so it would seem natural that other fonts like Sans Serif and Tahoma would be better without CC.
Changing fonts in Windows is not trivial these days but here is how it's done:
https://www.faqforge.com/windows/windows-10/how-to-change-the-default-font-in-windows-10/
I went with the Acer CG437K instead of waiting for this one. I need one for a desktop monitor and 48" is just going to be too big for a desktop where I'll be sitting about 3 ft away. HDMI 2.1 also doesn't really mean anything for me at this point as I won't pickup a PS5 till Final Fantasy VII part 2 makes it out and by then, I'll be ready to replace my current 65" QLED with a 8K unit. Also once you're used to getting over 1500 nit, anything under 1000 will seems lacking.
Interestingly i don’t see any problems with fonts on my cx...
Pretty close to my experience as well. I'm exclusively using a CX 55" as a computer monitor, but I also have a 65" Samsung Q90R for my living room and consoles. The CX is magnificent as a computer monitor. However, the Q90R is big, bright, and beautiful. I also think that Samsung does a better job on low bitrate content (especially for sub-4k formats on youtube). That's not to say the CX does bad, but I can see a lot of macro blocking near black on the CX that just isn't there on the Q90R.
For console gaming, movies, and TV, the Q90R is amazing. I'm looking forward to seeing the Xbox Series X and PS5 in all their glory on the Q90R.
Also, in case anyone is wondering, FFVII Remake looks spectacular in HDR on the Q90R... and that's only on a PS4 slim. I can't imagine how good it would look on a PS4 Pro.
I don't on my B7, either. I know someone posted something above about LG changing the pixel structure but the text on mine has always been perfectly acceptable. And I have owned many, many monitors of all types.
Regarding the lack of brightness compared to QLED sets, I do not run my set at anywhere close to full brightness outside of HDR scenarios, so that's a non-issue for me.
Interestingly i don’t see any problems with fonts on my cx...
Honestly, I can't tell a difference. I've never really sat close enough to the TV to notice fine details in text rendering. If I had to guess, I'd say they're pretty much identical; Q90R = RGB, CX = WRGB. Becuase they both have all 3 color subpixels in each pixel, it should be basically identical.Would you say that rendering of text is better on the Q90R than the CX when used a a normal PC (ie not for gaming)? Being a LCD it should probably be more suitable as a "text monitor" for Windows, even though VA panels have traditionally not been that great for this compared with IPS. OLEDs have not been that great either for this to be hones.
Honestly, I can't tell a difference. I've never really sat close enough to the TV to notice fine details in text rendering. If I had to guess, I'd say they're pretty much identical; Q90R = RGB, CX = WRGB. Becuase they both have all 3 color subpixels in each pixel, it should be basically identical.
I would tilt towards the CX being ever so slightly better simply because each of the pixels are self-lit, and because if the text is "White" on the CX, only the White subpixel is lit, while on the Q90R, White = RGB subpixels.
Thanks. Thing is that RGB should, at least in theory, work better with Windows and ClearType. Or at least the impression I have but as seen in this thread there is some confusing surrounding this. I have also seen that due to the new wide angle filter the pixels are more blurry on the new Samsungs. This video shows it more in detail (at the end, but the video is really interesting I think so worth watching all of it).
Yea, I remember seeing this video about a Q900 vs C9. Honestly, I think they were more excited about the OLED TV being so thin than the actual picture quality.
Help a fren out here guys:
Im about to pull the trigger on the Alienware AW5520QF, theyre 3K new on Ebay (US shipping) (inb4 no HMDI 2.1, you have to use it in a dark room, and you need a giant desk)
Why is this LG half the price? Whats the 1500 dollar difference here??