LG 48CX

What madpistol said. I keep my screen in SDR mode with a very low OLED light setting for general work and SDR gaming (~25-30). It only enters HDR mode when HDR content is displayed in my media player software or when a game turns it on, then I let the brightness crank up. I fully realize that HDR mode should be allowed to take the pixels to 100%, but I use my screen in a dim office only so I have set my HDR max to 80% OLED light... the HDR effect is still great compared to any consumer HDR monitor. It really is. These screens aren't about getting 1000 nits in HDR... they're about everything else. The image is so ridiculously good compared to ANYTHING else on the market that having the absolute brightest pop on a bright HDR scene just doesn't seem that important.

Sure, someday maybe we can have it all on brightness, but this is by far good enough for now. Every single evening when I get home and turn it on, I'm instantly struck by how much better it is than EVERY other top end IPS or VA panel I've ever used. PC or TV... it's just a world better.

And that makes me want to baby it just a LITTLE BIT because I want to get a bunch of years out of it before having to live with any burn in.

So I would also recommend:

Use dark mode religiously on all your common web sites, set the task bar to auto hide and set it to a dark grey color.

It really doesn't hurt to have a second monitor if you need to keep static info up for many hours on end or if you have to run something with solid white backgrounds all day long. I'm personally not willing to run Word or Excel or Fusion 360 full screen all day long at static full white on my CX. I think you could for several years at a low brightness. But it's personal choice. I put those on the secondary monitor for the mundane work. But I do browse and youtube and game etc. on it all night.

Just keep the remote in front of you and TURN IT OFF when you know you are walking away for a long period of time.
 
I suggest TranslucentTB for the taskbar. With a black desktop background the only thing visible when the tasbar is hidden is the little slivers of color for the active app/window.
 
  • Like
Reactions: Jake
like this
Ok I did update the firmware and then shut the wifi off on it, and some of these things don't seem to be in the same menus, names, or order anymore.

48C1 settings:
brightness 30, oled brightness 85
set the HDMI I'm using to PC
have "Game optimizer" on
under game optimizer I turned on 'reduce blue' to level 1, it scorches my eyes otherwise
edited the quick menu to have "Screen Off" and "Sleep Timer" on there.
"Adjust Logo Brightness" is set to low

Windows stuff:
made sure I've got Freesync on
screen set to turn off after 45min (screen saver didn't show setting??)
black desktop, will check out that taskbar mod
have been using dark mode on many things

Haven't needed Cleartype yet, but I might if I muck with Win10 UI scaling again.

and I have been just turning it off when I walk away for more than a few min
 
Last edited:
Haven't needed Cleartype yet
Cleartype is enabled by default... which is the exact problem - Cleartype expects a traditional RGB or BGR sub-pixel arrangement and doesn't know how to account for the white sub-pixel on LG's WOLED panels.

For this reason, setting it to "greyscale" or turning it off altogether can make text look better and/or clearer.


and I have been just turning it off when I walk away for more than a few min
It's my undertanding that, in a multi-monitor configuration, turning off the TV the traditional way causes it to disappear from the OS which can mess up the placement of open programs and stuff which is why I mentioned the whole "turning off only the screen" ability that keeps the display visible to the OS.
 
Cleartype is enabled by default... which is the exact problem - Cleartype expects a traditional RGB or BGR sub-pixel arrangement and doesn't know how to account for the white sub-pixel on LG's WOLED panels.

For this reason, setting it to "greyscale" or turning it off altogether can make text look better and/or clearer.



It's my undertanding that, in a multi-monitor configuration, turning off the TV the traditional way causes it to disappear from the OS which can mess up the placement of open programs and stuff which is why I mentioned the whole "turning off only the screen" ability that keeps the display visible to the OS.
Strange thing, nothing changes when I turn it off. It resets window positions when I turn it back on.
 
thanks guys, if the C1 is just a minor step from the CX I can see why this thread lasted so long.

Any immediate tricks or tips for using as a pure monitor? No sound, no switching, no apps.

screen savers / avoiding static content where possible

https://hardforum.com/threads/lg-48cx.1991077/post-1044971422

----
https://hardforum.com/threads/lg-48cx.1991077/post-1045026138

https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/

http://www.itsamples.com/taskbar-hider.html

----------------------
View distance vs. text quality and AA

https://hardforum.com/threads/lg-48cx.1991077/post-1044972495

---
93ppi 48" 4k at 33.5" = 163 ppi 27" 4k at 18.85"

Same sharpness and density, Pixels Per Degree on your eyeballs. Sitting farther it will be even higher density per degree.

So he and some other people who complain about the ppi (and I suspect even text subsampling to a large degree) - are almost certainly sitting too close, perhaps much too close for this screen.
 
Last edited:
Last night was the first time I've ever seen the screen give me the message that it was about to start a pixel refresher routine and that if I saw a white line that it was normal, that it would take awhile, etc. I hit the OK button and let it run.

I don't know if when that activates goes by # of hours run just to check in case, or if by measuring brightness degradation somehow. If by # of hours the tv is on, I do leave it on but with the emitters off with the "turn off the screen" trick part of the time so it might be logging more hours on than actual hours of wear. I do play mostly HDR and fakeHDR reshade filtered games on it though so HUDs could be degrading a little.

From what I've read LG OLEDs reserve the first 25% brightness for evening wear so it should last a long time I think. I do have the BB warranty that covers burn in though if it ever game to that within 5yrs.
 
Last edited:
Strange thing, nothing changes when I turn it off. It resets window positions when I turn it back on.

One of the latest windows updates supposedly fixed the windows shuffling issue.
However whenever you turn the screen off entirely to standby, you do lose eARC sound if you are using that method so you'd have to switch to a different sound device while the screen was off if you still wanted to hear anything.

I use displayfusion pro which has a window position save function where you can save window positions to one or multiple named profiles.

I also set my regularly used apps to displayfusion window moving functions so they all have their own "home" position. I set those each to their own hotkey in displayfusion and link those hotkeys each to their own streamdeck button with a little icon for each app.
I cobbled together some displayfusion user scripts from different functions so that each time I hit an apps button it:
--checks to see if the app is open or not, if not, opens it and moves it to set position.
--if it's open it checks to see if it's minimized.
--If it's open and not minimize it.
--If it's open and minimized, restore it to the set position.

... I can hit the same button a few times and get the particular app window back to the home position or minimize/restore it bascially
... I also have a window position profile button on the streamdeck set to the displayfusion window position profile hotkey - so I can simply shuffle them all back at once without having to use the scripted function buttons.

-------

Personally I use side monitors for static desktop/app material and reserve the oled as a media and gaming "stage". I use the "turn off the screen" without going into standby trick as a way to sort of "minimize" whatever I have paused on the OLED, or whenever I go afk since I tend to get sidetracked.
 
It gives that message at the 2000 hour mark for the deep/extensive compensation cycle.
 
  • Like
Reactions: elvn
like this
For everyone new to the thread, this is a good place to post the link to the "Better Cleartype Tuner" again. Setting the cleartype to "greyscale" is the best overall compromise option for the sub pixel arrangement of the CX / C1. Turning cleartype off is not really an option it looks far worse. Leaving it stock has minor visible color halos around text.

https://github.com/bp2008/BetterClearTypeTuner/releases

If you are sitting at a more reasonable distance for a 48" screen (30+ inches) it will largely solve any complaints about text. If you keep it right in your face, nothing will completely solve it at this time as Microsoft doesn't support non-standard subpixel arrangements with cleartype.

I was totally satisfied after switching to greyscale.
 
For everyone new to the thread, this is a good place to post the link to the "Better Cleartype Tuner" again. Setting the cleartype to "greyscale" is the best overall compromise option for the sub pixel arrangement of the CX / C1. Turning cleartype off is not really an option it looks far worse. Leaving it stock has minor visible color halos around text.

https://github.com/bp2008/BetterClearTypeTuner/releases

If you are sitting at a more reasonable distance for a 48" screen (30+ inches) it will largely solve any complaints about text. If you keep it right in your face, nothing will completely solve it at this time as Microsoft doesn't support non-standard subpixel arrangements with cleartype.

I was totally satisfied after switching to greyscale.
This literally fixed the text in windows file explorer by setting it to RGB as that was the only issue I was having with text. My LG C1 is now perfect. :D Thanks!
 
What madpistol said. I keep my screen in SDR mode with a very low OLED light setting for general work and SDR gaming (~25-30). It only enters HDR mode when HDR content is displayed in my media player software or when a game turns it on, then I let the brightness crank up. I fully realize that HDR mode should be allowed to take the pixels to 100%, but I use my screen in a dim office only so I have set my HDR max to 80% OLED light... the HDR effect is still great compared to any consumer HDR monitor. It really is. These screens aren't about getting 1000 nits in HDR... they're about everything else. The image is so ridiculously good compared to ANYTHING else on the market that having the absolute brightest pop on a bright HDR scene just doesn't seem that important.

Sure, someday maybe we can have it all on brightness, but this is by far good enough for now. Every single evening when I get home and turn it on, I'm instantly struck by how much better it is than EVERY other top end IPS or VA panel I've ever used. PC or TV... it's just a world better.

The crazy part about this screen is that it may "only" hit 750-nits in HDR, but it feels much brighter than that. That's because each pixel is self-lit, which means you can have insane bright spots in the screen surrounded by total darkness. This is most apparent on a game like Doom Eternal; HDR is painful to look at on this screen because it is able to produce violently bright splashes of color instantly, then die back down; it doesn't give our eyes time to adjust, which makes the image unbelievably punch. I have a Samsung Q90R which can hit 1500+ nits peak brightness with HDR, and believe me when I say that it doesn't punch nearly as hard as the CX does on Doom Eternal.

People get obsessed about how many nits their HDR screen can put out... it's a useless measure. LG's OLED tech is the ONLY HDR experience anyone should be gaming with. The stupidly fast pixel response + per-pixel brightness control puts OLED on another level compared to it's LCD counterparts. It's just that good.
 
Yeah it's why I sold my PG32UQX. Even though it's capable of 1400nits full field, 90% of HDR content outside of high APL scenarios looked better on the CX. Stuff with highlights like the flamethrower in Metro definitely had much more impact on the Asus but under scrutiny came at the cost of really bad surrounding light pollution (haloing) as well as just overall worse image quality due to the huge difference in contrast.

Also the first time I put up a 100% full field white HDR slide neither me or my wife could look at the screen head on. I really don't think that brightness level is healthy for your eyes but thankfully not much content requires a sustained flash like that.
 
The crazy part about this screen is that it may "only" hit 750-nits in HDR, but it feels much brighter than that. That's because each pixel is self-lit, which means you can have insane bright spots in the screen surrounded by total darkness. This is most apparent on a game like Doom Eternal; HDR is painful to look at on this screen because it is able to produce violently bright splashes of color instantly, then die back down; it doesn't give our eyes time to adjust, which makes the image unbelievably punch. I have a Samsung Q90R which can hit 1500+ nits peak brightness with HDR, and believe me when I say that it doesn't punch nearly as hard as the CX does on Doom Eternal.

People get obsessed about how many nits their HDR screen can put out... it's a useless measure. LG's OLED tech is the ONLY HDR experience anyone should be gaming with. The stupidly fast pixel response + per-pixel brightness control puts OLED on another level compared to it's LCD counterparts. It's just that good.

Yes if you watch HDTVTEST's HDR temperature map footage of one of the newer star wars movies you can see that most of the scene is in SDR and what is in HDR range is often not higher than 700 in most places. He always says that in blind tests people always like OLED HDR better since FALD has to play zones off against each other.


https://hardforum.com/threads/lg-48cx.1991077/post-1044646108

459643_7X2ncu9.png


Yeah it's why I sold my PG32UQX. Even though it's capable of 1400nits full field, 90% of HDR content outside of high APL scenarios looked better on the CX. Stuff with highlights like the flamethrower in Metro definitely had much more impact on the Asus but under scrutiny came at the cost of really bad surrounding light pollution (haloing) as well as just overall worse image quality due to the huge difference in contrast.

Also the first time I put up a 100% full field white HDR slide neither me or my wife could look at the screen head on. I really don't think that brightness level is healthy.

Not surprising. Vincent on HDTVTest often states that in order to do mixed/constrasted area scenes, FALD displays have to more or less make a dim or glow "halo" area in relation to each other or they would overwhelm one of the other area, Therefore outside of the brightest overall scenes (without darker areas, shadows, etc mixed in) they can't hit their quoted peak numbers. They also lose detail when they offset their zones against each other, where contrasted areas meet.



Temperature map of a HDR 10,000 capable game here shows that only the sun and the brightest reflection of it on the gun barrel would be that bright. I don't think it would be worth the FALD zone issues to see that highlight at more than 750-800nit. I'm all for 10,000 nit HDR where it would mostly be things like the sun and the sun's reflections off a a gun barrel because it would add realism if at the per pixel level someday. It would almost never be a full field of white if ever. In fact every tv has % window for what % of the screen is capable of different peak nits to begin with.
459651_jE1pSgZ.jpg

Per pixel emissive is way better overall. Ultimate black depth side by side with colors at the pixel level is way better than trading off to LCD and FALD issues that lose out in black depth, side by side contrast and details lost in competing zone vs zone. LCD is also slow response time and usually has uniformity issues (very badly if not FALD). Some of the newer LCD FALD TV's also use a viewing angle filter that reduces their contrast ratio and black depth by a lot.

Stuff with highlights like the flamethrower in Metro definitely had much more impact on the Asus

There is a lot of variance in the settings people use on their screens and in different games. By default, game mode on the LG CX is pretty low saturation. I've been using Reshade with FakeHDR filter, LightroomFilter, and a sharpness filter that helps the render look. I also turned the game mode's color setting up in the TV's OSD to start with before adjusting everything in Reshade. The result is hugely different than the default game mode's appearance for SDR games. Darksiders3 is the only game I've been playing in SDR (with Reshade filters and increased OSD color) and I love how it looks now. The other two games I played and finished since I got the LG CX, jedi:fallen order and Nioh2, are HDR so have their own color metadata that is way more bright and saturated than the default dull game mode. The next two games I have to play are AC:Odyssey and Immortals:Fenyx rising which both have great HDR.
 
Last edited:
Yes if you watch HDTVTEST's HDR temperature map footage of one of the newer star wars movies you can see that most of the scene is in SDR and what is in HDR range is often not higher than 700 in most places. He always says that in blind tests people always like OLED HDR better since FALD has to play zones off against each other.


https://hardforum.com/threads/lg-48cx.1991077/post-1044646108

View attachment 371538




Not surprising. Vincent on HDTVTest often states that in order to do mixed/constrasted area scenes, FALD displays have to more or less make a dim or glow "halo" area in relation to each other or they would overwhelm one of the other area, Therefore outside of the brightest overall scenes (without darker areas, shadows, etc mixed in) they can't hit their quoted peak numbers. They also lose detail when they offset their zones against each other, where contrasted areas meet.



Temperature map of a HDR 10,000 capable game here shows that only the sun and the brightest reflection of it on the gun barrel would be that bright. I don't think it would be worth the FALD zone issues to see that highlight at more than 750-800nit. I'm all for 10,000 nit HDR where it would mostly be things like the sun and the sun's reflections off a a gun barrel because it would add realism if at the per pixel level someday. It would almost never be a full field of white if ever. In fact every tv has % window for what % of the screen is capable of different peak nits to begin with.
View attachment 371539

Per pixel emissive is way better overall. Ultimate black depth side by side with colors at the pixel level is way better than trading off to LCD and FALD issues that lose out in black depth, side by side contrast and details lost in competing zone vs zone. LCD is also slow response time and usually has uniformity issues (very badly if not FALD). Some of the newer LCD FALD TV's also use a viewing angle filter that reduces their contrast ratio and black depth by a lot.



There is a lot of variance in the settings people use on their screens and in different games. By default, game mode on the LG CX is pretty low saturation. I've been using Reshade with FakeHDR filter, LightroomFilter, and a sharpness filter that helps the render look. I also turned the game mode's color setting up in the TV's OSD to start with before adjusting everything in Reshade. The result is hugely different than the default game mode's appearance for SDR games. Darksiders3 is the only game I've been playing in SDR (with Reshade filters and increased OSD color) and I love how it looks now. The other two games I played and finished since I got the LG CX, jedi:fallen order and Nioh2, are HDR so have their own color metadata that is way more bright and saturated than the default dull game mode. The next two games I have to play are AC:Odyssey and Immortals:Fenyx rising which both have great HDR.

Yup I still have my Acer X27 1000 nit FALD display and I haven't had any desire to use it. Been sitting in the closet since December last year.
 
For everyone new to the thread, this is a good place to post the link to the "Better Cleartype Tuner" again. Setting the cleartype to "greyscale" is the best overall compromise option for the sub pixel arrangement of the CX / C1. Turning cleartype off is not really an option it looks far worse. Leaving it stock has minor visible color halos around text.

https://github.com/bp2008/BetterClearTypeTuner/releases

If you are sitting at a more reasonable distance for a 48" screen (30+ inches) it will largely solve any complaints about text. If you keep it right in your face, nothing will completely solve it at this time as Microsoft doesn't support non-standard subpixel arrangements with cleartype.

I was totally satisfied after switching to greyscale.
I’d experiment. Grayscale can have a bit weird effects on text in some apps. I find that with the 125% scaling I prefer RGB with contrast adjusted to something like 1600. I wish Windows had a grayscaling font smoothing option that looked like OSX without font smoothing. OSX looks perfectly fine to my eyes whereas Windows is a bit of a compromise.
 
Out of curiousity, why do you ask? Are you wondering if it is not marketed as a TV?
for someone looking to import one in another country. It became a curiosity

Also started wondering if they do label/market it as a TV. I suspected the CX might have a reference to it and that the C1 might have done away with it. Maybe someone at LG is enlightened.

Just search for LG C1 unboxing on youtube. I don't think it does.
Those unboxings were annoying while trying to find this. None or few of them thought to show off the box as far as I could tell.
 
The crazy part about this screen is that it may "only" hit 750-nits in HDR, but it feels much brighter than that. That's because each pixel is self-lit, which means you can have insane bright spots in the screen surrounded by total darkness. This is most apparent on a game like Doom Eternal; HDR is painful to look at on this screen because it is able to produce violently bright splashes of color instantly, then die back down; it doesn't give our eyes time to adjust, which makes the image unbelievably punch. I have a Samsung Q90R which can hit 1500+ nits peak brightness with HDR, and believe me when I say that it doesn't punch nearly as hard as the CX does on Doom Eternal.

People get obsessed about how many nits their HDR screen can put out... it's a useless measure. LG's OLED tech is the ONLY HDR experience anyone should be gaming with. The stupidly fast pixel response + per-pixel brightness control puts OLED on another level compared to it's LCD counterparts. It's just that good.
Try explaining this to wizz33.
 
  • Keep display brightness as low as possible for longevity. (obviously, crank it up for HDR)
  • Set Screen Shift On
  • Set Logo Luminance Adjustment to Low.
  • Set Game Mode On (for fastest response)
  • On the Home Dashboard, set the HDMI port you have your PC plugged into as "PC" (it does change some settings).
  • Set you desktop background to flat black
  • Set a Screen Saver to 5 minutes.
  • Set Automatic Screen off (in Windows) to 1 hour or less.
  • Put a bucket under your mouth to catch the drool.
Automatic screen off in windows makes no sense since it will put tv in the gallery mode.
Better off with just blank windows screen saver
 
I wouldn't trust any of those screensavers because they can crash, app crash/notifications can take the top layer. Rare but possible - OS can crash and get stuck on bios screen on reboot, gpu can crash leaving gpu artifacts/noise on screen, etc.

I use the "turn off the screen" feature via 2 clicks on the remote or by mic on the remote saying it. To the system the screen is still on and functioning and not in stand-by. Sound still running if you don't mute it. I do have a screensaver on a timer though but it just turns on my side screens because any time it would kick on my oled is "minimized" in the "turn of the screen" mode. Any time I'm not actively using the OLED and any time I walk away I "minimize it" to emitters off with 'turn off the screen'. Hit the wheel on the remote and it turns back on in a fraction of a second.

https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/
 
Last edited:
Automatic screen off in windows makes no sense since it will put tv in the gallery mode.
Better off with just blank windows screen saver
You can disable the gallery mode which just gives you a bouncing fireworks with no signal text. I wish you could disable that but I expect it's still less wear than the bright white gallery thing.
 
It would be great if I could find a way to do the minimize the screen trick and switch picture modes, inputs, etc using a windows app hotkey to a stream deck key but I don't think that it can be done or easily enough. If the LG app was on windows and had hotkeys for input type, picture mode, and screen emitters off mode that would be very handy.

It would be neat if LG ever expanded the Universal Control feature (which controls other devices from the TV) to a windows app connecting via wifi, then included the TV itself in the devices that could be controlled.

There might be some emitter I could buy for windows that could clone the LG Magic remote's signals though. I'll have to look into that.
Instead of infrared (IR), the LG Magic Remote uses radio frequencies (RF) or Bluetooth to communicate directly to the TV
Pretty sure it uses bluetooth mainly if not entirely.

Maybe windows 11's compatibility with phone apps could come into play with the LG remote app too.

Using the remote isn't that bad, I just prefer using my streamdeck keys for everying (via hotkeys mapped to it or via streamdeck apps). I leave the LG magic remote in between my split keyboard halves so it's always accessible.
 
Last edited:
You can disable the gallery mode which just gives you a bouncing fireworks with no signal text. I wish you could disable that but I expect it's still less wear than the bright white gallery thing.
really ?! I need to do that. how? I only found you can press mute button few times and disable no signal textbox
 
At least on the C9/CX it's "No signal image" off in General settings.
On an E9 with that setting turned off, the fireworks never show up and this is all that appears (apologies for vertical video syndrome, the TV is actually owned by a family friend):
 
On an E9 with that setting turned off, the fireworks never show up and this is all that appears (apologies for vertical video syndrome, the TV is actually owned by a family friend):
I think they have changed it a bit on CX, I think my C9 looks the same with the no signal text moving around the screen. I wish you could disable that too so it would be a blank screen but unfortunately that's not an option it seems.
 
Howdy folks, I'm a 55 inch C7 owner that used it as my primary computer monitor for 2 years, it's in my living room now. Good news, no burn in!

Eventually I went back to ultra high refresh rate monitors because I play a lot of Battle Royale games and high input lag with 60hz is a deal breaker. I still miss the OLED picture quality every day.

I'm considering a 48C1 or the 42 inch that LG has in production.

I have a couple questions about BFI on the C1 that I'm hoping others can clarify:

42 inch @ 1440P with No Scaling in NVCP would be 31.5 inches with black bars on all sides. That would be a much more manageable size for Warzone I think and would allow me to sit much closer. I'm putting the 48C1 or 42C2 on a monitor arm for easy repositioning.

If I'm running 1440p or 4k with BFI 120 HZ on High or Medium in Game Mode to get the 5ms input lag then I can't run PC mode input right? This is annoying because I would like Chroma 4:4:4 and BFI on High or Medium. I thought maybe a firmware update might resolve this?

My understanding of the "Prevent Input Delay" feature is that setting it to "Boost" only benefits 60z signals, is that correct?

Rtings says: "There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, so it does not affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1ms for 1080p @ 60Hz, 13.4ms for 1440p @ 60Hz, and 13.0ms for 4k @ 60Hz."
Rtings says: "In 'Game Optimizer' mode, the settings are the same, but note that you can't enable BFI if you have Prevent Input Delay set to 'Boost'. Unfortunately, you can't enable BFI in 'PC' mode."

I literally never use gsync and I only play Battleroyale games on low-ish settings to avoid any frame rate drops. I haven't used HDR for gaming in years because single player mostly bores me. The last HDR game I played was Shadow of War and it was both beautiful and fun. I have a 3080.

I'm coming from an Asus 280hz Tuf VG279QM, it's got shit picture quality but 280hz is nice, it has 3ms input lag which would be very comparable to the 5MS on the LG C1. 120hz OLED with BFI on Medium or High should give comparable motion clarity, if not better, than what I have now I think.

My use case for this display will be 80% battle royale games and other competitive FPS on mouse and key. I also do an hour of Kovaaks aim training every day. I wish this wasn't necessary but being 40 years old sucks.

Some of you guys are using a keyboard tray that would be barely large enough to support my mouse and I'd have no space for a keyboard at all. No clue how people game at high sensitivity like that!

This guys video makes me think 4k @120hz with BFI on High is going to be amazing! 4:22 has the comparison to CX.

 
Last edited:
Howdy folks, I'm a 55 inch C7 owner that used it as my primary computer monitor for 2 years, it's in my living room now. Good news, no burn in!

Eventually I went back to ultra high refresh rate monitors because I play a lot of Battle Royale games and high input lag with 60hz is a deal breaker. I still miss the OLED picture quality every day.

I'm considering a 48C1 or the 42 inch that LG has in production.

I have a couple questions about BFI on the C1 that I'm hoping others can clarify:

42 inch @ 1440P with No Scaling in NVCP would be 31.5 inches with black bars on all sides. That would be a much more manageable size for Warzone I think and would allow me to sit much closer. I'm putting the 48C1 or 42C2 on a monitor arm for easy repositioning.

If I'm running 1440p or 4k with BFI 120 HZ on High or Medium in Game Mode to get the 5ms input lag then I can't run PC mode input right? This is annoying because I would like Chroma 4:4:4 and BFI on High or Medium. I thought maybe a firmware update might resolve this?

My understanding of the "Prevent Input Delay" feature is that setting it to "Boost" only benefits 60z signals, is that correct?

Rtings says: "There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, so it does not affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1ms for 1080p @ 60Hz, 13.4ms for 1440p @ 60Hz, and 13.0ms for 4k @ 60Hz."
Rtings says: "In 'Game Optimizer' mode, the settings are the same, but note that you can't enable BFI if you have Prevent Input Delay set to 'Boost'. Unfortunately, you can't enable BFI in 'PC' mode."

I literally never use gsync and I only play Battleroyale games on low-ish settings to avoid any frame rate drops. I haven't used HDR for gaming in years because single player mostly bores me. The last HDR game I played was Shadow of War and it was both beautiful and fun. I have a 3080.

I'm coming from an Asus 280hz Tuf VG279QM, it's got shit picture quality but 280hz is nice, it has 3ms input lag which would be very comparable to the 5MS on the LG C1. 120hz OLED with BFI on Medium or High should give comparable motion clarity, if not better, than what I have now I think.

My use case for this display will be 80% battle royale games and other competitive FPS on mouse and key. I also do an hour of Kovaaks aim training every day. I wish this wasn't necessary but being 40 years old sucks.
On the CX BFI and PC mode work together just fine. I still think it's the wrong display for your use cases and a fast LCD is more straightforward to use.
 
Howdy folks, I'm a 55 inch C7 owner that used it as my primary computer monitor for 2 years, it's in my living room now. Good news, no burn in!

Eventually I went back to ultra high refresh rate monitors because I play a lot of Battle Royale games and high input lag with 60hz is a deal breaker. I still miss the OLED picture quality every day.

I'm considering a 48C1 or the 42 inch that LG has in production.

I have a couple questions about BFI on the C1 that I'm hoping others can clarify:

42 inch @ 1440P with No Scaling in NVCP would be 31.5 inches with black bars on all sides. That would be a much more manageable size for Warzone I think and would allow me to sit much closer. I'm putting the 48C1 or 42C2 on a monitor arm for easy repositioning.

If I'm running 1440p or 4k with BFI 120 HZ on High or Medium in Game Mode to get the 5ms input lag then I can't run PC mode input right? This is annoying because I would like Chroma 4:4:4 and BFI on High or Medium. I thought maybe a firmware update might resolve this?

My understanding of the "Prevent Input Delay" feature is that setting it to "Boost" only benefits 60z signals, is that correct?

Rtings says: "There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, so it does not affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1ms for 1080p @ 60Hz, 13.4ms for 1440p @ 60Hz, and 13.0ms for 4k @ 60Hz."
Rtings says: "In 'Game Optimizer' mode, the settings are the same, but note that you can't enable BFI if you have Prevent Input Delay set to 'Boost'. Unfortunately, you can't enable BFI in 'PC' mode."

I literally never use gsync and I only play Battleroyale games on low-ish settings to avoid any frame rate drops. I haven't used HDR for gaming in years because single player mostly bores me. The last HDR game I played was Shadow of War and it was both beautiful and fun. I have a 3080.

I'm coming from an Asus 280hz Tuf VG279QM, it's got shit picture quality but 280hz is nice, it has 3ms input lag which would be very comparable to the 5MS on the LG C1. 120hz OLED with BFI on Medium or High should give comparable motion clarity, if not better, than what I have now I think.

My use case for this display will be 80% battle royale games and other competitive FPS on mouse and key. I also do an hour of Kovaaks aim training every day. I wish this wasn't necessary but being 40 years old sucks.

Some of you guys are using a keyboard tray that would be barely large enough to support my mouse and I'd have no space for a keyboard at all. No clue how people game at high sensitivity like that!

This guys video makes me think 4k @120hz with BFI on High is going to be amazing! 4:22 has the comparison to CX.



I'm just the opposite, I played online shooters starting in dial up days but I haven't played arena shooters in years because they are very boring to me like a demolition derby in a few mall arenas over and over instead of having more content in the game. Rock'em Sock'em robots if you will. That and the fact that cheating is rampant in online games:
...hundreds of thousands of cheaters that have been caught -- probably million+s of players cheating, even with less obvious intermittent assist cheats (temp wall hack spotting, reticule/target assist aimbot when near hitbox "gravity well"~"bullet gravity", etc)... over 420k banned on Apex, Overwatch and destiny cheating up 50% since january, call of duty warzone banned over 30,000 cheaters in only one wave of bans, CoD WZ is at least 475,000 bans. 2018 over 13million pubg accounts banned, end of 2019 pubg confirmed it was banning around 100,000 accounts per week. and there are several high ranked personalities caught and that's just who have been caught on streams cheating, (I believe a few other personalities even caught cheating during competition at esport LAN matches). The Chinese government raided a whole cheat engine subscription model servvice for a-la-cart abilities/cheats recently and arrested people running it (one 76million dollar cheat engine organization out of what's possibly a near 300 million dollar cheat service market across all of china) . There are plenty of other cheats and cheat services online if you look. All of that and also that netcode in a game chooses one side of a coin or another in resolving near timings which comes down to even slightly better ping time being a big advantage. Also that tick rates on most games suck. tick rate timing is actually based on the tick + interp time. Even the best online servers are doing interp server action states at around 15ms ~ 65Hz no matter what your local fps+Hz is. And most of them are wayyy worse, for example:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick

128tick at interp_2 = .015 seconds
64 tick at interp_2 = .031 seconds
22 tick at interp_2 = .090 seconds
12 tick at interp_2 = .166 seconds

/end rant

=============================================================

HDR on this screen is glorious on a well done game. Even using fakeHDR filter + Lightroom filter in reshade on a SDR game looks great compared to anything not HDR. That monitor you referenced is 1,186:1 contrast ratio with the accompanying grey black depth and has no local dimming so is edge flashlights from the sides. I'd never go back to something like that personally.

So this is a gorgeous goddess of a display.. supernatural beauty (infinitie black depth, inf to 1 contrast, high color volume, HDR ~800).. grace in movement (120hz, VRR, miniscule response time) and per pixel fidelity at 4k.. and you want a fast rooster-rat you've shaved the hair and plucked the feathers off of (low graphics settings) to put in a rat race cock fight arena from the sound of it. :inpain: ... So I have to agree with kasakka that you might want to look at something else. Though there are some high Hz miniLED LCD FALD displays coming out now at 144hz or more that might be better all-rounders for people playing games that get over 300fps average (in order to get appreciable benefits out of 240hz+) .. which isn't likely at 4k unless you are playing something with very weak demands/graphics settings (or stripping all of the graphics for some scoring/ladder advantage). It's hard enough to get 100fps+ average on many top games with high+ to near ultra settings on a 120hz 4k screen and that's not even with raytracing on. The graphics are getting better and better with HDR, 4k, DLSS, VRR, even raytracing if you have the wiggle room, etc.. I can't imagine (exaggerating) running a black and white 720p TV to get scoring advantages in some demolition derby but that's me. :D👍

Worth noting that best use scenario for BFI is matching or exceeding the peak Hz with your frame rate (over 120fps minimum frame rate on LG CX). If you are getting 120 or 144 fps minimum on a game with a very fast response time 120hz/144hz screen, you will already be getting a 50% to 60% sample and hold blur reduction even without BFI. OLED response times are extremely fast too so there won't be any ghosting or anything muddying the clarity increase results like when using LCDs so it will probably be marginally clearer results than those same fps+Hz on a lcd.

You can substitute the 1000 px/sec by matching the fps values to the same Hz in this graphics (e.g. 480fps at 480Hz). 1000 is just proving a point about where it would need to be to be "zero" blur at 1000px/sec 1000fps.
okw997S.png

~ 8pixels of blur with a sharp OLED response time cut off at 120fps+Hz is still not bad. BFI without crosstalk at 120fps solid/minimum is probably like (EDIT) 240fps at 240Hz (blur reduction wise ~ 4pixels) but it has other tradeoffs:
--The frame rate minimum requirement (and accompnaying loss of graphics and FX quality)
--dimmer / less luminous colored screen
--no HDR
--no VRR
--can be eye fatiguing/headache inducing over time even if you can't "see it" (sort of like PWM in that, even if BFI has uniform insertion).
--If I'm not mistaken it also introduces a little more input lag. I remember something like 13ms with BFI off vs 21ms with it on at 120fps/Hz but that might have been the C9. The C1 might even have different #'s than the CX too.



My understanding of the "Prevent Input Delay" feature is that setting it to "Boost" only benefits 60z signals, is that correct?
Vincent of HDTVtest did a video that referenced it.
If you are playing a 60fps game or running a gpu that only does 60hz on the tv it might still be useful. Vincent said it looks like it's doubling the frames for a slight input lag reduction on 60fps content(12.5ms vs 9.6ms) and is not useful at 120fps. In fact I think he said it makes the input lag worse at 120fps so you should still use standard mode for that.
 
Last edited:
https://www.resetera.com/threads/re...ame-insertion-on-and-off.332765/post-52471061

Unfortunately the CX has very limited options for BFI.
With its 120Hz BFI support, it should be able to offer 60Hz BFI with three levels of persistence.
Instead, the Auto/Low/Medium settings run at 120Hz, and only the High setting runs at 60Hz.

The display should be capable of the following combinations with a 60Hz source, where 1 is the source frame, and 0 is a black frame.

  • 1111 - 60Hz, no BFI. 16.67ms persistence, full brightness, no flicker. Lots of motion blur.
  • 1110 - 60Hz 1/4 BFI. 12.5ms persistence, 3/4 brightness, high flicker. Slightly reduced motion blur, smooth motion.
  • 1100 - 60Hz 1/2 BFI. 8.33ms persistence, 1/2 brightness, high flicker. Moderately reduced motion blur, smoother motion.
  • 1000 - 60Hz 3/4 BFI. 4.17ms persistence, 1/4 brightness, high flicker. Highly reduced motion blur, very smooth motion.
  • 1010 - 120Hz 1/2 BFI. 8.33ms persistence, 1/2 brightness, moderate flicker. Moderately reduced motion blur, double-images/judder in motion.
The highlighted options are the only ones available on the CX.
Without having one of these displays myself, I couldn't tell you what it's doing to differentiate Low/Medium BFI; only that it's running both at 120Hz like the last example.
The C9's BFI option is equivalent to 1/2 BFI rather than 3/4 - so it's brighter at 60Hz, but has less-clear/smooth motion than the CX.

Unfortunately BFI modes on most modern displays really highlight just how good CRTs were.
CRTs were able to pull off <1ms persistence while still remaining at roughly 100 nits brightness, which is the spec for SDR.
No other display using BFI that has options for sub-1ms persistence is anywhere close to reaching 100 nits brightness.


pswii60 said:
BFI doesn't support VRR which is a bit pants. And motion blur at 120hz is already massively reduced over 60hz even without BFI, so I'd rather take the brighter imager and lower latency at that point.
At 120Hz the brightness only drops by 1/2 rather than 3/4 to achieve the same 4.17ms persistence, so it's far more usable.
VRR is fundamentally incompatible with most BFI implementations, and there are a whole lot of technical challenges that would have to be overcome to make it work. I know ASUS have their ELMB-Sync monitors, but there's a reason that does not work well.

Correct me if I'm wrong but I think accoding to that and other comments..
50% BFI (resulting in moderate flicker/crosstalk and 50% brightness reduction) at 120fps+120Hz on the CX is 4.17ms persistence (half of the 120fpsHz BFI off persistence) which would more like 240fps at 240Hz in the previous graphic.

That is:

(compare to a baseline of 60fps at 60Hz baseline = 16.7ms ~ 17pixels of motion blur)

8 pixels of motion blur at 120fpsHz with BFI off
vs.
4 pixels of motion blur at 120fpsHz with BFI on medium (and with the accompanying BFI tradeoffs)



Generally, BFI tech dims the screen the same % it reduces the blur. So at 50% reduction, you get a 50% less luminous screen.



When objects move at speed or when moving the FoV/gameworld around at speed you'd get this kind of blur.
Outside of these maximum blur amounts, it varies most with the speed you are moving the viewport at any given time or if it's moving at all. Also how fast other objects are moving on screen. So it's more like zero(static) "up to" those peaks in each slot.

from https://blurbusters.com/ (edited by me showing the 240fpsHz darker to suggest BFI dimming)
k1aCWd3.png

1ms / "zero" blur of CRT or 1000fps at 1000Hz

ZCxnQl7.png


--------------------------

All of those 120Hz blur reductions listed are reliant on 120fps minimums, not averages. Often omitted from comments that only cite 120hz as cutting sample and hold blur (like the comment in the above quote by pswii60).

I shoot for at least 100fps average when possible.. relying on VRR to ride the roller coaster of variable frame rates and Hz. Which is typically something like:

(70) 85 fpsHz <<<< 100 fpsHz >>>> 115fpsHz (117 capped)

which results in something like:

~ 13.3px <<<<< 10 px >>> ~ 8.5 px of motion blur


Of course 120fps minimum would be less at ~ 8px of blur all of the time but VRR allows that tradeoff to gain much higher graphics settings smoothly, and while still gaining a lot of motion definition in addition to some motion clarity/blur reduction. Combined with quality DLSS in a game supporting it well, the graphics setting ceilings and/or frame rates can be higher too.
 
Last edited:
Howdy folks, I'm a 55 inch C7 owner that used it as my primary computer monitor for 2 years, it's in my living room now. Good news, no burn in!

Eventually I went back to ultra high refresh rate monitors because I play a lot of Battle Royale games and high input lag with 60hz is a deal breaker. I still miss the OLED picture quality every day.

I'm considering a 48C1 or the 42 inch that LG has in production.

I have a couple questions about BFI on the C1 that I'm hoping others can clarify:

42 inch @ 1440P with No Scaling in NVCP would be 31.5 inches with black bars on all sides. That would be a much more manageable size for Warzone I think and would allow me to sit much closer. I'm putting the 48C1 or 42C2 on a monitor arm for easy repositioning.

If I'm running 1440p or 4k with BFI 120 HZ on High or Medium in Game Mode to get the 5ms input lag then I can't run PC mode input right? This is annoying because I would like Chroma 4:4:4 and BFI on High or Medium. I thought maybe a firmware update might resolve this?

My understanding of the "Prevent Input Delay" feature is that setting it to "Boost" only benefits 60z signals, is that correct?

Rtings says: "There's a new setting for 2021 models found in the Game Optimizer menu, called Prevent Input Delay. There are two options: 'Standard' and 'Boost'. We ran several input lag tests and found that the 'Boost' setting consistently lowers the input lag by about 3ms when the TV is running 60Hz compared to the LG CX OLED. It works by sending a 120Hz signal to refresh the screen more frequently, so it does not affect 120Hz input lag. The published results are what we measured using the 'Boost' setting. On 'Standard', we measured 13.1ms for 1080p @ 60Hz, 13.4ms for 1440p @ 60Hz, and 13.0ms for 4k @ 60Hz."
Rtings says: "In 'Game Optimizer' mode, the settings are the same, but note that you can't enable BFI if you have Prevent Input Delay set to 'Boost'. Unfortunately, you can't enable BFI in 'PC' mode."

I literally never use gsync and I only play Battleroyale games on low-ish settings to avoid any frame rate drops. I haven't used HDR for gaming in years because single player mostly bores me. The last HDR game I played was Shadow of War and it was both beautiful and fun. I have a 3080.

I'm coming from an Asus 280hz Tuf VG279QM, it's got shit picture quality but 280hz is nice, it has 3ms input lag which would be very comparable to the 5MS on the LG C1. 120hz OLED with BFI on Medium or High should give comparable motion clarity, if not better, than what I have now I think.

My use case for this display will be 80% battle royale games and other competitive FPS on mouse and key. I also do an hour of Kovaaks aim training every day. I wish this wasn't necessary but being 40 years old sucks.

Some of you guys are using a keyboard tray that would be barely large enough to support my mouse and I'd have no space for a keyboard at all. No clue how people game at high sensitivity like that!

This guys video makes me think 4k @120hz with BFI on High is going to be amazing! 4:22 has the comparison to CX.



I would recommend trying the Aorus FI32U which has BFI + VRR between 100-144hz. It's a 32 inch 4k 144Hz IPS panel and those who own the monitor says the BFI + VRR works pretty well, although I guess you can also wait for professional reviews before deciding.
 
I'm just the opposite, I played online shooters starting in dial up days but I haven't played arena shooters in years because they are very boring to me like a demolition derby in a few mall arenas over and over instead of having more content in the game. Rock'em Sock'em robots if you will. That and the fact that cheating is rampant in online games:
...hundreds of thousands of cheaters that have been caught -- probably million+s of players cheating, even with less obvious intermittent assist cheats (temp wall hack spotting, reticule/target assist aimbot when near hitbox "gravity well"~"bullet gravity", etc)... over 420k banned on Apex, Overwatch and destiny cheating up 50% since january, call of duty warzone banned over 30,000 cheaters in only one wave of bans, CoD WZ is at least 475,000 bans. 2018 over 13million pubg accounts banned, end of 2019 pubg confirmed it was banning around 100,000 accounts per week. and there are several high ranked personalities caught and that's just who have been caught on streams cheating, (I believe a few other personalities even caught cheating during competition at esport LAN matches). The Chinese government raided a whole cheat engine subscription model servvice for a-la-cart abilities/cheats recently and arrested people running it (one 76million dollar cheat engine organization out of what's possibly a near 300 million dollar cheat service market across all of china) . There are plenty of other cheats and cheat services online if you look. All of that and also that netcode in a game chooses one side of a coin or another in resolving near timings which comes down to even slightly better ping time being a big advantage. Also that tick rates on most games suck. tick rate timing is actually based on the tick + interp time. Even the best online servers are doing interp server action states at around 15ms ~ 65Hz no matter what your local fps+Hz is. And most of them are wayyy worse, for example:

Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick

128tick at interp_2 = .015 seconds
64 tick at interp_2 = .031 seconds
22 tick at interp_2 = .090 seconds
12 tick at interp_2 = .166 seconds

/end rant

=============================================================

HDR on this screen is glorious on a well done game. Even using fakeHDR filter + Lightroom filter in reshade on a SDR game looks great compared to anything not HDR. That monitor you referenced is 1,186:1 contrast ratio with the accompanying grey black depth and has no local dimming so is edge flashlights from the sides. I'd never go back to something like that personally.

So this is a gorgeous goddess of a display.. supernatural beauty (infinitie black depth, inf to 1 contrast, high color volume, HDR ~800).. grace in movement (120hz, VRR, miniscule response time) and per pixel fidelity at 4k.. and you want a fast rooster-rat you've shaved the hair and plucked the feathers off of (low graphics settings) to put in a rat race cock fight arena from the sound of it. :inpain: ... So I have to agree with kasakka that you might want to look at something else. Though there are some high Hz miniLED LCD FALD displays coming out now at 144hz or more that might be better all-rounders for people playing games that get over 300fps average (in order to get appreciable benefits out of 240hz+) .. which isn't likely at 4k unless you are playing something with very weak demands/graphics settings (or stripping all of the graphics for some scoring/ladder advantage). It's hard enough to get 100fps+ average on many top games with high+ to near ultra settings on a 120hz 4k screen and that's not even with raytracing on. The graphics are getting better and better with HDR, 4k, DLSS, VRR, even raytracing if you have the wiggle room, etc.. I can't imagine (exaggerating) running a black and white 720p TV to get scoring advantages in some demolition derby but that's me. :D👍

Worth noting that best use scenario for BFI is matching or exceeding the peak Hz with your frame rate (over 120fps minimum frame rate on LG CX). If you are getting 120 or 144 fps minimum on a game with a very fast response time 120hz/144hz screen, you will already be getting a 50% to 60% sample and hold blur reduction even without BFI. OLED response times are extremely fast too so there won't be any ghosting or anything muddying the clarity increase results like when using LCDs so it will probably be marginally clearer results than those same fps+Hz on a lcd.

You can substitute the 1000 px/sec by matching the fps values to the same Hz in this graphics (e.g. 480fps at 480Hz). 1000 is just proving a point about where it would need to be to be "zero" blur at 1000px/sec 1000fps.
View attachment 374941

~ 8pixels of blur with a sharp OLED response time cut off at 120fps+Hz is still not bad. BFI without crosstalk at 120fps solid/minimum is probably like (EDIT) 240fps at 240Hz (blur reduction wise ~ 4pixels) but it has other tradeoffs:
--The frame rate minimum requirement (and accompnaying loss of graphics and FX quality)
--dimmer / less luminous colored screen
--no HDR
--no VRR
--can be eye fatiguing/headache inducing over time even if you can't "see it" (sort of like PWM in that, even if BFI has uniform insertion).
--If I'm not mistaken it also introduces a little more input lag. I remember something like 13ms with BFI off vs 21ms with it on at 120fps/Hz but that might have been the C9. The C1 might even have different #'s than the CX too.




Vincent of HDTVtest did a video that referenced it.
If you are playing a 60fps game or running a gpu that only does 60hz on the tv it might still be useful. Vincent said it looks like it's doubling the frames for a slight input lag reduction on 60fps content(12.5ms vs 9.6ms) and is not useful at 120fps. In fact I think he said it makes the input lag worse at 120fps so you should still use standard mode for that.

I wish I didn't love competition so much because the cheaters in Warzone are driving me insane! I wish I liked story driven content but it bores me to tears. I've tried dozens of story focused games that have stellar reviews, I make it about three hours in, get bored and quit. Even games like the Witcher and RDR... it just isn't competitive enough.

I agree that Arena shooters are boring, I quit FPS for years and then Battle royales happened. It pulled me back in like nothing I've ever tried other than WoW. Playing with a quality 4 man squad in Warzone is the most fun game I've played in 10 years, it's also the most frustrating.

The only other games I genuinely enjoy are strategy games. I'm a lifelong XCOM addict and I also enjoy Diablo type games. As long as the game is light on story and deep on replayability I'm interested. Recently I started playing Rocket League which is entirely server hosted so there is no cheaters and that game is bloody awesome!

I'm not planning on playing FPS games at 4k. I'm not planning on using Gsync, VRR, or HDR. Raytracing is literally a joke IMHO.

It's actually pretty easy to run Fortnite, Warzone, and Apex Legends at higher than 120 fps at all times and that's where BFI is a huge plus. Graphics fidelity means nothing in those titles. I want OLED because the motion resolution is so good and the colors are so clear that spotting people will be easier.

My Warzone addiction has convinced me to do the following: Work out regularly to improve my reaction time, drink less booze, smoke less pot, aim train daily, upgrade my PC repeatedly, learn a fuck-load about Ram overclocking and go to bed earlier. All to be marginally better at a game that I will never truly master because I'm 40 years old. The hilarious thing is that now I'm in the best shape of my life and my yoga practice is really evolving.

I'm not looking for casual games... I WANT to be obsessive. Golf was too hard, video games are less agonizing.
 
I would recommend trying the Aorus FI32U which has BFI + VRR between 100-144hz. It's a 32 inch 4k 144Hz IPS panel and those who own the monitor says the BFI + VRR works pretty well, although I guess you can also wait for professional reviews before deciding.

I don't understand why people want VRR. I've had gsync for years, there is literally zero point to it if you run games for max fps.
 
I don't understand why people want VRR. I've had gsync for years, there is literally zero point to it if you run games for max fps.
I have had GSYNC for the last 4 years, and now I can’t use a display without it, both for shooters and other games. Seeing tearing on screen or feeling the frame lag of vsync just drives me nuts.
 
I don't understand why people want VRR. I've had gsync for years, there is literally zero point to it if you run games for max fps.

The point of the VRR was because you wanted BFI to work at refresh rates other than a fixed 144 or 120. In the case of the Aorus BFI will work at any refresh rate between 100-144hz, that was the entire point of the VRR. Unless you absolutely need BFI to work at under 100Hz then yeah the Aorus won't suit you.
 
Back
Top