LG 48CX

I dunno how someone can be so dead set on BFI and it's motion clarity advantage but then have 0 visual perception of the advantage of VRR.

If I had a choice between BFI + tearing or Vsync stutter/input lag vs no BFI + VRR, I'd take no BFI + VRR all day as would anyone with working eyeballs.

Going back to Vsync or worse yet no Vsync is super jarring even at 144hz, it looks like constant stutter.

https://www.testufo.com/vrr
 
He's running games at low settings which I compared to plucking a rooster for a cockfight instead of flying with angels.. or like playing on a 720p black and white screen that makes him think he can aim better for some reason. He doesn't care if it's ugly or primitive looking as long as he gets higher frame rates, prob doesn't care about any kind of lighting of foliage or weather or texture details, worlds, architectures as it sounds like he's just going for the most stripped down he can go relative to minimum fps being over the peak Hz of his screen.

The things is, no matter what he does - not only are there rampant cheaters in the games he mentioned, even stealthy low-key cheating methods just to get the edge to make it to the peak tiers on ladders (and/or to help carry team-mates), but the server's tick rates are way lower than 120hz of game world updates let alone 240 or 360.

------------
Fortnite

I think fortnite's tickrate is 30 Hz which means double that for interpt2 in normal gameplay (unless you want to suffer huge ms penalty ~ 250ms on any latency spike) .. so that means 66.6ms between each server update to your machine. Even if it was 33.3ms it would be pretty bad compared to the local 120fpsHz ~ 8.3ms of world states (or 240fpsHz 4.2ms).

Fortnite players cheat the most in any online multiplayer game, with over 26,822,000 searches for hacks

https://us.blastingnews.com/gaming/...fter-fncs-cheating-controversy-003096277.html

https://fortnitetracker.com/article/1236/fortnite-cheating-crisis-reaches-new-highs

-----------------------
Call of Duty Warzone

Looks like Call of Duty Warzone's tick rate is even lower at around 20Hz. 1000Hz / 20 ... That's 50ms x interp2 ~> 100ms between each update on your machine even when you are running 120fpsHz: 8.3ms or 240fpsHz 4.2ms

https://www.forbes.com/sites/paulta...rzone-has-now-banned-half-a-million-cheaters/


------------------------------

Apex legends, surprisingly is also 20Hz tick rate! (For reference again, Valorant is 128tick). That makes it the same scenario as COD:WZ above in regards to how many action state dots your machine is connecting to on the dotted line so to speak as far as the server is concerned.

March 2021
Hundreds of high ranked Apex Legends players just got banned for cheating

april 2021
Apex Legends Devs Looking Into Compensation For Losing To Cheaters

Feb 2021
https://charlieintel.com/apex-legends-cheaters-hit-by-huge-ban-wave-with-more-to-come/85678/


-----------------------------------------------



People typically want VRR because they want games to look better using frame rate averages instead of locking on frame rate minimums. But it also helps compensate for judder/shudder frame rate "potholes" that can happen randomly in a frame rate graph for whatever reason (occasional game engine glitches or rendering pipeline whack, assets loading, whatever).

As I said in my previous post, I think BFI actually increases input lag somewhat and it has some flicker and darkens the screen the higher you set BFI. So low BFI pretty much issue free but the gain is so small that it's useless, and high BFI is to aggressive and dark and results in flicker/artifacts. Medium can still be eye fatiguing, darkens the screen color luminosity 50%, adds input lag, disallows VRR, incompatible with HDR color volume --- all for 4px of blur at fastest speed motion vs 8px of blur during fastest motion or movement of the FoV (comparing 120fpsHz minimum BFI = on at 4px -VS- 120fpsHz minimum BFI = off at 8px).
 
Last edited:
Yeah the only usable setting on LG OLED's is the low but if I had to quantify it, it's like a 20% improvement to my eye. Everything else sacrifices too much brightness and for me personally makes flicker visible.
 
Flicker was only visible at 60Hz. At 120Hz I didn't notice any flickering even with BFI set to High. And Vsync shouldn't stutter as long as you are able to maintain the frame rate, PS5 doesn't support VRR yet and just runs at a 60Hz Vsync but the games are still butter smooth.
 
Love my CX48 for gaming but do you guys thinkg the upcoming Samsung G9 Neo will beat it!

2000 nits brightness, miniLED 2048 zones, fast pixel response (samsung actually make fast VA panels, unlike rest of the garbage VA in the market).

Only limiting factor is that its vertically tiny!! Why do they not make 32" 16:9 4K panels, or 42" 4K with this kind of panel technology..
 
Love my CX48 for gaming but do you guys thinkg the upcoming Samsung G9 Neo will beat it!

2000 nits brightness, miniLED 2048 zones, fast pixel response (samsung actually make fast VA panels, unlike rest of the garbage VA in the market).

Only limiting factor is that its vertically tiny!! Why do they not make 32" 16:9 4K panels, or 42" 4K with this kind of panel technology..
Absolutely not. It's still LCD tech so going to have worse response times, VA panel viewing angle issues etc.

But I think it might turn out to be a very good compromise if it performs as well as the G9/G7 models. I really like the 5120x1440 form factor, though I would love if they made it slightly taller or alternatively higher res.

7680x2160 is just very heavy to run so I don't expect to see that anytime soon, as cool as that would be.
 
Absolutely not. It's still LCD tech so going to have worse response times, VA panel viewing angle issues etc.

But I think it might turn out to be a very good compromise if it performs as well as the G9/G7 models. I really like the 5120x1440 form factor, though I would love if they made it slightly taller or alternatively higher res.

7680x2160 is just very heavy to run so I don't expect to see that anytime soon, as cool as that would be.
And it's still going to be around the $3K price range or more once it hits the US like the PG32UQX.
 
Love my CX48 for gaming but do you guys thinkg the upcoming Samsung G9 Neo will beat it!

2000 nits brightness, miniLED 2048 zones, fast pixel response (samsung actually make fast VA panels, unlike rest of the garbage VA in the market).

Only limiting factor is that its vertically tiny!! Why do they not make 32" 16:9 4K panels, or 42" 4K with this kind of panel technology..
It's still limited by backlighting. I have yet to find a backlighting tech that can even get close to OLED's per pixel brightness control.
 
Flicker was only visible at 60Hz. At 120Hz I didn't notice any flickering even with BFI set to High. And Vsync shouldn't stutter as long as you are able to maintain the frame rate, PS5 doesn't support VRR yet and just runs at a 60Hz Vsync but the games are still butter smooth.

BFI can be eye fatiguing even if you don't "see" the flicker. Kind of like PWM (even if BFI is uniform where PWM is more agressive because it's pulse time varies).

BFI is I think 75% reduction.... 120fps solid/minimum at 120hz would already cut sample and hold blur down to 8.3px (and would be somewhat tighter looking on an oled at that rate vs a lcd's poorer response times). So BFI's 75% reduction could go down to 2.x pixels of blur during FoV movement at speed for example where it would really show. 2.x pixels of blur is very tight but at 75% loss in color luminance/screen brightness. Also no VRR, no HDR obviously, and at much lower graphics settings to maintain 120fps minimum.

If that works for you with those trade-offs go for it but I'll take, when I can get it:

(100fps average):

13px <<10px >> 8px blur
at
85 <<100fps>>117ps

.................

or (120fps average):

~ 10px <<< 8px >>> 8px

105 <<<117fps>> (117capped)


and at much, much higher graphics settings and utilizing VRR. Personally I use fakeHDR filter and Lightroom filter in reshade on SDR games and while still inferior to an actual well-done HDR game .. wow they really pop and look great compared to anything SDR.

And Vsync shouldn't stutter as long as you are able to maintain the frame rate
v-sync yes with input lag. Are you saying using v-sync when running a minimum frame rate that is over the max refresh rate of the screen? Because I believe what he was talking about is running very high frame rates without any syncs in an attempt to get an edge over the competition.


==========================================================

https://blurbusters.com/faq/benefits-of-frame-rate-above-refresh-rate/

....................QUOTE.....................

The disadvantage, however, is tearing during VSYNC OFF. This disadvantage doesn’t matter to many competitive/eSports players where winning the frag is more important.


At 432 frames per second, each frame fragment shown (frame slice) has only 1/432 second of input lag from GPU rendering. This means you can react more quickly to on-screen action at frames higher than refresh rate.


Advantage 3: Reduced Tearing & Stutters​


Tearing is much fainter at framerates far higher than the refresh rate. This is useful if you run older games (CS:GO, Quake, etc) that run at extremely high frame rates of hundreds or thousands of frames per second:


VSYNC OFF during 150fps creates more visible tear lines than 500fps. This is because the horizontal offsets at the tearline boundaries are much smaller at ever higher frame rates.


At ultra-high frame rates, there are many more tear lines, but they are all at smaller offsets:
.................

Also, microstuttering become reduced at higher frame rates, since there’s less harmonic frequency effects between frame rate and refresh rate. 61fps at 60Hz will have 1 stutter per second, as will 145fps at 144Hz. Running at frame rates much higher than refresh rate, massively reduces microstutters caused by harmonics between fps and Hz. This is important during unsynchronized frame rates (VSYNC OFF, Fast Sync, low-lag triple-buffered modes, etc.)


Obviously, if you have a variable refresh rate (VRR) monitor (GSYNC or FreeSync), it reduces stutters at any frame rate within your VRR range. This is very good if your game cannot run at high frame rates.


However, if you use “VSYNC OFF” with ultra-high frame rates, visibility of stutters and tearing gradually reduces on average, the higher your frame rate goes above refresh rate.

...............End QUOTE..................................


=====================================================

https://blurbusters.com/howto-low-lag-vsync-on/

....................QUOTE..............................

There are people who play CS:GO with VSYNC OFF, and switches to using G-SYNC or FreeSync for other games for better, smooth motion without stutters or tearing.


If you have a very high refresh rate (240Hz), the input lag of G-SYNC becomes similarly low as VSYNC OFF (unlike at 60Hz where the difference is much bigger). 240Hz VRR is capable of eSports-quality gaming. By adding sheer Hertz, VRR becomes suitable for professional gaming.

60hz-vs-240hz.gif


......................end QUOTE............................

====================================================

https://blurbusters.com/network-lag/

.........................QUOTE...................................

Let’s say that the gameserver sends just 10 updates per second, like many Call of Duty games do when a client is hosting the match. At this update rate, we have 100ms between the data packets, which is the same time that we have between 2 bullets when a gun has firing rate of 600 rounds per minute. So at an update rate of 10Hz, we have one data packet per fired bullet, as long as there is no packet loss and as long as the gun has a firing rate of no more than 600RPM.


But many shooters, including Call of Duty, have guns which shoot 750 rounds per minute, or even more. And so we then have 2 or more bullets per update. This means that when 2 bullets hit a player, then the damage of these 2 hits will be sent in a single update, and so the receiving player will get the experience that he got hit by a “super bullet” that dealt more damage than a single hit from this gun is able to deal.

............... end QUOTE.........................

--------------------------------------------------------------------------------


https://dignitas.gg/articles/blogs/...-not-the-reason-why-you-just-missed-that-shot

.....................QUOTE........................

During this time, a set number of further packages arrived on the client’s side containing more updated ticks from the server. Through these ticks, the client is able to interpolate what has happened between these two points in time and display this assumption to the player (don’t get mad yet). Interpolation time is determined by the simple equation


cl_interp = cl_interp_ratio / cl_updaterate


So in our 128 tick server example from above, on otherwise default settings this would mean: You receive a new packet every 7.8 Milliseconds (cl_updaterate 128) but the server waits until you received a third packet (cl_interp_ratio 2) before displaying the information, making the interpolation time is 15.6 Milliseconds for this example. On the other hand, a client running cl_interp_ratio 1 is presented with a renewed state of the game every 7.8 Milliseconds – assuming all other hardware and software variable are optimal.


Of course, from everything we’ve learned in our long online gaming history we assume that a lower number in front of the ms sign is always preferable. But, you already guessed it, things aren’t so easy this time around as bad connections and lag compensation come into the picture.


Again, the people with unreliable connections are better off to accept higher interp times, as the game client requires a new package of information from the server precisely at the interpolation time to update your game. If the second package is lost, the client waits 250ms on another package before flashing that red warning message in the top right corner of the screen.


For someone who tends to experience any package loss pretty much ever, it is safer to set cl_interp_ratio to 2, especially since you regain the “lost” time in the lag compensation.

.....................END QUOTE...................................


Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.

So once your packet is received it rolls back in time as much as your ping time + your interpolation time.

For example lets say:

40ms ping + on a 20 tick (50ms) server which is 100ms interp2 ---> 140ms time span..
Then rollback applied when the server receives your packet, and with the net code making a lot of decisions.

90ms ping + 20 tick server(1000/20 = 50ms) which would be 100ms interp2 ---> 190ms time span

20tick/Hz server is sill trying to deliver you action states/frames every 100ms (interp2).
The extra tick worth of interp2 gives some buffer space vs latency. If you ever ran without it and had any latency that resulted in a lost packet you'd get wacked with a 250ms wait time until the server reoriented you.

Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)

However the netcode makes a lot of choices for you which makes the results kind of muddy in that and rolled back time frame. In some games there can be no simultaneous kills ever... in some whoever's ping is even slightly better gets the shot/kill. These timing decisions in the "rollback in time" apply to a ton of things like ducking your head in and out of a window, jumping off a ledge, reloading time finishing, throwing that grenade, finishing bandaging and getting the health back in time, whatever.

--------------------------------------------------------------

Peeker's advantage:
Imagine you want to reach the boxes on top mid on Mirage while an AWP is perched in window. You jump out of the cover of the wall, fly and land safe behind the boxes. In the moment you land, the AWP shot goes off and you somehow die and end up right on the edge of the boxes, a good half meter away from where you stood on your screen. In the German scene, you would have just been “interped”, even though “being lag compensated” might be the more accurate term (I acknowledge that is way more clunky and less easy to complain about).


As the peeking CT moves into the gap of the double doors, his lag compensated hitbox and model are still behind the door, giving the Terrorist no real chance to respond. However, it is imperative in this scenario for the peeking player to actually hit (and in most cases kill) his opponent in the time it takes the server to compute all executed commands and the appropriate lag compensation. Of course, the showcased example is taken with a ping of 150ms, which is unrealistically high for most people, artificially lengthening that time.


Should any of you reading this have the good fortune to play on LAN one day, you should keep in mind that peeker's advantage is solely dependent on lag compensation, a big part of which is made up by a players ping. With the typical LAN connection ping of 3-7ms, peeker's advantage is practically non-existent anymore. Together with many other factors, this is one of the reasons why CS:GO has to be played differently in certain aspects on LAN than on the internet.

.................

https://www.reddit.com/r/Overwatch/comments/3u5kfg/everything_you_need_to_know_about_tick_rate/

Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.

  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".

-----------------------------------------

Also keep in mind that the fastest touch reaction times are 150ms so the fastest players are reacting in a 150ms to 180ms range --- but only after actually receiving a new -interpolated/adjusted- action state slice from the server. (Most gamers are probably 180ms - 240ms). So you would be receiving a new (netcode adjusted results wise) action slice 100ms later on a 20hz server, then you'd react to it ~ 180ms later.

At 100fps solid that means you would be receiving new (netcode adjusted results wise) action slices every 10 frames, and then reacting to it after another 18 frames. :watching:
 
Last edited:
I dunno how someone can be so dead set on BFI and it's motion clarity advantage but then have 0 visual perception of the advantage of VRR.

BFI is better than VRR, but you have to do the necessary work up front to make sure your game stays locked at the refresh rate at all times.

This isn't difficult. All you need to do is look at a well-produced console game, like Mario 3D World or Ratchet and Clank on PS5, to see how smooth things look when developers target an absolutely consistent frame rate with no variation in frame time.

The problem is that people are playing at BFI with unlocked frame rates, sometimes going above refresh, sometimes below, which totally defeats the purpose. With BFI, you need perfect 1:1 frame rate to refresh rate synchronization.

This is possible with a simple frame rate cap, but there are was to have low-lag vsync as well.

One being Scanline-sync from RTSS: https://forums.blurbusters.com/viewtopic.php?t=4916. From the game's perspective, vsync is off, but RTSS is controlling where the tearing line is. You can either put it at the top or bottom of the screen, or completely off-screen in the blanking interval.

Another is just simply setting a frame rate cap a few microsecond shorter than your refresh rate with vsync on, which greatly reduces the input lag: https://blurbusters.com/howto-low-lag-vsync-on/

And beyond those, on modern DX12 and Vulkan engines, some developers like Id-tech have really good vsync implementations. Like Vsync in Doom Eternal, not the triple buffered vsync but just the "vsync on" option, gives a basically imperceptible lag increase.
 
BFI is better than VRR,

"Better" is subjective. When I play SDR games I use fakeHDR filter and lightroom filter (which is not quite as good as but is kind of like using tone mapping on HDR end result wise, at least it looks like SDR+ compared to regular SDR) and with graphics as high as I can get without going below 100fps average if I can help it. ... (using VRR to ride the roller coaster +/- my frame rate average).

Flaying and knocking the lights out of a great looking game for a potential scoring advantage, and across all of the internet gaming limitations that are there to begin with, isn't better to me at all.

All of the frame rate scoring/hit tests I've seen (on LTT for example) were done on a local LAN or vs bots. Online gaming is muddy due to net code going back in time and shuffling things as it sees fit, plus poor tick rates.

.....

Unless you meant BFI is better than VRR when both have frame rates greatly exceeding the refresh rate of the screen? In that case, perhaps in that scenario purely in an attempt at scoring advantage with zero care for aesthetics.. though in the blurbuster's link above it indicates that at those frame rates VRR is competitive:


https://blurbusters.com/howto-low-lag-vsync-on/

....................QUOTE..............................

There are people who play CS:GO with VSYNC OFF, and switches to using G-SYNC or FreeSync for other games for better, smooth motion without stutters or tearing.


If you have a very high refresh rate (240Hz), the input lag of G-SYNC becomes similarly low as VSYNC OFF (unlike at 60Hz where the difference is much bigger). 240Hz VRR is capable of eSports-quality gaming. By adding sheer Hertz, VRR becomes suitable for professional gaming.

60hz-vs-240hz.gif


......................end QUOTE............................

All for this.. affecting the whole game world moving the viewport at speed of course, (though when slower movement or stationary, or at the fastest flick movement it wouldn't really be perceptible either):

When objects move at speed or when moving the FoV/gameworld around at speed you'd get this kind of blur.
Outside of these maximum blur amounts, it varies most with the speed you are moving the viewport at any given time or if it's moving at all. Also how fast other objects are moving on screen. So it's more like zero(static) "up to" those peaks in each slot.

from https://blurbusters.com/ (edited by me showing the 240fpsHz darker to suggest BFI dimming)
476623_k1aCWd3.png

1ms / "zero" blur of CRT or 1000fps at 1000Hz

476624_ZCxnQl7.png

--------------------------

All of those 120Hz blur reductions listed are reliant on 120fps minimums, not averages.

That kind of blur reduction is appreciable don't get me wrong, but at huge tradeoffs in graphics quality on most games with modern graphics, minus VRR, HDR, minus 50% to 75% color luminance and with potential eye-fatigue too of course.

Also I thought BFI increased the input lag to 21ms instead of 13ms? Can anyone confirm if that is true and if not , is it the same ~ 13.x ms? Maybe once you reset it to PC mode it goes back down to 13 but I remember that number for some reason.
 
Last edited:
"Better" is subjective. When I play SDR games I use fakeHDR filter and lightroom filter (which is not quite as good as but is kind of like using tone mapping on HDR end result wise, at least it looks like SDR+ compared to regular SDR) and with graphics as high as I can get without going below 100fps average if I can help it. ... (using VRR to ride the roller coaster +/- my frame rate average).

Flaying and knocking the lights out of a great looking game for a potential scoring advantage, and across all of the internet gaming limitations that are there to begin with, isn't better to me at all.

All of the frame rate scoring/hit tests I've seen (on LTT for example) were done on a local LAN or vs bots. Online gaming is muddy due to net code going back in time and shuffling things as it sees fit, plus poor tick rates.

.....

Unless you meant BFI is better than VRR when both have frame rates greatly exceeding the refresh rate of the screen? In that case, perhaps in that scenario purely in an attempt at scoring advantage with zero care for aesthetics.. though in the blurbuster's link above it indicates that at those frame rates VRR is competitive:


https://blurbusters.com/howto-low-lag-vsync-on/

....................QUOTE..............................

There are people who play CS:GO with VSYNC OFF, and switches to using G-SYNC or FreeSync for other games for better, smooth motion without stutters or tearing.


If you have a very high refresh rate (240Hz), the input lag of G-SYNC becomes similarly low as VSYNC OFF (unlike at 60Hz where the difference is much bigger). 240Hz VRR is capable of eSports-quality gaming. By adding sheer Hertz, VRR becomes suitable for professional gaming.

View attachment 375594


......................end QUOTE............................

All for this.. affecting the whole game world moving the viewport at speed of course, (though when slower movement or stationary, or at the fastest flick movement it wouldn't really be perceptible either):



That kind of blur reduction is appreciable don't get me wrong, but at huge tradeoffs in graphics quality on most games with modern graphics, minus VRR, HDR, minus 50% to 75% color luminance and with potential eye-fatigue too of course.

Also I thought BFI increased the input lag to 21ms instead of 13ms? Can anyone confirm if that is true and if not , is it the same ~ 13.x ms? Maybe once you reset it to PC mode it goes back down to 13 but I remember that number for some reason.

BFI is a niche use case for me nowadays, only used in super old games that lack HDR and I am able to get hundreds of fps anyways making VRR a moot point. Rather just lock it to 120Hz and turn BFI on. 99% of the time I am using VRR.
 
That's the thing, BFI would be much more attractive if they were to support arbitry refresh rates between 60 and 120 with it. But currently, no manufacturer is doing that.

Like, I'm currently using Trinitron and Diamondtron CRT's. By they nature of the display, they strobe at any refresh rate you use. 90hz and 80hz look incredible on these monitors. Even 60hz for games like Street Fighter 5 and Mega Man 11 that are locked at 60fps.

So that's the next step for BFI: arbitrary refresh rates. Because sometimes you want to run at a high resolution with max settings, and locked 100hz isn't attainable. Being able to lock your frame rate at 80, and run BFI 80hz, would be incredibly useful.
 
That's the thing, BFI would be much more attractive if they were to support arbitry refresh rates between 60 and 120 with it. But currently, no manufacturer is doing that.

Like, I'm currently using Trinitron and Diamondtron CRT's. By they nature of the display, they strobe at any refresh rate you use. 90hz and 80hz look incredible on these monitors. Even 60hz for games like Street Fighter 5 and Mega Man 11 that are locked at 60fps.

So that's the next step for BFI: arbitrary refresh rates. Because sometimes you want to run at a high resolution with max settings, and locked 100hz isn't attainable. Being able to lock your frame rate at 80, and run BFI 80hz, would be incredibly useful.
There have been monitors that combined BFI and VRR, but those implementations were not very good if I recall correctly.
 
  • Like
Reactions: elvn
like this
There have been monitors that combined BFI and VRR, but those implementations were not very good if I recall correctly.
I don't mean w/ VRR, I mean BFI just needs to work at whatever static refresh rate you choose.

Like I should be able to set my game to 80hz, and have BFI work. Currently, that's not the case. In my understanding, the LG will be strobing the pixels at 100hz or 120hz even though Windows is outputting 80hz. Which would give some pretty ugly motion artifacts.
 
I don't mean w/ VRR, I mean BFI just needs to work at whatever static refresh rate you choose.

Like I should be able to set my game to 80hz, and have BFI work. Currently, that's not the case. In my understanding, the LG will be strobing the pixels at 100hz or 120hz even though Windows is outputting 80hz. Which would give some pretty ugly motion artifacts.

Lol that is exactly what BFI + VRR does, at least in the monitors that it's implemented in. You want BFI working at 80Hz? Cap your FPS to 80 and the monitor will strobe at 80Hz. Honestly I don't see why you want to go through the process of manually settings the refresh rate EVERY SINGLE TIME you want a specific strobing Hz, VRR+BFI makes this so much easier. Just enable it and cap your fps to whatever you want. This beats going into the NVCP to manually set my refresh rate to 75 or 85 or 95 every time I want some random strobe Hz value. You wouldn't be utilizing VRR for fluctuating fps, but rather to be able to get strobing working at ANY Hz just by capping your fps to whatever target value you want.
 
at least in the monitors that it's implemented in.
This is the key thing. Of cours VRR+BFI would be the best thing, but it's not available on TV's, and pretty rare on monitors. It's not easy to implement, and sometimes when it is, it isn't implemented correctly.

Capping your frame rate to 80fps does not make the monitor strobe at 80hz. If your monitor is set to 120hz, it will be strobing at 120hz. And on LG TV's, they don't strobe correctly at other refresh rates that aren't 60, 100, or 120.

You can see for yourself here that capping a framerate while running at a different refresh doesn't look good: www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=960&framepacingerror=0&direction=rtl&framerate=40&compare=2&showfps=0

And lots of games let you pick a refresh rate. But I don't mind switching at the desktop beforehand if they don't
 
..... Re-posting some relevant info:

BFI isn't tied to the frame rate like LCD backlight strobing is. Since BFI can blank per pixel it can "strobe" it's rate independently of the refresh rate, and even fractionally at scan line ~ "rolling scan".

per blurbuster's site and forums, from mark R:


" , strobing on most OLEDs are almost always rolling-scan strobe (some exceptions apply, as some panels are designed differently OLED transistors can be preconfigured in scanout refresh, and then a illumination voltage does a global illumination at the end). "

"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

"Black duty cycle is independent of refresh rate. However, percentage of black duty cycle is directly proportional to blur reduction (at the same (any) refresh rate). i.e. 75% of the time black = 75% blur reduction. Or from the visible frame perspective: Twice as long frame visibility translates to twice the motion blur.

Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle. "

" That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD "

 
This is the key thing. Of cours VRR+BFI would be the best thing, but it's not available on TV's, and pretty rare on monitors. It's not easy to implement, and sometimes when it is, it isn't implemented correctly.

Capping your frame rate to 80fps does not make the monitor strobe at 80hz. If your monitor is set to 120hz, it will be strobing at 120hz. And on LG TV's, they don't strobe correctly at other refresh rates that aren't 60, 100, or 120.

You can see for yourself here that capping a framerate while running at a different refresh doesn't look good: www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=960&framepacingerror=0&direction=rtl&framerate=40&compare=2&showfps=0

And lots of games let you pick a refresh rate. But I don't mind switching at the desktop beforehand if they don't

Right now yes capping your fps to 80 the TV will still strobe at either 100 or 120 depending on what you set the refresh rate to. But instead of asking for the ability to strobe at any Hz, why not just ask for BFI+VRR instead where the TV will strobe at whatever the fps/Hz value is.
 
Right now yes capping your fps to 80 the TV will still strobe at either 100 or 120 depending on what you set the refresh rate to. But instead of asking for the ability to strobe at any Hz, why not just ask for BFI+VRR instead where the TV will strobe at whatever the fps/Hz value is.

If you varied the strobing to match an actual varying Hz rather than a locked frame rate minimum when using VRR - congratulations, you just re-invented brightness changing PWM more or less. That is extremely eye-aggravating. Most people avoid PWM screens when possible. But sure if you had that ability but then locked your FPS to a frame rate minimum that could work.

This is the key thing. Of cours VRR+BFI would be the best thing, but it's not available on TV's, and pretty rare on monitors. It's not easy to implement, and sometimes when it is, it isn't implemented correctly.

Capping your frame rate to 80fps does not make the monitor strobe at 80hz. If your monitor is set to 120hz, it will be strobing at 120hz. And on LG TV's, they don't strobe correctly at other refresh rates that aren't 60, 100, or 120.

You can see for yourself here that capping a framerate while running at a different refresh doesn't look good: www.testufo.com/framerates-versus#photo=dota2-bg.jpg&pps=960&framepacingerror=0&direction=rtl&framerate=40&compare=2&showfps=0

And lots of games let you pick a refresh rate. But I don't mind switching at the desktop beforehand if they don't

Lower Hz matched to BFI or backlight strobing flicker more noticeably and can cause more eye fatigue over time (whether you personally say you "see" it or not conciously - it's hitting your eyes). Essentially, the lower the strobe the more aggressive it would be to your eyes. It can fatigue anyone's eyes but some people have faster eyesight than others and can conciously see strobing even in some lightbulbs.


https://www.oled-info.com/pulse-width-modulation-pwm-oled-displays
Pulse-Width Modulation, or PWM, is one of the ways display makers can use to adjust the display's brightness. PWM is considered to be an easy (or cost-effective) way to control the brightness, but it has serious drawbacks, such as flicker that may cause eye strain and headaches. In this article we'll discuss PWM and its effects on OLED displays.
Display-PWM-duty-cycles-img_assist-400x206.jpg

-------------------------------------

I think LG CX's BFI at 100%/full is a full black frame per Hz (resulting in the lowest brightness of the three settings, just like very low brightness setting on PWM LED) .. and the other settings are less often.

-----------------------------------

https://iristech.co/pwm-flicker-test/


eye-pwm-flicker.gif



------------------------

Technically OLED doesn't have to flash or black frame the entire frame like the backlight did in the old lightboost method. It can use rolling scan -- independent of the refresh rate, down to the pixel.

(blurbusters Mark R.)
"rolling strobe on OLED can be fractional refreshes, so OLED BFI can actually be arbitrary lengths unrelated to refresh cycle length. Since the off-pass can chase behind the on-pass simultaneously on the same screen at an arbitrary distance"

"Black duty cycle is independent of refresh rate. However, percentage of black duty cycle is directly proportional to blur reduction (at the same (any) refresh rate). i.e. 75% of the time black = 75% blur reduction. Or from the visible frame perspective: Twice as long frame visibility translates to twice the motion blur.

Does not matter if full strobe or rolling scan, as it is per-pixel duty cycle. "

" That said, non-global illumination can cause artifacts (e.g. skewing during scan of any CRT, OLED (including strobed and nonstrobed rolling scans) or nonstrobed LCD "

-- but that can cause artifacts if it's not full screen black frames ~ is non-global illumination.. so yeah that's not optimal either.
 
If you varied the strobing to match an actual varying Hz rather than a locked frame rate minimum when using VRR - congratulations, you just re-invented brightness changing PWM more or less. That is extremely eye-aggravating. Most people avoid PWM screens when possible. But sure if you had that ability but then locked your FPS to a frame rate minimum that could work.

Uhhh no? I already tested out an Asus ELMB-SYNC monitor and brightness remains constant even with varying fps, at least between 80-155 which is what I set the gsync pendulum demo to. It is possible to have varying backlight strobing without fluctuating brightness levels. As for the varying level of PWM causing more eye strain, can't speak for that since I immediately returned the monitor back to Amazon the next day and only used it briefly for a few hours.
 
Your are right 👍, they figured out how to vary the brightness dynamically. :D
edit: apparently they are using double strobes instead of actually varying the brightness output by voltage and it causes a lot of crosstalk.:eek:.. unless they changed that in some other model

Still, BFI doesn't work that way, at least on LG oleds where my PWM comparison (PWM at 25% static brightness somethiing like BFI at 75% since both their strobes reduce the brightness 75%) still stands. BFI/strobing typically reduce the brightness by the same % they reduce the blur.
Also, even though those ELMB-Sync monitors exist, the implentation is poor according to the quotes I put below from the blurbuster's forums.



https://www.tftcentral.co.uk/reviews/asus_tuf_gaming_vg279qm.htm
We measured the on/off strobing using our oscilloscope and confirmed that the strobing is in sync with the refresh rate. So at 280Hz for instance (shown above) the backlight is turned off/on every 3.57ms (280 times per second). As you reduce the refresh rate the strobing remains in sync with it well as you would hope.
The 'on' period reduces as the refresh rate increases as with most blur reduction backlights, which would normally mean that the image becomes a bit darker with the higher refresh rates. However, because this feature is designed to be used with VRR, Asus needed to develop it to avoid changes in brightness as the refresh rate changes. That would have been pretty distracting in use. While the strobing frequency and 'on' periods will change with the refresh rate, the backlight intensity is also being dynamically controlled to make up for it. We measured the same maximum brightness at a range of refresh rates as shown below. So there are no noticeable fluctuations in brightness as the refresh rate changes which is great news.​

These comparisons are with the refresh rate as high as is available for the blur reduction feature to function. For most this is at 100 - 144Hz. You can often achieve a slightly brighter display if you use the feature at compatible lower refresh rates since the strobes are less frequent, but it's not a significant amount. That can also introduce more visible flicker in some situations.
Wgyrala.png

=================================

From here, regarding an Asus XG279Q with ELMB-Sync, at least according to that user. Maybe other models are better though idk.

https://forums.blurbusters.com/viewtopic.php?f=4&t=6386

ELMB-Sync is absolutely fucking amazing......on about 1/3 of the screen. It brings crystal clear CRT like motion clarity. The issue is the other 2/3 of the screen are full of crosstalk images where they actually make motion WORSE than when it was off. Theoretically this could be a non issue in fps type games if the good part was actually dead center on the crosshair or could be calibrated. Unfortunately, this isn't the case. On mine I have crosstalk starting at about the crosshair location and up so a target I'm aiming at commonly has different corona type effects applied to the same model, which is....a bit distracting. If you get lucky you can maybe get one that has the butter zone a bit more centered although the top and bottom 2/3rds will still be full of horrible crosstalk.

Is it decent? Yeah, I use it when I play CS and disable it for all other games. Would I choose this monitor instead of a GL850 if I were to go again? No. The BFI significantly dims the screen where I find that I need to run on 1.8 gamma instead of 2.2, while also increasing the brightness in games, which washes out the scene quite a bit. This is why CSGO is the only game I can really do it since I am not seeking eye candy. Also for fps standards CSGO is a slow moving game focusing around holding angles. So this system kinda works. Playing a very fast paced game like quake or Apex exacerbates the crosstalk further to make the choice less cut and dry than simply saying FPS game + ELMB = Good. Also as FPS gets lower, even in the 120s the crosstalk gets worse. So to effectively use this technology you need 1. Washed out colors or have an extremely difficult time spotting enemies on dark backgrounds. 2. at least 144+ fps. 3. Even in best case scenario you WILL have crosstalk blur on at minimum half of your screen. So this is a lot to sacrifice just for smoother motion on a small part of the screen. Why not just buy a TN panel since after all these compromises a TN panel will probably end up looking much brighter and still have better motion blur?

....
with ELMB-Sync On they apply double-strobe all the time, no matter what frequency is active ! Rtings.com and TFTCentral both confirm this atrocious strobe algoritm behavior icon_e_sad.gif Obviously constant double-strobe destroy the motion blur reduction experience with overlapped images. Is very ridiculous see this in a modern gaming monitor, thats good for Aprils Fools Day.

They use this algorithm to compensate brighness with all frequencies icon_lol.gif They should have worked to compensate this brightness changes using automatic dinamic backlight brightness, in the same way dinamic Overdrive work in GSync module monitors, setting the correct voltaje value at each frequency. Is very fustrant... icon_e_sad.gif seeing the marketing all is perfect and all work fine
 
Last edited:
The implementation of elmb sync was absolutely poor which is why I returned it the next day lol. As much as I do love me some excellent motion clarity having used a CRT until late 2011, I have started to care less and less as more time passes. I can't even remember the last time I used BFI on my CX. If LG ever implements BFI Sync and it's excellent, great. If not then oh well.
 
OLED has multiple advantages that should allow a unique implementation- it can have sub 1ms persistence and easily run an underlying 1000hz strobe if the processing existed. In a VRR situation you could run a rolling strobe at the exact rate of the current frame if you could predict the next one (i.e. the duration for which the current frame would stay up) without introducing much lag or brightness variations but without buffering that is not really possible.
One way to go around BFI+VRR is keep repeating the frames at a constant 1ms with rolling strobe and use the next frame rollover at the next 1ms window to do VRR. This would theoretically slightly affect frame rate and lag but sub-1ms compared to a perfect VRR implementation so no real loss.
You would get a clean implementation with VRR, BFI sharp motion, low lag. It is also a relatively simple algorithm if you can get a good 1000hz processing capable display, which is the challenge. I am pretty sure Valve (or maybe Microsoft?) tested a 1000hz display for Vr/MR and saw the benefits of lower input to photon delay, which is a whole other discussion.
 
  • Like
Reactions: elvn
like this
Yes you could theoretically develop a very good interpolation method that duplicated frames without bad artifacting or input lag. One method without VRR could be to do that off of a healthy fps minimum for good motion definition/path articulation/animation cycle frames to start with. Say 100fps for example x10 interpolated = 1000fps at 1000hz internally. At 1000fps + 1000 Hz it would be "zero" sample and hold blur like a crt (1px) , so you wouldn't need any black frame insertion/strobing at all.

Blur Busters Law: The Amazing Journey To Future 1000Hz Displays

KlIRG0B.png
 
Last edited:
Hardware Unboxed did a review of the C1 48"

In summary, he's a a fan.


I been abscent from HF for some time. Is the 48 C1 a newer model than the 48 CX? I may be getting my hands on a 3080 so I'm looking into 4k again. Last when I looked it was the 48 CX that was the 4k tv king for gaming. Well the whole CX line anyway.
 
In general usage the C1 probably not that big of a difference. It also has a game boost thing for 60hz content that drops the lag down for ~13 ms to ~10ms. It actually makes lag worse at 120hz according to HDTVtest.


https://www.rtings.com/tv/tools/compare/lg-cx-oled-vs-lg-c1-oled/10619/21421?usage=11&threshold=0.10
The LG C1 OLED replaces the LG CX OLED, and overall they're very similar TVs. The biggest differences are that the C1 comes in a larger 83 inch variant, has the newest version of webOS, and includes new 'Game Optimizer' settings, including an input lag boost that reduces input lag by a few milliseconds. Our unit of the C1 has poor out-of-the-box color accuracy and lower brightness compared to the CX, but this could just be due to panel variation. All things considered, if none of the minor additions are essential to you, the CX may offer a slightly better value.

Differences in brightness between this and the LG CX OLED may simply come down to panel variation. If you want something that has the new evo panel and gets brighter, then check out the LG G1 OLED.

---------------
https://www.rtings.com/tv/tools/compare/lg-cx-oled-vs-lg-g1-oled/10619/21422?usage=11&threshold=0.10
The LG G1 OLED and the LG CX OLED are two excellent TVs. They each have an OLED panel with a near-infinite contrast ratio and similar gaming features. The main difference is that the G1 has the new evo OLED panel, allowing small highlights to get brighter in HDR, but the CX still gets brighter in SDR. The GX has a unique design meant to sit flush against the wall with the dedicated wall-mount, while the CX comes with a stand. Other than that, there's very little difference between each TV, and they each deliver exceptional picture quality.

-----------------

 
So I see Rtings tests input lag at the middle of the screen. And I presume they don't subtract the typical "scan out" time? So at 60hz a CRT, you'd get 8ms at the middle because that's how long scanout takes to get to the middle.

So if they're getting 10ms on the middle of the screen, and not subtracting scanout, then it's literally only 2ms slower than a CRT.
 
The input lag on the CX/C1 is seriously negligible, well as long as you are in Game Mode. Nobody is going to feel any lag on these sets unless maybe you were the top 0.01% of competitive gamer, in which case you wouldn't be doing your competitive gaming on this display anyways. For 99.99% of users the input lag is an absolute non issue.
 
Looks like a new non-engineering test/alpha CX series firmware was pushed out in Korea at the end of June - 03.23.10 and then as the next post mentions - 03.23.15 a few days ago (mid July).

No info on what's changed or been updated since 03.23.06 or 03.23.10.
 
Last edited:
I been abscent from HF for some time. Is the 48 C1 a newer model than the 48 CX? I may be getting my hands on a 3080 so I'm looking into 4k again. Last when I looked it was the 48 CX that was the 4k tv king for gaming. Well the whole CX line anyway.
C1 is the newest version. It has a few more goodies and gets slightly brighter than the CX. The C1 is also has a bit less input lag (unless you're an extreme hardcore gamer, you won't notice this).

In short, if the prices are equal, the C1 is the superior product. If the CX is any cheaper (even $100), get the CX. You won't notice the difference.
 
C1 is the newest version. It has a few more goodies and gets slightly brighter than the CX. The C1 is also has a bit less input lag (unless you're an extreme hardcore gamer, you won't notice this).

In short, if the prices are equal, the C1 is the superior product. If the CX is any cheaper (even $100), get the CX. You won't notice the difference.

According to RTings the C1 isn't really brighter in practice but the G1 is.

CX vs C1 HDR

XhSBP7f.png

=======================================================================================

CX vs G1 HDR

ZUAgN1E.png


=====================================

Input lag CX vs C1

wmBVC3x.png

============================


Those input lag numbers with the latest tests on the latest firmware and on hdmi 2.1 are pretty low at 120hz 4kx with VRR.

If I'm reading it right, they are listing 4k + VRR as being 5.9ms vs 5.8ms , and 4k 120hz (no vrr) as 6.7ms vs 5.3ms.

That's faster than most people are being fed the next new frame to be seen after they act (e.g. 100fps = 10ms later, 117fps capped ~ 8ms later) locally.

Also much lower than the tick rate of online game servers (50ms + another 50ms interp_2 buffer on 20tick servers. Though the 2nd (interp_2) amount gets subracted mostly in lag compensation history re-writes by net code sort-of.

And that input lag is a drop in the bucket compared to the spans of time it takes the fastest gamer to react with reaction times of 150ms to 180ms (to 250ms in general), after they see a new netcode biased/constructed/guessed action state from the server's next tick.


=====================================
Valorant: 128tick
Specific paid matchmaking services like ESEA: 128tick
CSGO ("normal" servers): 64tick
Overwatch: 63
Fortnite: 60 (I think, used to be 20 or 30)
PubG: 60
COD Modern Warfare mp lobbies: 22tick
COD Modern Warfare custom lobbies: 12tick

COD Warzone: 20 tick


Some have guessed that League of Legends tick rate is around 30.
ESO PvE / PvP: ??
WoW PvE / PvP ?? .. World of warcraft processes spells at a lower 'tick rate' so is a bit more complicated, but overall the tick rate probably isn't that great.
https://us.forums.blizzard.com/en/wow/t/is-classic-getting-dedicated-physical-servers/167546/81


https://happygamer.com/modern-warfa...or-a-game-that-wants-to-be-competitive-50270/


---------------------------------------------

Keep in mind that a higher tickrate server will not change how lag compensation behaves, so you will still experience times where you ran around the corner and died.
-----------------------
An example of lag compensation in action:
  • Player A sees player B approaching a corner.
  • Player A fires a shot, the client sends the action to the server.
  • Server receives the action Xms layer, where X is half of Player A's latency.
  • The server then looks into the past (into a memory buffer), of where player B was at the time player A took the shot. In a basic example, the server would go back (Xms+Player A's interpolation delay) to match what Player A was seeing at the time, but other values are possible depending on how the programmer wants the lag compensation to behave.
  • The server decides whether the shot was a hit. For a shot to be considered a hit, it must align with a hitbox on the player model. In this example, the server considers it a hit. Even though on Player B's screen, it might look like hes already behind the wall, but the time difference between what player B see's and the time at which the server considers the shot to have taken place is equal to: (1/2PlayerALatency + 1/2PlayerBLatency + TimeSinceLastTick)
  • In the next "Tick" the server updates both clients as to the outcome. Player A sees the hit indicator (X) on their crosshair, Player B sees their life decrease, or they die.
Note: In an example where two players shoot eachother, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.
----------------------------------------
  • If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
  • If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
  • If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
  • If it is undertuned, it results in "I need to lead the target to hit them".
 
Last edited:
It's still limited by backlighting. I have yet to find a backlighting tech that can even get close to OLED's per pixel brightness control.
MicroLED is probably equivalent, but needs a few years to be ecumenical to get down to 55" and smaller form factors. But yes, right now nothing comes close to OLED. Technologies like MiniLED are just trying to hide LED/LCD's increasingly obvious flaws.
 
Until they get it down near to pixel level it won't be the same but the tinier the better. When you have zones that encompass a decent area of pixels you can't really do highlights on edges and sbs (stars twinkling in dark night sky, edge of a spaceship on ink black space background, twinkling reflections on a black nightime lake, etc. ). When trying to show very bright right next to very dark, the firmwares have to play the zones off against each other and either have the darker area larger or the brighter area ( I call it a "dim halo" vs a "glow halo" effect). That lowers the actual nits shown and loses details in the scene either way. The smaller the zones the less often that will be obvious but in HDR highlights it will probably still be a trade-off until nearer the pixel level. It would have to be near 1080p worth of backlights on a 4k screen. That would be 2,073,600 backlights, one for every 4 pixels. Compare to 8,294,400 pixels and more subpixels lit individually on an OLED with side by side contrast up to inf black to 1

Dual layer LCD made per nearly pixel backlight out of a 2nd LCD screen in monochrome + light filtering for black depth but LCD's always have way worse response time. Those also had to deal with heat. They used a 1080p layer as a monochrome backlight similar to what I was outlining above for a microled threshold. Some of the top eizo and sony mastering screens (@$35k+) are dual layer LCD now instead of OLED in order to avoid ABL and peak brightness trade-offs and burn in concerns but there was only one 55" tv released in china afaik.
 
Back
Top