What does high refresh rate mean? (100hz-144hz)

That's what i thought. Thanks.

I tried FPS gaming on a TV (KS498500) and it was unplayable. I guess because of high response time and low refresh rate.

Though it's fantastic for work.
 
It's pretty dependent on the individual and what you're used to. I've always gamed on 60Hz screens so it doesn't bother me at all.
 
Also makes your mouse cursor extremely smooth in Windows.
I dont really care enough to spend cash on it and am still at 60Hz.
 
That's what i thought. Thanks.

I tried FPS gaming on a TV (KS498500) and it was unplayable. I guess because of high response time and low refresh rate.

Though it's fantastic for work.
those two things contribute to it but the main factor in your experience was input lag. even the fastest televisions have nearly 10x the amount of input lag as gaming monitors.
 
The KS8500 when in game mode should have pretty low lag though. I'm guessing you weren't running in game mode.
 
TV's in game mode may or may not be low-latency like PC monitors- there's a whole lot more going on in there, between the output signal and the image being displayed on the monitor.

Further, 'high refresh rate' makes a *huge* difference in gaming, something you really need to experience firsthand to appreciate.
 
For one, I makes the general non gaming windows experience insanely better. Hz also is how many fps your display will show you.
 
It's great on a CRT, or in combination with ULMB or other backlight strobing (as long as the frame rate doesn't drop below refresh rate).

Otherwise, it's nothing to get excited about, in my experience. The motion resolution will still be terrible on an LCD.
 
Otherwise, it's nothing to get excited about, in my experience. The motion resolution will still be terrible on an LCD.

"Motion resolution" (assuming you're talking about motion blur due to persistence) is directly affected by refresh rate. With a high enough refresh rate (assuming that the pixel rise and fall times are fast enough so that the pixels are truly refreshed at this rate), you are effectively getting lower pixel persistence, and thus less motion blur.

Whether or not pixels can display information that is truly independent between refreshes, at such high rates, is of course an open question. But in theory, if they could, you'd get less motion blur with higher refresh rates.
 
.
.
blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)


There are two benefits of higher hz on a low response time monitor with a good modern gaming overdrive implementation
..motion clarity (blur reduction) , and
..motion definition (more world and object action state/position "slices" shown per second).

Even if you are championing backlight strobing you have to admit that the motion definition side of the equation is way better at 100fps-hz to 144fps-hz regardless, so you are being dismissive saying it's not a big difference from 60hz.

--------------------------------------------------------------------------------------------------------------------

The thing about backlight strobing is you can't use variable hz so you either have to use v-sync with it's downsides at a very high frame rate or run very high frames per second without v-sync and still suffer some of the screen aberrations that g-sync/variable hz eliminates. Both of these options would require you to turn down graphics fidelity (eye-candy, fx, higher settings) on the most demanding games more than using variable hz because you can't be as flexible with your frame rate range since it's tied to the strobe frequency (as well as to the limitations of v-sync vs frame rate if you are using that).

What variable hz allows you to do for example, is to ride the frame rate graph 70 - 100 - 130(160) at 100fps-hz average or better yet 90 - 120 - 150+ at 120fps-hz average on a very high resolution monitor like a 2560 x 1440 or 3440 x 1440 and avoid the screen aberrations of not using v-sync without turning settings down as much on the most demanding games.
So you get a blend of those frame rates and corresponding hz without:
-cutting off the high part of the graph's motion definition (and motion clarity increases even if it's not the near total blur elimination of strobe mode)
-not having to turn down the graphics fidelity in the graphics settings way more to maintain a much higher frame rate minimum.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)


From Blurbusters.com:


Sustaining a perfect 120fps @120hz means your graphics settings would be low. At 1080p you would get a lot more mileage out of your gpu power and on non-demanding games (e.g. CS:go at 1080p), but at higher resolutions and more demanding games forget it.



This little simple single object bitmap ufo below going from 60hz to 120hz may not seem like a big difference, but when you consider that the entire viewport full screen is smearing like that relative to your viewpoint at 60hz during mouse-looking and movement keying in 1st/3rd person games in the first example, and that it becomes tighter to more of a soften blur within the shadow masks of everything on screen instead at 100 - 120 - 144fps-hz while ALSO gaining all of the motion definition increases in motion flow and motion path articulation that the higher frame rates provide at high hz, it is quite an improvement. And that improvement comes with preserving the higher end rates of the average frame rate's roller coaster graph not cutting them off with v-sync or turning down the graphics settings a lot more to bring up the low end. If getting high frame rates was not difficult vs the most demanding game's graphics ceilings pushing the envelope, especially on 1440p or higher monitors, it might be a different story. As it is, it's a trade-off and "better" is very subjective considering..
Backlight strobing Main Cons:
-Display "muting"
....Reduced brightness
... Degradation of color quality
– Flicker (and eyestrain for some people, especially below sustained 120fps @ 120hz strobing)
- can get input lag with v-sync, tearing without.
– Requires some combo of powerful GPU, lower resolutions, much lower settings to get full benefits on demanding games.

144fps-hz would be 6.94ms persistence btw, so somewhere a bit tighter than the 8.
If this chart were more accurate, the display (the ufo and even the background in this case) would be dim and the color saturation muted if the last two were backlight strobing monitors.

lcd-blur_quick-chart.png
 
Neat. That's nice if it's a simple postage stamp bitmap UFO, but it ignores a lot in the real-world scenario.
Your gif ignores the large graphics settings drop using ULMB/lightboost on demanding games and higher than 1080p resolution in order to get 100fps-hz solid (not avg), which really should be 120fps-hz SOLID vs strobe effect bothering people's eyes.
Assuming you are avoiding v-sync, you go back to suffering some screen aberrations unless using some form of adequate dynamic sync perhaps, that or you suffer the downsides of using v-sync.

In reality it's more like THIS
-except the last frame would be half as bright or worse, "muted" due to the strobing effect. The graphics settings would also be cut low compared to the first few example frames if it were a demanding game.
------------------------------------------------------------------------------------------------------

100fps average graph example (vs 60fps avg graph)


With variable hz tech, you open up the whole range of the frame rate graph which allows you to keep the graphics settings higher, moving your frame rate to the average/middle of the graph and riding the frame rate graph smoothly.
With ULMB, you are stuck at whatever graphics settings you have to hack and slash your settings down to until the lowest part of the graph is over 100fps (while in effect cutting off the top half of the graph). That is a lot to shift it up to 120fps-hz solid (not average) if running over 1080p on a demanding game.

When a hypothetical full scene GIF got to it's ulmb mode example frames in any kind of demanding game (or even moderately demanding ones at 1440p or higher), the graphics settings would drop down a lot (including things like model detail, textures, depth via bump mapping, other shaders, FX, shadows, view distances, animated objects viewable in distance, etc) while the brightness and vibrancy would go dim. You could also get some stutter and stops unless you were running at your minimum frame rate constantly.

So say it was a very demanding game on a 2560x1440 monitor at very high+ to ultra graphics settings, in variable hz mode at 100fps-hz average. How low would the graphics settings have to be slashed to get 120fps-hz locked/minimum for ULMB mode? The scene would go from the witcher 3 to N64 zelda (exaggeration). For games like CS:go and some simple isometric style games switching to ULMB might be fine. A very powerful gpu running on a 1080p monitor probably wouldn't have to gut the graphics settings quite as deeply to get 120fps-hz solid/minimum for ULMB. However most people are looking to 1440 and even 4k 144hz monitors in the future, and even VR's current combined resolution is 2160 x 1200.

The counter argument is that at speed without strobe mode or black frame insertion, the viewport and objects will blur which also drops the effective detail down. However at 100fps-hz average and higher this becomes more of a soften blur at speed, within the "mask" of onscreen objects, architecture, and landscape, while still getting all of the motion definition increase. This blur can be intermittent and it's severity can vary depending on how fast you are looking around and what is going on at any given moment as well. Overall the trade-off is less exaggerated vs the trade-offs of low graphics settings and muted screen (and even eye strain) of strobe mode.

Another concern is that backlight strobing mode probably won't work with HDR mode on future monitors, so HDR would be a major loss. However, if a mfg did allow ULMB/strobe mode as an option on a HDR monitor outside of HDR mode it could be a boon to ULMB users since true HDR monitors would have at least 1000nit peak brightness. That peak brightness could help vs the muting of the screen in strobe mode. But again, I doubt any mfg would offer a monitor which allowed strobe mode to work with HDR mode (and variable hz mode) which is where things seem to be going... if they even include strobing at all, even on g-sync HDR monitors.

At least when using ULMB mode, you would still get most of the other side of the equation - the high frame rate and hz that provide much greater motion definition (which differs from motion clarity/blur reduction). You'd get 5:3 ratio of frame increase at 100fps-hz (though flickery) or 2:1 at 120fps-hz, vs a 60fps-hz baseline of course.

Motion definition respresentations
 
Last edited:
ulmb doesn't mute the screen unless you're using over 120 cd/m2 brightness which is bad. ulmb on every monitor that supports it reduces brightness to around 100-120 cd/m2, which is perfectly fine and what everyone calibrates to.
 
Is that brightness measurable externally by calibration hardware? Even that might not be accurate to what you actually see. The problem is (or at least was when I tried it) that the strobing makes your eyes/brain perceive the panel brightness as much more muted and dim even if you maxed the typical brightness. Much more dim than the preferred vibrancy and brightness of a game outside of ulmb mode.

TFT central PG278Q ULMB brightness per pulse width (max brightness OSD)
KSERza7.png


It is a major tradeoff among the other trade-offs of lowering graphics settings a lot more on demanding games and higher resolutions (to get the low end of a frame rate graph to 100fps-hz or 120fps-hz strobes) than for example running 1440p at high+ settings to get 100fps-hz average using variable hz instead of strobe mode, the tradeoff of having to suffer the downsides of v-sync or similar or no-sync, and the tradeoff of strobing being potentially incompatible with HDR mode capable monitors going forward.

I've actually used ulmb mode in the past for torchlight2, L4D2, and darksiders because they had extremely high frame rates. It works pretty well for those but that's not true of most games. I found darksiders looked a lot better in variable hz mode unmuted though.
 
Last edited:
I am fine with ULMB at 85 Hz (near maximum strobe length). For people who need 100 or even 120 Hz, I guess ULMB is less useful.

I wonder if something like the reprojection they do in VR could be a good fit for games in general. But considering the small percentage of people who care about higher than 60 fps, I guess it's not gonna happen.
 
Is that brightness measurable externally by calibration hardware? Even that might not be accurate to what you actually see. The problem is (or at least was when I tried it) that the strobing makes your eyes/brain perceive the panel brightness as much more muted and dim even if you maxed the typical brightness

Meters certainly measure luminance. And they, like the human visual system, integrate luminance over time. So, for example, if a display is pulsing light as follows: 100 cd/m^2 pulses, each of which lasts 1 ms, and occurs every 10 ms, then over the course of 100 ms, there will have been roughly ten 100 cd/m2 pulses, each of which lasted 1 ms. The total "energy" in this 100 ms period is equal to a display that showed a constant (i.e. non pulsing) field of light that was 10 cd/m^2.

All meters have an integration time - for example the i1display pro can be adjusted between 0.5 and 2 seconds (or something like that). So the i1 display pro would measure both examples above at 10 cd/m^2.

The human visual system also integrates light over time (look up bloch's law).

Not sure if this answers your question, but it's important to understand that temporal summation plays a big role in how most light measuring instruments (the human visual system being one of them) operate.
 
  • Like
Reactions: elvn
like this
Back
Top