240hz is best Quantitatively

Without blind testing, this is a personal anecdote that may be colored by the placebo effect..
So what if I said 4K was placebo? The human eye can't possibly see more than 1080P. That is just as ridiculous as this placebo argument.
 
So what if I said 4K was placebo? The human eye can't possibly see more than 1080P. That is just as ridiculous as this placebo argument.

Depending on viewing distance, it is for many people. Most people have their 4K TVs so far from their couch that they are barely in the 1080p visible difference zone, let alone the 4K visible difference zone, but they probably rushed out to buy a cheap 4K TV upgrade, because 4K is more, and "more is automatically better".

It's exactly in line with analogy to the audio situation I presented as well.

There are many practical limits to human perception, and many people will just assume more of something is always better and perceivable, when it isn't.

Until it is actually tested in a controlled blind experiment, it is nothing more than relating anecdotes, and you gets decades of those from audiophiles and they are all complete rubbish.

Also note that I am not saying that it is placebo.

Just that until it is somewhat rigorously tested, we really can't say.

You need to put it to the test, not rely on notoriously bad self reporting of individual perception.
 
Last edited:
Okay, I see what you are saying.

It's just that the difference seems clear to me, but I agree it is difficult to say for sure.
 
Well I won't re-post the entirety my previous reply but it spells it out clearly and is shown tested with pursuit cameras.

240fps at 240hz is a "haze".. much less sample and hold blur during movement than "fuzzy"/"soften blurred" ~ 120fps (at 120hz and higher) and "smearing" wash at 60fps (at 60hz and higher).

This goes way beyond a single simple bitmap ufo and applies to the entire viewport movement of the game world relative to you during movement-keying, mouse-looking, and controller panning. This includes texture detail , depth via bump mapping, other effects, and text detail lost during panning at speed.

I have used 60hz screens right next to 120hz and higher ones (at high frame rates), experimented with different lightboost amounts, as well as having used 60hz screens next to a graphics professional "zero blur" fw900 crt.. I can definitely tell the difference between bad sample and hold blur and better, and "zero" sample and hold blur while moving the game world around at speed relative to me in 1st/3rd person games.

Always remember that you have to fill those hz with new frames though. Saying you don't notice a difference when you aren't running 240fps on a 240hz monitor is irrelevant.

The third UFO pursuit camera shot down in the list is the 240fps at 240Hz example in this quoted picture from blurbusters.com. Extrapolate each UFO to the entire game world moving around relative to you during viewport movement at speed.

KlIRG0B.png
 
Last edited:
Well I won't re-post the entirety my previous reply but it spells it out clearly and is shown tested with pursuit cameras.

Missing the point. No one is arguing that equipment can't find a difference.

It's whether that difference is detectable in a semi-rigorous human perception test (preferably double blinded).

And given the context of the importance in game playing, that difference should be detectable in games, not some kind of specific artifact inducing program.
 
I stated that I have had 60hz smearing blur, a 120hz 1080p and later greater monitors.. and professional crt "zero" blur ... the 60hz and the crt are completely opposite and obvious to perception. The 120fps at 120hz is also an obivous improvement vs 60hz, while not as good as "zero" sample and hold crt.

This is obvious when moving the viewport at speed because the whole game world goes to smearing, to visibly less bluring, or zero blur crystal clear on crt.

I don't need a pursuit camera to see this, it is just a good way to show people without the hardware to see it for themselves.

It also makes obvious that the closer you get the zero blur of 1000fps at 1000Hz the better, and as the equipment shows, higher hz at higher fps makes gains back toward that "zero" blur goal.



Your frame rate is a huge factor though, especially since people try to squeeze the highest settings generally, and they straddle a large frame rate range. So in that case it muddies the results in actual gameplay for a lot of people. That's why I kept reiterating that you have to fill the Hz with new frames to get the improvements. Playing watered down frame rate average roller coasters on a 240hz monitor in common usage scenarios would be more in line with what you are claiming. Going from 120fps at 120hz to 240fps at 240hz is an incremental tightening down of the sample and hold blur from a fuzzy soften blur to a light haze so to speak, so isn't as drastic as 60fps at 60hz and higher's smearing blur to half of that at 120fps 120hz+, or as drastic as going from sample and hold blur to "zero" blur on a crt.
If you are running a variable frame rate graph +/- 30fps of your "frame" rate in both scenarios you aren't going to be as definitive about things It might not be quite as obvious running 90 - 120 - 150 vs 190 - 210 - 240 ranged averages as running 120fps solid vs 240fps solid that is, but it will still be improvement. 120fps solid at 120hz+ cuts the blur 50%, 240fps solid at 240Hz+ cuts it down 75%.. incremental but appreciable. Whether it's worth the frame rates that would be required is another matter. Running lower fps ranges on a higher hz ranged monitor is again irrelevant.
 
Last edited:
Your frame rate is a huge factor though, especially since people try to squeeze the highest settings generally, and they straddle a large frame rate range. So in that case it muddies the results in actual gameplay for a lot of people. That's why I kept reiterating that you have to fill the Hz with new frames to get the improvements. Playing watered down frame rate average roller coasters on a 240hz monitor in common usage scenarios would be more in line with what you are claiming. Going from 120fps at 120hz to 240fps at 240hz is an incremental tightening down of the sample and hold blur from a fuzzy soften blur to a light haze so to speak, so isn't as drastic as 60fps at 60hz and higher's smearing blur to half of that at 120fps 120hz+, or as drastic as going from sample and hold blur to "zero" blur on a crt.
If you are running a variable frame rate graph +/- 30fps of your "frame" rate in both scenarios you aren't going to be as definitive about things It might not be quite as obvious running 90 - 120 - 150 vs 190 - 210 - 240 ranged averages as running 120fps solid vs 240fps solid that is, but it will still be improvement. 120fps solid at 120hz+ cuts the blur 50%, 240fps solid at 240Hz+ cuts it down 75%.. incremental but appreciable. Whether it's worth the frame rates that would be required is another matter. Running lower fps ranges on a higher hz ranged monitor is again irrelevant.

I agree. I have a hard time telling a perceived difference between 120 Hz and 144 Hz on my display but the display is marked at 60-100 and still noticeable from 100 to 120. But honestly we are very good at adapting so after playing a while at any refresh rate you tend to just ignore it and concentrate on the game, in the same way that watching a 3D movie in the theater the 3D aspect becomes less noticeable about 15 minutes in.

Since reducing motion blur by increasing refresh rate and framerates to match is a tough performance challenge for both display and GPU I wish there was more effort put into making ULMB type tech better. I get by far the least blurry visuals running 120 Hz ULMB. I've even tried locked 60 fps games like Dark Souls with this and gotten better results. The primary issue with ULMB is lower brightness but at least for me it's a non-issue because I run my PG278Q normally at something like 22 brightness setting so going to ULMB at 100 brightness is about the same but obviously it's going to be a problem if you need more brightness and have nowhere to go. It's a shame that the HDR monitors that would have the possibility for higher brightness don't include ULMB at all.

The only thing new in this area seems to be the ASUS "ELMB" which combines VRR with strobing backlight. I hope that becomes a thing and works as expected. Having that in their upcoming XG438Q would be awesome but that's probably not going to happen.
 
For reference, a reply from the thread about that monitor you are talking about .. why the current incarnations of strobing are not for me.


I've heard people's reports of strobing cutting their peak brightness by 2/3. If that were the case, for this to do HDR 1000 + strobing it would have to do 3000nit peak color brightness with strobing off. For 350nit SDR it would have to do 1050 nit peak brightness with strobing off. As it is,, according to that site these are 350nit - 400nit with strobing off depending on the model so HDR is just a fake label.

Most people avoid PWM like the plague.
With variable refresh rate, you'd have to have a typical LOW of 100fps to maintain 100fps-HZ strobing, not an average of 100fps.
Since typical frame rate graphs tend to be +/- 30fps from the average for the most part, that could mean 130fps average for a mostly 100 - 130 - 160 fps graph. Even then, 100Hz strobing and variable rate strobing over hour(s) of game time could fatigue people's eyes.. and some have faster eyesight than others regarding flickering as well.

People use G-sync/VRR in order to push graphics settings higher while riding a frame rate range without hiccups. Between the desire for very high+ to ultra game settings relying on VRR to smooth things out seeming to be in direct opoosition to the very high frame rates required for high hz strobing, along with the very high peak color brightness required for even fractional HDR, this kind of strobing seems very situational usage wise. 1440p makes higher frame rate lows more within reach though at least.

I could see this tech being much more useful if someone developed a very high quality interpolation to go along with this, doubling or tripling the typical frame rates and combining it with a very high peak brightness to start with like a Q9Fn's 1800 - 2000 nit peak color brightness or higher. (Those Q9Fn's also have interpolation and black frame insertion I believe but I'm not sure of the quality vs artifacts. Those flagship tvs are still only a 60hz 4k line w/o hdmi 2.1 for now but they can do 1440p non native at 120hz with VRR/freesync off of amd gpu pcs and VRR off xbox one).

With a theoretical very high quality interpolation that avoided input lag and artifacts, you could run a 70fps graph of something like 40 - 70 - 100 fps interpolated x3 to 120 - 210 - 300 fps or multiplied more. That way the strobing could be very fast. Incidentally, if we ever get 1000hz super high response time displays with interpolation of something like 100fps x 10 for 1000fps at 1000Hz, we wouldn't even need strobing since there would only be 1pixel of sample and hold blur just like a crt.... essentially "zero" blur.



The below is dated vs combined VRR + Strobing but most of the tradeoffs still apply..


==================================
Easier to render games with very high fps work pretty well with ulmb.
Running 1440p or higher rez with any kind of high to ultra settings on the most demanding games won't let you sustain high fps, only average it.

As per blurbusters.com 's Q and A:

-----------------------------------------------------
Q: Which is better? LightBoost or G-SYNC?
It depends on the game or framerate. As a general rule of thumb:
LightBoost: Better for games that sustain a perfect 120fps @ 120Hz
G-SYNC: Better for games that have lower/fluctuating variable framerates.

This is because G-SYNC eliminates stutters, while LightBoost eliminates motion blur. LightBoost can make stutters easier to see, because there is no motion blur to hide stutters. However, LightBoost looks better when you’re able to do perfect full framerates without variable frame rates.

G-SYNC monitors allows you to choose between G-SYNC and backlight strobing. Currently, it is not possible to do both at the same time, though it is technically feasible in the future.
......
Main Pros:
+ Elimination of motion blur. CRT perfect clarity motion.
+ Improved competitive advantage by faster human reaction times.
+ Far more fluid than regular 120Hz or 144Hz.
+ Fast motion is more immersive.

Main Cons:
– Reduced brightness.
– Degradation of color quality.
– Flicker, if you are flicker sensitive.
– Requires a powerful GPU to get full benefits. <edit by elvn: and turning down settings a lot more at higher resolutions>

--------------------------------------------------------
During regular 2D use, LightBoost is essentially equivalent to PWM dimming (Pulse-Width Modulation), and the 2D LightBoost picture is darker than non-LightBoost Brightness 100%.
--------------------------------------------------------
Once you run at frame rates above half the refresh rate, you will begin to get noticeable benefits from LightBoost. However, LightBoost benefits only become major when frame rates run near the refresh rate (or exceeding it).
-------------------------------------------------------
If you have a sufficiently powerful GPU, it is best to run at a frame rate massively exceeding your refresh rate. This can reduce the tearing effect significantly.Otherwise, there may be more visible tearing if you run at a frame rate too close to your refresh rate, during VSYNC OFF operation. Also, there can also be harmonic effects (beat-frequency stutters) between frame rate and refresh rate. For example, 119fps @ 120Hz can cause 1 stutter per second.
Therefore, during VSYNC OFF, it is usually best to let the frame rate run far in excess of the refresh rate. This can produce smoother motion (fewer harmonic stutter effects) and less visible tearing.
Alternatively, use Adaptive VSYNC as a compromise.
-------------------------------------------------------------
Pre-requisites
Frame rate matches or exceeds refresh rate (e.g. 120fps @ 120Hz).

  1. LightBoost motion blur elimination is not noticeable at 60 frames per second.

--------------------------------------------------------------

Sensitivity to input lag, flicker, etc. (You benefit more if you don’t feel any effects from input lag or flicker)

Computer Factors That Hurt LightBoost

  • Inability to run frame rate equalling Hz for best LightBoost benefit. (e.g. 120fps@120Hz).
  • Judder/stutter control. Too much judder can kill LightBoost motion clarity benefits.
  • Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
  • Faster motion benefits more. Not as noticeable during slow motion.
  • Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
  • Some games stutters more with VSYNC ON, while others stutters more with VSYNC OFF. Test opposite setting.
-----------------------------------end-of-blurbuster's-quotes--------------------------
 
Back
Top