ASUS/BENQ LightBoost owners!! Zero motion blur setting!

Your stance on this is already well documented. Also completely indefensible. How like you to completely ignore an entire thread full of information about how TERRIBLE sample-and-hold LCDs are with motion. Totally ignore the comparatively atrocious IPS response time and its effect on motion blur. Continue to congratulate yourself on your amazing purchase.

I dont know if he is right but I will say this, I have asked before if pixels that are smaller are capable of refreshing faster. And I think they are. So the smaller pixels may have less motion blur. Many LCD panels which are from the same company in different sizes, IE diferent DPI will list the larger monitors as having a longer G2G response time.

I saw alot of people who were very happy with the 2560x1440 monitors maybe this could explain why they seem to perform well.
 
So unless you are going to look around like some kind of blinking gnat all the time, and maybe you don't play any games with gorgeous landscapes and textures idk.. To each his own though. Lighboost2 reports are similar to crt. The car simulation below is just a single car object cell shaded. A full scene of high detail textures + bump mapping and high detail object geometry blurred out is even worse.

As for 120hz, the consensus is that it still blurs.. just not as much. ...
the most modern, extremely high resolution texture mapped games , + bumpmapping depth and shaders, make lcd screen blur even more obvious and eye wrenching than before since the blur on fast FoV movement washes out that extremely high detail+3d depth my eyes "have a lock on" every time. It strains my eyes and is much more obnoxious to me than more than simpler textured/older games.

Most people seem to agree with this representation of 60hz/120hz *LCD/ CRT blur in games.

lcd-blur.jpg


So it appears to me that 120hz vs the limitation of LCD pixel response times and retinal retention blur would still not be enough to retain the focus on texture detail (much like fine text scrawled on a surface which gets smudged out) and bump map depth.

Its like you have goggles filled with some liquid-gel and every time you turn quickly, your eyes see all fine detail lost in a blurring. 120hz might replace your goggles with a fluid which has double the viscosity, blurring near half as much.. but its still a lousy prescription compared to clear sight imo.
 
Vega's playstyle is a "workaround" vs blur as I've said before - whether he started doing it because it fit his skill-style or not. Peronsally I hate blur and like having complete freedom of movement.
It really boils down to a matter of preference, especially if you're just playing casually and non-professionally.

I've noticed that gameplay style for long-time CRT players in competition FPS (especially those who play professionally), tend to be somewhat different from the gameplay style for LCD. The professional competition players do like the full freedom of movement and would find the colors somewhat less important than the lack of motion blur. Certain types of multiplayer tactics such as "circle-strafing", benefit more from CRT and LightBoost, than hiding stationary behind a bunker gun port as a sniper. Another example is playing "Scout" in TF2 is an excellent example of CRT / LightBoost benefit -- Scout is a fast moving player. People who have developed ability to shoot while panning on a CRT, are invariably an audience that tends to dislike the LCD the most, and should give LightBoost a try if they need a non-CRT technology for any reason (portability, space considerations, etc).

As the years go by, we need to whittle away at the LCD disadvantages. What we have proven is that it is possible to bypass the pixel persistence of LCD as the limiting factor of motion blur, as the high speed camera footage of LightBoost on YouTube proves you can bypass pixel persistence using a strobed backlight, by keeping the pixel persistence in the dark, and strobing on fully-refreshed frames. What we need to see is this gets applied to IPS as times goes, and preferably (eventually) to 1440p and even 2160p. (To dream in the decade or two ahead: NHK demonstrated an 8K 120Hz format too...) The technology is already here today so let's run with it... (unlike OLED which will probably be more expensive than LCD for at least a decade)
 
Last edited:
I still maintain that battlefield type games aren't where CRT levels of motion clarity really pay off, which is why Vega is happy with the catleap and the benefits that provides. I'm not bashing the game before someone jumps in :)

The ultimate recommendation would be from a long term Quake player with recent CRT exposure and a hatred of all LCDs. If no-one else fits the bill I'll give my verdict when the Asus VG248qe hits the UK :)
 
I still maintain that battlefield type games aren't where CRT levels of motion clarity really pay off, which is why Vega is happy with the catleap and the benefits that provides. I'm not bashing the game before someone jumps in :)

The ultimate recommendation would be from a long term Quake player with recent CRT exposure and a hatred of all LCDs. If no-one else fits the bill I'll give my verdict when the Asus VG248qe hits the UK :)
This would make a lot of sense. Battlefield games aren't as critical for zero motion blur as, say, TF2 or Quake Live.
 
If no-one else fits the bill I'll give my verdict when the Asus VG248qe hits the UK :)
I am looking forward to the VG248QE and how it tests out.
That said, here's some quotations I've collected from CRT users who have confirmed the zero motion blur effect of LightBoost LCD's from various forums, including their original links. See below:

Mark Rejhon said:
Confirmed!! I've since purchased the Asus VG278H & a GTX680, and it works! Zero motion blur confirmed -- looks like CRT motion.
original post

Transsive said:
Then yesterday I, for some reason, disabled the 3d and noticed there was no ghosting to be spotted at all in titan quest. It's like playing on my old CRT.
original post

Inu said:
I can confirm this works on BENQ XL2420TX
EDIT: And OMG i can play scout so much better now in TF2, this is borderline cheating.
original post

Terrorhead said:
Thanks for this, it really works! Just tried it on my VG278H. Its like a CRT now!
original post

Vega said:
Oh my, I just got Skyrim AFK camera spinning (which I used to test LCD's versus the [Sony CRT] FW900) to run without stutters and VSYNC locked to 120. This Benq with Lightboost is just as crystal clear if not clearer than the FW900 motion. I am in awe. More testing tomorrow. Any of my doubts about this Lightboost technology have been vaporized! I've been playing around with this fluid motion on this monitor for like 6-hours straight, that is how impressive it is.
original post

Agreed that there are disadvantages -- it's a TN panel, so it's not as good color as IPS. But obviously, the niche audience of CRT game players have already confirmed the "CRT feel" of LightBoost; confirmed by multiple sources. Some still prefer other displays (e.g. resolution and color) but they are all nontheless all confirming the zero motion blur effect.

Let's hope this technologies filter to IPS!
And with easier methods of enabling/disabling the strobe backlight feature (like a button or hotkey)., for those times we don't need it, rather than going to the nVidia Control Panel to enable/disable this.
 
Last edited:
Frame rate must match the refresh rate, otherwise the game starts to look bad, same issues as with CRTs.
So you have 100, 110, 120 hz in 2d or 50, 55, 60 hz in 3d.

Under 120 fps you get stuttering due to frame repeating (also creates a blurring effect)
60 fps at 120hz shows a double image effect.
 
Frame rate must match the refresh rate, otherwise the game starts to look bad, same issues as with CRTs.
So you have 100, 110, 120 hz in 2d or 50, 55, 60 hz in 3d.

Under 120 fps you get stuttering due to frame repeating (also creates a blurring effect)
60 fps at 120hz shows a double image effect.

It's bad because I have 120hz but always my framerate is less than 120fps in demand games.

To conclude, at 120hz, I must have 120fps, isn'it if not I will have stuttering, blur or double image effect?.

I
Frame rate must match the refresh rate, otherwise the game starts to look bad, same issues as with CRTs.
I never have this problem with CRt. I always had CRT at 85hz and FPs at 85 fps or under 85fps and I never have any problem like stuttering with that in lots of games.
 
Last edited:
I played several games on both the LCD and a CRT I borrowed. I did not see any difference in behavior.
Maybe you are used to stuttering/frame skipping, but since I haven't used a CRT in over 4 years the effect is new to me, so maybe more noticeable.

I think the pros and cons on framerate are the same as with CRTs.

You should wait for other opinions.
 
I played several games on both the LCD and a CRT I borrowed. I did not see any difference in behavior.
Maybe you are used to stuttering/frame skipping, but since I haven't used a CRT in over 4 years the effect is new to me, so maybe more noticeable.

I think the pros and cons on framerate are the same as with CRTs.

.
What is stuttering/frame skipping? I never use this.

I gave up my last crt one year ago (iiyama hm903dtb after a iiyama A901HT after a sony 15" etc...) because of windows seven and I can say that with CRT there aren't these problems.

The only game that have stuterring with my crt have stuttering even with lcd 60hz or lcd 120hz without lighboost. Stuteering doesn't come from CRT lightstrobbe.

You should wait for other opinions
Thanks for your answer. I wait other opinion.
 
The ultimate recommendation would be from a long term Quake player with recent CRT exposure and a hatred of all LCDs. If no-one else fits the bill I'll give my verdict when the Asus VG248qe hits the UK :)

That's me spot on. Still on a 19" CRT for quake and everyday use. Quakeplayer since ever and extremely suspicious vs LCD's. I'm going to get the new Asus when it arrives and I hope I don't need to go back to my CRT.
 
It's bad because I have 120hz but always my framerate is less than 120fps in demand games. To conclude, at 120hz, I must have 120fps, isn'it if not I will have stuttering, blur or double image effect?.
If you are sensitive to this effect on CRT or plasma, then you will be sensitive to it on LightBoost.
If you are NOT sensitive to this effect on CRT or plasma, then you WON'T be sensitive to it on LightBoost.

Also, it helps to understand why it's easier to see on CRT.
Stuttering is slightly harder to see on LCD because the blurring helps masks the stuttering effect more than it does on CRT.
Whatever you're (or not) sensitive to on a CRT, is equal to whatever you will also be (or not) sensitive to on LightBoost -- that's the point.

I never have this problem with CRt. I always had CRT at 85hz and FPs at 85 fps or under 85fps and I never have any problem like stuttering with that in lots of games.
Then you won't be sensitive to the same effect.

Sometimes, it depends on the person.
Sometimes it's a "knowing what to look for" type of thing.
(like a 30fps-versus-60fps side-by-side demonstration.)
 
Last edited:
That's me spot on. Still on a 19" CRT for quake and everyday use. Quakeplayer since ever and extremely suspicious vs LCD's. I'm going to get the new Asus when it arrives and I hope I don't need to go back to my CRT.
Let me know when it arrives -- did you order it already?

Note -- I'm still waiting for my BENQ XL2411T. I'm going to compare it to my ASUS, since I hear its LightBoost is much brighter than the ASUS, and it uses shorter backlight strobe lengths which is better for the zero motion blur effect.
 
The only game that have stuterring with my crt have stuttering even with lcd 60hz or lcd 120hz without lighboost. Stuteering doesn't come from CRT lightstrobbe.
No, but it helps to understand why it's easier to see on CRT.
Stuttering is slightly harder to see on LCD because the blurring helps masks the stuttering effect more than it does on CRT.

The bottom line: If you're not sensitive to stuttering, then it doesn't matter.
If you weren't sensitive enough to see it on CRT, you won't see it on LightBoost.
HOWEVER...If you are sensitive to stuttering, its _definitely_ easier to "see" stuttering on CRT than on LCD, because of the fact that motion blur on LCD masks the stuttering effect to an extent.

This is why, if you want the _best_ zero motion blur effect, you really do need the full framerate (e.g. 100fps@100Hz, or 120fps@120Hz). LightBoost strobing can be lowered to 100Hz on current LightBoost monitors.
 
Last edited:
I dont know if he is right but I will say this, I have asked before if pixels that are smaller are capable of refreshing faster. And I think they are. So the smaller pixels may have less motion blur.
One big problem:

There is always motion blur on sample-and-hold displays even with instant-responding (0ms) pixels. This motion blur is caused by eye-tracking based motion, which is the major cause of motion blur. On a modern panel, pixel persistence is no longer the main cause of motion blur because it's now only a tiny fraction (e.g. 2-5ms) out of a 16ms refresh at 60Hz) See Science & References that explains this further.

Your eyes are always moving, while following a moving object. On a sample-and-hold display (non-strobed), your eyes have moved a tiny bit even over the 1/60th second (or 1/120th second), so the LCD frame is blurred across your retinas, as the frames artificially step forward, one frame at a time. The edge blur thickness is equivalent to the step distance. For example, a moving objecte that steps 16 pixels between frames, you've got 16 pixels of edge blur -- even with instant-responding pixels (0ms). Double the Hz, and you've got half the step, and that's why LCD 120Hz has 50% less motion blur than LCD 60Hz. However, CRT 60fps@60Hz has much sharper motion than LCD 120fps@120Hz, and that actually is not only because of pixel persistence -- it's actually scientifically proven/measured to also be eye-tracking based motion blur on a sample-and-hold display. On newer LCD panels, retinal blurring during eye tracking is a MUCH BIGGER cause of motion blur than pixel persistence. (2ms out of 16ms refresh, means pixel persistence on a 2ms LCD is less than 20% the cause of motion blur at 60Hz -- and by itself, does not explain why motion blur is still so bad)

You _need_ more frames (more Hz, or frame interpolation), _or_ strobing (CRT, LightBoost, scanning backlight, etc). The stroboscopic effect eliminates the retinal blurring caused by eye tracking. In a perfect world, we'd have infinite framerate, but that's technically impossible, so strobing is the easiest way (ala CRT) -- and done at a high enough rate that the flicker doesn't bother the eyes. (Thus, 120Hz and up is a good rate for strobing that looks flicker-free to most people)
 
Last edited:
This is all great news, but I'm concerned that this development isn't getting the proper attention from news sites. I haven't seen anything about it on Gizmodo/Slashdot/Arstechnica. I just emailed Gizmodo about it though, and submitted an article to Slashdot. Any others we should notify?

And and help upvote my submission on Slashdot if you please. It's titled "Zero motion blur achieved..."

http://slashdot.org/recent
 
Last edited:
This is all great news, but I'm concerned that this development isn't getting the proper attention from news sites. I haven't seen anything about it on Gizmodo/Slashdot/Arstechnica. I just emailed Gizmodo about it though, and submitted an article to Slashdot. Any others we should notify?

And and help upvote my submission on Slashdot if you please. It's titled "Zero motion blur achieved..."

http://slashdot.org/recent

i'll be surprised if they take this up. i mean telling the tech audience that your LCD has just now started catching up with CRT is kinda odd. it can make the tech industry look pretty bad.
 
If you are sensitive to this effect on CRT or plasma, then you will be sensitive to it on LightBoost.
If you are NOT sensitive to this effect on CRT or plasma, then you WON'T be sensitive to it on LightBoost.

Also, it helps to understand why it's easier to see on CRT.
Stuttering is slightly harder to see on LCD because the blurring helps masks the stuttering effect more than it does on CRT.
Whatever you're (or not) sensitive to on a CRT, is equal to whatever you will also be (or not) sensitive to on LightBoost -- that's the point.

Then you won't be sensitive to the same effect.

Sometimes, it depends on the person.
Sometimes it's a "knowing what to look for" type of thing.
(like a 30fps-versus-60fps side-by-side demonstration.)
Thanks.

I am very sensitive at stuttering (the worst problem in video game for me) but I never have stuttering like described by Transsive with CRT.

You speak about stuttering but Transsive said that if I don't have 100FPs at 100hz, I will have stuttering with a CRt or lightboost 120hzlcd. I don't understand why.
Even withtout lighboost on 120hz LCd or with a 60HZ LCD I will have stuttering.

I would to know if the fact that I don't have the same fps like HZ create stutteirng because of lightboost ou lightstrobbe in crt like described by Transsive.

For me stuttering doesn't create by CRT, lighboost 120hz lcd but by game 3d engine, graphic driver or other. If stuttering arrives, it will arrive in a 60hz lcd or a crt or a 120lcd with lighboost, isn'it?

Maybe my stuttering definition isn't the same than your definition. For me, stutering is jerk even with high fps like 50 or 60. For example, I have stuttering with dirt 3 under windows 7 even with a 60hz lcd, a CRT or a 120hz LCD.
With dirt 3 under xp I don't have stuttering even with a 60hz lcd, a CRT or a 120hz LCD. In both cases, I don't have the same FPs than Hz.

I will try with a benqxlm2420T to see if I see this problem.
 
Last edited:
I 'm on smartphone and a bit distracted but I think you are talking about gpu micro-stutter and I think the topic was more of an out of sync judder type effect from strobing not matching the framerate. MarkR could probably make a video of it if he wanted to, considering the ones he has shown already.
 
I've edited original XL2411T EDID and saved it to .inf file: changed only 4 values: "Manufacturer ID", "Model ID", "Serial number", "2nd description field"(Display Product Serial Number), "4th description field"(Display Product Name) to Asus VG278 values.
Now i have original XL2411T refresh rate timings, ability to set 144hz and ability to enable 3D in nvidia control panel at 100-120hz.
This .inf for XL2411T owners: http://ge.tt/api/1/files/6FMs6UU/0/blob?download mirror: http://dl.dropbox.com/u/17188606/monitor.inf
Also i found another trick to disable 120 fps capping when lightboost is ON. No any stuttering or tearing noticed.
1. Enable 3D (at refresh rate 120hz) via nvidia control panel.
2. Set refresh rate to 144hz via windows. Lightboost will be auto disabled.
3. Disable 3D via nvidia control panel.
4. Set refresh rate to 120hz via windows.
5. PROFIT
I hope someone will find trick to enable lightboost at 144hz. May be via DDC/CI/I2C.
Tested this all at benq XL2411T at win7 x64 with nvidia beta 310.70 and FIFA13 with FRAPS fps counter.
 
I 'm on smartphone and a bit distracted but I think you are talking about gpu micro-stutter and I think the topic was more of an out of sync judder type effect from strobing not matching the framerate. MarkR could probably make a video of it if he wanted to, considering the ones he has shown already.
Correct, I'm not really talking about GPU micro-stutter, but generic stutter caused by single random framedrops, like the sudden transition between 60fps<->30fps.
So ,it helps to explain what I mean by "stutter"; because different people have a different understanding of what "stutter" means.

Example List of generic stutters/judders
  • Random single frame drops (e.g. 1, 2 or 3 frames dropped every second, e.g. 57fps out of 60fps)
  • (for VSYNC-ON game players) Framerate suddenly halving (e.g. 30fps-vs-60fps, or even 60fps-vs-120fps). This typically causes a double-edge effect on _all_ strobe displays (CRT, plasma, LightBoost). If you play with VSYNC on, this becomes noticeable. (this stutter effect is usually N/A for VSYNC-OFF players)
  • Random noticeable freezes (e.g. network disruption, virus scanner, disk access) -- multiple frames dropped all at once, e.g. 1/10sec pause.
  • GPU microstutter, well documented lately, but is part of the entire family of stutters
  • Inconsistent frame repeat sequence, e.g. 3:2 pulldown (a frame repeats 3 times, then 2 times, then 3 times, then 2 times, and so on), which is worse than a consistent frame repeat sequence. This is often called 'judder' instead of 'stutter', though not everyone makes the distinction.
  • Even with VSYNC turned off (or triple buffering), running at odd framerates (e.g. 73fps at 60Hz) higher than refresh rate, can still (not always) lead to more judder and/or stutter than simply framelimiting to 59fps or 60fps (either via framelimit or via VSYNC) at 60Hz. (It can depends on the game/software)
  • Other types of stutters/judders/framedrops/freezes
All the above, is all, invariably more noticeable on impulse/strobe displays (CRT, plasma, LightBoost) than on sample-and-hold displays (most LCD), if using exactly the same game on exactly the same GPU, connected to a splitter outputting to both displays simultaneously. Although major stutters (freezes) which are almost equally noticeable on all displays because it now leaks into real human measurable reaction time (e.g. 200ms) rather than behaving as an artifact (e.g. mere simple motion glitches, edge-blurring effects on sample-and-hold displays, edge-doubling effects on strobe/impulse displays, etc.)

This is what I mean by "stutter": _Everything_ that results in a _varying_ frame rate, or results in a _suboptimal_ frame rate.
For major freezes (large series of multiple frames dropped), there is little difference between impulse-driven displays and sample-and-hold displays.
For simple framerate fluctuations involving dropping or adding single frames, the fluctuation is more noticeable on impulse-driven displays.

What _really_ is the double image effect that other people see on CRT displays?? (and other strobe display technologies). For the "double image" effect, it's not strictly a true double image, but simply a double-edge instead of blurred-edge. For slow motion, it is hard to tell apart, and just simply looks like motion blur, but it's distinct because it's a sharp "ghost edge" instead of a "fuzzy blur". For slower game motion (where frames step only a pixel or few per frame), it's hard to tell the two apart, because the sharp ghost copy of the edge looks simply like a blurred edge. For faster game motion it becomes more noticeable. Turning on VSYNC makes this even more noticeable. Perfect consistency also makes this even far more noticeable (VSYNC ON, 30fps at 60Hz) unless you're running VSYNC on and running at perfectly consistent half framerate. It is especially easier to tell apart a double-edge from a blurred-edge, if you're moving perfectly consistently (e.g. perfect 30fps at 60Hz, VSYNC ON, during perfect 'smooth' pans, like strafing using keyboard in front of close-up objects, etc). This is the situation where the double edge image looks most pronounced, and this double edge effect still exists at higher framerates (e.g. 60fps@120Hz) even though distance between frames is halved.

In short, that's what meant by the double image effect. Along motion vectors, high-contrast edges are either blurred (LCD) or doubled-up (CRT/Plasma/LightBoost) at the half framerate. Many people see the double-edge just as motion blur, so you need a side-by-side test to see the subtle difference. the "double image" effect, only on impulse driven displays, because you get two strobes for the same frame while your eyes are still continously tracking roughly the average movement of a moving object, down the same motion vector). However, this effect is not the _only_ kind of framerate defect that is visible to the human eye.

Connect same game to a splitter, to a CRT and to LCD.
That single dropped frame? More easily noticed on strobed displays.
Sudden temporary halving of framerate? More easily noticed on strobed displays.
etc.

Generally, the worse the LCD (e.g. old 30ms+ persistence displays from 2002), the more it masks framerate variances, but even when persistence became less than a refresh cycle, the sample-and-hold nature of the display enforces one full refresh cycle worth of eye-tracking-based motion blur (e.g. 16ms of motion blur at 60Hz).

A re-worded phrase, that omits the potentially contentious word "stutter" (which means different things to different people):
"Many kinds of frame-rate defects (varying frame rates, or frame rates below refresh rate) are generally more noticeable on strobed/impulse driven displays than on sample-and-hold displays. This is because impulse-driven displays do not have as much motion blur to hide subtle frame rate defects".
 
Last edited:
Correct, I'm not really talking about GPU micro-stutter, but generic stutter caused by single random framedrops, like the sudden transition between 60fps<->30fps.
So ,it helps to explain what I mean by "stutter"; because different people have a different understanding of what "stutter" means.

Example List of generic stutters/judders
  • Random single frame drops (e.g. 1, 2 or 3 frames dropped every second, e.g. 57fps out of 60fps)
  • (for VSYNC-ON game players) Framerate suddenly halving (e.g. 30fps-vs-60fps, or even 60fps-vs-120fps). This typically causes a double-edge effect on _all_ strobe displays (CRT, plasma, LightBoost). If you play with VSYNC on, this becomes noticeable. (this stutter effect is usually N/A for VSYNC-OFF players)
  • Random noticeable freezes (e.g. network disruption, virus scanner, disk access) -- multiple frames dropped all at once, e.g. 1/10sec pause.
  • GPU microstutter, well documented lately, but is part of the entire family of stutters
  • Inconsistent frame repeat sequence, e.g. 3:2 pulldown (a frame repeats 3 times, then 2 times, then 3 times, then 2 times, and so on), which is worse than a consistent frame repeat sequence. This is often called 'judder' instead of 'stutter', though not everyone makes the distinction.
  • Even with VSYNC turned off (or triple buffering), running at odd framerates (e.g. 73fps at 60Hz) higher than refresh rate, can still (not always) lead to more judder and/or stutter than simply framelimiting to 59fps or 60fps (either via framelimit or via VSYNC) at 60Hz. (It can depends on the game/software)
  • Other types of stutters/judders/framedrops/freezes
All the above, is all, invariably more noticeable on impulse/strobe displays (CRT, plasma, LightBoost) than on sample-and-hold displays (most LCD), if using exactly the same game on exactly the same GPU, connected to a splitter outputting to both displays simultaneously. Although major stutters (freezes) which are almost equally noticeable on all displays because it now leaks into real human measurable reaction time (e.g. 200ms) rather than behaving as an artifact (e.g. mere simple motion glitches, edge-blurring effects on sample-and-hold displays, edge-doubling effects on strobe/impulse displays, etc.)

This is what I mean by "stutter": _Everything_ that results in a _varying_ frame rate, or results in a _suboptimal_ frame rate.
For major freezes (large series of multiple frames dropped), there is little difference between impulse-driven displays and sample-and-hold displays.
For simple framerate fluctuations involving dropping or adding single frames, the fluctuation is more noticeable on impulse-driven displays.

What _really_ is the double image effect that other people see on CRT displays?? (and other strobe display technologies). For the "double image" effect, it's not strictly a true double image, but simply a double-edge instead of blurred-edge. For slow motion, it is hard to tell apart, and just simply looks like motion blur, but it's distinct because it's a sharp "ghost edge" instead of a "fuzzy blur". For slower game motion (where frames step only a pixel or few per frame), it's hard to tell the two apart, because the sharp ghost copy of the edge looks simply like a blurred edge. For faster game motion it becomes more noticeable. Turning on VSYNC makes this even more noticeable. Perfect consistency also makes this even far more noticeable (VSYNC ON, 30fps at 60Hz) unless you're running VSYNC on and running at perfectly consistent half framerate. It is especially easier to tell apart a double-edge from a blurred-edge, if you're moving perfectly consistently (e.g. perfect 30fps at 60Hz, VSYNC ON, during perfect 'smooth' pans, like strafing using keyboard in front of close-up objects, etc). This is the situation where the double edge image looks most pronounced, and this double edge effect still exists at higher framerates (e.g. 60fps@120Hz) even though distance between frames is halved.

In short, that's what meant by the double image effect. Along motion vectors, high-contrast edges are either blurred (LCD) or doubled-up (CRT/Plasma/LightBoost) at the half framerate. Many people see the double-edge just as motion blur, so you need a side-by-side test to see the subtle difference. the "double image" effect, only on impulse driven displays, because you get two strobes for the same frame while your eyes are still continously tracking roughly the average movement of a moving object, down the same motion vector). However, this effect is not the _only_ kind of framerate defect that is visible to the human eye.

Connect same game to a splitter, to a CRT and to LCD.
That single dropped frame? More easily noticed on strobed displays.
Sudden temporary halving of framerate? More easily noticed on strobed displays.
etc.

Generally, the worse the LCD (e.g. old 30ms+ persistence displays from 2002), the more it masks framerate variances, but even when persistence became less than a refresh cycle, the sample-and-hold nature of the display enforces one full refresh cycle worth of eye-tracking-based motion blur (e.g. 16ms of motion blur at 60Hz).

A re-worded phrase, that omits the potentially contentious word "stutter" (which means different things to different people):
"Many kinds of frame-rate defects (varying frame rates, or frame rates below refresh rate) are generally more noticeable on strobed/impulse driven displays than on sample-and-hold displays. This is because impulse-driven displays do not have as much motion blur to hide subtle frame rate defects".
Thank you very much for this explanation. I am not good (the right word would be "very bad" :D ) in english but I have understood very well.

I see very well stuterring on my CRt when it arrives and on my 120Hz no lightbooost. I think that I will see stuttering with lightboost 120hz.
 
Thanks for this, though this must be done carefully -- we need to make sure that people at least be able to understand this technology without an immediate "It isn't possible".
Here are some VERY important links to give out, for people who disbelieve it is possible. Some of your Reddit posts are now being downvoted, so it's important to prepare a good (but short) defense of the technology:


Also, as a headline, I prefer "Zero motion blur LCD's now possible (allowing CRT perfect sharp motion)."
Or even "Zero motion blur LCD's now possible, pixel persistence now bypassed by backlight strobing"
(and a link to the high-speed video being prominent -- forum threads are less trustworthy than video proof).
 
Last edited:
I think its hilarious and ironic that most of the people bought into LCDs with the promise that they did not need refresh rates, we had all those arguements years ago that LCDs didnt need high refresh rates, now we spend all our time trying to make them flash like CRTs. Industry 1, consumers 0. gg
 
I think its hilarious and ironic that most of the people bought into LCDs with the promise that they did not need refresh rates, we had all those arguements years ago that LCDs didnt need high refresh rates, now we spend all our time trying to make them flash like CRTs. Industry 1, consumers 0. gg
To be fair, several manufacturers did understand strobing for 'premium' displays. That's why scanning backlights existed for a few years in HDTV's -- Existing Technology -- charging an extra premium for that feature, and it does cost extra to manufacture.

Now we have strobed backlight technology for computer monitors, a simpler form than a scanning backlight, but does the same effect (and actually, sometimes simpler is better -- no backlight diffusion limitation). The cost premium is apparently becoming relatively small over a non-LightBoost 120Hz monitor, since LED's are falling fast in price -- cheap enough, bright enough, and fast enough -- to permit strobe operation that is now competitive to CRT. The use of 120Hz eliminates most of the flicker complaint of 60Hz CRT, and the strobe operation can be conveniently enabled/disabled. It's just a "special PWM backlight mode" that's precisely synchronized at one strobe per refresh, with strobe timings specially optimized for motion blur elimination.
 
Last edited:
Thanks for this, though this must be done carefully -- we need to make sure that people at least be able to understand this technology without an immediate "It isn't possible".
Here are some VERY important links to give out, for people who disbelieve it is possible. Some of your Reddit posts are now being downvoted, so it's important to prepare a good (but short) defense of the technology:


Also, as a headline, I prefer "Zero motion blur LCD's now possible (allowing CRT perfect sharp motion)."
Or even "Zero motion blur LCD's now possible, pixel persistence now bypassed by backlight strobing"
(and a link to the high-speed video being prominent -- forum threads are less trustworthy than video proof).

Yeah I like those headlines better too. Overcoming disbelief has been a big hurdle, and not just because people don't think it's possible. Many are so used to LCD they don't even notice the blur. I'm getting "Huh? What blur?" from a lot of people.
 
Yeah I like those headlines better too. Overcoming disbelief has been a big hurdle, and not just because people don't think it's possible. Many are so used to LCD they don't even notice the blur. I'm getting "Huh? What blur?" from a lot of people.
Give it at least 3 to 6 months. It becomes easier to explain LCD motion blur when I release my modernized PixPerAn-like motion testing software, which also includes several useful motion benchmarks. (As you already know, Vega is one of the beta testers)
 
Last edited:
Many are so used to LCD they don't even notice the blur. I'm getting "Huh? What blur?" from a lot of people.

Reminds me of the "Allegory of the Cave".
Some are so used to wearing their slushy goggles they don't know what its like to see clearly.

The ways I've seen people "workaround" , "attempt to ignore" , or pretend to be oblivious to it are flick-FoV movement from A-to-B in an attempt to "blink" past the FoV arc motion that smears every time, otherwise they just try to ignore and not pay attention during the FoV movement, trying not to register vision during it at all... or you let your locked on eye focus strain at the textures and texture-depth via bump mapping that smear out during each FoV movement, since your eyes always try to focus away blur (I know mine do). Its most annoying on the highest detail extreme textured games.

As for 120hz, the consensus is that it still blurs.. just not as much. I'm finding the most modern, extremely high resolution texture mapped games , + bumpmapping depth and shaders blur even more obvious and eye wrenching than before since the blur on fast FoV movement washes out that extremely high detail+3d depth my eyes "have a lock on" every time. It strains my eyes and is much more obnoxious to me than more than simpler textured/older games.

Most people seem to agree with this representation of 60hz/120hz *LCD/ CRT blur in games.

lcd-blur.jpg


So it appears to me that 120hz vs the limitation of LCD pixel response times and retinal retention blur would still not be enough to retain the focus on texture detail (much like fine text scrawled on a surface which gets smudged out) and bump map depth.

Its like you have goggles filled with some liquid-gel and every time you turn quickly, your eyes see all fine detail lost in a blurring. 120hz might replace your goggles with a fluid which has double the viscosity, blurring near half as much.. but its still a lousy prescription compared to clear sight imo.
 
Another workaround is alot of games actually have you aiming behind a person so in a sense you dont have to work around you just aim right at their ghost. Pretty messed up shit, makes me wonder if this netcode crap was the result of devs using LCDs. Ironically the explosion of CS / realistic shooters coincided with LCDs.
 
Still waiting for my BENQ XL2411T to arrive.

I did, however, test out my Arduino input lag meter, and found very little input lag difference between LightBoost versus non-LightBoost. I do, however, now need to refine the input lag meter software -- it is having some greater inaccuracies at 120Hz than at 60Hz (+/- 1ms) when the software is designed to be +/- 0.1ms at any refresh rate. Some debugging needed.

Regardless, the good news is that the input lag difference between LightBoost and non-LightBoost is insignificant.
 
looking forward to a review of the benq vs the asus monitor! On the fence about which one to get for this :)

So excited!
 
I'm very interested in this tech, but I might be in for a wait. Since I don't already have a 680 this would be a considerable upgrade to buy both monitor and gpu. By the time I get my tax return and start looking around it would be february. The next nvidia card will eventually come out - some say march but most likely something like june 2013 (gtx 780 I guess, with a considerable performance increase). If its a matter of waiting (up to) another 4 months after I get tax return to get the 780 rather than a 680 I'd consider waiting for it. I'm also hoping that someone will release a 1ms strobe lightboost2 120hz glossy display, maybe even a 27" one which would fit better in my monitor array scheme, but I won't hold my breath. Maybe by the time the 780 comes out there will be more options.
 
Last edited:
just got the vg278he and have tried the lightboost on 2d which does help eliminate motion blur but there seems to be a few drawbacks.

The colours - although i guess this can be tweaked in drivers settings, tried altering the digital vibrance in drivers and setting it high does give colours a little more pop (although they are nowhere near accurate, but brighter.)
Saying that the splendid preset modes are poor without using lightboost anyway (game mode has a distinct blue hue)

144hz - the main selling point of the monitor and 3d locked to 120fps, swear i read in review big selling point 3d wize was having 72hz per eye so hopefully this will be tweaked in new driver. 144 fps with lightboost on would be even better.

The last one may be just me getting used to my new mouse but when locked at 120 fps with lightboost on it felt like the mouse wasnt responding as it should. It felt like playing with vsync on which i hate to be honest as it does affect the mouse movement.

I prefer the newer "adaptive" mode if i have to have vsync on, or cap ingame fps.

So in lighboost mode is it using proper vsync or the new adaptive mode? - which seems to help with mouse movements.

I only have a gtx570 and there will be some games that are impossible to get solid 120 fps but most games you can turn the settings down enough to run display in 3d mode with lightboost and get solid framerate. Far cry 3 is one I have had to play with lightboost off, as I was barely scrapeing 120 anway in standard mode.

Idea of the hit I took on Black ops 2 -

Lightboost off standard mode - 200 fps (limit of game) shadows low textures extra, ambient occ=on

lightboost on - 120 fps (locked) shadows low, textures high, ambient occ - off -

If i didnt change above setting I would dip below 120.


Final thing pixperan seems to crash for me (win 7 x 64) is there any other programs i can try out.
 
I do hope you've simply got a driver hiccup with regards to vsync. If vsync is forced on it doesn't matter how good lightboost is, I won't be bothering. Can anyone else confirm this?

Vsync (any kind) vs Blur - I'll take blur every time
 
just got the vg278he and have tried the lightboost on 2d which does help eliminate motion blur but there seems to be a few drawbacks.

The colours - although i guess this can be tweaked in drivers settings, tried altering the digital vibrance in drivers and setting it high does give colours a little more pop (although they are nowhere near accurate, but brighter.)
Saying that the splendid preset modes are poor without using lightboost anyway (game mode has a distinct blue hue)

144hz - the main selling point of the monitor and 3d locked to 120fps, swear i read in review big selling point 3d wize was having 72hz per eye so hopefully this will be tweaked in new driver. 144 fps with lightboost on would be even better.

The last one may be just me getting used to my new mouse but when locked at 120 fps with lightboost on it felt like the mouse wasnt responding as it should. It felt like playing with vsync on which i hate to be honest as it does affect the mouse movement.

I prefer the newer "adaptive" mode if i have to have vsync on, or cap ingame fps.

So in lighboost mode is it using proper vsync or the new adaptive mode? - which seems to help with mouse movements.

I only have a gtx570 and there will be some games that are impossible to get solid 120 fps but most games you can turn the settings down enough to run display in 3d mode with lightboost and get solid framerate. Far cry 3 is one I have had to play with lightboost off, as I was barely scrapeing 120 anway in standard mode.

Idea of the hit I took on Black ops 2 -

Lightboost off standard mode - 200 fps (limit of game) shadows low textures extra, ambient occ=on

lightboost on - 120 fps (locked) shadows low, textures high, ambient occ - off -

If i didnt change above setting I would dip below 120.


Final thing pixperan seems to crash for me (win 7 x 64) is there any other programs i can try out.

PixPerAn crashed for me until i got rid of beta Nvidia drivers, same OS.
 
Don't get me wrong the option for vsync was not forced on with lightboost enabled, you still could change it

The mouse just felt different to me it's probably just me getting to it though.
 
If the game is running at a fixed 120fps, vsync is enabled despite what the options might say.
 
I could run Lightboost without VSync. Although some games you have to manually go in and turn it off, like Skyrim.
 
Back
Top