ASUS/BENQ LightBoost owners!! Zero motion blur setting!

yep. I'm just holding out on a later nvidia gpu upgrade before I buy a lightboost2 monitor. Titan/gtx780 depending on benchmarks. Its nice to get an idea of what strobed backlight blur reduction (though not elimination in this case) looks like on a lcd though like you said. I also still have a fw900 crt by the way so I know the difference between full motion clarity. Looking forward to lightboost2 zero blur later this year.
True, the reduction versus elimination really depends on many factors:
-- Human sensitivity. How fast you can track your eyes.
-- Speed of motion. The faster, the more likely motion blur will happen.
-- One strobe per refresh (best) versus multiple strobes per refresh (bad).
-- Strobe length. Shorter (best) versus longer (bad).

There comes a specific point where the motion blur becomes so tiny (e.g. strobe lengths approaching CRT 1ms-2ms phosphor illuminate-and-decay cycle) that motion blur is effectively eliminated because it's now "below human perceptible levels" and thus no longer a limiting factor anymore. For example, LightBoost on an XL2411T or VG248QE configured to the 10% OSD setting (not "OFF"), is quite stunning clarity with no afterimages/crosstalk in games and most tests. The motion blur is only occuring at speeds that are too fast for my eyes to track, and thus, can legitimately lay claim to "zero motion blur" -- from the perspective of human eye.

Even LightBoost 100% setting, I can't see the motion blur anymore except in fast motion (e.g. strafing a few inches in front of a wall, or doing fast 180 degree flicks, or very fast-moving PixPerAn objects). Then, when I set LightBoost to 10% setting, I can't even see motion blur anymore for anything (circle strafing is as good as CRT, and I can count single pixels in moving objects that are moving half a screen width per second).
 
Last edited:
Question, does anyone here have a Sony HDTV of any of the model HX950 series (also called the XBR950)? They come in various sizes 46" sizes to 65" sizes.
Some other lines also have the setting (e.g. HX853 and a few others). Generally, these are quite expensive Sony LCD HDTV's.

These Sony HX950 series HDTV's have a LightBoost-like mode called "Motionflow Impulse" setting (non-interpolated). This is the Motionflow mode that does not use interpolation. Unfortunately, it flickers very badly at 60 Hz (like a CRT), and has excessive input lag -- albiet less input lag than interpolation. However, it eliminates about 50-75% of motion blur; not quite as good as LightBoost, but this may be of interest to people who wish to have the LightBoost effect on a living room HDTV, for 60fps material (emulators, 60fps console games, etc). Most people did not like the 60 Hz flicker of this Impulse mode, but if you already have the TV anyway, can you please turn on this mode with a 60fps video game?

The good news: You keep the good Sony color (far better looking than current LightBoost monitors), and a huge motion blur reduction.
The bad news: The nasty 60 Hz flicker and the extra input lag. (but not as bad as frame interpolation)
 
Last edited:
The good news: You keep the good Sony color (far better looking than current LightBoost monitors), and a huge motion blur reduction.
The bad news: The nasty 60 Hz flicker and the extra input lag. (but not as bad as frame interpolation)

Since the new Titan can support monitor refresh rate overclocking, it would be interesting to see how high the Sony TV could be pushed (if at all). Flickering drops off pretty quickly to most of us when you start getting to 72hz and above (based on CRT experience, not LCD). It's obviously not useful if you're gaming on the TV with an xbox360, etc. instead of a PC, which I suppose would be the main reason to be using the TV in the first place.
 
Since the new Titan can support monitor refresh rate overclocking, it would be interesting to see how high the Sony TV could be pushed (if at all). Flickering drops off pretty quickly to most of us when you start getting to 72hz and above (based on CRT experience, not LCD).
It works on the Vizio e3d420vx HDTV monitor.
Overclocking an HDTV to 120Hz native

It shall be very interesting to see if this can possibly be done with the Sony HDTV's. I *doubt* it, though -- but if you already have a 120 Hz HDTV, it's worth attempting to overclock your HDTV, it actually works on certain models. Even if it worked on the Sony, it's possible the strobe backlight only functions at certain refresh rates (e.g. 60 Hz), much like LightBoost is programmed to work only in the 100Hz-120Hz range. However, this is an independent question to whether or not you can overclock an HDTV to a higher refresh rate.

The answer is yes: *some* HDTV's can be overclocked to 120Hz native.
We need more guinea pigs, and contribute to an "Official Overclockable HDTV List"
 
.
Btw I did test my a750D and was able to get up to a 17 on pixelperan using blurbuster's samsung instructions. At 18 the afterimage shadow was sort of superimposed on top of the original letters so it made it too hard to read at that speed. At a few different speeds, the aftershadow of the text was in different positions, so I'm not sure if it would be readable at any higher speeds if it were possible thatthe aftershadow moved again off of the letters at some point. 17 still is not that bad considering.


as i wrote earlier in this thread, pixperan started blurring already at speed 5 on my SA750 using standard 120 hz mode. 17 is a very notable improvement. however, i read that article about enabling lightboost on samsung when i had already sold the monitor :p

i had time to try the trick before the buyer came to pick it up. the method is so simple one can only wonder how no one tried it before. going into 3d mode made the picture look dimmer and colder, but also gave the players clearer, sharper outlines, a visible effect of less motion blur.

i didn't had much time for playing, but didn't notice any headache or motion sickness, which occurred frequently after gaming sessions. buyer is someone i know, i showed him the trick and i'll ask him about his experience with it.
 
The crosstalk and/or slower/different strobe definitely fatigued my eyes after playing L4D2 for awhile on the A750D, which was just in frame sequential mode and normal response time. It felt similarly fatiguing after I watched 3D video content on that monitor when I first got it. Hopefully the 1ms backlight and lack of an afterimage/crosstalk for full clarity on the asus VG248QE or similar won't wrestle my eyes. It did look a lot clearer during gameplay but my eyes are still feeling "sore". :rolleyes:
 
however, i read that article about enabling lightboost on samsung when i had already sold the monitor :p
One interesting clarification -- it's a strobe backlight similiar to LightBoost, but is actually not licensed to use nVidia's brand name "LightBoost". (it works on AMD too)

The crosstalk and/or slower/different strobe definitely fatigued my eyes after playing L4D2 for awhile on the A750D, which was just in frame sequential mode and normal response time. It felt similarly fatiguing after I watched 3D video content on that monitor when I first got it. Hopefully the 1ms backlight and lack of an afterimage/crosstalk for full clarity on the asus VG248QE or similar won't wrestle my eyes. It did look a lot clearer during gameplay but my eyes are still feeling "sore". :rolleyes:
One factor is flicker. Not everyone is comfortable with the CRT-style flicker. (Flicker-induced headache, even if flicker is invisible)
Another factor is increased eye tracking. Clear motion potentially increases eye tracking, which tires eyes more. (Eye-tracking tiredness)
There are other factors. Some people feel nausea when they see a "Motionflow effect" (frame interpolation) too, even without flicker.
Sometimes you get used to it over time (like exercise) much like people take time to acclimate to 3D. But sometimes not.
There are several generic recommendations.

-- Add a gentle light in your room, such as a lamp behind your monitor. The lamp does not flicker, and it made a big difference for some people.
-- Calibrate the picture. Via monitor, via nVidia Control Panel, etc. Fix your contrast/gamma. Get a Spyder sensor or other colorimeter if needed.
-- Adjust the LightBoost. Try LightBoost 100% versus LightBoost 10%, The strobes/brightness may be more comfortable.
-- nVidia wrote a warning for 3D Vision warnings to have plenty of rest periods, especially while you get used to it (instructions normally for 3D glasses, but also applicable to LightBoost even though it flickers far less than 3D shutter glasses). Similar instruction apply to the Samsung strobe backlight.
-- Disable the LightBoost (or Samsung strobe mode) when you exit the game.
-- Failing all the above, a strobe backlight may not be for you.

Manufacturer 3D glasses warnings apply here; strobe backlights were designed originally for 3D with a secondary benefit of eliminating motion blur. While they flicker far less in 2D (120Hz) than 3D (60Hz/eye), we heed the manufacturer disclaimers & recommendations applicable to 3D. The important thing is to take breaks. Do not force yourself. And if it's too much, can't get used to it, or you're too worried, then LightBoost may not be for you. Your mileage may vary -- not everyone has any effect. Some people even claim less game eyestrain; this is possibly because of motion-blur-induced eyestrain (straining to focus on motion-blurred objects). I don't feel any effect with my eyes, for example. The disclaimers and recommendations out of the way, keep us updated on your experience with the next monitor, too!!
 
Last edited:
I'm not assuming that the faster lightboost2 monitors would have the same eyestrain effect on me. I've used a fw900 at higher hz with no problems. During the pixelperan text testing of the samsung A750D I got to 17, but at several speeds there was an aftershadow image at different spacings, at 18 the afterimage was superimposed on top of the original letters which was a mess. The two top 1ms speed backlight strobing lightboost2 monitors should be a lot tighter and might have a better chance of not enducing such an overt eyestrain as my A750D's partial blur reduction. I'd definitely be willing to risk it but I'm holding out on the next gen nvidia cards (running an amd currently).
 
So what's the final verdict regarding Ligthboost and input lag? It sounds like a frame is being buffered similar to using vsync based on the reports so far.
 
So what's the final verdict regarding Ligthboost and input lag? It sounds like a frame is being buffered similar to using vsync based on the reports so far.
Perhaps, but this doesn't result in the same input lag behavior of VSYNC. If it's not buffered (e.g. BENQ Instant in non-LightBoost mode) and pixels are refreshed in real-time in total darkness, then simply waiting till an LCD refresh is complete, results in an average of half a frame of lag. This is simply because the top of the screen was refreshed a full frame ago, while the bottom edge of the screen was refreshed just momentarily ago, yielding an average of half a frame of input lag for an optimized LightBoost. It's possible that some of the LightBoost monitors may be framebuffering the refreshes. Some tests will need to be done.

That said, it feels like less input lag to me, due to lack of motion blur / faster reaction times (less human brain lag!). Some people have posted they feel less input lag, while others have posted they feel more input lag. It's somewhat contradictory, and probably depends on if there's sufficiently small input lag that you don't feel the small bit of added LightBoost input lag, but still fully benefit from the lack of motion blur & get better scores & more frags due to your faster human reaction times (less human brain lag).
 
I'm fully aware of the top-bottom input lag issue.

I can physically tell the difference in mouse movement from top to bottom of a 120hz LCD, which gives a delta of ~8ms. Oddly doing the same on a CRT shows no visible input lag, despite the delta being similar (longer blanking interval but it's still around 7ms).

Why then are some people reporting issues with Lightboost? The strobing should have the same side effect of eliminating input lag as a CRT.
 
Why then are some people reporting issues with Lightboost? The strobing should have the same side effect of eliminating input lag as a CRT.
It is very dependant on configuration and person, it seems.
A good article is Anandtech's Input Lag Article

It has this great illustrative image of a poor computer configuration
longlag.png


You can see very well, in AnandTech's high speed video, that there's a whole chain of input lag from pressing the "FIRE" button, to some reaction being seen on a display. And most of it is actually not caused by the display, at all! What might be possible is that some configurations have less total input lag, that the added LightBoost lag is not noticeable. Having a 1000Hz mouse (mouse lag of 1ms), a fast i7 CPU (lower CPU lag), a fast GPU (higher framerate reduces input lag), will compress this graph. Some parts of the input lag chain may be unavoidable (e.g. some games are pretty 'inefficient', or have some internal frame rate cap). To reduce input lag, it's helpful to also turn off VSYNC and raise your fps higher (e.g. 240fps or 360fps). Having framerates far in excess of refresh rate benefits input lag, because the freshest possible frames are displayed. Although some prefer VSYNC ON to eliminate stuttering and tearing issues (especially for solo gaming).
With a well-optimized system, you can have a graph more similar to:

bestcase.png


For some people, adding LightBoost is "the straw that broke the camel's back" between "I can't feel the input lag" to "I can feel the input lag". It all depends on what other lag you have in your input lag chain. Different people seem to have different input lag sensitivites. Some computer configurations seem to add extra input lag that we didn't anticipate (e.g. running in windowed mode in Windows 7/8 sometimes added 1 frame of lag, compared to running in full screen mode, because modern compositing managers (e.g. Aero) behaves as double buffering, overriding vsync-off).
 
Last edited:
I appreciate the effort explaining input lag and am sure it will be an eye-opener to others.

I've always aimed for the lowest possible input lag and would never play online games with less than 120fps and never with vsync on. I can feel lag from triple buffering at 120fps (with 300fps achievable ie source games) on a CRT.

My personal input lag for online gaming is something like this:

1ms - 500hz mouse average
2-8ms - 125 - 500 fps depending on game
4.2ms - Frame transmission average
5ms - LCD input lag (half brightness pixel response + signal delay)
=12.2ms to 18.2ms

A CRT at 120fps (game limited) and 120hz still has input lag of 13.3ms in the above setup btw.

Most hardcore fps gamers will have similar values. If Lightboost buffers even a single frame (8.3ms), you would definitely feel it at these levels.
 
Most hardcore fps gamers will have similar values. If Lightboost buffers even a single frame (8.3ms), you would definitely feel it at these levels.
If you have confirmed that you can feel this whole-chain difference, you are truly one of the people who is extremely sensitive to input lag. However, most people cannot feel an input lag difference between 18ms and 25ms (end-to-end chain). Also, it bears worth factoring in that the lack of motion blur can speeds up human reaction time (for certain game play styles) to overcome an additional frame of input lag. For example, someone who tested a Samsung display that is known to have very bad input lag with its strobe backlight:
Hi Mark,
I got really excited when I saw your update about two of the Samsung 3d monitors being capable of better motion performance without the need for a Geforce card. I just successfully tested my S23a700D and it worked so this one can be added to the list as well. I was able to get up to a tempo of 25 or so on the pixperan readability test and still make out the individual letters.

The only problem is that the screen dims, and I get an added 20-30ms of input lag that makes mouse movements feel a bit “soupy”. I play Quakeworld online (usually at 600 fps to reduce tearing-no vsync) and actually found even with the added input lag, I was able to track other players much easier and pull of twitch shot kills that I normally wouldn’t because of the blur I normally get when spinning 180 degrees quickly. It would be incredible if someone could somehow hack the firmware of these monitors to remove the input lag, with some brightness/color tweaking I’d leave this mode on all the time.

Thanks for you continued research, look forward to this becoming a feature of lcd’s in the future. It really removes that last barrier to crt-like performance.

Matt
He appears to a hardcore FPS gamer -- if he's playing old games just to get insane framerates. He claims to be able to tell the benefit of having 600fps without VSYNC. (I believe him; I have personally seen the advantage of 360fps on my GTX680 myself: The advantage of reduced tearing (during VSYNC OFF) by having massive overkill of framerate well beyond refresh rate)

From dozens of posts on multiple forums and blog comments:
-- Not all LightBoost displays seem to manifest noticeable input lag (more testing needs to be done)
-- Not all gamers notice the input lag of LightBoost
-- Several gamers find faster reaction times with LightBoost (less human lag) massively outweighs the slight lag of LightBoost (more display lag).
-- For some gamers, LightBoost lag is annoying and distracting.
-- For some gamers, LightBoost has tolerable lag "but with benefits" (people like Matt falls in this category).
-- For some gamers, LightBoost has no discernable lag difference (people like me and Vega falls in this category).
-- For some gamers, LightBoost feels like less input lag (false impression, possibly due to faster reaction time)
-- Some of the additional lag is caused by configuration (e.g. fixing configuration can minimize any added LightBoost lag).

Human reaction times are measured not in single millisecond levels, but at tens and hundreds of milliseconds. Even 100 meter Olympic sprint racers don't react faster than approximately ~100ms from the starting pistol. Even reacting 20ms faster (100ms versus 120ms) far massively outweighs an 8.3ms added input lag, allowing you to shoot your enemy before the enemy shoots you. Some gamers are actually reacting hundreds of milliseconds faster with certain type of game motions (e.g. speeding along as the Scout character in Team Fortress 2, without stopping moving). Obviously, not all gameplay styles benefit, and slower games may not allow the lack of motion blur to give you enough of a reaction-time advantage over enemies that have less display-based lag than you do. But with dozens of reports, it is indisputable that some game players have a reaction time advantage with LightBoost; including at least a few of what seems to be hard-core game players. Multiple confirmations show LightBoost outweighs input lag (as long as it's still tiny) including some confirmations from professional competition gamers.

From this isiforums thread that I have discovered only recently, Spinelli is raving about less input lag with LightBoost (an apparetly inaccurate claim; but sort of correct if including less "human brain lag").
Spinelli said:
120 Hz LightBoost: Best for simracing. Lowest input lag.
Spinelli said:
I'm so jealous of you guys, I only spent time with lighboost once at my friends last week. Going to try to sell my 3 Samsung PX2370s ASAP. They are 3ms monitors, with some of the lowest input lag of all lcds (reportedly only 3.3 ms input lag), the motion blur on them is still a complete joke.

If the world was perfect, we'd have CRT-like motion *and* CRT-like input lag. But nobody wants to lug around CRT's anymore.
Now confining ourselves to LCD's -- the question is how much input lag a gamer is willing to sacrifice to gain the CRT-style zero motion blur advantage?
The answer is -- yes, sometimes -- for some people, it's worth whatever small lag LightBoost seems to have -- even for some professional competition gamers!

There appears to be some professional gaming teams now advocating LightBoost. For example, Team Exile 5 Review of LightBoost that receive free hardware from vendors. This is the type of gaming team with sponsored T-shirts and free sponsored hardware, that are professional competition gamers. (You can see that in the photos!) Clearly, it is already benefiting several professional competition gamers, some of whom have already contacted me profusely thanking me when they discovered LightBoost motion blur elimination.
Spachla said:
As shown in the video, with Lightboost disabled, each frame bleeds into the next due to the top-to-bottom refresh method of modern day LCDs. With Lightboost on, however, the backlight is switched on and off per refresh which almost eliminates bleeding frames; similar to a CRT monitor.
As you can gather, it is in the interest of competitive gamers to reduce motion blur as much as possible (especially for FPS gamers).This empowers you to see vital in-game details clearly, even when moving fast, which otherwise would have blurred into the background. In the top end of competition, this can mean the difference between winning or losing a match.
Spachla said:
I am the captain of the Call of Duty 4 division for Team Exile5 and have been competing in eSports for over 7 years. Alongside my team, I have won multiple tournaments at a National level and am currently working in the IT industry.
LightBoost may not be for everyone, but I have thusly given proof that multiple competition gamers are now using LightBoost.
I, therefore, rest my case.
 
Last edited:
I'm sure you've seen this, but many have not:

www.youtube.com/watch?v=vOvQCPLkPt4&hd=1

Starting from around 1 minute, the box dragging example is a perfect illustration of the floatyness even a few ms can introduce. You can clearly see a visible difference between 10ms and 1ms input lag. Hell, the core input lag of software and hardware can still be seen in the 1ms test. My main point however is the huge difference 9ms makes.
 
I'm sure you've seen this, but many have not:
www.youtube.com/watch?v=vOvQCPLkPt4&hd=1
Starting from around 1 minute, the box dragging example is a perfect illustration of the floatyness even a few ms can introduce. You can clearly see a visible difference between 10ms and 1ms input lag. Hell, the core input lag of software and hardware can still be seen in the 1ms test. My main point however is the huge difference 9ms makes.
Yep. This often happens. I can tell when my mouse is lagged 8ms behind the window. I've sometimes seen this happen with dragging Google Chrome's scrollbar, where my mouse pointer often goes a little ahead of the scrollbar. I can feel an 8ms differential in input lag.

I should point out that YouTube video is FAR more than 10ms difference. You can clearly frame-skip the YouTube video, and each YouTube video frame is at least 1/33.3sec (1/30sec frames for video running at 30 frames per second), and there's still a very clearly noticeable lag. It might be the "full-chain" input lag (AnandTech diagram) occuring here; that you're seeing, that's currently captured on video, even if one component in the chain is taking under 10ms.

However, I seem to be unable to feel a small 8ms lag, provided the mouse stays in sync with the in-game motion. The display is lagging both the mouse cursor and the dragging/sliding/panning motions equally, so that window dragging stays in sync with my mouse pointer, no matter how much display lag there is (even if my physical hand on the mouse mat may be physically lagged behind on-screen). As a result, there is no floaty effect if the input lag is small enough. It does start to get noticeable if the total input lag chain is starting to become big enough. People with faster reaction times, are more likely to feel the difference, but it's known some also can get "used" to it. Upgrading from Windows XP (no window compositing manager) to Windows 7/8 (VSYNC ON window compositing) often added some input lag, yet most people weren't able to tell the difference during things like window dragging, because the mouse pointer still stayed in sync with window dragging equally, regardless of VSYNC ON (window compositing) versus VSYNC OFF, even though the VSYNC ON situation added lag. But some very sensitive people do.

Any differential lag on onscreen objects is caused by factors other than the hardware LightBoost. (e.g. wrongly limiting cursor to 60fps when enabling LightBoost, while running the cursor at 120fps without LightBoost). LightBoost was designed for 3D at 60 Hz per eye, so some game logic may incorrectly try to run at 60 Hz instead of 120 Hz, so this possibility can't be excluded (especially during moments where you have to hit Control+T to disable LightBoost) I have heard from at least a few that they were able to eliminate this on-screen differential input lag problem by using the ForceLightBoostWithoutGlasses registry tweak in combination with the EDID override INF file; so that games directly launch into 2D mode without the Control+T tweak. That's because the game now thinks it's running in 2D, and merrily runs everything as things were in 2D to begin with, so there's no accidential additional input lag variables beyond the unavoidable slight hardware input lag caused by the display's enabling of LightBoost. It seems there are a lot of driver bugs/issues as resulted by people who have to keep holding down Control+T to fix freezing problems. Long-term, we need a utility that covertly enables LightBoost on the monitor without letting the nVidia drivers or videogame know about it. That way, no difference in differential lag between onscreen objects should occur; just slightly increased lag between the display and your hand sitting on the mouse/keyboard. It is important to track down all the causes of LightBoost input lag, as apparently there might be more than 1 kind of LightBoost lag (the hardware kind which cannot be avoided, and the software kind that is fixable).

The ability to detect input lag is different between:
- A real-world finger and a window (easy floaty effect)
- An on-screen cursor and a window (floaty mouse cursor feel can be reduced by keeping cursor in sync with window movement, despite display lag)

That said, I agree that input lag is generally always bad; though improvements in the other parts of the chain can compensate enough -- including human brain lag (faster reaction time by having no motion blur).

Back in 1994, I remember playing synchronous network games such as DOOM -- NETDOOM over a 14.4Kbps modem -- lateral movement and shooting was always input-lagged relative to ping because it was a synchronous protocol. You had to get used to it, and compensate for the input lag by compensating your reaction time. For example, shooting a moving target while heavily input lagged, requires you to press the fire button sooner. This is an acquired skill. Much like for real-world projectiles that fly faster versus slower -- artillery physics 101; input lag simply is a variable to compensate for. Some professional competition gamers are already skilful at compensating for weapon latency, and find it easy to adapt to minor slight increased input lag, by utilizing the same skills. Or people who've played very old synchronous-network-protocol games on high-ping connections. People who became very good at compensating for input lag, are the type of people who will have the improved reaction times of zero motion blur effect massively outweigh whatever small input lag caused by LightBoost.

It is very noteworthy that some game players still play at 60 Hz, which has more input lag than playing at 120 Hz. From what I'm able to tell so far, it seems that the use of LightBoost+120Hz have less averaged-out input lag than playing at 60 Hz. Testing is needed to definitively confirm this. Some good competition gamers are able to tolerate (skifully compensate for input lag) the 60 Hz, since significantly superiorior skills can easily win. Provided monitors at a friend, workplace or gaming event. The problem occurs when the gamers are equally matched in skills. Here, reducing even 1ms of input lag can give you a competitive advantage, because you potentially get to shoot 1ms sooner.

The "shoot first" effect can matter even with a 1ms difference in input lag between two equally-matched gamers. Even a 1ms difference occasionally can amplify to as much a 16.7ms difference, if a certain chance moment of reacting 1ms faster "rounds over" to an earlier 60 Hz frame instead of the next 60 Hz frame. (This depends on the game engine & netcode logic, etc.). However, additional brain lag (e.g. increased motion blur during fast motion) can massively outweigh this, as apparently many rave reviews have reported about better reaction times and improved gaming abilities, despite tiny added lag caused by LightBoost.

Generally it is easier to detect differential lag between on-screen objects, than between display and offscreen objects not in your field of view (e.g. mouse mat). The thresholds of detecting lag between these situations are different, and vary differently between different people. Input lag (excluding human brain reaction time = human brain lag) is arguably not a 100% holy grail, due to apparent existence of improving external human reaction times (e.g. LightBoost lack of motion blur) that can apparently outweigh the input lag disadvantage of the said enhancement (e.g. LightBoost lag), at least for some game players. Especially if you're used to CRT and if it's not adding enough input lag to be felt (or you are someone who skilfully compensates for input lag and find that the benefits of zero motion blur far outweighs the felt input lag).

For the record, I am unable to feel the input lag of LightBoost.
It is noteworthy that some gamers still like LightBoost despite the lag, and still gain a competitive advantage.
True end-to-end input lag chain, includes the human brain lag (reaction time).
 
Last edited:
Has anyone figured out how to use LightBoost on the VG248QE with an AMD card yet? I tried talking to ASUS tech support but they couldn't find anything. He said that "LightBoost will be automatically on when the monitor is running @120hz" and that was just information he was told to pass on...
 
Has anyone figured out how to use LightBoost on the VG248QE with an AMD card yet? I tried talking to ASUS tech support but they couldn't find anything. He said that "LightBoost will be automatically on when the monitor is running @120hz" and that was just information he was told to pass on...
Unfortunately, at this time of writing, LightBoost is enabled only for nVidia cards.

There are some attempts (SoftMCCS) to attempt to find a DDC/CI command that enables LightBoost. Only partial success has occured (with a specific BENQ monitor). Some research is posted somewhere near page 25 of this thread.
 
Does LightBoost add any input lag? Turning Samsung's native 3d on in Frame Sequential mode in on my S23A950D introduces a sizable amount input lag, making it not worth using in my opinion despite the massive gain in image smoothness.
 
Last edited:
This was gone into in detail on the same page as your question, four posts ago. Go back around seven or ten posts to #730 or #733 (yesterday) and start reading
 
One of many of the posts I was referencing:

If you have confirmed that you can feel this whole-chain difference, you are truly one of the people who is extremely sensitive to input lag. However, most people cannot feel an input lag difference between 18ms and 25ms (end-to-end chain). Also, it bears worth factoring in that the lack of motion blur can speeds up human reaction time (for certain game play styles) to overcome an additional frame of input lag. For example, someone who tested a Samsung display that is known to have very bad input lag with its strobe backlight:He appears to a hardcore FPS gamer -- if he's playing old games just to get insane framerates. He claims to be able to tell the benefit of having 600fps without VSYNC. (I believe him; I have personally seen the advantage of 360fps on my GTX680 myself: The advantage of reduced tearing (during VSYNC OFF) by having massive overkill of framerate well beyond refresh rate)

From dozens of posts on multiple forums and blog comments:
-- Not all LightBoost displays seem to manifest noticeable input lag (more testing needs to be done)
-- Not all gamers notice the input lag of LightBoost
-- Several gamers find faster reaction times with LightBoost (less human lag) massively outweighs the slight lag of LightBoost (more display lag).
-- For some gamers, LightBoost lag is annoying and distracting.
-- For some gamers, LightBoost has tolerable lag "but with benefits" (people like Matt falls in this category).
-- For some gamers, LightBoost has no discernable lag difference (people like me and Vega falls in this category).
-- For some gamers, LightBoost feels like less input lag (false impression, possibly due to faster reaction time)
-- Some of the additional lag is caused by configuration (e.g. fixing configuration can minimize any added LightBoost lag)
.


Human reaction times are measured not in single millisecond levels, but at tens and hundreds of milliseconds. Even 100 meter Olympic sprint racers don't react faster than approximately ~100ms from the starting pistol. Even reacting 20ms faster (100ms versus 120ms) far massively outweighs an 8.3ms added input lag, allowing you to shoot your enemy before the enemy shoots you. Some gamers are actually reacting hundreds of milliseconds faster with certain type of game motions (e.g. speeding along as the Scout character in Team Fortress 2, without stopping moving). Obviously, not all gameplay styles benefit, and slower games may not allow the lack of motion blur to give you enough of a reaction-time advantage over enemies that have less display-based lag than you do. But with dozens of reports, it is indisputable that some game players have a reaction time advantage with LightBoost; including at least a few of what seems to be hard-core game players. Multiple confirmations show LightBoost outweighs input lag (as long as it's still tiny) including some confirmations from professional competition gamers.

From this isiforums thread that I have discovered only recently, Spinelli is raving about less input lag with LightBoost (an apparetly inaccurate claim; but sort of correct if including less "human brain lag").


If the world was perfect, we'd have CRT-like motion *and* CRT-like input lag. But nobody wants to lug around CRT's anymore.
Now confining ourselves to LCD's -- the question is how much input lag a gamer is willing to sacrifice to gain the CRT-style zero motion blur advantage?
The answer is -- yes, sometimes -- for some people, it's worth whatever small lag LightBoost seems to have -- even for some professional competition gamers!

There appears to be some professional gaming teams now advocating LightBoost. For example, Team Exile 5 Review of LightBoost that receive free hardware from vendors. This is the type of gaming team with sponsored T-shirts and free sponsored hardware, that are professional competition gamers. (You can see that in the photos!) Clearly, it is already benefiting several professional competition gamers, some of whom have already contacted me profusely thanking me when they discovered LightBoost motion blur elimination.

LightBoost may not be for everyone, but I have thusly given proof that multiple competition gamers are now using LightBoost.
I, therefore, rest my case.
 
Each and every example quoted is someone giving their personal opinion. No-one has yet done an SMTT test to give accurate information (assuming one can run SMTT with Lightboost).
 
I never said it was hardware tested information. He asked a question and that is the best answer available currently as far as I know.

An interesting test would probably be a blind test, but it would have to be 100% blind, and with the exact same hardware, drivers, settings, etc down to the keyboard, mouse, tested a lot of times by different people and the data crunched later to get a refined picture of what the results were.

The fact that there is zero motion blur results in increased accuracy which could possibly counter a small amount of input lag. Another type of tradeoff. How could you test the gaming benefit of zero blur and accurately measure it vs a (small?) increase in input lag even if you had hardware tested input lag numbers? Maybe another blind test but there are so many variables in a game, especially a team game, that you would have to devise some kind of a walk, run, spin, jump and ride through robotic shooting gallery arena to test eye-hand + reaction times vs the entire input lag chain during various type of mobile and stationary shooting rather than muddle results with too much game arena variances. There are a lot of variances in gaming arenas, especially online ones. To list a few: other player's (each player on both teams)' skill levels and how much "in the zone" or "well-playing" and not-distracted they are at any given time/session, the varied performance of their rigs, varying online latency of teammates and opponents, game engine code relating to latencies, luck of the draw item spawn timing, spawn camping, varying knowledge levels or lack of knowledge of a game's maps, of a game's mechanics between various players, etc. Input lag does matter to a degree but there are going to be a ton of other variables (especially latency [entire chain of latency including ping and all hardware] of all players on both teams) that grossly outweigh and could essentially wash (or flood) out differences very small amounts of display input lag from a scoring perspective in online games, especially when counterbalanced vs a display with zero blur advantage. I can understand that if input lag becomes a noticeable feel (and not just a "placebo" feeling based on quoted numbers), that it could be annoying from an aesthetic perspective though.. much like I hate lcd blur aesthetically and find it aggravating when people regularly mention blur reduction as only something that matters for twitch gamer scoring advantage. To that point, most people are not LAN gamers, and most people are not going to run at bottom-level / low graphics eye candy settings or at lower resolutions on modern games to get multiple hundreds of fps in their games, so the people that have cut their performance edge to that point where it could matter are a likely a very small sample.
 
Last edited:
I think within the next 3 months, one of the reputable reviewers will apply SMTT to a LightBoost monitor, or utilize my upcoming input lag testing. Objective measurements of input lag will be good; I'm just saying it's already confirmed that other factors can outweigh a small difference in input lag.
 
Mark after doing the tweak if I'm playing cs go and some old q3 engine game and I'm using 4:3 resolutions on both do u recommend to change desktop reso and use window mode or it can work fullscreen always?
 
Mark after doing the tweak if I'm playing cs go and some old q3 engine game and I'm using 4:3 resolutions on both do u recommend to change desktop reso and use window mode or it can work fullscreen always?
Hard to say; some people prefer these games with stretched resolutions and others prefer it with 4:3 mode. I have never tested LightBoost in 4:3 mode before. You could also embed 1280x1024 within 1080p timings using nVidia Custom Resolutions, too.

One caveat I can think of about Windowed mode: It can potentially add input lag, due to the windows compositng / Aero / etc. (e.g. VSYNC OFF often does not work in windowed mode -- you'll notice that tearing disappears even though the in-game VSYNC is OFF) Turning off Aero in Windows 7 can help fix the input lag penalty of windowed mode. However, Windows 8, enforces compositing.
 
I think within the next 3 months, one of the reputable reviewers will apply SMTT to a LightBoost monitor, or utilize my upcoming input lag testing. Objective measurements of input lag will be good; I'm just saying it's already confirmed that other factors can outweigh a small difference in input lag.

Hm, I have SMTT 2.0 and could test one of my QE's versus a QE right next to it. One in LB mode and one not. Not sure how well my camera will handle the LB strobing though. I think it does 1/600th at f4.5.

That is if I have time. PCMonitors just reviewed the QE and put input lag at ~2ms, so it is confirmed along with the BenQ 11T as being the best gaming monitor on the market.
 
If my in game reso is 1024x768 and I'm launching in fullscreen with fixed aspect ratio, any reason for LB not to stay on ? Or I'll experience fps drops
 
So I have a vg248qe and I ran to bit of a problem after applying the lightboost hack as I dont really see any difference compared to normal 144hz, xcept the screen is more dark... I also noticed that on step 7 I don't have such option as "Verify “Enable Stereoscopic 3D settings for all displays” is enabled". I can't even test it because when i start pixperan the screen just flashes couple of times and changes the colors bright and then nothing happens, program doesnt start. so any1 got any ideas whats wrong here :s? and why pixperan just doesnt work or perhaps another program to test the lightboost?
 
If my in game reso is 1024x768 and I'm launching in fullscreen with fixed aspect ratio, any reason for LB not to stay on ? Or I'll experience fps drops

Had this problem in the beginning. In nvidia control panel, go to adjust desktop size and position, perform scaling on and set to display. Also click override the scaling mode set by games where it is checked and it should work on lower resolutions. Some games like quake live which is a browser game needs to be set to always on before launching as well.
 
So I have a vg248qe and I ran to bit of a problem after applying the lightboost hack as I dont really see any difference compared to normal 144hz, xcept the screen is more dark... I also noticed that on step 7 I don't have such option as "Verify “Enable Stereoscopic 3D settings for all displays” is enabled". I can't even test it because when i start pixperan the screen just flashes couple of times and changes the colors bright and then nothing happens, program doesnt start. so any1 got any ideas whats wrong here :s? and why pixperan just doesnt work or perhaps another program to test the lightboost?

You could also try chromium wheel smooth scroller and scroll web pages up and down. Non-LB mode you will see some trails/blur, LB will be crystal clear.
 
So I just got my VG248QE monitor and followed the instructions from the HOWTO. Battlefield 3 runs fine in normal, but when I enable lightboost it barely loads (very laggy/choppy) and eventually crashes.

I did not have this issue with my VG278H, so I think it must be something with the inf and registry tweaks. Anything I can try? Running two 680's, and I believe I removed all of the Catleap stuff (drivers, toastyx patch, etc.)
 
Hm, I have SMTT 2.0 and could test one of my QE's versus a QE right next to it. One in LB mode and one not. Not sure how well my camera will handle the LB strobing though. I think it does 1/600th at f4.5.

I would very much like to see those results.
 
Last edited:
I would very much like to see those results.
I have the suitable equipment including the high speed camera, but I no longer have a CRT anymore. I need to re-obtain a high-end CRT (with 120 Hz capability), then I'm all set. I need to eventually use my Arduino input lag meter, too (that project is waiting on the shelf, awaiting further work, while I work on my motion test I'm planning to launch).
 
Last edited:
So I just got my VG248QE monitor and followed the instructions from the HOWTO. Battlefield 3 runs fine in normal, but when I enable lightboost it barely loads (very laggy/choppy) and eventually crashes.

I did not have this issue with my VG278H, so I think it must be something with the inf and registry tweaks. Anything I can try? Running two 680's, and I believe I removed all of the Catleap stuff (drivers, toastyx patch, etc.)
Try disabling the "3D checkbox" and see if LightBoost sticks (stays stuck on). That's the holy grail mode; LightBoost staying enabled in 2D mode; so games don't need Control+T. This appears to be the LightBoost mode that Battlefield 3 works flawlessly with.
 
Last edited:
The simplest test would be to use any two 120hz displays. They don't need to be the same model and a CRT isn't necessary. Measure input lag difference between them without LightBoost. Even with identical displays this may be greater than zero depending on graphics card drivers/settings. Then simply repeat the test with LightBoost on the primary display and calculate the difference.
 
Last edited:
I will compare a QE versus another QE in LB mode with SMTT 2.0 when I get some free time.
 
Back
Top