ASUS/BENQ LightBoost owners!! Zero motion blur setting!

If you're getting 100-110 just use the 100Hz LB mode.

Anyway, I've just installed Borderlands and I'm faced with a dilemma...

My PC can't do 120fps in the game... unless I'm looking at the ground. (even with shadows and occlusion disabled)
So I have to choose between:
1. 40-60 fps 3D
2. uncapped FPS 60-120 LB
3. uncapped FPS 60-120 no LB
4. capped fps 60 LB
5. capped fps 60 no LB
6. 60 hz

Now for drawbacks:
1. it's 3d, works great when it's 60fps, look horrible when it drops
2. inconsistent performance, various levels of blur
3. inconsistent performance, variance in blur not as noticeable
4. smooth performance, double image effect blur...
5. smooth performance, consistent smooth blur
6. uhh, smooth performance, high blur

I will probably go for 5 since I consider the double image effect to be worse.
What would you choose?

If you have PhysX enabled in the game, I noticed that turning it off or lower the settings will maintain a more consistent frame rate. I had PhysX set to "High" previously and I noticed my FPS would go from 120 to 60 very frequently. I thought I had turned on some FPS limit somewhere, but it was just my GPU spending more resources for PhysX than rendering the screen.
 
So is there a way to keep my 144hz while also having the option to use 120hz lightboost whenever I want to. I can't keep 120fps for a game like BF3 so I don't use the lightboost trick. This is where 144hz benefits to me because if I set fps to 75 instead of 60 the game play is smooth. I have the vg278he monitor and with the inf I lose 144hz.

Also anyone have any recommended calibration settings for my monitor with and without lightboost?
 
http://www.lagom.nl/lcd-test/

Do the contrast , gamma and black level tests. Color reproduction can only be fine tuned with a proper calibration tool and every monitor regardless of it being in the same model line is different.

But if all you do is game , follow the link I gave you , do the minimal tweaking and set the color temp on your monitor to whatever you feel is best because short of spending at least $150 on a proper calibration kit that's the best you are going to get.
 
http://www.lagom.nl/lcd-test/

Do the contrast , gamma and black level tests. Color reproduction can only be fine tuned with a proper calibration tool and every monitor regardless of it being in the same model line is different.

But if all you do is game , follow the link I gave you , do the minimal tweaking and set the color temp on your monitor to whatever you feel is best because short of spending at least $150 on a proper calibration kit that's the best you are going to get.

I tend to agree with the tweaking sentiments, though hardware calibration would provide a solid foundation --> good starting point.

Don't forget that if you bathe your display in direct light sources, and if you allow the lighting conditions to vary at all in your display gaming "studio", the hardware calibrated brightness, contrast, gamma, color saturation, etc will not be what your eyes are seeing anymore. The hw calibration is usually done in a dark room right up against the screen, and the different lighting conditions I mentioned will affect the ways your eyes perceive the settings quite a bit, in some cases drastically.

In my case, I keep a corner desk facing out from the corner. When a desk is up against a wall like a bookshelf, it acts as a catcher's mitt for light pollution whether glossy or AG, (though the effects are more obvious on glossy). For lights I prefer keeping lighting behind my monitors, and since my desk is a considerably long "stealth bomber"/chamfered boomerang shape, I can keep a lamp in line at each end of my monitor array like bookends without them showing up in my glossy display's surfaces. Btw, balanced lighting right/left also helps avoid eyestrain/headaches. One sided is bad.
.
Any settings you store on your monitors will be altered perception wise at different room lighting levels, so unless you are going to keep several sets of settings hotkeyed for different lighting environments, you are better off trying to maintain similar lighting levels in the room all the time imo. I keep 3 sets of settings on my living room tv for this reason since the perceived settings are hugely different from daytime to lights out/blinds drawn, etc. In my computer room I just try to keep the room behind the monitors lit well with floor lamps at night to maintain the light level from a daytime window.
.
My LED monitors are quite bright, so keeping the room at least moderately lit and a lamp on each end of the monitor array helps prevent the monitors from looking too harshly bright by contrast too.
 
I can't get lightboost to stay on for games :( i launch battlefield 3, hold ctrl+t, then exit bf3 and turn stereoscopic 3D off, yet when i relaunch bf3 its non lightboost mode.
 
Ok I gave it one more try, decreased lightboost in OSD of monitor (it's about 60% of the green meter in benq - it's way to bright at max), did the gamma correction in nvidia software and.... gaming is on the other level.

Playing Hawken with no motion blur at all, fraging people like crazy :D. FPS rate about 50-70fps and no screen tearing, vsync off.
CS 1.6 - 120fps, vsync on (with off screen tearing, settings in CS: fps_max 120, developer 1), old CRT times baby!

BUT! No chromium wheel smooth scroller for me! It gives me very bad eyestrain, especially reading high contrast text like in hardforum (white txt on gray background). I think it is because my eyes want to read everything when scrolling down or up.

About the crimson tint my settings for RGB brightness and contrast are totally different then Mark's, need to buy/borrow colorimeter.
 
Last edited:
I can't get lightboost to stay on for games :( i launch battlefield 3, hold ctrl+t, then exit bf3 and turn stereoscopic 3D off, yet when i relaunch bf3 its non lightboost mode.

Did you set one of the drop down options in the 3D Settings page to be "Always" on? After turning off 3D, Lightboost should still be turned on.
 
Mark, i've turned lightboost 2d back on.

Right away, i noticed eye strain. One setting i tried was LB 100hz vs 120hz. That made no percievable difference in eye strain. It was still there.

I've been using your 93 contrast setting both in and out of LB mode. I tried lowering that. Lo' and behold, much much less eye strain in-game. At the desktop, there's just as much unfortunately. Contrast at 0 in/out of game. As soon as i leave the game, the eye strain cranks up a notch. Not sure why.

Anyhow, if LB 2d mode hurts your eyes, at least try lowering the contrast. It seemd to make very little difference in game, with LB enabled, imho.

I've also tried messing with different levels of LB. Seems to not matter too much. It's the contrast that was really pushing the eye strain.

As it stands, i'm not sure i'll keep using LB in 2d mode. It's probably not worth it compared to 144hz gaming without eye strain.
 
If you have PhysX enabled in the game, I noticed that turning it off or lower the settings will maintain a more consistent frame rate. I had PhysX set to "High" previously and I noticed my FPS would go from 120 to 60 very frequently. I thought I had turned on some FPS limit somewhere, but it was just my GPU spending more resources for PhysX than rendering the screen.

Borderlands 1 does not have physx support.
I get the same framerate with both
bDisablePhysXHardwareSupport=True
bDisablePhysXHardwareSupport=False

I've decided to play at 120Hz with framerate fixed to 60fps. (no lightboost)

My E8500 does not cut it in most games anymore, I'll need to get a new CPU and maybe an SSD. Maybe then I'll be able to play more games at 120fps.
 
Hey guys, I'm new to all of this and I'm having some problems as I've never messed around with anything mentioned in the guide and was hoping I could get some help.

So I just purchased an Asus VG248QE and started following instructions to Option B

Now on Instruction 2 I copied the file into a Notepad and renamed it to "Asus-VG278H-3D-Monitor-EDID-override.inf" and placed it on my desktop. I then right clicked and hit "Install" and an error message came up stating "The INF FIle you have selected does not support this method of installation" I then tried installing it via 'Device Manager' and it would not show up on the manual search.

Would love to figure this out and just try Lightboost out, I loved gaming on CRTs back in the day and wouldn't mind replicating the style with better colours.
 
You need install the inf via Device Manager, manually. You're at the right place if you see the "Have Disk" button.
 
Now on Instruction 2 I copied the file into a Notepad and renamed it to "Asus-VG278H-3D-Monitor-EDID-override.inf" and placed it on my desktop. I then right clicked and hit "Install" and an error message came up stating "The INF FIle you have selected does not support this method of installation" I then tried installing it via 'Device Manager' and it would not show up on the manual search.

Would love to figure this out and just try Lightboost out, I loved gaming on CRTs back in the day and wouldn't mind replicating the style with better colours.
Are you using Windows 7 or Windows 8? You may need to override driver signature verification in order to install third-party INF files.

Let me know if you succeed; I'd love to help.
 
Are you using Windows 7 or Windows 8? You may need to override driver signature verification in order to install third-party INF files.

Let me know if you succeed; I'd love to help.

Any update on where to get that inf that lets me keep non lightboost 144hz and lightboost 120hz?
 
Does anyone have three Lightboost monitors of any kind in Surround that could test to see if LB stays enabled if you use them in portrait mode surround? (you don't have to physically rotate your displays, just attempt the setting).
 
I have a question for MarkR, as well as Vega and others who have used a 27" 120hz - 133hz korean ips. In debating against 60hz monitors as gaming monitors in a few other threads (reminds me of before I actually used a 120hz myself lol), I used a post by mark R showing the pixel trail amounts on different monitors.

If you come from a CRT, then you will be dissapointed to hear that 120Hz will only reduce motion blur, but not eliminate motion blur completely. You need strobe backlight to do a dramatic reduction. CRT and plasma are impulse-driven displays, but LCD normally isn't impulse-driven (except for LightBoost strobe mode).

PixPerAn chase test, 960 pixels per second:
60Hz -- blur trail length of about 16 pixels
120Hz non-strobed -- blur trail length of about 8 pixels
120Hz LightBoost strobe backlight -- blur trail length of ~1 pixels (CRT sharp)

The above quote was taken when the lightboost2 lcd's tested were not the newer 1ms backlight strobe models which are tighter. The latest ones do not even have a single 1pixel afterimage and truly have zero blur I believe. I didn't want to post outdated info without saying that.

As most people that follow this thread probabaly know, that blur trail length actually looks like the WISIWYG blurry "blob" ufo photo's (that are on this thead and the blurbuster's blog) to your eye, not the single trail that a non motion tracking camera shows.. but it is still a useful measurement. The longer the trail, the more horrible the real world blur is to your actual eyes.. and of course it affects the entire viewport of high detail objects and textures, not just a simple cell shaded object.

Getting back to my question, is that trail length measurement only accurate on a 1080p "grid" ? Do the 2560 x1440 "overclockable" ips monitors with their higher number of pixels per inch still only get a trail measurement of 8 (much tinier) pixels? Or would it be the same real world distance trail on a 27" 1920 x 1080 monitor vs a 27" 2560 x1440 monitor? For example if the trail was 1/8th inch yeilding the 8 pixel trail quote, would it be 1/8th inch full of 108.8ppi pixels on the ips ? If it is not a ratio like that and only around 8 pixels in both cases, how would this relate to preceived blur comparisons between the two?

I know this isn't really a lightboost2 topic but a lot of the data that has me asking this question comes from this thread, MarkR, and vega so sorry if I'm OT a little.
 
Last edited:
Hm, I really had not thought about that. Screens the same size, image the same size, pixels the same speed, the only difference being PPI. My best guess given all things equal is that the higher PPI screen would appear more clear due to the reduced size of the pixels and the reduced distance they would make up of the trail.

But as Mark pointed out, most motion blur is actually from eye movement viewing sample and hold frames, not ghosting/trails.
 
but I thought the trail measurement is comparable to the motion tracking camera's version of WYSIWYG "blur-blob" limits left end to right end more or less.

That is, the solid left side of the ufo object distance to the right edge of the trail afterimage would be the full width of the "blur blob" you see in a motion tracking camera and to your eyeballs, which is the retinal retention blur version I thought. I'll have to think about it... maybe it is just showing the response time related blur like you said. For some reason I thought the motion tracking ufo image was emulating retinal retention blur, and that the extents of its blur-ball object ufo were comparable to the length of the trail image version.
 
Maybe Mark will be able to chime in.

On another note, I have two more QE's inbound for de-matte'ing / de-bezeling to begin my 3x 120 Hz Lighboost zero motion blur portrait Surround, 4x SLI Titan geo-thermal cooled awesome-sauce project. :D

You called it elvn!
 
Last edited:
Turned this on and tried it, wasnt able to tell much of a perceivable difference in image quality. Other than less brightness and more filled out colors. Not at all to bash those who are enjoying this, but I wasnt able to tell a difference and didnt want to sacrifice too much image quality.
 
Lightboost doesn't do anything for image quality. Just motion clarity.
 
Iton, it's not image quality this feature is for. It's for motion blur reduction.
 
Yes, that's correct. If you use the registry tweak, you'll keep your display in LightBoost mode at all times, in and out of games. If you prefer your monitor this way, then you have to calibrate your monitor using nVidia Control Panel -- there's R/G/B color adjustments there for contrast/gamma/brightness. (Do adjust your monitor's OSD Contrast first, to set a baseline brightness for your whites)

The problem with using the nVidia Control Panel to adjust the RGB settings is that some games seem to ignore it (Source based games for example). Have you seen that happen or is it just me?
 
Turned this on and tried it, wasnt able to tell much of a perceivable difference in image quality. Other than less brightness and more filled out colors. Not at all to bash those who are enjoying this, but I wasnt able to tell a difference and didnt want to sacrifice too much image quality.

Make sure you are maintaining over 100fps at 100hz or over 120fps at 120hz or you won't be seeing what these monitors can do. For that matter, you wouldn't be seeing the full benefit that a 120hz or greater lcd provides outside of lightboost2 synchronized strobing either.

I've heard people claim that they can see no benefit of 120hz at high fps over 60hz, or that they see no blur on 60hz lcds.
Some people don't have fast eyesight. I consider my eyes pretty "fast", but my ex gf could see certain light bulbs flickering and fluorescents were ridiculous for her. On gaming LCDs, some people have developed a "flick A to B" style fov movement, whether by inherit play-style or subconsciously, which acts as a workaround and helps avoid the FoV movement blur duration (blinking your viewpoint from A to B like a gnat jumping around). I suspect others have developed a blase' gaze during FoV moment since they have been trained to see everything blur out during the repeated fractions of a second/seconds duration that they are moving their FoV over and over. They've just come to accept the blur like a visual "noise" every time they move their FoV and don't even register it much if at all psychologically. Others, like me, just suffer the locked-on focus of the entire viewport (high detail objects, archtecture, landscape, textures, depth via bump mapping, other shaders, etc) being blurred out every time they move their "head" (pan their entire viewport/FoV), even though it is blatantly obvious. Those of us who find it a huge inferiority of lcd tech are on a continued quest for a gaming display that can actually display motion clearly.
 
Last edited:
The problem with using the nVidia Control Panel to adjust the RGB settings is that some games seem to ignore it (Source based games for example). Have you seen that happen or is it just me?

I've been playing with this a bit. For Source games (at least some), you can run in borderless/windowed mode and keep the desktop calibration.
 
I've been playing with this a bit. For Source games (at least some), you can run in borderless/windowed mode and keep the desktop calibration.

the nvidia color settings (with the exception of digital vibrance and hue) as well as software calibrated monitor profiles adjust the video card lut. games will often reset the lut in fullscreen mode.

download and install monitor calibration wizard. it can capture and enforce the current lut (don't actually click the 'run wizard' button, just save the profile) so that if a game tries to reset it, mcw will override it.
 
Last edited:
I've been playing with this a bit. For Source games (at least some), you can run in borderless/windowed mode and keep the desktop calibration.

You typically get a decrease in performance running in windowed mode, so that might not be the best way to solve it.
 
Maybe Mark will be able to chime in.

On another note, I have two more QE's inbound for de-matte'ing / de-bezeling to begin my 3x 120 Hz Lighboost zero motion blur portrait Surround, 4x SLI Titan geo-thermal cooled awesome-sauce project. :D

You called it elvn!
To inspire you, I'm looking for these pictures I saw a few years ago where someone buried a house or car radiator in the ground outside his window and ran his water cooling loop through it.
 
You typically get a decrease in performance running in windowed mode, so that might not be the best way to solve it.

I cant think of any source games that dont run well in excess of 120 fps on any system powerful enough to think about running these faster monitors.
 
Maybe Mark will be able to chime in.

On another note, I have two more QE's inbound for de-matte'ing / de-bezeling to begin my 3x 120 Hz Lighboost zero motion blur portrait Surround, 4x SLI Titan geo-thermal cooled awesome-sauce project. :D

You called it elvn!


holy crap. you said you sold the de-bezeled fw900s with the fresnel lens? this one should be even better, bigger screen, less clumsy and power-hungry, etc
 
To inspire you, I'm looking for these pictures I saw a few years ago where someone buried a house or car radiator in the ground outside his window and ran his water cooling loop through it.

Hopefully he buried it deep, that is where you get the real benefit. ;)
 
While I can clearly see the difference with lightboost on I am unable to play any games with it because the input lag is unbearable. My mouse is so floaty. I am not sure if its related to running SLI or what.
 
But as Mark pointed out, most motion blur is actually from eye movement viewing sample and hold frames, not ghosting/trails.
Correct. Human perception of motion blur can come from many sources, including the pixel persistence which leads to trailing and ghosting artifacts (asymmetric rise-and-fall effects of an LCD pixel) which is perceived as roughly equivalent to motion blur to the human eye (or a chase camera tracking the object).

There can be lots of confusion about "ghosting" versus "motion blur" since ghosting is not really motion blur, but is perceived as a motion artifact that affects clarity of motion, and is thus included in all possible motion blur "weak links".

For the purposes of the Blur Busters Blog own definitions:

"Ghosting" = a trailing artifact, caused by asymmetric rise and fall of LCD pixels. For example, a dark trail behind a red moving square on a cyan background (configurable in PixPerAn chase test). On many LCD's, the pixel fall-to-black is slower than the pixel rise-to-white

"Motion Blur" = blur that equally occurs on both leading edge and trailing edge of a moving object, completely unrelated to the normal ghosting artifact. The vast majority of this is caused by eye tracking, but longer pixel persistence will worsen this effect. However, pixel persistence on TN is an insignificant fraction of a refresh and thus pixel persistence is not the weak link in LCD motion blur (as proven by the ability to strobe the backlight to bypass the other motion blur weak link -- eye tracking-based motion blur caused by sample and hold)

Again, TN pixel persistence (1ms or 2ms) is now an insignificant fraction of a refresh. So most of the motion blur seen by the human eye, is caused by eye tracking a sample-and-hold display. LCD refreshes are displayed statically, but your eyes are always continuously moving when tracking a moving object. Your eyes are in a different position at the end of a refresh versus the beginning of a refresh; so the frame has "smeared" across your vision. To eliminate that, you need shorter refreshes (either by extra refreshes like higher Hz, or via black periods between refreshes ala CRT flicker)

____________________

On a related topic:

Stationary camera images are excellent for measuring pixel persistence effects,
while tracking camera images are excellent for measuring eye-tracking motion blur (*and* all trailing effects, including ghosting and PWM effects).

...

An accurate pursuit camera (chase camera) tracking the object, captures the same "motion blur" (trailing/ghosting/simple blur) that is also perceived by the human eye. This would be a method of recording the same blur that is perceived by human eye. This is sometimes done by TV manufacturers and monitor makers (google "MPRT pursuit camera" -- often costs $10K+ and beyond the budget of blogs like mine at this time). The ideal litmus test would be a 1/10sec exposure (several refreshes) chasing the moving object -- and the photograph is still sharp when accurately following an object on a CRT or LightBoost monitor, despite the crazy-long exposure lasting multiple refreshes. Presently, my blog is attempting to find a sub-$1K solution for an accurate tracking camera that can follow a PixPerAn car perfectly (less than 1 pixel inaccuracy) over a 1/10th second time period while snapping a photo. This records WYSIWYG motion blur, practically exactly as seen by the tracking human eye. Even if I have to build it from scratch (It can be done -- inkjet printer cartridge heads are amazingly accurate in horizontal motion).

I did a manual slide test that succeeded (after 10 tries):
movingcamera.jpg

My motion test (currently in beta; including Vega testing it) moves my UFO avatar sideways, in a similar fashion as the PixPerAn car.
This above photograph is a 1/60th second exposure, and was done on a Samsung 245BW with a 180Hz PWM backlight.
This above photograph perfectly captured exactly what my human eye saw while I was tracking it: The PWM trailing effect! (180Hz PWM/60Hz refresh = 3 copies)
The camera was sliding at virtually exactly the same speed as the moving object (within <1pixel of onscreen object motion)
The above photo is a true WYSIWYG motion blur capture -- it captured exactly what the human eye perceived.
(At least within an error margin of approximately one pixel -- I was at least able to keep the vertical black screendoor gap mostly undisturbed -- that shows up as horizontal lines between pixels in the moving-camera photo above)

However... I'm looking to set up an automated rig to do this, for reviewing / comparing captured motion blur between different displays.
Instead of sliding the camera sideways manually to follow a moving object, I'd rather it be done automatically scientifically/mechanically in a precision-controlled manner (less than +/-1 pixel tracking inaccuracy over a 1/10sec period). Unfortunately, scientific-friendly rigs capable of doing this (either via camera rotation, via rotating mirror, or via sliding camera -- all equivalent from a motion blur capture perspective) cost over $10K, so I'm actually pursuing another method that is blogger-friendly.
 
Last edited:
While I can clearly see the difference with lightboost on I am unable to play any games with it because the input lag is unbearable. My mouse is so floaty. I am not sure if its related to running SLI or what.
Try these steps:
1. Turn off VSYNC
2. Turn off mouse smoothing (do not use software-based mouse smoothing)
3. Use a gaming mouse set to 1000 Hz. This has much less input lag!
4. Set the hardware mouse sensitivity VERY HIGH. (far too fast for Windows)
5. Set the game mouse sensitivity VERY LOW. (usually 5%, compensates crazy-high hardware sensitivity)
6. Silky smooth (even better than software-based mouse smoothing but without the lag)

My mouse is not floaty at all during LightBoost.
It doesn't even feel floaty even with VSYNC on; since I prefer VSYNC ON when solo (while VSYNC OFF for online competition)
If your mouse is still floaty after following these instructions, then something else is causing enough input lag to cause the mouse to be floaty, or it's a certain (specific) videogame.
LightBoost does add a very minor amount of input lag, but I don't feel it at all (no floaty mouse pointer here). It's generally insigificant and the lack of motion blur (and improved reaction time) massively outweighs the slight added input lag. Some professional gamer remarked that he apparently prefers zero motion blur over an additional frame of input lag.

I use a Logitech G9X, a gaming mouse. It has extra buttons on the mouse that allows me to adjust its sensitivity, so I can make it crazy high sensitivity within games, but change it back to normal when I exit the video game so that the mouse pointer isn't too fast outside the game.

Try temporarily turning SLI off, as well.
 
Last edited:
I have a question for MarkR, as well as Vega and others who have used a 27" 120hz - 133hz korean ips.

Getting back to my question, is that trail length measurement only accurate on a 1080p "grid" ?
It's resolution and PPI independent.
PPI doesn't matter; resolution doesn't matter.
For 100% brightness (no PWM, no strobe), blur trail is directly proportional to the pixel step between frames, regardless, no matter what the PPI is, no matter what the resolution is, as long as the pixel persistence is significantly less than a refresh (which would cause the blur trail to be longer).

Trail length is pretty accurate. For LightBoost at 10% setting, it is actually roughly 1.3 to 1.4 pixels of blur trail for moving objects moving at about 960 pixels per second. But this is quite insignificant and it looks like there's no motion blur during 960 pixels/sec moving objects. I can even count the number of pixels and jaggies inside the PixPerAn car -- the pixels have sharp corners even at 960 pixels per second. (just like on CRT). The pixels of "I NEED MORE SOCKS" is clear.

Do the 2560 x1440 "overclockable" ips monitors with their higher number of pixels per inch still only get a trail measurement of 8 (much tinier) pixels?
Apples to apples, buddy.
960 pixels per second using tiny pixels will be slower motion.
If you measure motion in inches/second instead of pixels/second, you still go full circle around back to the same old motion blur problem for a same-display-size versus same-game-motion issue.
You're still moving in video games at the same physical speed (to eyes), so you just are doing bigger steps of pixel steps on a higher-PPI display.

If your motion is stepping 0.25 inch between frames, you get 0.25 inches of eye-tracking-based motion blur, no matter what the PPI is. (for the situation where pixel persistence is an insignificant factor)
 
Last edited:
That answers it for me. Blur to me really shouldn't be defined to a single simple cell shaded object anyway, and I remind people of that quite a bit. It is a good testing tool but it can give a false impression of just how bad an entire viewport of a game looks when it is blurred during FoV movment (even worse on modern , very high detail games). I was just curious from a testing perspective.
.
If I'm walking down a mountain valley trail, or better yet at speed on a mount or vehicle, in some game and do a fast pan (not a flick/blink) around 90degrees left and back to forward, the entire scene i.e. the entire viewport is going to bur no matter what the resolultion is. All high detail objects, textures, depth via bump mapping and all other shaders either totally blurred and meared outside of the lines (60hz) , or lost in a full "soften blur" type effect closer to the shadow mask of the onscreen objects at 120hz (non lightboost2) in my experience.
.
Btw I did test my a750D and was able to get up to a 17 on pixelperan using blurbuster's samsung instructions. At 18 the afterimage shadow was sort of superimposed on top of the original letters so it made it too hard to read at that speed. At a few different speeds, the aftershadow of the text was in different positions, so I'm not sure if it would be readable at any higher speeds if it were possible thatthe aftershadow moved again off of the letters at some point. 17 still is not that bad considering. I'll be testing it out in L4D2 and a few other easy to maintain crazy fps games soon. I'll try out rift at high detail texture settings but not sure what fps I can maintain , its very demanding so I might have to turn some stuff down.(its not optimized well, nor coded very well for dual gpu's unfortunately). I still have gw2 and BL2 too, and a backlog of steam games with stuff like darksiders, deadspaces, massEffects, etc :b
 
Last edited:
Try these steps:
1. Turn off VSYNC
2. Turn off mouse smoothing (do not use software-based mouse smoothing)
3. Use a gaming mouse set to 1000 Hz. This has much less input lag!
4. Set the hardware mouse sensitivity VERY HIGH. (far too fast for Windows)
5. Set the game mouse sensitivity VERY LOW. (usually 5%, compensates crazy-high hardware sensitivity)
6. Silky smooth (even better than software-based mouse smoothing but without the lag)

My mouse is not floaty at all during LightBoost.
It doesn't even feel floaty even with VSYNC on; since I prefer VSYNC ON when solo (while VSYNC OFF for online competition)
If your mouse is still floaty after following these instructions, then something else is causing enough input lag to cause the mouse to be floaty, or it's a certain (specific) videogame.
LightBoost does add a very minor amount of input lag, but I don't feel it at all (no floaty mouse pointer here). It's generally insigificant and the lack of motion blur (and improved reaction time) massively outweighs the slight added input lag. Some professional gamer remarked that he apparently prefers zero motion blur over an additional frame of input lag.

I use a Logitech G9X, a gaming mouse. It has extra buttons on the mouse that allows me to adjust its sensitivity, so I can make it crazy high sensitivity within games, but change it back to normal when I exit the video game so that the mouse pointer isn't too fast outside the game.

Try temporarily turning SLI off, as well.

I do not enable vsync, I run a steelseries sensei at 1000hz polling. I run a mouse sensitivity of 1 in game and adjust sensitivity on my mouse. My fps seems to be capped until I uncheck the enable 3d checkbox and I get 160-200 fps but still floaty. I will try turning off sli and some other things and see
 
I do not enable vsync, I run a steelseries sensei at 1000hz polling. I run a mouse sensitivity of 1 in game and adjust sensitivity on my mouse. My fps seems to be capped until I uncheck the enable 3d checkbox and I get 160-200 fps but still floaty. I will try turning off sli and some other things and see
Sounds like you're following all the recommendations to eliminate the "floaty"/"laggy" mouse cursor. I would definitely like to know if SLI is affecting input lag. (I hear it can be a "bit" contributing factor)

Perhaps what is happening is that the "accumulated input lag" was already pretty high (somehow) and turning on LightBoost was "the straw that broke the camel's back". In a good configuration for the whole input lag chain (input-to-software-to-display pixels), LightBoost *seems* to add only about 10% (unconfirmed) of the input lag of the entire chain from keypress/button perss to actual display reaction seen by eye.

Ars Technica has a great article on the whole chain of input lag (though does not cover LightBoost). High speed 1000fps footage is included. It's amazing how much input lag accumulates from everything -- totalled more than 50ms of input lag -- contributed by every little piece of the chain (software, mouse, GPU, cable, display, etc). No wonder that if there's enough weak links in the chain, we start to feel the lag. Sometimes a high quality improvement (e.g. elimination of motion blur, or smoother mouse movements, or less GPU microstutter, etc) improves reaction time to outweigh a minor input lag disadvantage (but it's highly contentious/subjective whether X outweighs Y, etc).

I also have heard (unconfirmed rumors) that configuring the computer to keep LightBoost turned on all the time, so that games launch in 2D (without hitting Control+T), reduces input lag further. Also, I noticed that running games in full screen mode also seems to help, as running in windowed mode adds a little input lag (due to Windows 7/8 compositing). Somehow, for some people, enabling LightBoost adds a bigger input lag penalty than for others -- I'm not sure why -- this merits more investigation.
 
Last edited:
That answers it for me. Blur to me really shouldn't be defined to a single simple cell shaded object anyway, and I remind people of that quite a bit. It is a good testing tool but it can give a false impression of just how bad an entire viewport of a game looks when it is blurred during FoV movment (even worse on modern , very high detail games).
In certain cases, blur is actually very useful. (I am not 100% anti-blur)
-- Motion blur is great if used artistically (e.g. as a "injure effect" or a "nitro turbo effect", or a "temporary powerup" effect.)
-- Motion blur is great for hiding judder/stutters at lower framerates (e.g. 37fps motion-blurred looks nicer than 37fps without motion blur)
-- Motion blur is great for hiding undesirable stroboscopic effects (e.g. wagon wheel effects).
-- Flicker-free displays (no strobing) is easier on eyes for many people, even though it comes with a motion blur penalty.

However, motion blur can interfere with game enjoyment of fast-action FPS
-- If your GPU/software/framerate is no longer a limiting factor, and you can play at fps=Hz (120fps@120Hz), then judder concerns are eliminated...
-- Zero motion blur gives you "high-definition-graphics during motion"; fast motion is as perfectly clear as stationary images. Better image quality during fast 180 degree turn flicks, circle-strafing, high-speed low passes (e.g. helicoptor in BF3), shooting moving targets while running/strafing, etc.
-- You're letting your human eyes do motion blur naturally; GPU and display no longer a limiting factor in motion blur.
-- You have faster reaction time without motion blur, since you can identify enemies more quickly while moving. (Unless you're so bothered by flicker or stroboscopic effects, or never had the habit of tracking moving objects on a display, e.g. you never gamed on CRT before and got used to common LCD gaming styles)

If I'm walking down a mountain valley trail, or better yet at speed on a mount or vehicle, in some game and do a fast pan (not a flick/blink) around 90degrees left and back to forward, the entire scene i.e. the entire viewport is going to bur no matter what the resolultion is.
Unless it's a CRT or LightBoost -- then you've practically eliminated the remaining sources of perceived motion blur that is related to display limitations.

At this point, any motion blur is naturally added by your eye/brain. Fast 180 flicks will have no more motion blur than real-life. For example 180 turns in game, versus 180 turns in real life (at the same pan speed). A fast 180 degree turn in real-life (e.g. walking down a hall then spinning 180 degrees). With a CRT or LightBoost, running full fps=Hz, the display is no longer the limiting factor in motion blur. It no longer forces additional motion blur via artificial means or technological-limitations means.

Minor quirk: Since real-life does not flicker, there can be minor stroboscopic effects (wagon wheel effects, phantom array effects), but at 120 Hz flicker is already far better than 60 Hz CRT (and far less eyestrain than shutter-glasses 3D), this is not a significant factor for most gaming. (Mind you: Theoretically, it would be nice to have 240fps@240Hz, to satisfy the few people who gets headaches from 120Hz flicker.)

Btw I did test my a750D and was able to get up to a 17 on pixelperan using blurbuster's samsung instructions. At 18 the afterimage shadow was sort of superimposed on top of the original letters so it made it too hard to read at that speed.
That's the crosstalk between refreshes. (The sharp trailing afterimage, non-blurred). It also affects 3D operation too (leakage of left/right eye). Some panels are better than others on that. Most LightBoost monitors seem to be better at this (with the sole exception of the VG278HE which apparently has a nasty afterimage effect, the VG278H is a bit better though). I think the 23"/24" panels do better than the 27" panels, in terms of crosstalk, as some have gone up to a readability score of 25 on the Samsungs.

The older Samsung's aren't nearly as crosstalk-free as some new 1ms panels. There is virtually zero afterimage shadow on the new 1ms panels (BENQ XL2411T and ASUS VG248QE). I'm actually impressed how much a difference the "1ms" stuff does for LightBoost during these PixPerAn tests -- I own both the BENQ XL2411T (1ms) and the ASUS VG278H (2ms) and if I had to give up one monitor, I'm keeping the smaller BENQ XL2411T (calibrated for fixed colors). In PixPerAn, I can't even see afterimages or motion blur even at fast PixPerAn movements, and I can do PixPerAn speed 30, and I have to move my head *and* eyes while tracking that fast. If I increase contrast to maximum and use black-on-white, there's a really really really faint afterimage if I look closely, but I can't see it when I am at arm's length from the BENQ XL2411T, or when I reduce contrast.

Still, the Samsung's strobe backlight gives you an excellent demonstration of how a strobe backlight eliminates motion blur, even though the Samsung's version has its input lag disadvantages (and the crosstalk).
 
Last edited:
yep. I'm just holding out on a later nvidia gpu upgrade before I buy a lightboost2 monitor. Titan/gtx780 depending on benchmarks. Its nice to get an idea of what strobed backlight blur reduction (though not elimination in this case) looks like on a lcd though like you said. I also still have a fw900 crt by the way so I know the difference between full motion clarity. Looking forward to lightboost2 zero blur later this year.
 
Back
Top