ASUS/BENQ LightBoost owners!! Zero motion blur setting!

If you do, try to follow the full LightBoost HOWTO on your laptop -- and enable 3D Vision on your monitor. Not all laptops can output 100-120 Hz to an external 120 Hz monitor. But once this is done, it will remain an easily enabled feature (via CRU.exe on either nVidia or Radeon) until the monitor is unplugged.

Success rate will be much higher from a desktop GeForce. But please report back.

I use a 120 hz monitor at work so its capable of that. Unfortunately, I somehow don't have a displayport cable laying around and there is no DVI output on the laptop. Garrrgggggggg this will have to wait until tomorrow.

My intention was to just do the CRU method on the laptop instead of the INFs and all that stuff....shouldn't that work?
 
One way to get it working with amd cards is to use a cheap nvidia card like this; 8400gs
and then enable the light boost using the standard Lightboost How To guide.
Once thats done you can do a fresh install of an OS or whatever, use your radeon card and input the correct custom resolution into the CRU.exe.
This worked for me. During this time I'm pretty sure I never unplugged my monitor from the mains so this must of saved the lightboost toggle.

I got a (borrowed) VG248QE yesterday evening (mine in still on backorder from Amazon) so I've spent the time since then playing with it. As has been mentioned, the key is getting the monitor into Lightboost mode one time and then don't unplug it (you can turn it off, just don't remove power). If the steps required to get it into LB mode are painful, then buying a cheap UPS to plug the monitor into might be worthwhile...get it into LB mode once (with a borrowed nvidia card or even another pc if needed) and then just never unplug it.

I originally did the full procedure for getting LB going, the modded 278 inf file, nvidia 3DVision drivers that I normally don't install, install the reg file, and turn LB on in the nvidia control panel, test, and then back off...LB stuck on ok. Then I uninstalled the nvidia 3dVision drivers (which automatically uninstalled the registry entries from the reg file) and uninstalled the monitor (with the 278 inf). Then I installed the VG248QE inf (modded to remove reference to the cat and icm file (I was going to do my own calibration file). Rebooted and power cycled a few times. Then I went into Create Custom Resolution, changed Vertical total pixels to 1149 as Vega recommended in his post, clicked on "Test" and bam...into 3D mode. Unplug the monitor and you can't toggle into 3D mode just doing the Custom Resolution trick.

I've been using CRT's pretty much forever. My current one is getting old, I just recalibrated it recently and it's looking great again, but it's getting weak (85 cd/m^2 is as high as it'll go now..still ok for the room I'm in which is just for gaming and TV), but it's days are getting numbered. I tried an LCD back in early 2004 and it was terrible for the games I play. That's when I replaced the CRT I was using then with my Mitsubishi 2070SB. I just went back and checked the motion blur on the games that I found so horrible on LCD's back then, and surprisingly, the VG248QE is dang good even without LB turned on. The color accuracy sucks though, even with calibrating the best I can with the monitor's controls (I'm used to a well calibrated display). Tweaking with the nvidia control panel can get it pretty decent (using HCFR and x-rite i1 Display 2). Then you have to use a utility of some type to try to make the tweaks stick with a lot of games, which is why I like to tweak my monitors as close to perfect as I can without relying on anything else. Things still look washed out with just tweaking the VG248, but a big improvement with adjusting it. Using the service menu in the display doesn't seem to help...tweaking the RGB gain and bias there doesn't seem to do anything.

So for the VG248QE...use Lightboost or not? The monitor still does a pretty darn good job even with it off. With LB off I can improve the blacks since I can turn the brightness way down (and still easily get 100 cd/m^2 on the high side), so HCFR shows I can get a decent after calibration contrast ration of about 850-900 without LB, and about 550 with LB on. Be careful not to turn contrast up too high, you'll start clipping...easy to see with checking your greyscale with HCFR, and you can clip from turning up gain in the nvidia control panel, not just the LCD contrast.

Many thanks to both Mark and Vega, and others who've been contributing to getting this all figured out. I'm guessing BenQ and Asus will eventually add a LB type setting that can be turned on from their monitors. It's too bad we don't have access to the monitor firmware or I think we could actually do it ourselves
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
So for the VG248QE use Lightboost or not...the monitor still does a pretty darn good job even with it off. With LB off I can improve the blacks since I can turn the brightness way down, so HCFR shows I can get a nice after cal contrast ration of about 900 without LB, and about 550 with LB on. Be careful not to turn contrast up too high, you'll start clipping...easy to see with checking your greyscale with HCFR, and you can clip from turning up gain in the nvidia control panel, not just the LCD contrast.
Thanks for reporting contrast measurements with and without LightBoost! This sounds about right;

The dynamic range of an LCD is intentionally reduced in order to precisely optimize the pixel response curve to the strobe backlight. Using a higher black floor and a lower white ceiling, makes response-time-acceleration overshoots easier to compensate. This allows prevent crosstalk for left/right eye, if you have extremely accurate pixel responses.

Future LCD's will no doubt improve this, perhaps even permitting the whole dynamic range of the panel (full LCD black and full LCD white). Until then, we have some color quality sacrifice to get the zero motion blur effect.
 
I use a 120 hz monitor at work so its capable of that. Unfortunately, I somehow don't have a displayport cable laying around and there is no DVI output on the laptop. Garrrgggggggg this will have to wait until tomorrow.

My intention was to just do the CRU method on the laptop instead of the INFs and all that stuff....shouldn't that work?

Monitor: Asus V248QE

Ok used a Dell Precision M70 laptop equipped with Nvidia Quadro 1000M graphics card, updated to latest drivers, had my AMD machine connected to DVI, laptop to HDMI, went through instructions listed on blurbusters for laptop machine, used CRU method on AMD machine and BAM LIGHTBOOOOOOOOOOOOOOOOOSSSSSSSSSSSSSTTTTTTTTTTTTTTTTT. Be back tomorrow with raw dick fapping to BF3 in lightboost. This shit is legit.
 
I use a 120 hz monitor at work so its capable of that. Unfortunately, I somehow don't have a displayport cable laying around and there is no DVI output on the laptop. Garrrgggggggg this will have to wait until tomorrow.

My intention was to just do the CRU method on the laptop instead of the INFs and all that stuff....shouldn't that work?
We've since discovered that it only works if you've ever initialized the monitor into 3D Vision once before in its lifetime (and haven't unplugged the monitor since)
Possibly because it may be an nVidia vendor lock-in, feature needing to be unlocked by nVidia driver initialization.
 
I've been experimenting with LightBoost on my VG248QE, and comparing to 60 hz and 120 hz without LB. I also have another 120 Hz monitor (Planar) that I used for comparison.

At this point I would say that LightBoost makes a big difference in PixPerAn, especially with the chase test, or the streaky pictures test. However LB is less dramatic when I tested in Half-Life 2. Maybe because moving textures are not the same as the high contrast images in PixPerAn.

I made a "spin in place" demo for HL2 Episode 1 which I found very helpful in seeing the difference with the following: 60hz, 120hz, and 120hz+LB. I think the biggest jump is from 60hz to 120hz.

Here is the demo file with a brief readme. Pay attention to the lettering on the phone booth if testing with LB.
HL2 Ep1 Spin Test

To sum it up, 60hz to 120hz, big difference (at least with my monitor). 120hz to 120hz+LB, less of a difference in game, but noticeable if you know how to test. Frankly I'm not sure I could tell simply by starting a game and moving around.

I'm glad to have the LightBoost option, and it has caused me to pay attention to motion performance.
 
So for the VG248QE...use Lightboost or not? The monitor still does a pretty darn good job even with it off..
This is an excellent question, and it goes to the heart of the matter. The VG248QE at 120hz is much better than 60hz. There is clearly an added benefit with LB over 120hz, but not as dramatic as the jump from 60hz to 120hz (in my opinion). Whether LB justifies the tradeoffs will be a choice for the user. I'm glad to have the option, I also wonder if there is room for a bigger improvement in what LB is doing now. If so, it could become even more compelling.
 
Monitor: Asus V248QE

Ok used a Dell Precision M70 laptop equipped with Nvidia Quadro 1000M graphics card, updated to latest drivers, had my AMD machine connected to DVI, laptop to HDMI, went through instructions listed on blurbusters for laptop machine, used CRU method on AMD machine and BAM LIGHTBOOOOOOOOOOOOOOOOOSSSSSSSSSSSSSTTTTTTTTTTTTTTTTT. Be back tomorrow with raw dick fapping to BF3 in lightboost. This shit is legit.

Sigh....can you please explain what "laptop instructions" exist on blurbusters? I don't see any at all.

And I tried to install this on my 570m/MSI 1761 laptop through HDMI, and it only allows 60 hz refresh rate, and there is no option for stereoscopic 3D anywhere in the control panel. So what did I do wrong? Yes the VG278 override is installed.
 
I don't know if I should get the VG248QE or the XL2420T. My main concern about the VG248QE is the low gamma at higher refresh rates. I hate washed out gamma more than anything, and the TN gamma shift is just going to make that worse. I need a darker gamma to compensate for that. Adjusting the colors through the video card is not an option because that doesn't work for most games.
 
Ectoplasmic - to me 120hz non lightboost 's 50% blur reduction ends up more like a full soften blur rather than completely outside of the lines complete mess that 60hz lcds do. It still blurs out all texture detail, highly detailed objects, depth via bumpmapping and other shader effects, - the entire viewport - during FoV movement. On very simply textured, older games, the textures are somewhat "muddy" and basic already, and the objects not as high detailed, less detail in shaders if any, etc.. so the effects are less pronounced than on the highest detail games. Similarly, the cell shaded car provides a poor detail sample object outside of the actual text shown on the flag and the fact that it can show a trail image in simple camera shots(though it is not the WYSIWYG blur your eyes see, which would be a blur mess filling the extents from leading object edge to end of the trail image), and in actual use FoV blur would blur the entire viewport not just a single object, (unless you were standing still with single object passing by at high speed).
.
Another thing to keep watch on is that the fps remain over the refresh rate in both cases, and realize that as long as you keep the fps over the 120hz refresh rate (or 144hz refresh rate), you are still going to get much smoother and more accurate motion tracking outside of any blur considerations. So at 120hz you have the greatly increased motion benefit which is a full jump from 60hz in both cases (120hz LB2 mode and non-LB2 mode), and a game like HL2 is going from a much simpler texture and detail level to half of that 60hz blur outside of lightboost, so the blur is less of a focal "distance" from the origin in a way, since the origin is not super fine.
.
Even though L4D2 is on the source engine, the textures seem a step up from the old HL type and I can tell the difference between 60hz and 120hz non-lightboost, and between 120hz non-lightboost and my 750d's inferior to lightboost frame sequential mode even. TF2 has its own "pixar-like" / "incredibles" soft cartoon 3D stylized look to it. A few of the most drastically affected by FoV blur games I played were when I played rift at high+ settings and Witcher2, both which have very high detailed textures and shaders, including armors, creature skins, and outdoor wilderness landscapes. I'm assuming skyrim would be similar, especially if you added high detail mods.
 
Last edited:
Guys, no one can help me?

I managed to get the laptop to out put 120 hz to the VG248 by using a DVI to HDMI adapter, but only at 800x600, but the Nvidia control panel STILL doesn't have anything for stereoscopic 3D! How did the other person get this working on a laptop?? Even the Nvidia website SAYS that the 570m supports 3D vision! But there's no option for it! And yes, i DID install the registry file.

So how do you get this working on a laptop so I can swap the monitor back to my desktop? How did the above person get this working on a laptop and get the Nvidia drivers to actually have the option?

And it's a MSI 1761 with a 570m, fully supported by 3D vision......(or so the Nvidia page claims....hell it's right there.. http://www./hardware/notebook-gpus/geforce-gtx-570m/features )
 
don't know if this will help, but make sure that the external monitor is the only active monitor, and then perhaps reboot. Also might want to try a dual link dvi to hdmi adapter in case that makes a difference. Process of elimination, grasping at staws perhaps.. I'm assuming the asus doesn't have a dvi input? less adapters the better.
 
Guys, no one can help me?

I managed to get the laptop to out put 120 hz to the VG248 by using a DVI to HDMI adapter, but only at 800x600, but the Nvidia control panel STILL doesn't have anything for stereoscopic 3D! How did the other person get this working on a laptop?? Even the Nvidia website SAYS that the 570m supports 3D vision! But there's no option for it! And yes, i DID install the registry file.

So how do you get this working on a laptop so I can swap the monitor back to my desktop? How did the above person get this working on a laptop and get the Nvidia drivers to actually have the option?

And it's a MSI 1761 with a 570m, fully supported by 3D vision......(or so the Nvidia page claims....hell it's right there.. http://www./hardware/notebook-gpus/geforce-gtx-570m/features )

Did you install the nvidia 3DVision drivers? And you might need the modded inf file, not sure if you've done that.
 
The problem is I don't even get the prompt to install the drivers in the driver package :( All it has is display driver, HD audio, control panel, etc...

There's also supposed to be an Nvidia stereoscopic service as well, nothing showing...
Apparently theres a big forum thread on the Nvidia forums about people who didn't have the 3D option on laptops, after upgrading from the original OEM supplied drivers...

I'm browsing a few forums...read something about an emulator that tricks the driver while installing, into thinking you have a 3D capable device (emitter), going to try this the hard way....
 
Ok this is utterly and completely *BS* to the EXTREME.
I'm using the monitor's dual link DVI plug plugged into a DVI to hdmi adapter (the DVI adapter is dual link but I don't know anything about whether the HDMI cable is 1.3 or 1.4).

I FINALLY managed to get the Nvidia laptop installers to show the 3d vision drivers as an installation option. I'm not sure HOW I managed to get the option to show in the driver install package, but I think it was either a combination of me having the 3d emulator that I found on 3dvision-blog.com, and the monitor running at 800x600 @ 120 hz.

Ok, first of all, while doing this, I entered that custom resolution of 1920x1080, because I could only display 60 hz at the native resolution. 120 hz was only available at 800x600. If I used the monitor's own HDMI connection to the laptop's HDMI, it would show "Out of range." But whatever....120hz was working at 800x600 with the DVI-HDMI adapter.
I then tried that custom resolution of 1920x1080 with the Nvidia custom resolution maker.
.
Then something bizarre happened.

The monitor popped up an error message on the OSD, saying "timings not supported; Please use a Dual link DVI cable" and telling me to press "MENU".
The screen SHOWED what seemed to be an 800x600 resolution, stretched out to 1920x1080, with half the screen being off screen to the right. The OSD said that it was at 1920x1080 at the correct khz, and 120hz, but it looked funny. I switched it back to 800x600 again.

Then i INSTALLED the drivers, and made sure the 3d vision driver thing was checked and......bam....NOTHING. There was NO trace of the service, no trace of anything having been installed except the display driver.

I then rebooted and tried reinstalling, and still nothing.

On a hunch, I ran the 3d vision emulator again, and then ran the Nvidia 3d vision installer (that was already unpacked in c:\Nvidia), and it ran something but didn't prompt up anything.

STILL nothing in control panel.

Edited the registry and added a few variables that the 3d vision installer had created, ran the 3d vision emulator again, then all I saw was an empty Nvidia 3d vision folder. STILL nothing on the Nvidia control panel for it.

Rebooted, ran the installer AGAIN, and suddenly 3d vision showed up!
But when I clicked "enable 3d vision", nothing worked....the monitor was not in lightboost mode.

Then I knew what to do.
I forced that custom timing, pressed apply, the monitor gave that OSD error of "please use a dual DVI cable", but then THIS time the screen had a weird pink tint on it?

I pressed Menu twice, and it said "3d mode" in red on the OSD......

First thing I noticed was that it felt like I was back on a CRT again.

when I unplugged the monitor from the laptop, the Nvidia control panel reverted back to its old self....
They DEFINITELY don't want you to use 3d except on fully supported hardware...

BUT HOLY god...all this fucking work...

Anyway It's running on my Radeon right now. Going to test it in some Black Ops 2...
 
I've been trying to use the search function but not finding this readily. I seem to remember someone, probably Mark R, listing a bunch of display types and the % blur reduction of each, and maybe even shown as "__ times clearer" , I forget. He included 60hz, plasma,perhaps 120hz - 133hz korean ips, 120hz tn, 144hztn, Lightboost2 enabled in 2D at 100hz&100fps+ and 120hz&120fps+, crt. I'm not sure if it was even this thread or one of the other individual threads that had lightboost questions posed in them. :rolleyes:
.
I know 60hz is very blurry and that 120hz tn is about "50%". I'm not sure what the plasma and some of the other quotes were. There was also this quote of blur trail lengths:
PixPerAn chase test, 960 pixels per second:
60Hz -- blur trail length of about 16 pixels
120Hz non-strobed -- blur trail length of about 8 pixels
120Hz LightBoost strobe backlight -- blur trail length of ~1 pixels (CRT sharp)


perhaps this post is what you're looking for.
 
I don't know if I should get the VG248QE or the XL2420T. My main concern about the VG248QE is the low gamma at higher refresh rates. I hate washed out gamma more than anything, and the TN gamma shift is just going to make that worse. I need a darker gamma to compensate for that. Adjusting the colors through the video card is not an option because that doesn't work for most games.
Here's some bad news -- there is forced LightBoost gamma on all LightBoost monitors, in LightBoost mode. There's different gamma settings. I've never seen a low-gamma LightBoost mode. The only way to adjust that out is via either nVidia Control Panel or via the video game (but fortunately, you can usually do either).

Unfortunately we can't go with monitor review gamma because gamma settings are totally different during LightBoost mode. So just get the best 1ms monitor -- either VG248QE or the XL2411T (which has a very slightly clearer LightBoost than XL2420T). Both monitors will have very crappy color.

If you want the best color, you need to re-calibrate the LightBoost mode with a colorimeter and using nVidia Control Panel. Another idea is to try to go into the Service Menu of your monitor, and see if you're able to adjust the LightBoost picture that way. (Nobody has attempted this before...yet)
 
Ok used a Dell Precision M70 laptop equipped with Nvidia Quadro 1000M graphics card, updated to latest drivers, had my AMD machine connected to DVI, laptop to HDMI, went through instructions listed on blurbusters for laptop machine, used CRU method on AMD machine and BAM LIGHTBOOOOOOOOOOOOOOOOOSSSSSSSSSSSSSTTTTTTTTTTTTTTTTT. Be back tomorrow with raw [bleep] fapping to BF3 in lightboost. This shit is legit.
They DEFINITELY don't want you to use 3d except on fully supported hardware...

BUT HOLY god...all this [bleep]ing work...

Anyway It's running on my Radeon right now. Going to test it in some Black Ops 2.
Falkentyne and chuckinbeast, welcome aboard with Radeon LightBoost. At least you don't have to do it again, provided you don't unplug your monitor. Powering off your monitor should be OK (although I've heard the older BENQ XL2420T "forgets" the LightBoost setting; while the newer VG248QE does not).

Eventually, hopefully, ToastyX and I will breakthrough with the necessary tweak to force LightBoost enabled regardless of the graphics product used. (keep tuned).

Also, I'd like you to be our guinea pig:
-- Does the Service Menu let you adjust the LightBoost color?
 
Even though L4D2 is on the source engine, the textures seem a step up from the old HL type and I can tell the difference between 60hz and 120hz non-lightboost, and between 120hz non-lightboost and my 750d's inferior to lightboost frame sequential mode even. TF2 has its own "pixar-like" / "incredibles" soft cartoon 3D stylized look to it. A few of the most drastically affected by FoV blur games I played were when I played rift at high+ settings and Witcher2, both which have very high detailed textures and shaders, including armors, creature skins, and outdoor wilderness landscapes. I'm assuming skyrim would be similar, especially if you added high detail mods.
Interesting note that the motion blur benefit benefits high-detail more than low-detail.
That's pretty true and makes sense; it works best with your textures set to maximum, and that your GPU is fast enough (with enough memory) to keep all the textures flying past at 120 frames per second.
 
Falkentyne and chuckinbeast, welcome aboard with Radeon LightBoost. At least you don't have to do it again, provided you don't unplug your monitor. Powering off your monitor should be OK (although I've heard the older BENQ XL2420T "forgets" the LightBoost setting; while the newer VG248QE does not).

Eventually, hopefully, ToastyX and I will breakthrough with the necessary tweak to force LightBoost enabled regardless of the graphics product used. (keep tuned).

Also, I'd like you to be our guinea pig:
-- Does the Service Menu let you adjust the LightBoost color?

No, but the "PanelSSC" is set to 0 when lightboost is on, and 30 when it's off (it's tweakable in the service menu but i dont know what it does or changes....)
 
Interesting note that the motion blur benefit benefits high-detail more than low-detail.
That's pretty true and makes sense; it works best with your textures set to maximum, and that your GPU is fast enough (with enough memory) to keep all the textures flying past at 120 frames per second.

I think he was getting the "wow" factor from both the 120hz of better motion tracking and smoothness of motion combined with the 50% blur reduction outside of lightboost2 mode, and when he bumped it up to near 100% blur using lightboost2 strobing on a simpler (perhaps even borderline "muddy") textured game, it seemed less drastic of a change than 60hz to 120hz, to him since the motion tracking increase (frame wise) remained the same.
.
...Blurring your viewport during movement due to display inadequacies all these years on lcds has been just sad. (Movement problems bother me on LCD TV video content too btw, waving hands and other very high speed content are very ugly and obvious). I have had a FW900 graphics professional crt and a xbr960 34" widescreen hdtv (with hdmi input) as additional displays at my desk and in living room respectively for years. For me, blur bothers me most on highest detail games, but it is still taking the focus away from my eyes every time even on simpler games. It's like blurring out something I'm reading that my eyes have dialed-in focus on. Something with larger, easier to read "text" so to speak being blurred on low detail games can be less eye-wrenching from solid than the much finer and smaller, highly detailed and embossed font "reading" of high detailed objects, textures, depth via bump mapping, and shader loss on a "full soften blur" effect that 120hz non-LB2 blur reduction seems to reduce blur to.
 
Last edited:
Drum roll....



You have been Blur Busted!

The new logo of the Blur Busters website (the home of the LightBoost HOWTO, of course!)
 
Last edited:
edit: nice logo :)

<snip>

PixPerAn Tests on LightBoost monitors (I own both BENQ XL2411T and ASUS VG278H)

baseline - 60 Hz mode (16.7ms frame samples)
50% less motion blur than 60 Hz (2x clearer motion) - TN 120 Hz mode (8.33ms frame samples)
60% less motion blur than 60 Hz (2.4x clearer motion) - TN 144 Hz mode (6.94ms frame samples)
85% less motion blur than 60 Hz (7x clearer motion) - TN 120 Hz mode with LightBoost set at 100% (2.4ms frame strobe flashes)
92% less motion blur than 60 Hz (12x clearer motion) - TN 120 Hz mode with LightBoost set at 10% (1.4ms frame strobe flashes)
Versus:
40% less motion blur than 60 Hz (1.7x clearer motion) - IPS overclocked to 120Hz (8.33ms + excess pixel persistence) -- Test done by Vega

This really clearly shows that not all 120 Hz is made the same. There is 7x less motion blur on a LightBoost-enabled 120 Hz TN than an overclocked 120 Hz IPS. Stroboscopically shortening the individually refreshes without increasing the refresh rate, makes a massive difference in motion blur elimination for people who are sensitive to motion blur, and want the "CRT silky smooth effect". As long as your eyes are comfortable with CRT >85 Hz.


Thanks for tracking down that post Solhokuten. Is this still accurate information or have the 1ms Lighboost2 backlight asus and benQ models been found to be up to 96% or so blur reduction? Also, where do the 10x backlight strobing per hz plasma and FFD plasma types fit in? I know they only have 60hz input the back so completely lack the increased motion tracking accuracy and more recent/current action frame "slice" data being shown, but they do have a blur reduction effect that is better than a 60hz lcd that I am curious about. Where do the better models of those plasma types (in general) fit in that blur reduction hierarchy since it comes up in conversation in threads often.
.
I also recall some crt hz equivalence/comarison info at 60hz crt and 85hz crt being compared to higher hz of lcds. I can try to dig that part up later tonight. There is also # of pixels trail lengths at extreme fps reported, among other measurement data. It would be nice to have this all in one table somewhere, (in order of motion improvement as logicially as possible). I'll post it to my webspace/dump site eventually once I get all the info. Once I do that its easy to reference from the local mirror of my webspace or via bookmarks.
 
Last edited:
I would like to clarify some things in your post:
elvn said:
I also recall some crt hz equivalence/comarison info at 60hz crt and 85hz crt being compared to higher hz of lcds.
Motion blur of fps = Hz is almost exactly equivalent on all refresh rates, due to CRT phosphor being the same.
However, images can look flicker free and smoother, so there's some marginal benefits.
But needless to say, 60fps@60Hz versus 120fps@120Hz is MUCH more apparent on non-LightBoost LCD's than on CRT's.

Motion blur is dictated by length of visible refresh, which is NOT necessarily equivalent to total refresh length
60 Hz CRT frame sample length = 1-2ms phosphor decay
120 Hz CRT frame sample length = 1-2ms phosphor decay
60 Hz LCD frame sample length = 16.7ms sample-and-hold
120 Hz LCD frame sample length = 8.3ms sample-and-hold

That's why people say 60fps@60Hz on CRT has less motion blur than 120fps@120Hz on traditional LCD.
But extra frames is good because it's less flicker and more stroboscopically smooth (less wagon wheel style effects, less phantom array effects, less stepping effects). But for tracking moving objects at fps=Hz moving at the same speed, there is virtually no difference in motion blur, if the frame sample length is the same for two different Hz's. (You may have heard of "black frame insertion". This is the black gap between the visible refreshes. LightBoost is simply doing the equivalent of approximately 12:1 black frame insertion -- it can be conceptually visualized as 11 or 12 black frames for every 1 visible frame -- and that provide 12x clearer motion).

There's nothing we can do about eliminating all stroboscopic effects (until we invent 1000fps@1000Hz displays to have 1 millisecond sample lengths without flicker), so we're stuck with flicker (CRT style, plasma style, or LightBoost style) as the best method of eliminating motion blur for video games (for those of us sensitive to motion blur) -- for the time being. There's no practical way to get 1 millisecond frame sample lengths without flicker (stroboscopic effect), and still getting fps=Hz (necessary ingredient) -- unless we want to do frame interpolation, which is not good for video games due to input lag.

LightBoost is hardware-locked to working at 100-120Hz. Otherwise, we'd easily get the CRT 60fps@60Hz effect with LightBoost without needing software-based black frame insertion to help bypass the hardware limitation of not doing 60 Hz strobes.
1ms Lighboost2
I made a mistake a few months ago; it is Lightboost not Lightboost2. My confusion was that 3D Vision doesn't have LightBoost, but 3D Vision 2 has LightBoost. I confused the sequel (of 3D Vision) with LightBoost which is simply a feature added, not a sequel to a feature. So it's "LightBoost" not "LightBoost2". My apologies.

asus and benQ models been found to be up to 96% or so blur reduction?
92%, not 96%.
This is because of the strobe length of 1.4 milliseconds.
1.4 milliseconds is 92% shorter than 16.7ms sample-and-hold of a 60 Hz refresh.
It's confirmed that PixPerAn motion blur trail length is exactly proportional to this -- and scientific papers also confirm the equivalence (scroll down for the Academic links).

Also, where do the 10x backlight strobing per hz plasma
Actually. It's just one strobe per refresh. One 1.4ms strobe, occuring 120 times per second. Larger black periods between refreshes means less motion blur.
If you did multiple strobes, you've got the PWM repeated-image artifact, which you DO NOT want:

BlurLED.png

(Source: TFT Central)

This is because your eyes are always tracking to follow moving objects. Your eyes are in a different position at the beginning of a refresh, than the end of a refresh. That means the PWM flashes (multiple strobes per refresh) causes a multiple-image trail, your eyes are in a different position. A camera can also capture this too as well. To do motion blur elimination, you need a precisely-synchronized single strobe per refresh, between refreshes (after pixel persistence of last refresh is finished, but before the next refresh begins). That's why the strobe needs to occur during the pause between refreshes (the blanking interval, essentially). This made 3D possible for LCD, and automatically made zero motion blur LCD's possible (shattering the pixel persistence barrier) You can see a 2007 LCD (high speed video) doesn't make a stroboscopic backlight practical, but the 2012 LightBoost LCD (high speed video) finally makes the stroboscopic backlight practical.

FFD plasma types fit in?
I've posted some info about this at 2500 Hz FFD page at Blur Busters. Plasma discussion is rather complicated, but a good explanation .... One needs an understanding of how plasma does temporal dithering (or at least DLP temporal dithering; which is somewhat different but very similar). That's high-speed pulse-width-modulation at the per-pixel level. What FFD is to focus all of it into a smaller time period, with larger black periods between refreshes.

Alas, despite the "2500 Hz" (theoretical 0.4 millisecond strobe); the plasma is still hamstrung by the 5 millisecond red/green phosphor decay (far longer than CRT), so there's still more motion on the 2500 Hz FFD than on LightBoost strobe backlights (2.4ms and less). So, we've still got a case of an LCD panel that actually beat the best plasma in terms of motion blur!! (but only if LightBoost is enabled and you've got 120 fps @ 120 Hz).... I brought motion tests to Future Shop (owned by Best Buy) and found the 2500 Hz FFD plasma (Panasonic VT50) is one of the best plasmas I've seen for motion clarity, unfortunately this mode is not very useful for computer use because it uses special motion interpolation algorithms in its dithered / subfield refreshes. (Plasmas subfields can benefit from that) There is too much input lag. You have to switch to game mode, which kills most of the 2500 Hz FFD benefits, alas. But you still get better motion than most flat panel LCD HDTV's (with maybe the sole exception of Sony HX950s' somewhat-game-compatible "Motionflow Impulse" mode).

I know they only have 60hz in the back so completely lack the increased motion tracking accuracy
Theoretically, LightBoost 60fps@60Hz can be just as clear as 120fps@120Hz when tracking moving objects (even if not as stroboscopically smooth if you're not intentionally tracking moving objects). The problem is LightBoost strobes only function during the 100Hz to 120Hz range. And you also need fps matching Hz, so that's why we've got the enforced high-GPU requirements of LightBoost today.
But on the other hand, 120Hz is good because it eliminates issues related to flicker (for most people) and it's far more stroboscopically smooth (less wagon wheel style artifacts, etc). And less input lag.

and more recent/current action frame "slice" data being shown, but they do have a blur reduction effect that is better than a 60hz lcd that I am curious about.
Actually, it has more to do with the LENGTH of a visible refresh. Scientifically, the "Hz" is meaningless for the purposes of motion blur. (though as a side effect, more Hz means shorter visible refreshes on traditional sample-and-hold displays). LightBoost doesn't operate at 60 Hz. (Though we've successfully tricked LightBoost to have exactly the same zero motion blur effect at 60 Hz in MAME arcade emulator, via software-based black frame insertion trick -- see www.blurbusters.com/mame -- simply by blacking out every other 120 Hz refresh, so that every 60fps frame has exactly one strobe per frame, which is necessary for the zero motion blur effect)

Where do the better models of those plasma types (in general) fit in that blur reduction hierarchy since it comes up in conversation in threads often.
The best plasma are great but it is hamstrung by a 5 millisecond red/green phosphor decay, so it's got an effective strobe length of approximately 5 milliseconds, which dictates its absolute limit in motion blur elimination.

I'll close out this by quoting a part of the pcmonitors.info VG248QE review, which I can vouch as being an extremely accurate simple-English version of what's already been scientifically written about:
pcmonitors.info said:
A clever Canadian chap called Mark Rejhon (who runs The Blur Busters Blog) was the first to widely publicise the fact that this strobing can be used to beneficial effect during 2D viewing as well. LCDs normally display a given frame (sample) and continue to display it (hold) until the next frame is due &#8211; a process aptly named &#8216;sample and hold&#8217;. Due to the way the human visual system works, your eyes constantly move to track motion on the screen. This results in your eyes being in different positions throughout the frame, even though the image itself is static during the held frame. The smooth tracking movement of your eye itself results in perceived motion blur. This actually accounts for a significant proportion of perceived blur you might see on an LCD no matter how fast the LCD appears on paper. It&#8217;s also one of the major reasons a higher refresh rate reduces visible trailing and blur &#8211; frames are held for a shorter duration of time and your eyes are given more actual image information (frames) to focus on.

Because LightBoost strobes the backlight to give rapid &#8216;on&#8217; and &#8216;off&#8217; pulses you only see a given frame for a fraction of the time you normally would. This shortens the length of a visible refresh and reduces the amount of time eyes are tracking across the visible period of a refresh. The end result is a significant reduction in perceived motion blur. We did test LightBoost on the VG248QE in a 2D capacity using our secondary (Nvidia) GPU and do think that it is something some users will love to use. It gave a glassy CRT-like smoothness during motion that is far beyond what an LCD would normally produce regardless of its refresh rate.
One of my favourite links from the Science & References page at Blur Busters is a VERY good oldie Microsoft Research article from 2001, which I will quote:
Microsoft Research said:
Flat-panel displays have a sample-and-hold characteristic

gg463407.TempRate15(en-us,MSDN.10).gif


All of the newer display technologies such as LCD, plasma, DLP, and so on, have essentially a sample-and-hold characteristic. When a pixel is addressed, it is loaded with a value and stays at that light output value until it is next addressed. From an image portrayal point of view, this is the wrong thing to do. The sample of the original scene is only valid for an instant in time. After that instant, the objects in the scene will have moved to different places. It is not valid to try to hold the images of the objects at a fixed position until the next sample comes along that portrays the object as having instantly jumped to a completely different place.

gg463407.TempRate16(en-us,MSDN.10).gif


Your eye tracking will be trying to smoothly follow the movement of the object of interest and the display will be holding it in a fixed position for the whole frame. The result will inevitably be a blurred image of the moving object.

Sample and hold pixel characteristic causes blur.
For the first graph, the middle image (CRT) is the same thing LightBoost does -- it essentially point-samples the refreshes for your eyes tracking trajectory, by using very short strobe lengths. (Remember your eyes are continuously moving tracking a moving object. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh) For the optimized LightBoost strobe backlight (OSD=10%) ... 1.4 millisecond strobe lengths is 92% shorter than 1/60sec (traditional 60Hz LCD) and is a bit over 80% shorter than 1/120sec (traditional 120Hz LCD). Even at maximized LightBoost (OSD=100%), the 2.4 millisecond strobe lengths are still 85% shorter than 1/60sec. That's where my oft-repeated 85% and 92% numbers come from -- also confirmed by oscilloscope and via PixPerAn motion tests, as well as the upcoming new Blur Busters motion tests. (three separate measured confirmations) pcmonitors.info vouches for the veracity of these "85% less motion blur" and "92% less motion blur" numbers too, in the pcmonitors.info forum mentioning their upcoming article that covers LightBoost more in-depth.
pcmonitors.info said:
Again I wish it were something I was comfortable covering in more depth in the reviews but the reviews will link to this thread and I&#8217;m sure I will give similar examples in my upcoming article. For others reading this and wondering about those figures (vs CRT) above, they aren&#8217;t plucked from thin air. I have also done similar testing to verify this which will be discussed in the article.
So my numbers are accurate, and reviewers are gradually starting to confirm this.

baseline - 60 Hz mode (16.7ms frame samples)
50% less motion blur than 60 Hz (2x clearer motion) - TN 120 Hz mode (8.33ms frame samples)
60% less motion blur than 60 Hz (2.4x clearer motion) - TN 144 Hz mode (6.94ms frame samples)
85% less motion blur than 60 Hz (7x clearer motion) - TN 120 Hz mode with LightBoost set at 100% (2.4ms frame strobe flashes)
92% less motion blur than 60 Hz (12x clearer motion) - TN 120 Hz mode with LightBoost set at 10% (1.4ms frame strobe flashes)

It's quite telling that ASUS VG278H's, a 2ms panel, is outputting measured 1.4ms MPRT's (Motion Picture Response Time), living proof sitting on our computer desktops, that the pixel persistence is actually a bypassable variable. It's real simple conceptually (But hard from LCD engineering) -- turning off the backlight during the pixel transition phase; strobing only on fully refreshed frames. The strobe stage of a refresh (seen by eye) can be shorter than the pixel persistence stage of a refresh (not seen by eye; kept in dark). That's the breakthrough: shattering the pixel persistence barrier.

So here, I close out:
When tracking moving objects (and frame-rate matches refresh-rate), perceived motion blur on displays is directly proportional to the length of visible refresh.

(Note: Variables can affect this, such as phosphor ghosting, plasma temporal dithering, pixel persistence, etc. LightBoost are pure strobes that cleanly bypasses pixel persistence)
 
Last edited:
Light boost is awesome finally got it working ...What a great tweak. Im loving it What fun


Mark Rejhon

OUR

LCD HERO
 
thanks for the detailed answers as always mark. I read it, and will read it again later tonight when I have more time to follow the referenced links too.
.
I appreciate the refresher on crt vs lcd. This thread is so huge, I'm hoping to make a table of the different display type's often repeated measurements in regard to motion and put them in a logical hierarchy of some kind, if even just for myself as a quick reference when posting in discussion threads.

However you slice it, it seems like a plasma is essentially strobing or conversely blanking whether at the backlight level or the pixel level, which results in less blurred motion to your eyes than a 60hz lcd. I'm curious where that blur reduction % on average for plasma types fits in the order of the % blur reduction hierarchy vs 60hz lcd, 120hz-133hz IPS lcd, 120hz TN LCD, 144hz lcd, 100hz and 120hz 1ms Lightboost(not 2!) enabled LCDs , CRT.
.
Another point you made - I understand lower hz can look as clear blur wise in the right circumstances, but to people with "fast eyes", motion tracking is another facet and consideration. Higher hz (with fps matching/exceeding it) provides more "dots per dotted line length" of unique (and more current) action data shown, showing more unique motion "slices" during any movement distance. This provides smoother looking and more accurate motion outside of any blur reduction benefits (- even if that smoother and more accurate motion tracking is full of "soften blur" of the whole viewport during FoV motion). Having zero blur combined with this is provides even more drastic difference in accurate portrayal of motion, not to mention keeping all detail, textures, and crisp scene beauty from being washed out aesthetically (and eye strain wise - focally).. I was sure to mention that I was interested in plasma's blur reduction in itself and make the point clear that plasma only has 60hz of action slices coming in the input in the back, so it completely loses out on that aspect of more current action data shown throughout (it can't show any new motion data outside of 60hz of frames and has to rely on using frame duplication, "tweening" hybrid frames, etc if anything).
.
 
I appreciate the refresher on crt vs lcd. This thread is so huge, I'm hoping to make a table of the different display type's often repeated measurements in regard to motion and put them in a logical hierarchy of some kind, if even just for myself as a quick reference when posting in discussion threads.
Just make sure you always cite me and BlurBusters.com when you quote any of my publicly posted posts. Also, I'm collecting snippets for future posting at the Blur Busters website.

However you slice it, it seems like a plasma is essentially strobing or conversely blanking whether at the backlight level or the pixel level
[dont-copy-and-paste]Both. It's black between refreshes then the subfield comes on (essentially a kind of PWM modulation at pixel level). , then it goes black again. Rinse and repeat for each of the 600 Hz subfields. Most of the subfields are tantamount to frame repeats, so on better plasmas you compress as much of the information into the brighter subfields into one compressed subfield per refreshs. From a vision science perspective, plasma is horrendously more difficult to explain than LCD's are.[/dont-copy-and-paste]

I am not currently going to go into the business of explaining plasma subfields -- for practical reasons -- I don't have 2 lifetimes to educate the masses, one by one :D .... There are better experts at plasma displays than me. I highly advise people who wish a better understanding of plasma to go to www.avsforum.com and other places on the Net.

blur reduction % on average for plasma types fits in the order of the % blur reduction hierarchy vs 60hz lcd, 120hz-133hz IPS lcd, 120hz TN LCD, 144hz lcd, 100hz and 120hz 1ms Lightboost(not 2!) enabled LCDs , CRT.
They are all over the map, the worst and the best plasmas can have a big difference. Even the ones with the less blur can have very noisy-looking dark colors. And there is whether or not the subfield refreshes are motion-interpolated or not (Panasonic 2500 Hz FFD's use interpolation to avoid the repeated frame effect in contributing to motion blur of the repeated subfield refreshes).

Another point you made - I understand lower hz can look as clear blur wise in the right circumstances, but to people with "fast eyes", motion tracking is another facet and consideration. Higher hz (with fps matching/exceeding it) provides more "dots per dotted line length" of unique (and more current) action data shown, showing more unique motion "slices" during any movement distance. This provides smoother looking and more accurate motion outside of any blur reduction benefits
That's true; that's the stroboscopic effect I'm talking about in my previous post.

I'm not currently an authority of sufficient repute on plasmas, though I've written some info about it in some posts on the AVSFORUM site.
 
Thanks again, and sorry for the sidetracking. It just comes up fairly often in other threads so it would be nice to know where in the blur reduction hierarchy (outside of the lack of new unique slice motion tracking improvements) a few common plasma types would be sitting. It sounds like a slippery concept and hard to define amount of blur reduction though. Sorry for trying to pigeon hole you like that if it's that much of a mess defining it. I assumed better than 60hz lcd (0% blur reduction) and worse than 120hz lcd (40% - 50% reduction), and their corresponding pixel trail lengths at extreme fps.. but no idea where exactly % wise. Input lag makes motion interpolation scenarios moot in my opinion outside of movies and tv as well so throws a lot out the window.
.
When I make a table up I'll just fit in in there somewhere in between in grey, with caveats ** regarding 60hz max input and lack of real, more current game generated unique motion frames shown beyond that. :p
 
Last edited:
All hail Mark Rejhon, for delivering us from the dark ages of LCD gaming. Truly great work, well written guidelines, interesting, easy to understand insights into the "why" this works, very informative and a lot of fun. I swear sometimes its more fun to play with the hardware than to actually use it! This is a great step forward in tech, I'm excited to see how it progresses. I'm interested to see what Mark and ToastyX come up with, because I bet some other discoveries will be made. Great stuff!

Been using lightboost for just under a day now and I have the following comments:

1. Its amazing. The glass like smoothness is just unreal, incredible.
2. Even in a dark room with no outside light sources, I find the display to be dim, specifically in game (like BF3). Pulsing the backlight naturally results in less overall light output. Going forward, this will be a very important consideration for future monitor purchases ---- "how well does it lightboost" -- brightness in lightboost mode etc.
3. The colors are off but again to be expected, thats the other trade-off at this point in time. Again all future monitors better improve on color when 2D lightboosting.
 
Light boost is awesome finally got it working ...What a great tweak. Im loving it What fun

Mark Rejhon
OUR
LCD HERO
All hail Mark Rejhon, for delivering us from the dark ages of LCD gaming. Truly great work, well written guidelines, interesting, easy to understand insights into the "why" this works, very informative and a lot of fun.
I am flattered, but let me credit esreality forums (whose people discovered before I did) and TechNGaming's HOWTO (before I created my LightBoost HOWTO from scratch) for helping popularize this too. And we have to remember, not everyone benefits as not everyone is as sensitive to LCD motion blur as we are; they think I'm just a fanboy! ;)

I'm simply helping make this a little more widespread because thousands of people have been missing the CRT quality for years; and we've discovered the LCD holy grail in the motion clarity department (that the sales departments of monitor manufacturers don't even understand; and aren't marketing properly). I'm doing a large amount of research reading (and research), becoming an associate member of Society for Information Display (SID.org), and have the knowledge to package the boring scientific mumbo jumbo (sample-and-hold, MPRT, etc) into easy-to-understand stuff.

Who you gonna call -- The Blur Busters!
 
Last edited:
Interesting note that the motion blur benefit benefits high-detail more than low-detail.
That's pretty true and makes sense; it works best with your textures set to maximum, and that your GPU is fast enough (with enough memory) to keep all the textures flying past at 120 frames per second.

I think this is a very important note for persuading more people to be interested in LightBoost. My friend didn't understand at face value why LightBoost would be such an improvement. Then I explained that for years we have been adding higher res textures, alpha-to-coverage transparency, parallax occlusion mapping, cascaded variance shadow mapping, etc but very little of this processing the GPU is doing is even visible while moving. Oh great we can see the the amazing graphics when I stand still and stare at it. LightBoost lets us see the detail while in motion (aka, while PLAYING the game).
 
I think this is a very important note for persuading more people to be interested in LightBoost. My friend didn't understand at face value why LightBoost would be such an improvement. Then I explained that for years we have been adding higher res textures, alpha-to-coverage transparency, parallax occlusion mapping, cascaded variance shadow mapping, etc but very little of this processing the GPU is doing is even visible while moving. Oh great we can see the the amazing graphics when I stand still and stare at it. LightBoost lets us see the detail while in motion (aka, while PLAYING the game).
Yep, excellent point.

P.S. It's noteworthy that LightBoost is an nVidia-specific name. I've used the generic term "strobe backlight" or "stroboscopic backlight", since some display manufacturers such as Samsung has the same kind of backlight (but don't use the nVidia brand name LightBoost). I've run into Samsung users who are surprised that their monitor had this feature. Marketing will have to come up with better ways to market this technology; even things like "92% LESS MOTION BLUR!*" stickers (*relative to 60Hz LCD) ....or "Zero Motion Blur Mode" or something. And make it easy to enable as a button. Blur Busters will continue to publish HOWTO for new methods of enabling stroboscopic backlights in different models of computer monitors (and HDTV's -- there's "Sony Motionflow Impulse" in HX950's which is a special LightBoost-like motionflow mode that does not use interpolation). Blur Busters has been discovering poorly documented strobe backlight modes in displays that are scientifically the same as LightBoost, but does not license that name.
 
Last edited:
Yep, excellent point.

P.S. It's noteworthy that LightBoost is an nVidia-specific name. I've used the generic term "strobe backlight" or "stroboscopic backlight", since some display manufacturers such as Samsung has the same kind of backlight (but don't use the nVidia brand name LightBoost). I've run into Samsung users who are surprised that their monitor had this feature. Marketing will have to come up with better ways to market this technology; even things like "92% LESS MOTION BLUR!*" stickers (*relative to 60Hz LCD) ....or "Zero Motion Blur Mode" or something. And make it easy to enable as a button. Blur Busters will continue to publish HOWTO for new methods of enabling stroboscopic backlights in different models of computer monitors (and HDTV's -- there's "Sony Motionflow Impulse" in HX950's which is a special LightBoost-like motionflow mode that does not use interpolation). Blur Busters has been discovering poorly documented strobe backlight modes in displays that are scientifically the same as LightBoost, but does not license that name.

Facepalmish really that they don't advertise this more.
 
baseline - 60 Hz mode (16.7ms frame samples)
50% less motion blur than 60 Hz (2x clearer motion) - TN 120 Hz mode (8.33ms frame samples)
60% less motion blur than 60 Hz (2.4x clearer motion) - TN 144 Hz mode (6.94ms frame samples)
85% less motion blur than 60 Hz (7x clearer motion) - TN 120 Hz mode with LightBoost set at 100% (2.4ms frame strobe flashes)
92% less motion blur than 60 Hz (12x clearer motion) - TN 120 Hz mode with LightBoost set at 10% (1.4ms frame strobe flashes)
(I know the above is the 27H)

I have to honestly say that the ASUS VG248QE is a pretty darn good monitor even without Lightboost.

Using Lightboost at 10%, despite its incredible numbers on motion-blur, is not an enjoyable experience. TOO DARK! I got to have it at 100%.

Despite LB, I still suck at FPS... just kidding!
 
Using Lightboost at 10%, despite its incredible numbers on motion-blur, is not an enjoyable experience. TOO DARK! I got to have it at 100%.

Despite LB, I still suck at FPS... just kidding!
Even LightBoost at 100% still beats all other mere mortals of LCD's hands-down, and all of yesterday's 120 Hz LCD. So you're not missing much. But, some people still think LightBoost 100% is still TOO BRIGHT for a dark room. (Vega, for one).

I personally like LightBoost=60% nowadays, for a dark room. It's a good balanced tradeoff.

Manufacturers need to build in brighter backlight during LightBoost mode (and allow PWM-free non-LightBoost modes)
 
Even LightBoost at 100% still beats all other mere mortals of LCD's hands-down, and all of yesterday's 120 Hz LCD. So you're not missing much. But, some people still think LightBoost 100% is still TOO BRIGHT for a dark room. (Vega, for one).

I personally like LightBoost=60% nowadays, for a dark room. It's a good balanced tradeoff.

Manufacturers need to build in brighter backlight during LightBoost mode (and allow PWM-free non-LightBoost modes)

For my gaming/TV room I'm liking the 10% setting. It gives me 101 cd/m^2 and 560 contrast ratio. That's with contrast at 78% and some RGB tweaks in the drivers. I'll do a full calibration with Lightboost when my monitor shows up and I return this one that I was loaned. I get 890 contrast ratio with my current non-lightboost settings, but as has been mentioned, the blur (while not bad) is still pretty noticable in modern games without lightboost enabled.

A bit of information on my tweaking. As was mentioned in the pcmonitors VG248QE review, the monitor looks absolutely horrible without at least tweaking brightness and contrast. Not all presets are suitable to use as a start for calibration, pretty much only the Standard mode is. As the review pointed out and my testing confirmed, all the others do weird things with the greyscale/gamma curve, and it can't be fixed by lowing the contrast setting even though it looks like the high end of the curves are being crushed. That's not a problem when one starts with the Standard preset, or Lightboost, just don't push the contrast too high (or push too high in the RGB driver settings if tweaking there). To really look excellent you need to do more calibration than can be done with just the monitor controls, but right now I'm looking at a calibrated (more than just the monitors built-in adjustments) VG248QE (non-lightboost at the moment) and it looks pretty darn good. It's sitting next to my CRT, so I can compare the two side by side. It's really a shame that the built in monitor adjustments are so limited and additional calibration tweaks are needed.
 
Last edited:
(I know the above is the 27H)

I have to honestly say that the ASUS VG248QE is a pretty darn good monitor even without Lightboost...

Try comparing the same scene with LB on and off. The difference with LB is not that much on my VG248QE compared to 120hz without LB. I made a post here describing how to run a spin demo with HL2.

It was suggested after that post that HL2 does not have detailed enough textures, and maybe Left for Dead would have a stronger LB effect. This sounds like rationalizing to me, but I made the same type of spin demo with LFD2 and the difference is the same (actually hard to see a difference without a test).

I'm not trying to rain on the parade, and I acknowledge there is a difference, especially with PixPerAn, but not as much in game that I can see. Frankly I think a lot of people could miss the difference even when shown an A/B test unless they were told what to look for.
 
Try comparing the same scene with LB on and off. The difference with LB is not that much on my VG248QE compared to 120hz without LB. I made a post here describing how to run a spin demo with HL2.

It was suggested after that post that HL2 does not have detailed enough textures, and maybe Left for Dead would have a stronger LB effect. This sounds like rationalizing to me, but I made the same type of spin demo with LFD2 and the difference is the same (actually hard to see a difference without a test).

I'm not trying to rain on the parade, and I acknowledge there is a difference, especially with PixPerAn, but not as much in game that I can see. Frankly I think a lot of people could miss the difference even when shown an A/B test unless they were told what to look for.

So I was just comparing some ME3 textures with and without LB. Get in an area with a "grate" on the floor and move... Without LB it doesn't look all that bad, but the textures are like they've dropped to low-res...with LB on they stay sharp. As someone said before, it's most noticable with newer games with high res/detail textures.
 
Even LightBoost at 100% still beats all other mere mortals of LCD's hands-down, and all of yesterday's 120 Hz LCD. So you're not missing much. But, some people still think LightBoost 100% is still TOO BRIGHT for a dark room. (Vega, for one).

I personally like LightBoost=60% nowadays, for a dark room. It's a good balanced tradeoff.

Manufacturers need to build in brighter backlight during LightBoost mode (and allow PWM-free non-LightBoost modes)

Remember I also have 3 times the amount of photon's coming from my 3x portrait setup so 10% LB works out really well.

And lol at Mark making huge explanation posts like ten separate times in this thread now. :D

It works people! I don't know how many times Mark can explain the same thing. ;)
 
Back
Top