ASUS/BENQ LightBoost owners!! Zero motion blur setting!

One ‘make or break’ point and contentious issue on the XL series thus far has been the pixel overdrive which on the 10T in particular was too aggressive. This lead to noticeable ‘inverse ghosting’ in some instances. The grey to grey response time is now quoted as ‘1ms’ rather than ‘2ms’, which could indicate some tuning has been done to the overdrive algorithm. Alternatively it could just be a mild exaggeration by BenQ – they wouldn’t be the first company to make misleading response time claims at any rate. One particularly attractive feature of the XL24 series monitors has always been the low input lag, which is further lowered by an ‘Instant Mode’ to bypass extraneous image processing. The company is claiming ‘0.001 frame’ of input lag which to all intents and purposes means no input lag whatsoever – it looks as if that won’t be something to worry about. We will of course find out more about what differentiates the XL2411T from its predecessors when we review it shortly. BenQ have told us that it should be available to test in November and to expect full retail availability in the UK around the same time. Some retailers are now stocking or taking pre-orders for around £235. We will bring any further details on price and availability as the news comes in.
This would be no longer an issue if the strobing hides the overdrive artifacts. A properly timed strobe backlight can hide overdrive artifacts, as well as the pixel persistence. It doesn't matter how "noisy or bouncy" the pixel transition is, as long as all that is kept in total darkness. For example, the first 4ms of a 1/120sec refresh could be a bouncy 1ms overdrive and a 3ms "settle-down" safety margin. (Or whatever settle-down margin it uses) Then afterwards, you strobe the backlight on a pretty stable, ghost-free frame on time before the next refresh. Done. Obviously, it depends how the refresh is scanned-out; and how long the monitor (or graphics card) artifically lengthens the vertical blanking interval, in order to allow a settle-down time and to allow a strobe to happen before the next refresh.

Provided LightBoost is enabled, BENQ might actually be telling the truth, from the strobe backlight perspective. I will be testing out the XL2411T against my VG236H and verifying these claims.

P.S. On my VG236H, I've confirmed that strobing can, to /some/ extent, hide most of *both* the pixel persistence _and_ the overdrive overshoots.
Not all of it though (that's why we get a little bit of crosstalk -- like 3D). Pre-calibrated/optimized settings are used, and that's why the settings are locked in the monitor, to prevent users from worsening LightBoost behavior by adjusting these settings.
 
Last edited:
^^ And that was gonna answer my next question hah. Looking forward to your results.
 
P.S. There is no perceptible limit to how short the strobes can be, in a strobed backlight. LED's can be switched even faster (microseconds), and the use of RGB LED's have no phosphor decay. You can strobe for shorter time periods than the pixel persistence, so it's possible for a LCD monitor to have a legitimate claim of "0.1ms", if it's accomplished through the strobing, instead of the panel itself. I have already successfully confirmed the "bypass-the-pixel-persistence" effect.

Many find it is hard to believe, unless you remind yourself that a 1/120sec refresh is 8ms long.
The majority of pixel persistence is far less than that, only 2ms.
You're simply strobing after the pixel persistence is practically finished, but before the next refresh.
Simply strobing at a different part of a refresh cycle, than the pixel persistence.
The strobes can be far shorter than the pixel persistence itself.
 
Last edited:
I've done some photo tests of the xl2411t if you need it https://dl.dropbox.com/u/11777259/xl2411t_test.rar
lightboost tends to bypass the most pixel changing time idd, but imho it's not timed perfectly as you can see and could be improved(cause afaik pixel changing time for any colour is less than 4ms even for 2420t which got slower panel so you got at least 4.33ms free slot with perfect picture but they somewhy didnt do it perfectly)
time to disassemble it and implement own backlight at 144hz! :p
 
I've done some photo tests of the xl2411t if you need it https://dl.dropbox.com/u/11777259/xl2411t_test.rar
lightboost tends to bypass the most pixel changing time idd, but imho it's not timed perfectly as you can see and could be improved(cause afaik pixel changing time for any colour is less than 4ms even for 2420t which got slower panel so you got at least 4.33ms free slot with perfect picture but they somewhy didnt do it perfectly)
If you're strobing instead of scanning, then don't forget that there's the timeslot of the refresh cycle, and the timeslot of the stable frame (like a lengthened vertical blanking interval between frames). Those timeslots have to be separate. You can only strobe during the vertical blanking interval.

If the typical timeslot of a 120hz refresh cycle is 8ms and the blanking interval is only 0.33ms, you've only have 0.33ms to wait the pixel persistence and strobe. That's not enough time. So the blanking interval time is artificially lengthened (either at the graphics card side (e.g. a massive vertical blanking/front/back porch, or internally in the monitor), so that there's enough time after the end of the previous refresh, to wait-out the pixel persistence, before strobing, before the next refresh begins.

So, the timeslot of a refresh might actually be 1/240sec and the timeslot of the blanking interval (where you can strobe) might actually be 1/240sec. Home theater HDTV's have been able to refresh at 240Hz (via frame interpolation), so it's possible to "accelerate" the individual refresh of an LCD beyond 1/120sec, in order to allow plenty of idle time (blanking interval) before the next 120Hz refresh -- this is the time that allows waiting for the pixels to settle near the final values, before strobing the backlight.

I'm unable, at this present time, to measure how much the frame refresh cycle is accelerated, since it's done in total darkness (light turned off), so it's not captured by my high speed camera. This reminds me, I still have my high speed footage of LightBoost that I need to publish to YouTube -- I'll get to that shortly.
 
I have a VG278H monitor and none of these tricks are working for me.

Not sure if I found a new trick. But you enable stereoscopic 3D and launch the game. I put it on 1% so it isn't too distracting when not putting on my glasses. Once in game I press (ctrl+t) and that disables 3D. It puts me back into 2D mode, but the brightness is severely lowered. I think the game does feel a little smoother.

If anyone else could verify this LMK.
 
I've tried this hack with my VG236 (using the VG278h inf). The OSD is showing it to be locked in 3D mode. I'm certain lightboost (or something of that kind) is enabled because I can detect a slight amount of eye discomfort from the flicker. Not enough to be a problem during gaming sessions, but it makes me wonder how bad an LED screen would be.

The only improvement 3d mode/lightboost hack provides with the VG236 is removal of overshoot/trailing. I'm hoping the slow pixel response is why I see no increase in motion clarity. I get the same pitiful score of 7 in pixperan.

****

I'm beginning to wonder if lightboost even exists on these older monitors. You keep referring to "lightboost 2", but the nvidia list of monitors only references the term "lightboost" on nvidia 3d vision 2 displays:

http://www.nvidia.com/object/3d-vision-system-requirements.html
 
Last edited:
i wish you could control the backlight to bring it to 60hz! would have been motion blur free moment for ALL LCD owners.

Its possible for Vista64 (maybe Win7, not Win8) and all 120hz 3DVision Displays. A simple bug activating black frame insertion for every second frame. Its perfect for 60fps limited games or slow graphicards. Games are limited with v-sync on to 60fps, but the display runs in 3d mode with 120hz. I am checked it with a DVI capture card and yes thats really black frame insertion. 60fps games are extremly sharp (look just like real 120hz).

--> connect display with a dvi dual link cable
--> set the monitor refreshrate to 120Hz
--> make sure the nvidia usb-dongle is connected
--> goto "Nvidia Control Panel"
--> activate "Enable stereoscopic 3D"
--> click on "Test Stereoscopic 3D"
--> set resolution and refreshrate to 1920x1080@120hz and launch the test application
--> when you see the moving Nvidia letters, press Win key or Alt+Tab
--> now, the windows desktop are darker and has heavy flickering like an old 60hz CRT (3DVision glasses don't work correct, the right glas are more darker than the left)(display is in 3d mode)
--> close "Nvidia Control Panel" (let 3DCP/nvsttest.exe open!) and reopen it
--> deactivate "Enable stereoscopic 3D"
--> start a game with 120hz (display and usb-dongle don't turn of 3dmode)
 
when the backlight isnt strobing in lightboost mode is the display PWM free for ether asus or benq?
 
when the backlight isnt strobing in lightboost mode is the display PWM free for ether asus or benq?

No.

Benq XL2420t - 180hz
Asus VG278h - 360hz
Asus VG278he - 432hz

Benq XL2411t - unknown, probably 180hz
Asus VG248qe - unknown, but should be 432hz
 
I have a VG278H monitor and none of these tricks are working for me.

Not sure if I found a new trick. But you enable stereoscopic 3D and launch the game. I put it on 1% so it isn't too distracting when not putting on my glasses. Once in game I press (ctrl+t) and that disables 3D. It puts me back into 2D mode, but the brightness is severely lowered. I think the game does feel a little smoother.

If anyone else could verify this LMK.

Did you try the method I got my vg278h working? It's on pg. 8 of this thread. Also a few people were having problem getting this activated on Win 8. No problems on Win7.

You can verify that the monitor has Loghtboost enabled by verifying all the Asus menu options are greyed out except contrast & LB.

I have found that keeping 3d enabled and using ctl T method will have less FPS performance compared to 3d disabled in the control panel.
 
I looked into DDC software last night.

And both my LCD's are not DDC compliant.

From there I checked out the AMD "ADL API" which allows the user to control driver settings.

I was able to run their sample code which changes brightness on a loop. (max low, max high)

But we can't achieve what we're looking for through video output, it should(?) only get output once per refresh by the videocard. i.e. even if I can/could tell the videocard to change brightness at the exact intervals we want, and keep it all accurate, it's really only sending that data to the screen 1 time per refresh. (If I'm way off base here please let me know)
 
I'm beginning to wonder if lightboost even exists on these older monitors. You keep referring to "lightboost 2", but the nvidia list of monitors only references the term "lightboost" on nvidia 3d vision 2 displays:
Right.
Yes, when I said LightBoost2, I meant "3D Vision 2 with LightBoost". It's good for more than 3D as it benefits 2D.

I was informed that the original 3D Vision 1 (including VG236H) has no strobed backlight -- no LightBoost. :(
So calling it LightBoost2, is somewhat confusing, so I stopped referring it to as LightBoost2 a couple days ago.

Not surprised you're still getting a low score on the VG236H.
Thanks for confirming that none of the hacks works on it. I guess we'll have to wait for the VG248QE if we want an ASUS 24"-sized LightBoost.
 
I looked into DDC software last night.
And both my LCD's are not DDC compliant.
From there I checked out the AMD "ADL API" which allows the user to control driver settings.
I was able to run their sample code which changes brightness on a loop. (max low, max high)
But we can't achieve what we're looking for through video output, it should(?) only get output once per refresh by the videocard. i.e. even if I can/could tell the videocard to change brightness at the exact intervals we want, and keep it all accurate, it's really only sending that data to the screen 1 time per refresh. (If I'm way off base here please let me know)
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).

I don't think software-based control of brightness will be reliable because of CPU jitters, etc. Although it's possible to do software-based VSYNC by precision time-coding to signal en external circuit (e.g. Arduino) *when* to strobe (e.g. time-stamping a Direct3D RasterStatus.ScanLine() with a QueryPerformanceCounter() and sending a signal to Ardunio, which uses its millisecond-accurate timer to undo the CPU/USB latencies). RasterStatus.ScanLine(), with appropriate timings formulas (e.g. VESA GTF, or other algorithm) be converted to a fairly accurate "microseconds-since-last-VSYNC") value. It can all be timestamped accurately. You simply make sure you timeslice your thread first (e.g. Thread.sleep(1)) then immediately run a high-priority critical section on two lines of code (reading the RasterStatus.ScanLine() followed by immediately reading QueryPerformanceCounter()), with a near-perfect guarantee that the CPU core won't get interrupted/timesliced to a different thread when reading these two lines, allowing a very good accurate timestamping operation. Some cross-checks on accuracy can be done relative to previous readouts, and perhaps a readout of the current monitor timings (the numbers seen in nVidia Control Panel Custom Resolutions feature, for example), to exclude "outliers" from the unlikely situations of bad timestamps. At this point, now you've essentially got a guaranteed +/- 10 microseconds accuracy of a timestamp on a VSYNC -- and possibly as little as +/- 1 microsecond), which can now be signalled to an external circuit at your own leisure (where the external circuit can use its microsecond accurate timer to compensate, based on the timestamp). The ping between the computer and external circuit may very quite a bit, but the relative values from readout to readout would remain accurate, and you'd only need to signal the VSYNC roughly once a second or something like that, and let the Arduino 'extrapolate' the intermediate VSYNC timings in between, and it'd continue strobing quite accurately, and the occasional timestamps will keep it continuing to strobe accurately. (you only need an apporox 0.1ms accuracy on the strobe timing). CPU fluctuation proof, CPU freeze proof, USB latency jitter proof.

However, I don't think it can be done reliably if the software *does* the strobing itself.
That's best left to an external circuit, even if you use a software-based VSYNC algorithm (timestamping technique) like what I mentioned above, to signal an external strobe circuit. The software based VSYNC algorithm would work on both nVidia and ATI cards and bring a home-made LightBoost to a computer monitor, or LightBoost to a LED monitor that wasn't originally designed to strobe. (would require some modifications to wire into its existing backlight, etc)
 
Last edited:
I looked into DDC software last night.
And both my LCD's are not DDC compliant.
From there I checked out the AMD "ADL API" which allows the user to control driver settings.
I was able to run their sample code which changes brightness on a loop. (max low, max high)
But we can't achieve what we're looking for through video output, it should(?) only get output once per refresh by the videocard. i.e. even if I can/could tell the videocard to change brightness at the exact intervals we want, and keep it all accurate, it's really only sending that data to the screen 1 time per refresh. (If I'm way off base here please let me know)
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).
I don't think software-based control of brightness will be reliable because of CPU jitters, etc.
And the ADL API might be manipulating only the graphics output brightness, not hardware-based LED brightness on the monitor...

However, it is possible to do software-based VSYNC by precision time-coding to signal an external LED-flashing circuit (e.g. Arduino) *when* to strobe (e.g. time-stamping a Direct3D RasterStatus.ScanLine() with a QueryPerformanceCounter() and sending a signal to an external strobing circuit such as an Ardunio, which uses its microsecond-accurate timer to undo the CPU/USB latencies). RasterStatus.ScanLine(), with appropriate timings formulas (e.g. VESA GTF, or other algorithm) be converted to a fairly accurate "microseconds-since-last-VSYNC") value. Then this could be transmitted to an external strobing circuit. After timestamping, CPU freezes, CPU fluctuations and USB fluctuations would not matter, as long as the external circuit had a microsecond-accurate timer too, to compensate based on the timestamp calculations. You wouldn't even need to signal every single refresh, perhaps even just once a second, as the external circuit can extrapolate strobe timings until the next received signal. The accuracy of strobes only need to be 0.1ms (~100 microseconds), so the timestamped software VSYNC apparently is doably accurate...

However, I don't think it can be done reliably if the software *does* the strobing itself. That should be left to an external circuit, or to the monitor's electronics.
 
Last edited:
Did you try the method I got my vg278h working? It's on pg. 8 of this thread. Also a few people were having problem getting this activated on Win 8. No problems on Win7.

You can verify that the monitor has Loghtboost enabled by verifying all the Asus menu options are greyed out except contrast & LB.

I have found that keeping 3d enabled and using ctl T method will have less FPS performance compared to 3d disabled in the control panel.
I'm running Windows 8.

This thread is only 7 pages long right now. You need to post a link to your thread.
 
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).

I don't think software-based control of brightness will be reliable because of CPU jitters, etc. Although it's possible to do software-based VSYNC by precision time-coding to signal en external circuit (e.g. Arduino) *when* to strobe (e.g. time-stamping a Direct3D RasterStatus.ScanLine() with a QueryPerformanceCounter() and sending a signal to Ardunio, which uses its millisecond-accurate timer to undo the CPU/USB latencies).
RasterStatus.ScanLine(), with appropriate timings formulas (e.g. VESA GTF, or other algorithm) be converted to a fairly accurate "microseconds-since-last-VSYNC") value.

However, I don't think it can be done reliably if the software *does* the strobing itself.

I understand.

I was playing around with QuerryPerformanceCounter() to create a high precision sleep. The results seemed pretty good. When I requested a 120hz sleep interval I was getting 8.33353ms and 8.33324ms intervals. The average of that comes out to 8.3334

Now I agree that an external hardware clock would be better suited. But I was nonetheless impressed with the software based timer.

I confess I did not setup/test any jitter measurement, and did not even lock my CPU clock nor the thread to a specific CPU Core. So i'm sure I would have gotten some jitter. I would think we would need to test before dismissing. One thing I like to think is that the strobing is not related to a specific frame. So as long as it's somewhat in line with the refresh then it should look ok to the user. In the end we could find a middle ground between mathematical accuracy and acceptable strobing. From personal experience in design, fabrication and deployment. While the absolute perfect implementation would be awesome, 99% of the time the cheaper less perfect solution ends up being the chosen one (because of various decisions/parameters that are in the end not as important).


What magnitude of precision do we think is required for strobing? Even the refreshrate usually fluctuates a bit in regular use.

For example at 100hz, my display fluctuates between 99.99533 and 99.99535 (ok, that's not alot, I had more in mind lol)

Same for jitter, is an offset of 0.1ms for 1 frame something we can notice? (note that I don't know how much jitter we would even get from software solution, could be 0.001, could be 100ms).

On another note, has anybody taken apart a lightboost 2 monitor yet? I'm still not convinced nvidia implemented a super high tech solution. It would be nice, but if in their tests they could get "ok" results with a basic non costly approach I'm sure the managers opted for that.
 
I'll go buy a lightboost LCD + nvidia card.

Do I "need" the emitter ? i.e. XL2420T vs XL2420TX
 
I understand.
I was playing around with QuerryPerformanceCounter() to create a high precision sleep. The results seemed pretty good. When I requested a 120hz sleep interval I was getting 8.33353ms and 8.33324ms intervals. The average of that comes out to 8.3334
That's a synchronous approach. The timestamping method (RasterStatus.ScanLine) allows high accuracy while doing it asynchronously. You know what I mean?

Now I agree that an external hardware clock would be better suited. But I was nonetheless impressed with the software based timer.
Yes, an external hardware clock is needed too. If we're trying to override the LightBoost strobes, yes, we need to do so.

On the other hand, if we're doing a home-made LightBoost modification to an existing computer monitor, we can just signal the VSYNC over a USB cable _asynchronously_ (because of timestamping ScanLine's with QueryPerformanceCounter's). It could even be randomly signalled (e.g. 1 second, then 0.35 second, then 1.5 seconds later), because the timestamping keeps it accurate. The advantage of this approach is no modifications needed to a computer monitor's electronics circuit. (Just modifying its own backlight). Basically, the software-timestamped VSYNC co-operates with an external circuit timer (e.g. Arduino micros() timer) to make accurate strobing possible _without_ needing solder anything into a computer monitor's circuit, and _without_ a hard-to-make video cable dongle (because of HDCP). We'd simply use a strobe-timing-offset (phasing) adjustment, to manually calibrate it to the point of least motion blur (using a test pattern). Ping over a USB cable is 4.091 milliseconds to an Arduino +/- 0.001ms (1 microsecond) if you average the ping over 1,000 pings, so even though USB ping can vary a lot on a single ping, the average is quite rock solid. Even so, USB ping jitter isn't even the limiting factor, as long as the strobe-timing-offset (phasing) can undo the latency. Besides, the timecoding still undoes even worse ping jitters.

I confess I did not setup/test any jitter measurement, and did not even lock my CPU clock nor the thread to a specific CPU Core. So i'm sure I would have gotten some jitter. I would think we would need to test before dismissing. One thing I like to think is that the strobing is not related to a specific frame. So as long as it's somewhat in line with the refresh then it should look ok to the user. In the end we could find a middle ground between mathematical accuracy and acceptable strobing.
I did some tests on a flashing high-brightness white LED running at 1-to-10 kilohertz and found a >0.1ms random insertion of delay (permanently delaying future strobes), led to a barely-perceptible flicker. It was, however, nontheless potentially annoying. So, I've set 0.1ms as the accuracy -- ONLY because of flicker issue. But if you delay an individual strobe but you still maintain the same number of strobes per second (e.g. future strobes are not permanently delayed by one delay), then there's no flicker issue.

The LCD panel is not even the limiting factor itself because even at a 8ms refresh cycle, assuming you've got a long-enough VSYNC interval, it acceptable for a strobe to be even 0.1 to 0.2 "late" without too much increased ghosting artifacts. It would be a graceful degradation in a mis-timed strobe and gradually increased ghosting. The later the strobe, the more gradually increased ghosting comes back (at least along the top edge of the screen).

One solution is to have the external circuit keep its timer, and when using an asynchronous timecoding technique (e.g. timecoded RasterStatus.ScanLine() which makes asynchronous timecoding possible) -- and letting the external circuit subtract it, to undo the CPU fluctuations and USB ping jitter -- and convert the asynchronous incoming VSYNC timecoded signals, into a synchronous internal strobe, that's extrapolated until the next asynchronous incoming timecoded VSYNC signal sent from the computer, to the external circuit.

From personal experience in design, fabrication and deployment. While the absolute perfect implementation would be awesome, 99% of the time the cheaper less perfect solution ends up being the chosen one (because of various decisions/parameters that are in the end not as important)
For manufacturers: They have the display signal to work from, so that's the cheapest; just work with the display signal to time the strobes.

For homebrew (custom hacking of a non-LightBoost monitor): This is the cheapest way to do it. Software-based timestamped asynchronous VSYNC signalling (randomly signalled from the computer to external circuit), and the external circuit simply calculating its own synchronous timed strobes based on the occasional incoming timestamped VSYNC's. No monitor modifications (except backlight), no external dongle, etc. The problem is incompatibility with Mac's and consoles, etc, since a tiny software program would always be running in the background on the PC to send the asynchronous time-stamped VSYNC's (RasterStatus.ScanLine timestamped with QueryPerformanceCounter's)

.
What magnitude of precision do we think is required for strobing? Even the refreshrate usually fluctuates a bit in regular use.
Precision for strobing is lower than most expect, as long as prevoius precision does not interfere with future precision. (e.g. a delayed strobe does not permanently delay future strobes). - but VSYNC is a very synchronous timer, so that solves that problem since we're using VSYNC as our time reference. Varying the timings of the strobing by 0.1ms themselves from their appropriate perfect mathematical timing, has no noticeable flicker artifacts, as long a tiny imperceptible delay in one strobe does not permanently delay future strobes. Based on LED flicker tests on my Arduino electronics circuit. BUT, permanently delaying all future strobes or a strobe being missed, _does_ can potentially a noticeable flicker, especially at higher frequencies (e.g. 1KHz PWM or 10KHz PWM, and one strobe suddenly becoming skipped or delayed -- amazingly, I _can_ notice that even in 10KHz PWM, it appears as a single faint flicker).

Same for jitter, is an offset of 0.1ms for 1 frame something we can notice? (note that I don't know how much jitter we would even get from software solution, could be 0.001, could be 100ms).
Nope. The strobe delay won't be noticed by the human eye from the flicker perspective. HOWEVER, there are possible analog effects on the LCD, a strobe delayed may cause increased faint ghosting along the top edge of the screen, because the LCD panel might have started refreshing the next frame, etc. But if you've added lots of safety margin (e.g. a huge vertical blanking interval), you've got safety margin for delayed strobes. It's only a graceful, slow degradation to increased ghosting, as it enroaches into a tiny part of the LCD refresh cycle. (You may have seen my high speed videos recently posted on blurbusters.com)

On another note, has anybody taken apart a lightboost 2 monitor yet? I'm still not convinced nvidia implemented a super high tech solution. It would be nice, but if in their tests they could get "ok" results with a basic non costly approach I'm sure the managers opted for that.
Nope. Are you aware of my home-made strobe backlight under construction at www.scanningbacklight.com? (Scroll down near the bottom and start scrolling upwards) I started this project before I discovered LightBoost was a strobe backlight. I've even created a FAQ at Scanning Backlight FAQ.
 
Last edited:
I'll go buy a lightboost LCD + nvidia card.
Do I "need" the emitter ? i.e. XL2420T vs XL2420TX
BTW, I hear XL2411T is a better LightBoost monitor than XL2420T. I've ordered one from UK; it's still on sale and no VAT when shipping to North America -- even with $50 shipping, it's still cheaper than buying one at a local computer store.

I'm going to compare my ASUS with my BENQ when the BENQ arrives. I've heard multiple reports that it uses LightBoost strobes of 1ms, which would mean half the motion blur of my ASUS VG278H. This may explain why some said VG278H may have PixPerAn readability score of 25, while Vega report a PixPerAn readability score of 30 (CRT quality motion). (Compare this to a score of 7-8 for non-LightBoost monitors)
 
Last edited:
Its possible for Vista64 (maybe Win7, not Win8) and all 120hz 3DVision Displays. A simple bug activating black frame insertion for every second frame. Its perfect for 60fps limited games or slow graphicards. Games are limited with v-sync on to 60fps, but the display runs in 3d mode with 120hz. I am checked it with a DVI capture card and yes thats really black frame insertion. 60fps games are extremly sharp (look just like real 120hz).

--> connect display with a dvi dual link cable
--> set the monitor refreshrate to 120Hz
--> make sure the nvidia usb-dongle is connected
--> goto "Nvidia Control Panel"
--> activate "Enable stereoscopic 3D"
--> click on "Test Stereoscopic 3D"
--> set resolution and refreshrate to 1920x1080@120hz and launch the test application
--> when you see the moving Nvidia letters, press Win key or Alt+Tab
--> now, the windows desktop are darker and has heavy flickering like an old 60hz CRT (3DVision glasses don't work correct, the right glas are more darker than the left)(display is in 3d mode)
--> close "Nvidia Control Panel" (let 3DCP/nvsttest.exe open!) and reopen it
--> deactivate "Enable stereoscopic 3D"
--> start a game with 120hz (display and usb-dongle don't turn of 3dmode)
Excellent tip, thank you. Although flicker makes it mostly unsable, it definitely shows that combining LightBoost with Black Frame insertion, gives you the zero-motion-blur effect at 60Hz. This is perfect for MAME, emulators, etc. The CRT flicker adds an air of authenticity. And Super Mario "Nintendo smooth" scrolls with no blur or ghosting, too.

Hopefully this becomes an official feature sometime, or somebody creates a System Tray Utility to do software-based Black Frame Insertion for 120Hz displays.
 
Some great stuff in this thread, have to admire the enthusiasm and work going into this.

Personally I'd love to see this available at lower refresh rates. Having to get to 100-120fps is the kicker in terms of it being cost efficient in newer games. The other way around could work as well, strobing + black frame insertion at 144Hz would be survivable in terms of flicker in games for me at least.

On another note; would an overdriven IPS-panel be quick enough for this to work at say 75Hz-85Hz?
 
Excellent tip, thank you. Although flicker makes it mostly unsable, it definitely shows that combining LightBoost with Black Frame insertion, gives you the zero-motion-blur effect at 60Hz. This is perfect for MAME, emulators, etc. The CRT flicker adds an air of authenticity. And Super Mario "Nintendo smooth" scrolls with no blur or ghosting, too.

Hopefully this becomes an official feature sometime, or somebody creates a System Tray Utility to do software-based Black Frame Insertion for 120Hz displays.
I wonder if black frame insertion would be good with a 240hz capable monitor, running at 120hz with black frame insertion every other frame...
 
well I purchased a 660 Ti (not a wasted purchase, will give it to my sister for xmas) and a VG278 to test this. (I normally run a 7970 + Tempest OC)

As expected it makes every frame clearer. Pretty good stuff. I thoroughly enjoyed my quick Richard Burns Rally session. It did bring me back to the CRT feeling. (Haven't used one in over 2 years)

The monitor appears to be fully DDC compliant. I'll try and see how they tell it to activate the strobing over the next few days. I'll try and scan DDC codes when it's in lightboost mode and compare it to when it's at normal 120hz mode.

If it's just a "profile" in the screen it must be possible to activate it with any videocard. (unless nvidia sneak in some sort of handshake and we can't find the codes required). They might even be talking via DDC but outside the specs and it's something we won't be able to find using normal DDC software.

All this experience did for me was similar to Vega's. We need to get this onto higher resolution displays and IPS. I've only been using a 2560x1440 @ 120hz for 1 week... and there is no way I'd switch full time to a 27" 1080p. (well unless my tempest OC explodes lol.)

I guess I'll go dig up my arduino, haven't played with it in over a year lol. And start tinkering.
 
ips response time is slower so the results won't be the same. 120hz + very high fps results in smoother motion tracking regardless but not reduced blur. Not sure how a 10ms+ ips compares to the 1ms XL2411T or other 2ms 120hz TN , or even 3ms 120hz TNs if lightboost strobing were enabled on all of them. I also doubt that the ips panels have the same amount of aggressive response time compensation available in them.
I'd rather have crt like performance at 1080p even if I had to use a 24" TN monitor. I play on a 120hz 27" TN at 1080p now (w/o lightboost), but it is dedicated to gaming only. I use a 2560x1440 ips for all other things desktop.
.
I'd say that it's quite similar. Although TechNGamer is saying it's better than CRT, I think it's not better (yet) but it's definitely in the same neighbourhood. Far more CRT than LCD (even when comparing to normal 120Hz LCD)

Most people are using 60Hz LCD's, so based on math calculations from my strobe measurements (Casio 1000fps camera) the VG278H would have 85% less motion blur than most people's computer monitors (2.5ms of eye-tracking based blur, rather than 16.7ms of eye-tracking based blur). I'd say that throws it within a stone's throw of CRT, which I feel is 90-95% less motion blur than LCD (primarily the green ghosting effects, etc). Very hard to tell. On my VG278H, motion looks more CRT than LCD, for sure. The limiting factor is the response-time-acceleration artifacts, which trails faintly. But there's absolutely no phosphor ghosting, and edges are sharp in fast motion like CRT. It's much better than I expected, and can only get better with improved LCD's and improved strobe backlights. (If the manufacturers are willing to chase down this path.) I can now easily see a 1-pixel gap in PixPerAn chase test, something never seen before in any other LCD, although it does have visible distortions caused by response-time-acceleration. I'm able to read the "I NEED MORE SOCKS" text in the PixPerAn racecar perfectly even when it's running at 960 pixels per second. Being able to read tiny text while it's zooming across my screen at 960 pixels per second -- something I've never been able to do with any LCD, ever -- until now. I think approximately half a pixel motion blur at this speed, hard to tell. (960pps = 8 pixels step per frame, configure PixPerAn accordingly). It's a burry mess on 60Hz LCD. Once you move the car faster than roughly this speed, it's hard to track my eyes (even if it were a CRT) but it now looks like there's finally very ultra-faint amount of motion blur when you start moving objects at 1920 pixels per second (very hard to track my eyes that fast), so you just now finally barely be able to notice a possible minor shortfall relative to CRT (e.g. it psychologically feels only like a "1%" shortfall). I'll have to do more testing in pushing the limits. For all practical purposes, it's looks like zero motion blur unless I'm really trying damn hard to tire my hand out flicking my mouse so fast, that I can only barely track my eyes on the almost-too-fast-motion, and then finally I barely notice maybe 1 or 2 pixels thick of motion blur. Only I'm super aggressive at trying to move objects across my screen extremely fast. On the other hand, I've also seen worse CRT's -- long-persistence phosphor CRT's with more objectionable phosphor ghosting than what I'm seeing with the VG278H. I will say this thus far: Given a scale of traditional LCD=1 and a CRT=10, I'm giving the VG278H a "9" in sharpness of fast-motion (zero motion blur effect). To my eyes, motion on the VG278H (+zero motion blur tweak) looks 10 times better than a 60Hz LCD.

What's far more noticeable (but must be weighed against CRT phosphor ghosting artifacts) is that there's some minor trailing RTC/ghost/corona artifacts, much like you see in some reviewer photos. But surprisingly, the RTC ghost/artifacts don't show up very much in games; it only affects edges only between certain colors and certain speeds. It's hard to describe, the ghost/corona artifact is very different from what you normally see in non-strobed displays. Basically, the RTC ghosts/coronas are extremely razor-sharp double image that's chasing a few pixels behind a sharp moving image: That's the side effect of extremely slight (1%) pixel persistence leakage into the next frame for certain color contrasts. But I can't see this effect in videogames actually, and is actually less noticeable than phosphor ghosting trail artifacts (though it depends on the colors & depends on the game -- could be worse or less). It only happens when you've got big edges between big solid colors, and only between certain colors -- it's similiar to 3D shutter glasses leakage, except it's a faint razor-sharp double-edge chasing a razor-sharp edge. So I've traded CRT phosphor ghost artifacts (green ghosting in dark scenes) with faint-but-razor-sharp ghost/corona artifact.

However, the edges of everything remain razor sharp during fast movement, just like CRT. (I may replay Half Life 2 again like this. It's a new experience now, with CRT-sharp motion)

Watch your GPU if you want to do zero motion blur gaming; don't bother if you don't have a high-600-series Geforce card. Also, adjust video game settings to turn off GPU-based motion blur effects. (Although artistically desirable for some moments like being injured, I don't like it when artificial motion blur is enabled at all times in a videogame). You may have also to lower your detail level a bit, just to cap out at 120fps@120Hz. With zero motion blur gaming, you will now be able to notice frame drops even at 120Hz (where you weren't able to before, because of LCD motion blur). Much CRT at 60fps@60Hz, you quickly notice single frame drops, or CRT 120fps@120Hz, where you occasionally notice them. The zero motion blur effect makes frame drops far more noticeable; some people will hate this. Others don't mind it all that much. CRT video gamers are already familiar with this effect -- the effect is the same.

Remember, I'm talking about motion blur, not other aspects (color quality). I've had time with a friend's W900, and I also used to have a 21" CRT that could do 120Hz. For my Asus, color quality (e.g. for photoshop) is not as good as a well-calibrated FW900. It's a TN panel -- temper your expectations for color quality. Color quality will be a dissapointment for color-quality snobs. But I can say the motion blur is in "CRT territory". It totally feels like zero motion blur. Not noticeably better, not noticeably worse. Just different artifacts (very faint sharp-edge RTC artifacts instead of CRT phosphor ghosting). It may be because I'm more picky about IPS quality than TechNGaming is, so you've got to pick your poison (You want good photo viewing? IPS! You want zero motion blur on LCD? TN+LightBoost, baby!!!)

P.S. Although it looks great at just about any fps that doesn't microstutter too much --
Did I say this again: You _really_ do need to cap-out the fps at the same rate as strobes (120fps@120Hz), for the _best_ zero-motion-blur effect.
Very hard to do without a GTX680 SLI in the newest video games without an aggressive lowering of detail.

P.P.S. Regarding XL2420TX or XL2411T, I'm pretty unsure. I'd lean towards the one with the 1ms rating, but the bottom line is the length of the strobes for each monitor. Is the shutter glasses emitter built into the XL2411T like it is for the Asus VG278H? If so, then it's relatively easy to force-enable LightBoost in 2D mode -- get the XL2411T, the lower ms rating is a good sign because better response-time-acceleration (less RTC artifacts) is useful. But it's not a guarantee of it being a better strobed monitor. Or remaining bright during strobes. (Non-strobed brightness doesn't always guarantee sufficient brightness during strobed modes). We need someone well-connected to test them all, or someone near a massive elite computer store, to do some tests...
 
ips response time is slower so the results won't be the same. 120hz + very high fps results in smoother motion tracking regardless but not reduced blur. Not sure how a 10ms+ ips compares to the 1ms XL2411T or other 2ms 120hz TN , or even 3ms 120hz TNs if lightboost strobing were enabled on all of them. I also doubt that the ips panels have the same amount of aggressive response time compensation available in them.
I'd rather have crt like performance at 1080p even if I had to use a 24" monitor. I play on a 120hz 27" at 1080p now (w/o lightboost), but it is dedicated to gaming only. I use a 2560x1440 ips for all other things desktop.
.

Did you really just say a 2560x1440 60hz IPS panel has the exact same motion blur as a 2560x1440 120hz IPS panel??? I have a catleap 2B and an Overlord Tempest OC and the blur is night and day different between 60hz and 120hz.
 
I'm saying from what I understand they don't have the same combo of factors to reduce blur as much as a much lower response time 120hz TN with aggressive RTC.
 
3char
Depends on what kinds of games you play. Fast twitch online games benefit hugely from CRT-style (provided by LCD LightBoost). Or are you doing a slower exploration/adventure game, or Civilization/Simcity games? (You'll appreciate the color quality of IPS).

Also, IPS 120Hz has slightly more motion blur than non-LightBoost TN 120Hz LCD
Gotta pick your poison. For me being used to C, regular 120Hz is NOT "a little motion blur" -- it's still a blurry mess in Arena-type games compared to CRT. Just half as blurry a s 60Hz.

Hopefully someday, we will get 1440p+IPS+LightBoost. Probably not for several years, alas.
 
Did you really just say a 2560x1440 60hz IPS panel has the exact same motion blur as a 2560x1440 120hz IPS panel??? I have a catleap 2B and an Overlord Tempest OC and the blur is night and day different between 60hz and 120hz.
Agreed. Although 120Hz TN (non-LightBoost, too) has less motion blur than 120Hz IPS, 120Hz IPS still has less motion blur than 60Hz IPS. The pixels on IPS cannot transition fast enough to be an exact halving of motion blur from 60Hz->120Hz, but there IS a reduction of motion blur from 60Hz->120Hz IPS. Just not as much as the blur reduction from 60Hz->120Hz TN, due to the slower IPS pixel response.

If a strobe backlight is a sequential scanning backlight (like those used in some existing high-end HDTV's), the LED's can flash in sync, in a top-to-bottom scanning fashion, then you don't need to wait for the pixel persistence to finish in the VSYNC. As long as some part of the display has adequately refreshed while keeping other parts of the display (still refreshing) in the dark. The disadvantage of a sequential scanning backlight is much more complexity in the electronics, and backlight diffusion (on-segments of backlight leaking into off-segments of backlight). So that puts a limit on motion blur reduction.

However, it should be possible to reduce motion blur on IPS displays by another approximately 50-75%, using a good sequential scanning backlight. You just won't get the 94% motion blur reduction (over a LCD 60Hz display) that a 1ms strobe backlight would provide. (94% comes from 1ms versus 16.67ms), at least not until IPS speeds up and uses response-time acceleration (with its disadvantages alas, it is also an ingredient that allows 3D and zero motion blur to be possible). Some IPS HDTV's work with shutter glasses, so it's possible to make IPS fast enough. It just has not arrived on computer-monitor sized panels.

Click the link: strobed/scanning backlights in existing high-end HDTV's. You'll see that IPS 3D HDTV's exist, and with scanning backlights too -- strobe backlights in your home theater display, combined with IPS -- already here today. But, alas, they are not videogame-friendly because they use motion interpolation, and they don't reduce motion blur nearly as much as LightBoost does. But if you like the Motionflow effect, then you'll probably like, say, the Elite(tm) LCD HDTV, a high-end flanker brand by Sharp Electronics -- it is one of the best LED backlight LCD HDTV's out there, using IPS LCD technology, and provides the zero-motion-blur effect during video material (At least video material taken with fast shutter speed), if you don't mind the motion interpolation being combined with a scanning backlight. It even has local dimming, which means it turns off LED's behind parts of the display that are black. Alas, not very video game friendly, it has lots of input lag, and it costs several thousand dollars.

But a lot of the technology could be transferred to a computer monitor, in a low-input-lag manner. It's definitely technically doable, given the right panel and electronis, as well as backlight.
 
Last edited:
If it comes down to the difference being clear readable text and clear "readable" high texture detail incl. bump mapping, etc during FoV movement at speed - or not - for any time in the foreseeable future, my choice for (dedicated to) gaming would be the clear one.
 
I'm saying from what I understand they don't have the same combo of factors to reduce blur as much as a much lower response time 120hz TN with aggressive RTC.

Ya I don't agree with that. Perhaps not against the Asus 144hz or BenQ 144hz TN panels. But I find my IPS Catleap 2b & OC 2b to have less blur than my old Samsung S27A750 & A950 120hz TN panels. And my Catleap 2B overclocks to 130hz and blur is almost a non-factor at that refresh rate. I downclocked my 2b to 60hz today and tried a game of battlefield and it felt like my soldier's boots were filled with gummy bears and he had ten shots of Jack Daniels....thats how drastic 60hz versus 120hz is on these panels.

It would be interesting if Mark could test out an overclocked 120HZ IPS panel and show the actual numbers on how it improves once overclocked.
 
If it comes down to the difference being clear readable text and clear "readable" high texture detail incl. bump mapping, etc during FoV movement at speed - or not - for any time in the foreseeable future, my choice for (dedicated to) gaming would be the clear one.
Yah, um no...sorry but you just don't know what you are talking about. Your letting YOUR 60hz IPS 2560x1440 panel bias your opinion versus the 120hz 2560x1440 panels. I HAVE OWNED the VG278HE 144hz Asus, Samsung S27A750 & S27A950 (The key word is owned NOT talk out my ass) and the 2560x1440 120hz IPS panels blow them out of the fucking water. Whatever math applies to the 60hz versions (pixel response, fov movement, bump mapping, etc, gets thrown out the window once you go 120hz, which is why it would be very interesting if somebody like Mark could test them out and give us the real numbers on exactly what happens when an IPS goes 120hz. If you don't believe me, ask Vega because hes owned all of those displays too and he will be the first to tell you that 60hz 2560x1440 IPS is SHIT compared to 120hz 2560x1440 IPS.
 
It kind of also depends what you're sensitive too.

What I notice most is that my brain likes smoothness more than clarity. So for sure I prefer the highest number of images per second I can get.

But then comparing the two 120hz TN I have laying around vs. my 120hz IPS, I do notice the slowness of the IPS panel. But it's still overall the same for perceived "smoothness" (one could argue it even appears smoother because the pixels create a blurry mess of smoothness hehe).

I'm glad were are seeing improvements, but for the moment, for me, I don't mind staying with 120hz non lightboost. Testing the VG278 with lightboost last night confirmed that I would personally take smoothness over clarity every time.
 
Last edited:
Well, 60 Hz anything is garbage compared to 120 Hz. I have measured the full color pixel change speed on the Catleap at ~10ms. So anything faster than 100 Hz you are no longer getting any motion blur reduction, just an increase in the smoothness feel from the extra Hz.

There are some TN panels that have just as slow pixel response, then there are panels like the Asus and BenQ 120-144 Hz panels that blow it out of the water in that regard. On the Catleap at 130 Hz, I can read a speed of a bout 11 in Pixperan, which isn't too bad for an IPS panel.

I believe Elvn has a Apple 27", which only gets around 6-7 in Pixperan if my memory serves me correct. I've tested the monitor, while it does have a great picture and build quality, it is pretty much the slowest pixel response LCD I've ever tested.

I think the real breakthrough will be when some smart guy like Mark gets Lightboost type mode working on an IPS panel, as if you go solely one way or the other the compromises are huge.
 
Back
Top