ASUS/BENQ LightBoost owners!! Zero motion blur setting!

The HE also has a frame of input lag @60hz and can't do 3d with consoles
 
I don't understand german but I want to make sure you are aware that capturing a flickering monitor source with video camera recorded frames results in out of sync timing between the two, which is why crts look like they have bands cycling on them or pulsing when you see them in internet videos and on tv news reports, etc. In that video the entire scene of the reviewer and room that monitor is in is a pulsing and flickering thoughout much of that video.
Yes, that's correct. That's the flicker from the fluorescent lights.

Flicker artifacts are easily caught by camera than the human eye. Hopefully they will obtain a high-speed camera (e.g. Casio Exilim EX-FC200S).

Good coverage! I've blogged. I'm glad magazines are starting to cover this.
This should snowball even more once I've released my motion tests to the public. (21st century PixPerAn replacement).
 
Last edited:
sorry to ask the same question again.

how well does this work on TV sets like the Sony line with CMR960 or w/e they call it.
 
sorry to ask the same question again.

how well does this work on TV sets like the Sony line with CMR960 or w/e they call it.
From a computer/game usage perspective, there are currently some rather severe limitations -- they all currently have massive input lag, since it combines backlight scanning/strobing with interpolation. Couple that with the usual situation that HDTV's even in Game Mode tend to be worse at input lag than computer displays.

Also, 60 Hz scanning/strobing is rarely done because of the annoying flicker. Not very many scanning backlights in HDTV's have a mode that excludes interpolation (bad for input lag). However, there's a Motionflow "Impulse" setting that does this in certain newer Sony HDTV's. This "Impulse" setting in Sony HDTV"s would probably be the most videogame-compatible strobe backlight mode available (At the current moment). There's supposed to be less input lag in this mode, but very little tests have been done. It flickers at a nasty 60Hz, though, but it would be fully console compatible, assuming its input lag was adequate. Impulse is non-interpolated.

See Existing Technology for more info about Samsung/Sony/Panasonic HDTV's with scanning backlights.

Searches:
- Motionflow Inpulse Setting
- Review of Sony KDL-55HX853 commenting about Impulse

If anyone wants "LightBoost" equivalent in a HDTV set, then the Sony Motionflow "Impulse" setting (non-interpolated) in recent high end sets with the MotionFlow XR 960 feature, is your best bet at this stage, but the input lag is probably approximately 2 frames (untested though!) and you will get horrendous 60 Hz flicker. It will probably result in less motion blur than a 120Hz computer monitor without a strobe backlight, but not fully zero motion blur due to the longer strobes (~4ms-ish). Input lag and flicker would probably be pretty bad but potentially usable in a partially darkened room. If someone wants to test this out, be my guest -- the Blur Busters Blog would love to cover testing of this feature once the input lag meter software is fixed. The goal is to simply bring in my laptop with the lag meter, retractable HDMI cable & a universal remote, into a Best Buy, and measure a few TV's at various strobe/scanning backlight settings, hopefully not too long after I've released my motion tests.

Note: The LightBoost HOWTO is not applicable. Just configure the Sony HDTV set's menu and change the Motionflow setting to "Impulse" (only found in certain high end models) and you're ready to go, for any 60 Hz computer or game console, with probably worse input lag than a computer monitor, but much less input lag than interpolation. You will have less motion blur than a non-LB 120Hz monitor, but worse motion blur than a LB 120Hz monitor.

Enterprising people can bring their game console to a store that sells the higher-end Sony "960" HDTV's, and try out the Impulse setting and report back. You won't be able to go into Game Mode with the Impulse setting (as far as I know), but it would be good for someone here to give it a test go, to see. Be my guest & please post back!
 
Last edited:
TV's have always been the realm of high input lag and various other silly electronics that make them unfit for computer gaming.

Mark, I was thinking about 60 Hz 4K displays. Granted 60 Hz motion won't be that great with it's ~16 ms sample and hold, but what if you flashed the backlight twice inside that window instead of once? That would really help with the annoying visible flicker. I am sure even with IPS on a 4K screen there can be time in a 16 ms frame to flash a couple of pulses. Do the pulses have to be evenly spaced or if they are staggered to fit them into non-pixel transition time does that screw with the brain?
 
TV's have always been the realm of high input lag and various other silly electronics that make them unfit for computer gaming.

Mark, I was thinking about 60 Hz 4K displays. Granted 60 Hz motion won't be that great with it's ~16 ms sample and hold, but what if you flashed the backlight twice inside that window instead of once? That would really help with the annoying visible flicker. I am sure even with IPS on a 4K screen there can be time in a 16 ms frame to flash a couple of pulses. Do the pulses have to be evenly spaced or if they are staggered to fit them into non-pixel transition time does that screw with the brain?
No, they do not have to be evenly spaced.

However, this wrecks MPRT significantly. It would be equal to the leading edge of the first strobe, to the trailing edge of the last strobe. So if you had 2ms strobes spaced 8ms apart, your MPRT would be 10ms (2ms first strobe + 6ms black gap + 2ms final strobe). That's 5 times worse trailing-edge artifact than the 2ms strobes itself, and 2.5 times worse trailing-edge artifact than just putting the strobes together (one 4ms strobe per refresh). However, the reduction in flicker can win out because people are just simply so annoyed at 60 Hz flicker. This could be an additional "custom backlight mode".

MPRT of PWM-based backlights (multiple strobes per refresh) are usually a few milliseconds lower (e.g. 14ms) than steady backlights (e.g. 16ms). This is because of that tiny gap of blackness between the last strobe of the previous refresh, and the first strobe of the next refresh.

(MPRT = industry standard measurement called Motion Picture Response Time; also measures sample-and-hold lengths, not just pixel persistence)

The long-term solution is for all video formats to go to 120Hz or 240Hz native (no interpolation) or beyond in the next 50+ years. (How about the japanese NHK 8K 120hz format, anyone?) It would solve so many problems with motion blur, flicker, game-friendliness, computer-friendliness, and avoid interpolation, all at once.
 
Last edited:
From a computer/game usage perspective, there are currently some rather severe limitations -- they all currently have massive input lag, since it combines backlight scanning/strobing with interpolation. Couple that with the usual situation that HDTV's even in Game Mode tend to be worse at input lag than computer displays.

Also, 60 Hz scanning/strobing is rarely done because of the annoying flicker. Not very many scanning backlights in HDTV's have a mode that excludes interpolation (bad for input lag). However, there's a Motionflow "Impulse" setting that does this in certain newer Sony HDTV's. This "Impulse" setting in Sony HDTV"s would probably be the most videogame-compatible strobe backlight mode available (At the current moment). There's supposed to be less input lag in this mode, but very little tests have been done. It flickers at a nasty 60Hz, though, but it would be fully console compatible, assuming its input lag was adequate. Impulse is non-interpolated.

See Existing Technology for more info about Samsung/Sony/Panasonic HDTV's with scanning backlights.

Searches:
- Motionflow Inpulse Setting
- Review of Sony KDL-55HX853 commenting about Impulse

If anyone wants "LightBoost" equivalent in a HDTV set, then the Sony Motionflow "Impulse" setting (non-interpolated) in recent high end sets with the MotionFlow XR 960 feature, is your best bet at this stage, but the input lag is probably approximately 2 frames (untested though!) and you will get horrendous 60 Hz flicker. It will probably result in less motion blur than a 120Hz computer monitor without a strobe backlight, but not fully zero motion blur due to the longer strobes (~4ms-ish). Input lag and flicker would probably be pretty bad but potentially usable in a partially darkened room. If someone wants to test this out, be my guest -- the Blur Busters Blog would love to cover testing of this feature once the input lag meter software is fixed. The goal is to simply bring in my laptop with the lag meter, retractable HDMI cable & a universal remote, into a Best Buy, and measure a few TV's at various strobe/scanning backlight settings, hopefully not too long after I've released my motion tests.

Note: The LightBoost HOWTO is not applicable. Just configure the Sony HDTV set's menu and change the Motionflow setting to "Impulse" (only found in certain high end models) and you're ready to go, for any 60 Hz computer or game console, with probably worse input lag than a computer monitor, but much less input lag than interpolation. You will have less motion blur than a non-LB 120Hz monitor, but worse motion blur than a LB 120Hz monitor.

Enterprising people can bring their game console to a store that sells the higher-end Sony "960" HDTV's, and try out the Impulse setting and report back. You won't be able to go into Game Mode with the Impulse setting (as far as I know), but it would be good for someone here to give it a test go, to see. Be my guest & please post back!

I'm actually more interested in TV/Movie use :)
 
I'm actually more interested in TV/Movie use :)
In that case, then input lag doesn't matter all that much. You can then also tolerate some high-quality interpolation, at least for video-based material.

-- For video (full framerate), If you want minimal motion blur during sports, then you'll appreciate the "960" look. It makes the motion on these LCD's look as smooth as plasma

-- For movies, it's a matter of personal preference.
......I prefer the "24fps motion blurred look". Films are one thing that I like to look as the director originally intended, so I always turn off interpolation and motion enhancements for movies. A rare exception is certain animated movies (e.g. Pixar) where it can be interesting but even those, I usually even leave it off.
......It's challenging to interpolate 24fps to look crystal sharp, as the interpolation does not work reliably for all types of motion. Different TV's have far better motionflow-like effects for movies, since it is very dependant on the quality of interpolation. Newer and higher-end TV's will generally have better interpolation, especially those displays with a "960" rating.
 
Last edited:
No, they do not have to be evenly spaced.

However, this wrecks MPRT significantly. It would be equal to the leading edge of the first strobe, to the trailing edge of the last strobe. So if you had 2ms strobes spaced 8ms apart, your MPRT would be 10ms (2ms first strobe + 6ms black gap + 2ms final strobe). That's 5 times worse trailing-edge artifact than the 2ms strobes itself, and 2.5 times worse trailing-edge artifact than just putting the strobes together (one 4ms strobe per refresh). However, the reduction in flicker can win out because people are just simply so annoyed at 60 Hz flicker. This could be an additional "custom backlight mode".

MPRT of PWM-based backlights (multiple strobes per refresh) are usually a few milliseconds lower (e.g. 14ms) than steady backlights (e.g. 16ms). This is because of that tiny gap of blackness between the last strobe of the previous refresh, and the first strobe of the next refresh.

(MPRT = industry standard measurement called Motion Picture Response Time; also measures sample-and-hold lengths, not just pixel persistence)

The long-term solution is for all video formats to go to 120Hz or 240Hz native (no interpolation) or beyond in the next 50+ years. (How about the japanese NHK 8K 120hz format, anyone?) It would solve so many problems with motion blur, flicker, game-friendliness, computer-friendliness, and avoid interpolation, all at once.

I remember reading about that super 8K 120 Hz camera. The bandwidth required on that is incredible. Doing the math a while back you would need like eight Display port 1.2 connections to run that!
 
I tried lightboost out today with my ViewSonic V3D245 (they don't make them anymore).

I was really surprised how much of a difference it makes. However, my eyes did get tired after a while just like they used to with CRT. With normal LCD my eyes never fatigue.

But it is still totally worth turning lightboost on while gaming.
 
I tried lightboost out today with my ViewSonic V3D245 (they don't make them anymore).

I was really surprised how much of a difference it makes. However, my eyes did get tired after a while just like they used to with CRT. With normal LCD my eyes never fatigue.

But it is still totally worth turning lightboost on while gaming.

Sure it wasn't just placebo? From what I can see, the V3D245 doesn't support LightBoost (Nvidia Vision 2). Maybe it does and simply isn't documented, although that would be odd. It could be that 3D mode forces the ViewSonic into a more aggressive level of overdrive, which would indeed make a large difference in games.
 
Sure it wasn't just placebo? From what I can see, the V3D245 doesn't support LightBoost (Nvidia Vision 2). Maybe it does and simply isn't documented, although that would be odd. It could be that 3D mode forces the ViewSonic into a more aggressive level of overdrive, which would indeed make a large difference in games.
Actually, it could be quite real. Undocumented or poorly-advertised LightBoost apparently exists!

I've lately been hearing from other people, that some displays (a newer model of Samsung S23A950D apparently has it too, though only one report) work with my LightBoost HOWTO. So it already has undocumented backlight strobe control. So I'm not surprised. We can speculate for a moment. It might be some has the equivalent of LightBoost but didn't fully meet nVidia certification for naming it "LightBoost". Or maybe they tweaked it to be compatible with 3D Vision 2. Others may have advertised them as "3D Vision" without really advertising them as being "3D Vision 2" ready (LightBoost). There could be many reasons. Try the PixPerAn motion test. If the text balloon above the PixPerPan motion test racing car is clearly readable at 960 pixels per second (tempo 8 at 120Hz) -- the "NEED MORE SOCKS" text -- then LightBoost is probably enabled, as LCD is not capable of that otherwise.

Meanwhile, I'm now dying to see if there's a VESA DDC command for LightBoost. Alas, I'd need to purchase the latest VESA MCCS (Monitor Command Control Set) to find out (and a VESA membership costs thousands of dollars). Or if it's an nVidia-specific custom command (e.g. one of those "0xE0" commands). There an older version of the MCCS if I google for it, and an EntechTaiwan utility called softMCCS allows me to send custom datagrams to the monitor, maybe one of us can determine what the signal is.

BTW, to reduce eyestrain (applies to 3D shutter glasses /or/ when using 2D LightBoost)
-- Use a behind-the-screen light (lamp behind the monitor). Don't make the room too dark.
-- Try putting the monitor a few more inches further away.
-- Make sure you take breaks.

The same problem applies with 3D shutter glasses use. However, for me, at 120Hz, the 2D LightBoost has less eyestrain than 3D LightBoost because of 60Hz flicker (nVidia originally designed LightBoost for 3D). For me, I fortunately have no effect on my eyes -- since I'm used to CRT 85Hz and beyond. LightBoost does bring the same CRT advantage/disadvantages, and it is very person dependant. For me, the advantages outweigh the disadvantages.
 
Last edited:
I had the opportunity to run Entech Taiwan softMCCS to attempt to reverse-engineer the LightBoost control command. But I have not yet discovered the LightBoost command. However, the utility works as a tool to attempt to trace the LightBoost command manually.

Check out the freely downloadable utility: http://www.entechtaiwan.com/lib/softmccs.shtm


(Observe the Command-Line Editor -- the "Send Datagram" information!)

I can play with common settings such as modifying my monitor's internal Contrast setting, from a software program, over VESA DDC -- using the VESA MCCS

VESA = Video Electronics Standards Association, an organization for standardization of display stuff
MCCS = VESA Monitor Command Control Set (http://www.google.com/#hl=en&q=VESA+MCCS )
VCP = Virtual Control Panel code (0x00 through 0xFF), a value controls a setting (e.g. brightness or contrast). Some are undocumented and not available in the monitor menus.

softMCCS tells me what commands are supported over VESA DDC.

ASUS VG278H Capabilities String (VESA MCCS)
((prot(monitor)type(lcd)modelVG278Hcmds(01 02 03 07 0C F3)vcp(02 04 05 06 08 0B 0C 10 12 14(01 04 05 08 0B) 16 18 1A 60(01 03 04) 62 6C 6E 70 A8 AC AE B6 C6 C8 C9 D6(01 04) DF FE)mccs_ver(2.1)asset_eep(32)mpu(001)mswhql(1)))

BENQ XL2411T Capabilities String (VESA MCCS)
(prot(monitor)type(lcd)model(xl2411t)cmds(01 02 03 07 0c e3 f3)vcp(02 04 05 08 0b 0c 10 12 14(04 05 08 0b) 16 18 1a 52 60(01 03 11) ac ae b2 b6 c0 c6 c8 c9 ca(01 02) cc(01 02 03 04 05 06 08 09 0a 0b 0d 12 14 1a 1e 1f 20) d6(01 05) df)mswhql(1)mccs_ver(2.0))

There's a "Command-Line" editor in SoftMCCS that allows me to transmit manually-typed VESA DDC commands, so I can experiment with these numbers, and see if any of these commands are

LightBoost. It's going to be a little time consuming, but I figured out that softMCCS require you to prefix a datagram with a 0x03 (hex). Then you can use any VCP command found in the VESA MCCS (the PDF can be downloaded via Google Search).

It says that contrast setting is via command 0x12 (hex). The setting is 2 bytes, so therefore:

03 -- First byte seems always 03 for a write command via softMCCS
12 -- The VCP command for "Contrast" (list of VCP commands are found in the VESA MCCS PDF)
00 -- upper byte for contrast value
5C -- lower byte for contrast value (5C hexadecimal = contrast 92 in decimal)

I did a test... This datagram works!
So when I transmit "03 12 00 5C" to my ASUS VG278H, it works! It sets my OSD Contrast to "92".
(I chose Contrast as a first test, since I can't adjust brightness when LightBoost is enabled)
So, this confirms I can control some aspects of my monitor via the free download, softMCCS, for testing.
NOTE: Since I'm using the EDID override, the BENQ shows as an ASUS too, except when I peek more closely -- then I see the difference in the Capabilities Strings.

To read a VCP setting instead of writing a VCP setting, I send a "01" instead of "03". So to read the Contrast setting, I send a "03 12". The relevant values will be the last few hex values displayed, excluding the final byte (the checksum byte). I immediately notice the 5C value.

One way to figure out the LightBoost command is to try to find some VCP codes that exist in both Capabilities Strings (BENQ XL2411T and ASUS VG278H) that are not currently documented in the MCCS .pdf file, and then attempt to control these values. It might be a value not part of the Capabilities strings, however. It might be another method of signalling than MCCS.

However, MCCS is something we can work off (on a free basis), for now.
Happy experimenting! (I shall continue to do so, but I need help.)
Keep the factory reset code handy. (I disclaim any damages from experimentation from sending random undocumented control codes to your monitor)
 
Last edited:
So what happens if I use that string with an ati card?

Doesnt the string change when you disable 3dmode and thereby disable lightboost?
Why not test that, or am I thinking too easy here?
 
For ASUS VG278H
Playing more with softMCCS to try to figure out the LightBoost control code.

In the MCCS PDF file, it says VCP values E0 to FF are manufacturer-specific VCP commands. Using the 01 read command with 120Hz LightBoost versus no LightBoost, and getting the last 4 bytes of the read result (excluding final byte which is the checksum byte). I ignored VCP values that returned FF FF 00 00 which suggested that they were unsupported VCP values. The below what seems to be manufacturer-specific VCP's for ASUS VG278H, with LightBoost specific behaviors:

Command = Result
01 E0 = 00 03 00 01 (same non-LB and LB)
01 E1 = 00 01 00 00 (same non-LB and LB)
01 F9 = 00 03 FF 07 (same non-LB and LB)
01 FA = 00 03 00 0A (same non-LB and LB)
01 FB = 00 03 00 00 (Becomes 00 03 07 33 with LightBoost, appears to be a 2-byte write)
01 FC = 00 03 00 00 (Becomes 00 03 07 D1 with LightBoost, appears to be a 2-byte write)
01 FD = 00 03 01 01 (same non-LB and LB)
01 FE = 00 01 05 55 (Becomes E0 01 05 55 with LightBoost, appears to be a 4-byte write)

I turned off LightBoost again and verified the values reverted back.
Now, I attempted to trigger LightBoost by writing a few bytes at a time, sending these three commands, one by one:
03 FB 07 33
03 FC 07 D1
03 FE E0 01 05 55
And though the 01 verifies changed values correctly, LightBoost is NOT triggered, alas.

Still need to keep hunting....But someone else may want to give this a go!
 
Last edited:
So what happens if I use that string with an ati card?
Doesnt the string change when you disable 3dmode and thereby disable lightboost?
Why not test that, or am I thinking too easy here?
Are you talking about the Capabilities String? No, it doesn't work that way. That's an incoming (received) capabilities string, and not an outgoing (sent) string. It stays constant no matter what monitor setting is, it just tells the software what MCCS settings are supported.

Theoretically, sending the correct MCCS values *might* trigger LightBoost completely independently of the graphics card, if it's an MCCS controllable setting. It might have monitor-side nVidia verification of some kind (I'm not sure if that's even possible, but there's a lot of neat little technology behind the scenes I didn't know about, just by reading VESA MCCS). My goal is making LightBoost more convenient, even if it's still nVidia specific. That is, if LightBoost is not controlled by another means (e.g. embedded timing parameter -- such as a Timings Flags bit, in the same byte normally used to configure polarity and sync-on-green, etc. This would not be controllable via MCCS, but theoretically controllable via PowerStrip API, assuming the graphics card was still supported)
 
Last edited:
I tried it on BenQ XL2420T. No different VCP values with Light-Boost on or off.:(

Light-Boost on.txt
Light-Boost off.txt
Thanks for pitching in some help!
Try testing undocumented/unreported VCP values. To read the value of VCP E0, you need to read them manually by typing "01 E0" in the command line, and seeing the output. Repeat the command for all unassigned codes. For both LB on and LB off. Do this for both the manufacturer assigned codes E0 through FF, as well as perhaps undocumented/unreported values between 00 through DF. The unassigned codes (gray rows) in the table beginning at page 116 of VESA MCCS v3 (mccsV3.pdf) in chapter "11 VCP Code Index"

Keep figuring this out along with me!
....This is to create an app that turns on/off LightBoost.
-- Turning on LightBoost via a hotkey
-- Turning on LightBoost via an easy click on system tray
-- Turning on LightBoost automatically only for certain games, etc.

To make LightBoost easy and more convenient, without INF or REG files, even without an emitter.
 
Last edited:
Theoretically, sending the correct MCCS values *might* trigger LightBoost, if it's an MCCS controllable setting. That is, if LightBoost is not controlled by another means (e.g. embedded timing parameter -- such as a Timings Flags bit, in the same byte normally used to configure polarity and sync-on-green, etc. This would not be controllable via MCCS, but theoretically controllable via PowerStrip API, assuming the graphics card was still supported)

I think LightBoost must be enabling via an DDC/MCCS signal to active one flag. The VESA coordinated video timings (CVT-R) are the same between LB on or LB off. When my BenQ activate LightBoost (green LED is on) i cant turn LightBoost no more off. Systemrestart or disconnect the dvi-cable has no effect. LightBoost will be automtically on when the monitor is running @120hz. LightBoost will still active when you connecting the dvi-cable to a second pc with a ATI graphiccard.

For deactivating LightBoost I must power off the monitor.

Thanks for pitching in some help!
Try testing undocumented/unreported VCP values. To read the value of VCP E0, you need to read them manually by typing "01 E0" in the command line, and seeing the output. Repeat the command for all unassigned codes. For both LB on and LB off. Do this for both the manufacturer assigned codes E0 through FF, as well as perhaps undocumented/unreported values between 00 through DF. The unassigned codes (gray rows) in the table beginning at page 116 of VESA MCCS v3 (mccsV3.pdf) in chapter "11 VCP Code Index"

Keep figuring this out along with me!
....This is to create an app that turns on/off LightBoost.
-- Turning on LightBoost via a hotkey
-- Turning on LightBoost via an easy click on system tray
-- Turning on LightBoost automatically only for certain games, etc.

To make LightBoost easy and more convenient, without INF or REG files, even without an emitter.

I have already trying it. Please look the txt-files. No different values @ vcp 01 e0 --> 01 ff
 
Last edited:
I think LightBoost must be enabling via an DDC/MCCS signal to active one flag. The VESA coordinated video timings (CVT-R) are the same between LB on or LB off.
When my BenQ activate LightBoost (green LED is on) i cant turn LightBoost no more off. Systemrestart or disconnect the dvi-cable has no effect. LightBoost will be automtically on when the monitor is running @120hz. LightBoost will still active when you connecting the dvi-cable to a second pc with a ATI graphiccard.
For deactivating LightBoost I must power off the monitor.
Interesting, that would prove that it's not a video signal-controlled feature. LightBoost remains enabled with ATI, if you swap plugs between nVidia to ATI! nVidia would not be happy about that, however... (Even though we are only simply wanting improved convenience of LightBoost, even for nVidia cards)

I have already trying it. Please look the txt-files. No different values @ vcp 01 e0 --> 01 ff
Oh now I looked closely: You did all 256 commands with LB ON/OFF. That would have been a lot of manual work -- that means you manually typed commands 512 times! I did however, notice two differences, when I downloaded and ran a WinMerge on the two files (Popular source code diff utility, installed the free WinMerge tool, selected both files in Windows Explorer, right clicked and selected "WinMerge" -- it then instantly displays a color-coded list of differences between two text files) -- just helping out readers, this tool might speed up future research.

LightBoost OFF
*********************************************
Command: 01 0b
Answer: 6E 88 02 00 0B 01 5D DC 00 01 3E
*********************************************
Command: 01 10
Answer: 6E 88 02 00 10 00 00 64 00 10 D0
*********************************************

LightBoost ON
*********************************************
Command: 01 0b
Answer: 6E 88 02 00 0B 01 5D DC 00 02 3D
*********************************************
Command: 01 10
Answer: 6E 88 02 00 10 00 00 64 00 64 A4
*********************************************

That said, 0x0B is color temperature and 0x10 is luminance. They would have nothing to do with LightBoost, methinks. (hmmm, custom color temperature easter egg? Let me try.)
IDEAS:
Maybe try writing values higher than 100 for Luminance -- maybe it's a special Luminance setting?
Maybe try writing unusual values for Color Temperature -- maybe it's a special Color Temperature setting?

Perhaps it might be some more complex write-only VCP command that can't be read, or it might be yet another, more complicated, undocumented DDC/CI mechanism than the industry standard MCCS method. For example, maybe a different prefix than 01h and 03h .... If only we knew! We can keep experimenting, perhaps create an Arduino unit to snoop the DDC channel (see Hacking Displays Made Interesting), so that we have output that allows us to determine the signalling mechanism needed to turn on LightBoost. Alas, I've got little time to create this as I would prefer to debug my input lag meter first, and finish my motion tests, so maybe some other volunteer could create the Arduino circuit to snoop the DDC/CI channel?

[edit] Thread Post #500 FTW!
 
Last edited:
The BENQ XL2411T and the Asus VG248QE are the 1ms lightboost2 monitors with the tightest testing ~ zero blur results as far as I've seen in posts. Reviews still coming in on the Asus VG248QE which has the same panel as the BenQ but might have different circuitry. I posted some questions in the Asus VG248QE thread asking for some comparisons between the two on various points (like crimson tint?, brightness, inversion artifacts, color saturation, input lag etc during LB2 sync'd 2D mode)
 
Based on the figures and reports of eye-strain, I'll probably be giving this a miss.

VG278h - 2ms strobe = 24% duty cycle @ 120hz
XL2411t - 1.4ms strobe = 17% duty cycle @ 120hz

A duty cycle of <40% at 180hz PWM is considered bad. The above figures would cause more eye-strain than the worst PWM LED display at 0% brightness.
 
How does that compare to a fw900 crt at 85hz and higher? because that is awesome outside of the other crt tradeoffs (geometry, size, calibration tinkering, etc).
How does that compare to people using 3D gaming with glasses?
How much eyestrain do you get when your locked on gaze sees the entire viewport go blurry every time you move your fov (your eyes always try to focus away blur, I know mine do and its instant).
FoV blur may even contribute to some people actually getting sick feeling from FoV movement +FoV blur on some FoV-cinematic games, and games with the FoV tied to the character's head animation cycles, etc.
 
The BENQ XL2411T Crimson Fix

Zero crimson tint with BENQ LightBoost!
On the subject of BENQ color calibration, I cancel out the crimson tint via nVidia Control Panel, by doing the following:

Via Monitor Menus
Monitor's Contrast Setting = 65 .... (If using nVidia Gamma 1.10)

Via nVidia Control Panel - "Adjust Desktop Color Settings"
Adjust them individually, R, G, B.
---
R Contrast = 30%
G Contrast = 45%
B Contrast = 30%
---
R Brightness = 10%
G Brightness = 40%
B Brightness = 10%
---
R,G,B Gamma = 1.10

Make sure to readjust Monitor OSD contrast everytime you change Gamma. nVidia Gamma 1.10 worked best with a monitor OSD Contrast of 65.
With these settings, I can see the difference between RGB(0,0,0) and RGB(1,1,1), without a wrongly tinted color in backgrounds or in dim colors.

Test Patterns Used
-- Lagom Contrast Pattern
-- Lagom Black Level
 
Last edited:
Based on the figures and reports of eye-strain, I'll probably be giving this a miss.
---
VG278h - 2ms strobe = 24% duty cycle @ 120hz
XL2411t - 1.4ms strobe = 17% duty cycle @ 120hz
---
A duty cycle of <40% at 180hz PWM is considered bad. The above figures would cause more eye-strain than the worst PWM LED display at 0% brightness.
Depends on what your eyestrain is caused by. CRT has similar duty cycles with phosphor. Eyestrain is different in games (fast motion) versus desktop (static objects). During videogames, three people reported less eyestrain with LightBoost than without, because of zero motion blur. You can turn on/off LightBoost after exiting games.

Although several reported eyestrain, others reported none.
I had some VERY surprising reports that someone had less eyestrain playing videogames with LightBoost than without.

But this isn't applicable to everyone.
Are your eyes bothered more by flicker?
Are your eyes bothered more by motion blur? (Focussing muscles getting tired trying to focus well on motion-blurred objects)
....Everyone seems different!

Eyestrain during CRT is very common, depending on the person.
As you remember, back in the CRT days, some people still even had eyestrain with a 120Hz CRT.

2d Lightboost has less eye strain than when my LED monitors are in default mode.
^^^^ This. (and multiple other reports!)
And myself -- I don't have eyestrain differences.

Also.... 3D shutter-based stereoscopic is almost universally MORE STRESSFUL on eyes than using 2D LightBoost

If your eyes are not bothered by CRT, or you have already done nVidia 3D Vision 2 before (shutter glasses), then you've already got worse flicker exposure already, and that LightBoost 2D zero motion blur will probably not bother you. You won't know unless you've tried it yourself.
____

Every reader -- as a good favour -- if people ask "Does LightBoost give you eyestrain?" on any Internet forum, tell them the widely-agreed truth "It depends on the person" (especially if eyes are used to CRT) and link to this post (http://hardforum.com/showpost.php?p=1039557336&postcount=506) to be fair that LightBoost eyestrain does not affect everyone. It is very person-dependant. It is CRITICAL to put it in perspective: LightBoost is designed for 3D, and the 3D eyestrain is worse (due to 60 Hz flicker per eye) than 2D LightBoost (which flickers at a far more comfortable 120Hz). Also people have long had eyestrain with CRT's before the days of LCD. Some young game players have never used a CRT, which display PWM-like strobe flashes (phosphor illuminate and decay) to people eyes since 1940's and 1950's. Everyone and their eyes is different, different people have different flicker sensitivity. Some people can't stand fluorescent light flickers, other times it doesn't affect them. Therefore, it is wrong to universally say that everyone will have eyestrain -- "it depends". Also LightBoost can be turned off. Also, most of the LB monitors have great 120Hz modes, even without LB. It is important to objectively observe this, if you are in the market anyway for 120Hz, and want to have a bonus feature handy to try out, since LightBoost is a 'free' feature built into more than half year-2012 and year-2013 model 120Hz TN monitors, that can be turned on and off as needed.
 
Last edited:
Crimson tint fix is great news. One of the things that would really have bothered me a lot about these. Thanks.
 
Crimson tint fix is great news. One of the things that would really have bothered me a lot about these. Thanks.
Yep. My BENQ LightBoost now looks clearly superior to the ASUS VG278H LightBoost in almost every single category now (including color). Vertical color shift is less objectionable with BENQ XL2411T than with ASUS VG278H. Contrast is duller with LightBoost, but all things equal, the BENQ now appears to look better. However, I have not yet re-calibrated my colors during ASUS VG278H LightBoost for a more apples-to-apples calibrated comparison, and that is something I shall do next.
 
Last edited:
Yep. My BENQ LightBoost now looks clearly superior to the ASUS VG278H LightBoost in almost every single category now (including color). Contrast is duller with LightBoost, but all things equal, the BENQ now appears to look better. However, I have not yet re-calibrated my colors during ASUS VG278H LightBoost for a more apples-to-apples calibrated comparison, and that is something I shall do next.

Interesting news. How is the eyestrain from PWM flicker on the 11T BenQ during Lightboost? I guess I never used mine long enough to test that. I know I am sensitive to PWM flicker. I set my Catleap at 100% brightness for the fastest Hz and have to dim the image down via NVIDIA control panel. Not the best solution but its better than eye-strain.
 
Interesting news. How is the eyestrain from PWM flicker on the 11T BenQ during Lightboost? I guess I never used mine long enough to test that. I know I am sensitive to PWM flicker. I set my Catleap at 100% brightness for the fastest Hz and have to dim the image down via NVIDIA control panel. Not the best solution but its better than eye-strain.
No, I haven't noticed any eyestrain.

I can tell the display is flickering if I move my eyes around fast (the stroboscopic effect / phantom array effect, explained at bottom of Science & References)

I sometimes get surprising reports. More eyestrain with 360Hz non-motion-optimized PWM, than with 120Hz motion-optimized LightBoost. At least when looking at motion. Something to do with eye-focussing muscles getting tired trying to focus on motion-blurred objects. Then turning off LightBoost when going back to desktop.

Other people say they hate LightBoost. Invariably most (not all) also hated CRT, too. It truly depends on the person.
CRT also behaves like PWM on your eyes at the refresh rate (60Hz, 85Hz, etc) although a softer curve with the phosphor illuminate-and-decay cycle, rather than the digital LightBoost on-and-off pulses. Monitor manufacturers should ideally test out a PWM-softening by using a simple capacitor. That might make PWM more comfortable.

I get more eyestrain with the ASUS VG278H's vertical color shift, for example. It's rather annoying, compared to my XL2411T. Which is too bad, because the ASUS is bigger, and the BENQ is smaller. Gotta choose. I have both side by side, and can't decide which one becomes my Primary Monitor. (I game with only 1 monitor) Eventually I'll relocate one of the monitors...

Most PWM in LCD's are constant Hertz at all refresh rates. PWM stands for Pulse Width Modulation. The width of the flash changes when you change brightness. Once your brightness is 100%, the width of the pulse touches the next pulse, so there's no black gap between the pulses. That is probably what is happening when you adjust your Catleap to 100%. The problem with setting to 100% and adjusting nVidia downwards, is that you will get a lower contrast ratio because your LCD blacks will stay the same. Some people use a neutral density filter instead, but that can be a problem too (glossy sheen, etc).
 
Last edited:
Yup, the contrast ratio does suffer. Although during the day I leave the NVIDIA settings untouched and run the monitor at full brightness. I just have to worry about the lower contrast at night when I need to adjust the screens brightness while leaving the Catleap itself set at 100% brightness.
 
This or the Benq XL2411T?

They will be pretty much identical but the Asus is available in the US and for a pretty cheap price IMO. I am interested in playing around with the control panel settings that Mark came up with to get rid of the color tint when in LB mode. Since the screen is so cheap I will most likely also convert it to a gloss screen with a real protective layer on it and not just leave the polarization layer unprotected.
 
They will be pretty much identical but the Asus is available in the US and for a pretty cheap price IMO. I am interested in playing around with the control panel settings that Mark came up with to get rid of the color tint when in LB mode. Since the screen is so cheap I will most likely also convert it to a gloss screen with a real protective layer on it and not just leave the polarization layer unprotected.

Guess I'm concerned with Asus quality control. Uniformity and all that jazz. I have a cheap Asus TN now that I've been using. Mainly purchased due to budget and all that. But with my tax refund and stuff I'm looking to spring a bit more for a good monitor with 3D and 120hz.

Annoys me that the XL2411T isn't available in the US yet. I was almost going to buy the XL2420T as well (which is a lot more for some reason).

....damn out of stock now.
 
Annoys me that the XL2411T isn't available in the US yet. I was almost going to buy the XL2420T as well (which is a lot more for some reason).

Hmm I was going to say you should check the prices again but I see now that there is an actual price difference of 120 euros between the 11T and 20T unlike the US were they are similar. After I factor in shipping the price difference of importing an 11T is similar to the 20T which suggests the 11T is specifically designed to cater to the EU market by offering a 20T at a lower price point.

The tradeoff for the lower price point in the EU market is that the 11T doesn't have an S-Switch. It also lacks a DisplayPort, audio port and the additional HDMI/USB ports the 20T has.






The BENQ XL2411T Crimson Fix

Zero crimson tint with BENQ LightBoost!
On the subject of BENQ color calibration, I cancel out the crimson tint via nVidia Control Panel, by doing the following:

Via Monitor Menus
Monitor's Contrast Setting = 65 .... (If using nVidia Gamma 1.10)

Via nVidia Control Panel - "Adjust Desktop Color Settings"
Adjust them individually, R, G, B.
---
R Contrast = 30%
G Contrast = 45%
B Contrast = 30%
---
R Brightness = 10%
G Brightness = 40%
B Brightness = 10%
---
R,G,B Gamma = 1.10

Make sure to readjust Monitor OSD contrast everytime you change Gamma. nVidia Gamma 1.10 worked best with a monitor OSD Contrast of 65.
With these settings, I can see the difference between RGB(0,0,0) and RGB(1,1,1), without a wrongly tinted color in backgrounds or in dim colors.

Test Patterns Used
-- Lagom Contrast Pattern
-- Lagom Black Level

Thank you very much for your diligence and hardwork.
 
Last edited:
Hmm I was going to say you should check the prices again but I see now that there is an actual price difference of 120 euros between the 11T and 20T unlike the US were they are similar. After I factor in shipping the price difference of importing an 11T is similar to the 20T which suggests the 11T is specifically designed to cater to the EU market by offering a 20T at a lower price point.

The tradeoff for the lower price point in the EU market is that the 11T doesn't have an S-Switch. It also lacks a DisplayPort, audio port and the additional HDMI/USB ports the 20T has.

The lack of some connections and stuff doesn't bother me, I just want the better display. The XL2411T seems to be the better one, 1ms and all that (if that matters).
 
The lack of some connections and stuff doesn't bother me, I just want the better display. The XL2411T seems to be the better one, 1ms and all that (if that matters).
FYI -- there are many people doubting 1ms. The 1ms benefit exists and is very apparently clear.
Although 1ms is an exaggeration in /some/ ways, it has truth since the improvement is clearly apparent in reduced crosstalk between refreshes.
This is important if you're a stereoscopic 3D user.

Surprisingly, 1ms actually does matter at least indirectly -- it results in less crosstalk between refreshes since the pixels appear at least 99.5%+ finished refreshed before the next refresh.

It's quite obvious for black-and-white moving edges and text. This results in much better 3D stereoscopic usage, and this also translates to the elimination of the faint double-image effect during LightBoost.

I own BENQ XL2411T (1ms), the Samsung 245BW (older 2ms), and the ASUS VG278H (newer 2ms). Motion tests really clearly show the benefit of the incremental improvements in technology. Although we really can't tell 1ms directly, it matters because the pixels are more finished for 3D stereoscopic. Less leakage between eyes. And more finished for backlight strobes (backlight flashes during a completely refreshed frame -- the pixel persistence is kept in total darkness while waiting for pixels to finish transiting -- see high speed video of LightBoost bypassing pixel persistence as motion blur limiting factor). The faster the pixels are finished, and the faster that RTC can finish rippling/boucning/overshooting, in total darkness, before the backlight is strobed -- the better.

The 1ms panels make a clearly noticeable difference for stereoscopic 3D (virtually no leakage between eyes)
The 1ms panels with LightBoost enabled allow the "1-pixel-gap" PixPerAn chase test -- CRT territory -- it's quite neat to see this happening on an LCD.
The VG278H can do it too, but there's some faint crosstalk artifact that's virtually invisible on the XL2411T.
 
Thank you very much for your diligence and hardwork.
I just got information that this may be panel-dependant. Some panels required a very different adjustment than this. So just want to add -- the settings I've posted, may not be a "one size fits all". More testing will need to be known if it truly varies from panel to panel, or only for specific manufacturing runs, etc.
 
absolutely love this thread, I've been struggling with motion blur on an older tv/monitor samsung lcd for a while now since I started back into gaming with - CS:GO.

Attempting to focus on a moving body over and over while trying to shoot has gotten tiring and frustrating. Going to buy one of these monitors and free myself from bondage.
 
Back
Top