ASUS/BENQ LightBoost owners!! Zero motion blur setting!

Interested in purchasing the VG248QE monitor... is it guaranteed lightboost for 1920x1080p?
Yes, LightBoost definitely will work assuming you have a recent nVidia card supporting 3D Vision, use the full nVidia drivers with 3D Vision (even if you don't use stereoscopic 3D), and a supported version of Windows (Vista, 7 or 8).
 
I must say this, I was skeptic and unsure as well before purchasing this monitor, specially that I had to order it over seas and pay the shipping accordingly. Will I be able to run all games with LB no issues? Is it guaranteed/stable, Will I be limited to something in my games?

NO NO NO, its extremely easy to turn ON specially now with CRU (takes 1min?) and its just working flawless and there is no reason IMO to use 144HZ non LB instead. GAMES/RESOLUTIONS haven't ran into any trouble and im playing some old games that are based on the Q3 engine and Source engine games

YES YES YES, colours are bad if you come from some good IPS monitor, other than that? Don't know what to say that hasn't already been said.
 
I must say this, I was skeptic and unsure as well before purchasing this monitor, specially that I had to order it over seas and pay the shipping accordingly. Will I be able to run all games with LB no issues? Is it guaranteed/stable, Will I be limited to something in my games?

NO NO NO, its extremely easy to turn ON specially now with CRU (takes 1min?) and its just working flawless and there is no reason IMO to use 144HZ non LB instead. GAMES/RESOLUTIONS haven't ran into any trouble and im playing some old games that are based on the Q3 engine and Source engine games

YES YES YES, colours are bad if you come from some good IPS monitor, other than that? Don't know what to say that hasn't already been said.

*EVERY* game will run in lightboost mode, provided you use the CRU and lightboost.bin file, to add the custom timings and remove all of the refresh rates under 100 hz. This will force not only those directX 10 60 hz games (remember Crysis DX10 mode=60 hz?), but also will force even old games, like those Windows 95/98/XP games, to run with lightboost, too.

The DX10 60hz thing, actually has nothing to do with lightboost directly, since you can also remove the same <100hz timings and add non lightboost 100/120/144hz modes (if you didnt care about LB) to stop DX 10 games from running at 60hz, without third party override tools. So you can use CRU.exe even on a CRT, to remove the 60hz modes, for example.

The only thing I've noticed about lightboost is, 60 fps @ 120 hz on a lightboost LCD, does look slightly worse (more double blurred/more distinct double blurred) than 60 fps @120hz does on a CRT. I'm not sure why that is, though. But I've gamed on a CRT for many, many years, and believe me--I can tell the difference.

I'm sure Mark can explain more ,but it's just probably due to inherent differences in the electronics pixel /phospher hardware...after all a LCD isnt a CRT :p
 
I didn't really understand what you meant with DirectX and 60fps/hz etc, I haven't experienced any of this anyway :D Running 120FPS+ all the time in the (SH!TTY) games I play :D

Anyway out of couristy, Do you find it as good/better than the CRT you had?
 
The only thing I've noticed about lightboost is, 60 fps @ 120 hz on a lightboost LCD, does look slightly worse (more double blurred/more distinct double blurred) than 60 fps @120hz does on a CRT. I'm not sure why that is, though. But I've gamed on a CRT for many, many years, and believe me--I can tell the difference.
Very interesting. There are some common-sense theories to explain this. The artifact is intrinsically very similar -- a double-image effect at half frame rate on a stroboscopic display. However, the distinctness of the double-image effect can look slightly different if you are extremely sensitive to these subtleties.

LightBoost have the faint-sharp-ghost effect, and sometimes the scrolling-checker-board-pixel-pattern artifact (worse on VG278HE / XL2420T, than with the VG278H / XL2411T / VG248QE). CRT can have the non-crispness and the green phosphor ghosting artifact. These could possibly help explain part of the differences. If there are worse LightBoost artifacts, that can make the motion harsh looking (amplified LCD inversion artifact -- increased checkerboard pixel patterns, etc). That isn't generally noticeable in games with my VG278H (Contrast=65%, LB=50%) or my XL2411T (Contrast=65%, LB=50%), but I see it more if my VG278H has Contrast=90%, and it's even worse on VG278HE (with the "E"). This can make the

Also, LCD's have essentially 'perfect' geometry (no distortions, same focus all over the surface, etc). The LCD can look more pixelated. More visible aliasing, requiring AA to be turned on more often on LCD than on CRT, to eliminate those pixel jaggies. During darn-near-complete motion blur elimination with LightBoost, the aliasing even remains during LCD motion -- you can see the pixel jaggies during fast motion. This LCD crispness and lack of CRT ghosting can make the double-image effect far more crisp-looking on LightBoost than CRT. If you got a good LightBoost monitor that has less LightBoost artifact (well-calibrated XL2411T at LB10%), than a CRT monitor with green ghosting artifact (e.g. FW900 medium persistence phosphor), there is potentially clearer motion and less motion blur than a CRT. Some people have remarked that LB10% can sometimes outperform the Sony FW900 in terms of motion blur. The sharper the images and motion is, the easier it is to see the double-image effect at higher framerates. Unfortunately, it also makes it easier to see the temporal artifacts (e.g. LCD inversion patterns, like checkerboard pixel patterns). Fortunately, it isn't bad on many new LightBoost monitors...

Different strobe displays can show differences in behaviour of half framerate. For example, 30fps@60Hz on plasma versus CRT. It's still a double-image effect, but on plasma, it looks "noisier" on plasma than CRT because of the dithered subfield refreshes.

So in short, the double image effect 30fps@60Hz which also manifests at 60fps@120Hz is better looking on CRT because it's a cleaner and softer double-image, caused by the lack of temporal artifacts (e.g. plasma subfields, LCD inversion, checkerboard pixel pattern, etc) and CRT's natural soft focus/antialiasing and slight amount of ghosting. Even so, the half-framerate artifacts still look more similiar to CRT behavior than with traditional non-LightBoost LCD 120 Hz.

I didn't really understand what you meant with DirectX and 60fps/hz etc, I haven't experienced any of this anyway :D Running 120FPS+ all the time in the (SH!TTY) games I play :D
Not everyone can really closely notice the subtle artifacts of half frame rate operation (30fps@60Hz, or 60fps@120Hz). The most common artifact of half framerate operation on flicker displays, is the double-image effect that occurs on flicker displays (CRT, plasma, lightboost), that's the easiest part. The subtle differences are harder to see in gameplay, but some people really pay darn close attention to subtle motion behavior, like staring close at the screen and examining motion test patterns in PixPerAn, even though you may not always see most of them during normal gameplay..

____

Multiple Edge Motion Artifact

Example: The double-edge motion artifact at 30fps @ 60Hz on a CRT during VSYNC ON operation

For flicker displays
The number of copies of edges in motion, is (flickers / fps). As a rule of thumb, the faster the motion, the easier it is to see the multiple edges (until a certain point where the motion is too fast for your eyes to track reliably). Also the higher the numbers (e.g. higher Hz), the faster the motion needs to be in order to notice the artifact. Also, it's much easier to see the multiple edge artifact if you turn VSYNC ON, than when turning VSYNC OFF.
-- CRT -- If you're consistently running half framerate of your refresh (flickers), that's 2 flickers per refresh, causing a double-edge effect (e.g. 30fps@60Hz, 60fps@120Hz). If you're consistently running at third framerate of your refresh, that's 3 flickers per refresh, causing a triple-edge effect (20fps@60Hz, 40fps@120Hz).
-- LightBoost strobes -- LightBoost strobes at 120 Hz by default, and if your fps is 60, then you get 2 flickers per refresh, leading to a double edge effect. LightBoost refresh rate is less flexible than CRT monitor refresh rate, only a 100Hz to 120Hz range if you want the stroboscopic feature.
-- LCD Brightness PWM strobes -- Likewise, you can get multiple-edge motion artifacts even at 120fps@120Hz on non-LightBoost when Brightness=0% due to the PWM artifact (flicker modulation to generate dimmed brightness). For example, 360Hz PWM. You are getting 3 strobes per refresh, so you get a triple edge artifact (see LCD Motion Artifacts 101).

For non-flicker displays:
Half framerate operation on non-flicker displays (e.g. 60fps@120Hz on non-LightBoost) can often look identical or almost identical to 60fps@60Hz, because the sample-and-hold nature shows the same refresh for exactly the same amount of time each. Differences can play in the differences in pixel response time (e.g. different RTC at 60Hz than 120Hz, faster LCD refresh leading to poorer colors, etc), but otherwise 60fps@120Hz woud look identical to 60fps@60Hz in terms of motion blur and artifacts.

(Interesting side note: People have reported flicker eyestrain for CRT, for PWM or for LightBoost. But for some people, it only happens under certain conditions. Some people don't get eyestrain with CRT flicker 120Hz but get eyestrain with PWM flicker 360 Hz. This means your eyes might be more bothered by multiple-edge motion artifact, than by flicker. Keeping LightBoost at 120fps@120Hz will eliminate these artifacts, and become like CRT 120Hz, reducing eyestrain if your eyes are the type that prefer CRT flicker over PWM flicker, especially if you reduce LightBoost OSD percentage downwards to avoid brightness-related eyestrain. So some people report more eyestrain, while other people report less eyestrain, depending on what their eyes are bothered by.)
 
Last edited:
Very interesting. There are some common-sense theories to explain this. The artifact is intrinsically very similar -- a double-image effect at half frame rate on a stroboscopic display. However, the distinctness of the double-image effect can look slightly different if you are extremely sensitive to these subtleties.

LightBoost have the faint-sharp-ghost effect, and sometimes the scrolling-checker-board-pixel-pattern artifact (worse on VG278HE / XL2420T, than with the VG278H / XL2411T / VG248QE). CRT can have the non-crispness and the green phosphor ghosting artifact. These could possibly help explain part of the differences. If there are worse LightBoost artifacts, that can make the motion harsh looking (amplified LCD inversion artifact -- increased checkerboard pixel patterns, etc). That isn't generally noticeable in games with my VG278H (Contrast=65%, LB=50%) or my XL2411T (Contrast=65%, LB=50%), but I see it more if my VG278H has Contrast=90%, and it's even worse on VG278HE (with the "E"). This can make the

Also, LCD's have essentially 'perfect' geometry (no distortions, same focus all over the surface, etc). The LCD can look more pixelated. More visible aliasing, requiring AA to be turned on more often on LCD than on CRT, to eliminate those pixel jaggies. During darn-near-complete motion blur elimination with LightBoost, the aliasing even remains during LCD motion -- you can see the pixel jaggies during fast motion. This LCD crispness and lack of CRT ghosting can make the double-image effect far more crisp-looking on LightBoost than CRT. If you got a good LightBoost monitor that has less LightBoost artifact (well-calibrated XL2411T at LB10%), than a CRT monitor with green ghosting artifact (e.g. FW900 medium persistence phosphor), there is potentially clearer motion and less motion blur than a CRT. Some people have remarked that LB10% can sometimes outperform the Sony FW900 in terms of motion blur. The sharper the images and motion is, the easier it is to see the double-image effect at higher framerates. Unfortunately, it also makes it easier to see the temporal artifacts (e.g. LCD inversion patterns, like checkerboard pixel patterns). Fortunately, it isn't bad on many new LightBoost monitors...

Different strobe displays can show differences in behaviour of half framerate. For example, 30fps@60Hz on plasma versus CRT. It's still a double-image effect, but on plasma, it looks "noisier" on plasma than CRT because of the dithered subfield refreshes.

So in short, the double image effect 30fps@60Hz which also manifests at 60fps@120Hz is better looking on CRT because it's a cleaner and softer double-image, caused by the lack of temporal artifacts (e.g. plasma subfields, LCD inversion, checkerboard pixel pattern, etc) and CRT's natural soft focus/antialiasing and slight amount of ghosting. Even so, the half-framerate artifacts still look more similiar to CRT behavior than with traditional non-LightBoost LCD 120 Hz.

Not everyone can really closely notice the subtle artifacts of half frame rate operation (30fps@60Hz, or 60fps@120Hz). The most common artifact of half framerate operation on flicker displays, is the double-image effect that occurs on flicker displays (CRT, plasma, lightboost), that's the easiest part. The subtle differences are harder to see in gameplay, but some people really pay darn close attention to subtle motion behavior, like staring close at the screen and examining motion test patterns in PixPerAn, even though you may not always see most of them during normal gameplay..

____

Multiple Edge Motion Artifact

Example: The double-edge motion artifact at 30fps @ 60Hz on a CRT during VSYNC ON operation

For flicker displays
The number of copies of edges in motion, is (flickers / fps). As a rule of thumb, the faster the motion, the easier it is to see the multiple edges (until a certain point where the motion is too fast for your eyes to track reliably). Also the higher the numbers (e.g. higher Hz), the faster the motion needs to be in order to notice the artifact. Also, it's much easier to see the multiple edge artifact if you turn VSYNC ON, than when turning VSYNC OFF.
-- CRT -- If you're consistently running half framerate of your refresh (flickers), that's 2 flickers per refresh, causing a double-edge effect (e.g. 30fps@60Hz, 60fps@120Hz). If you're consistently running at third framerate of your refresh, that's 3 flickers per refresh, causing a triple-edge effect (20fps@60Hz, 40fps@120Hz).
-- LightBoost strobes -- LightBoost strobes at 120 Hz by default, and if your fps is 60, then you get 2 flickers per refresh, leading to a double edge effect. LightBoost refresh rate is less flexible than CRT monitor refresh rate, only a 100Hz to 120Hz range if you want the stroboscopic feature.
-- LCD Brightness PWM strobes -- Likewise, you can get multiple-edge motion artifacts even at 120fps@120Hz on non-LightBoost when Brightness=0% due to the PWM artifact (flicker modulation to generate dimmed brightness). For example, 360Hz PWM. You are getting 3 strobes per refresh, so you get a triple edge artifact (see LCD Motion Artifacts 101).

For non-flicker displays:
Half framerate operation on non-flicker displays (e.g. 60fps@120Hz on non-LightBoost) can often look identical or almost identical to 60fps@60Hz, because the sample-and-hold nature shows the same refresh for exactly the same amount of time each. Differences can play in the differences in pixel response time (e.g. different RTC at 60Hz than 120Hz, faster LCD refresh leading to poorer colors, etc), but otherwise 60fps@120Hz woud look identical to 60fps@60Hz in terms of motion blur and artifacts.

(Interesting side note: People have reported flicker eyestrain for CRT, for PWM or for LightBoost. But for some people, it only happens under certain conditions. Some people don't get eyestrain with CRT flicker 120Hz but get eyestrain with PWM flicker 360 Hz. This means your eyes might be more bothered by multiple-edge motion artifact, than by flicker. Keeping LightBoost at 120fps@120Hz will eliminate these artifacts, and become like CRT 120Hz, reducing eyestrain if your eyes are the type that prefer CRT flicker over PWM flicker, especially if you reduce LightBoost OSD percentage downwards to avoid brightness-related eyestrain. So some people report more eyestrain, while other people report less eyestrain, depending on what their eyes are bothered by.)

Thank you; that definitely makes sense.
 
Well I have definitely got the 120 Hz / Lightboost bug after getting the BenQ XL2411T. Since I'm still a little underwhelmed by its picture quality, when I saw a refurbished Samsung S23A700D on ebay for a reasonable price I bought it. So, I have some comparative data to share.

The Samsung picture quality is out of the box much better than the BenQ (reasonable gamma, for one) and the difference remains after calibration. At least some of it is the glossy panel, the colors are more vivid. However, the usual caveat about reflections applies - for example right now, browsing this forum, I can see my reflection in the darker side colors. This monitor really requires a light controlled room, while the BenQ is usable everywhere.

After turning on frame sequential 3D, the picture on the Samsung gets quite dim and very very blue. On default settings, there is about 180% blue compared to 6500K target. The maximum luminance I can get out of the monitor in this mode was about 64 cd/m^2, and after turning down green and blue to match the maxed-out red and with a calibrated profile, maximum luminance is just 55 cd/m^2 which is a bit dim, even for me. Ok for gaming in a dark room, but not much else. The BenQ Lightboost is much more usable, and the lightboost/frame sequential 3D color shift evens out the picture quality differences although the Samsung still looks a fair bit more "punchy".

When it comes to settings, I tried the settings for SA950D posted here as a starting point and they turned out to be pretty good when it came to setting contrast to the "correct" value. This is the closest I could get to an accurate picture before creating colour profile with the ColorMunki software:
Red 100
Green 84
Blue 40
Brightness 100
Contrast 26 (increasing this setting will push color balance out of whack since red output is maxed out)
gamma mode1 (with this, gamma bounced around 2.1 - other modes were more off from the 2.2 target)
Magic Angle Off (I tried Group view as recommended, but it made the picture very dark - measured gamma was around 2.8-2.9. Other settings were not really better than the Off setting either)

I tried an input lag tester exe (for the first time, I probably made some beginner error) to compare between the BenQ and the Samsung, and in non-frame sequential mode 120Hz the Samsung was about 0-30 ms behind the BenQ. When I turned on frame sequential 3D, the Samsung dropped to between 20-40 ms behind the BenQ. I think my input lag tool is not up to the task of accurate measurement, but the comparative difference between normal and "lightboosted" seems to indicate changing to frame sequential mode does seem to add some input lag, like reported. This is not noticeable if you are only using the Samsung, but if you use the BenQ for a while and go back to the Samsung, the mouse feels just a little laggy at first. I think it is a pity, the instant response of the BenQ goes very well with the general smoothness of 120 Hz.

The Samsung has little clouding, just small shadows in the corners of the display. The BenQ has quite a lot of clouding, not all of it due to gamma shift. The TN gamma shift on the Samsung is much less than on the BenQ, perhaps due to the slightly smaller size. The blacks are a little better on the Samsung according to colorimeter measurement - around 0.14 cd/m^2 with 105 cd/m^2, and about the same in Frame Sequential mode, while for BenQ the values are somewhere around 0.18 cd/m^2 and rise somewhat when turning on Lightboost. Perceptually the blacks are much more inky on the Samsung, no doubt due to the glossy panel which is similar to that in the Samsung televisions.

Originally I thought I would pick one of the monitors and sell the other one on, but now I'm in the tricky situation where if I sold one of the monitors, I would regret some very nice aspect of its performance compared to the other. So I'm keeping both for the time being.

Oh, speaking of bargain monitor mania, I also have a potentially 120 Hz overclockable Qnix PLS monitor on the way from Korea. If FedEx gets around to delivering it instead of sending it to tours of Asia, I will have a third option with its own strengths and weaknesses to choose from. I think in the end I'm going to need a bigger desk :)
 
Last edited:
I didn't really understand what you meant with DirectX and 60fps/hz etc, I haven't experienced any of this anyway :D Running 120FPS+ all the time in the (SH!TTY) games I play :D

Anyway out of couristy, Do you find it as good/better than the CRT you had?

Some games FORCE The refresh rate to be 60 hz (usually some DX10 games), regardless of what windows/control panel is set to (unless you use third party tools to override).

This was first noticed in Crysis, which when installed on Vista (at the time) and run in DX10 mode, would run at 60 hz, regardless of what monitor you had. CRT users had a ball with that....
 
Oh, speaking of bargain monitor mania, I also have a potentially 120 Hz overclockable Qnix PLS monitor on the way from Korea. If FedEx gets around to delivering it instead of sending it to tours of Asia, I will have a third option with its own strengths and weaknesses to choose from. I think in the end I'm going to need a bigger desk :)
I've been tempted to obtain one of these for testing, since it's so cheap for 1440p 120Hz and new nVidia drivers now finally make it easy to overclock monitors.

That said, I know it will only have, at most, about 50% less motion blur than a regular 60Hz LCD (roughly the same as non-LightBoost 120Hz) while LightBoost 120 Hz can have 85%-92% less motion blur than a regular 60 Hz LCD. If that is all the motion blur reduction you need, and you have other priorities other than games, it's good. Another tough tradeoff to choose from, in order to get better colors of PLS.
 
I've been tempted to obtain one of these for testing, since it's so cheap for 1440p 120Hz and new nVidia drivers now finally make it easy to overclock monitors.

That said, I know it will only have, at most, about 50% less motion blur than a regular 60Hz LCD (roughly the same as non-LightBoost 120Hz) while LightBoost 120 Hz can have 85%-92% less motion blur than a regular 60 Hz LCD. If that is all the motion blur reduction you need, and you have other priorities other than games, it's good. Another tough tradeoff to choose from, in order to get better colors of PLS.

I have both an ASUS lightboost monitor and just got the Qinx monitors (mine overclocks to 110hz without batting an eye and I made a thread about it) and I can tell you in terms of motion there is a difference. However the trade off between the two is massive. One is at a lower resolution , lightboost has a tendency to wash out games and forces you to turn brightness down to zero although it still shows an image that is too washed out.

You end up giving up a serious amount of PQ to obtain the near smoothness of a CRT and I personally don't think its worth it in the end. The higher resolution , far better PQ and easy overclocking Qnix monitors that are currently $318 are cheaper and give you at least some reduced motion blurring benefits and for me that's a better deal.

If you are a motion hound and that's all you care about then a lightboost monitor or a CRT of some kind should be in your focus.
 
Yea, that is why I leaning toward the Tempest Overlord.

Well from what I can tell the Overlord monitors are not superior monitors. They are not certain to hit 120hz (I think its like 85hz is promised at the minimal).

Right at this moment , in this shipping cycle , the Qnix is where its at.
 
You end up giving up a serious amount of PQ to obtain the near smoothness of a CRT and I personally don't think its worth it in the end. The higher resolution , far better PQ and easy overclocking Qnix monitors that are currently $318 are cheaper and give you at least some reduced motion blurring benefits and for me that's a better deal.
Understandable, but don't forget the increased costs of the GPU necessary to get 120fps@120Hz at 1440p. e.g. pull it off in Crysis 3 using a single Titan on a single monitor rather than using a Titan SLI. You can also get the LightBoost monitor de-matted by Vega, if you want the ultimate LightBoost monitor. The tradeoffs certainly have different value to everyone...
 
Well from what I can tell the Overlord monitors are not superior monitors. They are not certain to hit 120hz (I think its like 85hz is promised at the minimal).

Right at this moment , in this shipping cycle , the Qnix is where its at.

All the research I have seen says other wise...120 hz is the norm more than not and the picture is great..this info comes from MORE than one source. I am very picky when it comes to spending 500+ dollars.....still leaning this way....not final though
 
Last edited:
All the research I have seen says other wise...120 hz is the norm more than not and the picture is great..this info comes from MORE than one source. I am very picky when it comes to spending 500+ dollars.....still leaning this way....not final though

However, it is important to note that we here at Overlord cannot and do not guarantee that your monitor will hit a particular rate. Too many factors come into play when trying to OC a Tempest (your rig, your drivers, your ability to modify timings, your cables, your aptitude, your shoe size, etc.)

Taken from Overlord's site. I've actually seen one in action and running at 120hz but not everyone can get that rate as stated and the PQ from what I've witness is not superior to what I have in front of me. Since they are IPS they have the well known IPS glow (my friend's monitor wasn't that bad though , pretty minimal compared to what I've had) and it was glossy (which I don't like since it makes it harder to use with any kind of annoying reflection).

There is no doubt that Overlord provides a prettier package , the monitor's casing is superior no question and that's great but is it $230+ dollars great (no including shipping)? With almost everyone who has bought a Qnix monitor able to hit at least 96hz (the majority faster than that) and many arriving without dead pixels even on non-pixel perfect shipments , the only assurance you are getting from Overlord is that it will turn on basically.

I commend Overlord for offering such good support and a nice active forum for help/questions and feedback. But it seems like you are just paying for more "comfort" than any kind of guarantee. With the Qnix , you simply buy a 3 year SquareTrade warranty and you're covered basically (versus Overlord's one year warranty) if it shits the bed. Although with Overlord you would get a replacement that is guaranteed to overclock well versus what you might get from another warranty from a third party.

I understand the buying of "piece of mind" but it seems like when you almost double the cost of a product , it doesn't often become worth it.
 
Taken from Overlord's site. I've actually seen one in action and running at 120hz but not everyone can get that rate as stated and the PQ from what I've witness is not superior to what I have in front of me.
What's in front of you right now? Curious which 120 Hz display you've settled on.
 
Has there been any mention at all from nvidia's side if they will add it as an option to the drivers?
 
After another evening of use, I think I prefer the BenQ XL2411T to the Samsung S23A700D. After calibration, desktop use is not so much worse than the Samsung (disregarding the matte vs glossy difference) - just a little bit different when viewed side by side. Games are another matter, it feels like at least some of them ignore colour profiles - and without a colour profile the BenQ has too low gamma causing a somewhat washed out look. Clouding is more evident too. But, the immediacy of the experience (low input lag compared to Samsung LB) is much better when playing games, and that feels more important. At least tonight, I may go back to the Samsung tomorrow ;)

I also tried going back to "regular" 120Hz mode when playing games. The feel was quite a bit different when making quick turn, at least on Source Engine games with 120 fps. I guess I never noticed LightBoost actually working until I disabled it... now my eyes are ruined for "normal" LCDs - thanks to mr. Chief Blur Buster's methods :p
 
What's in front of you right now? Curious which 120 Hz display you've settled on.

27" QNIX QX2710 Evolution 2. Its $317 on Ebay , it came pixel perfect , it overclocks to 110hz for me and with my extra SquareTrade warranty its covered for 3 years.

I almost bought an Overlord but I'm frankly glad I didn't. This panel also being a Samsung PLS also gives better contrast than its IPS counterparts.

If you can buy this monitor from hulustar then right now you'll be getting a hell of a deal. There is always a risk , as with any of these Korean knock off monitors but the vast majority of people are very happy with their Qnix's.

I seriously doubt that Qnix will continue to produce 1440p panels with whatever PCB that allows them to overclock so well in the next production cycle after they run out of stock.
 
Remember that Lightboost 2D is for fast paced FPS games. One thing you need to remember is, on a non lightboost monitor, even at 120 or 144 hz, the image texture quality will be substantially degraded by moving or turning in FPS games, since everything becomes blurry. So you lose any benefits you would get from an IPS panel, in FPS games texture quality.

Now Browsing, image creation and much slower paced games are a different story.
 
After another evening of use, I think I prefer the BenQ XL2411T to the Samsung S23A700D. After calibration, desktop use is not so much worse than the Samsung (disregarding the matte vs glossy difference) - just a little bit different when viewed side by side. Games are another matter, it feels like at least some of them ignore colour profiles - and without a colour profile the BenQ has too low gamma causing a somewhat washed out look. Clouding is more evident too. But, the immediacy of the experience (low input lag compared to Samsung LB) is much better when playing games, and that feels more important. At least tonight, I may go back to the Samsung tomorrow ;)

I also tried going back to "regular" 120Hz mode when playing games. The feel was quite a bit different when making quick turn, at least on Source Engine games with 120 fps. I guess I never noticed LightBoost actually working until I disabled it... now my eyes are ruined for "normal" LCDs - thanks to mr. Chief Blur Buster's methods :p

You need to use something like CPKeeper to try to lock your tweaked LUT if you want things to look decent. I know some people don't care about visual quality, but probably most do, and you really can't make the VG248QE look all that good without external calibration. There's RGB gain/offset in the service menu, but it doesn't seem to do anything. My VG248QE looks really good now, but I spent a lot of time with an i1 display pro tweaking things.
 
Remember that Lightboost 2D is for fast paced FPS games. One thing you need to remember is, on a non lightboost monitor, even at 120 or 144 hz, the image texture quality will be substantially degraded by moving or turning in FPS games, since everything becomes blurry. So you lose any benefits you would get from an IPS panel, in FPS games texture quality.

Now Browsing, image creation and much slower paced games are a different story.

I don't really agree. Very rarely do I NOT notice the textural difference even in high paced games.

What your talking about is much more user subjectable experience and less a technical certainty.
 
Well even in a fast paced game you have plenty of times where you slow down or are mostly facing one direction. Of course you will notice the quality of a nice higher density IPS display. However the blurry image is not user subjective unless you have bad vision which of course would kill an IPS display too.

Ultimately though I would rather have a pretty good image I can see even when I am moving around then a great image that is only great when things are slow.

But that does come down to personal preference. The age old argument of increased graphics vs increased speed and where people choose to sit their experience. I highly disagree with it not being a technical certainty godmachine it is a technical certainty we can all pull out pixperan and technically prove the blurry image without any doubt.
 
You need to use something like CPKeeper to try to lock your tweaked LUT if you want things to look decent. I know some people don't care about visual quality, but probably most do, and you really can't make the VG248QE look all that good without external calibration. There's RGB gain/offset in the service menu, but it doesn't seem to do anything. My VG248QE looks really good now, but I spent a lot of time with an i1 display pro tweaking things.
Thanks, that's good information. I'll have to try CPKeeper.
 
I don't really agree. Very rarely do I NOT notice the textural difference even in high paced games.
You two are correct, but for different reasons. Games are not always in motion. You are continually starting and stopping moving at all times, and you aren't always turning.

However, if you enforce the following conditions:
(1) Material is always running 120fps@120Hz. No frame slowdowns, no frame drops, no stutters.
(2) Maximum fluidity (either VSYNC ON -- or framerate massively exceeding Hz eliminating microstutters and to eliminate visible harmonic stutters from framerate-vs-Hz mismatch)
(3) Your motion is sufficiently fast. Especially when moving faster between refreshes than the width of a texel in a texture. e.g. you're using high-resolution textures AND you're moving fast enough. (i.e. not too close too a low-resolution blurry wall, whereupon you won't be able to see the motion blur as much)
(4) Your eyes are continuously tracking as the screen motion pans, rather than always staring at the center of the screen while the screen is in motion.

Then the difference between LightBoost vs non-LightBoost is amplified -- where you were unable to see detail in fast motion, you now see detail in fast motion.

Good test games if you don't have a powerful card: Quake Live, as well as Source Engine games, despite their typically lower resolutions.
They tend to have amplified LightBoost-vs-non-LightBoost difference, compared to newer games such as Crysis (which requires a Titan) because of the tendency of even minor stutters to greatly diminish the CRT/LightBoost motion blur advantage.
 
Last edited:
Another great test game to try is Borderlands 2, which I just started playing this week. This is another great motion blur test case. LightBoost appears to significantly benefit this game, moreso than the average game.

It has those very high-contrast edges, and black lines. It REALLY benefits from LightBoost 120fps@120Hz. I do set framerate to unlimited, and VSYNC ON, for maximum fluidity (and best LightBoost benefit), so that I am able just about more-or-less run 120fps@120Hz most of the time on a lowly GTX 680 (except in the widest open areas). Minor input lag occurs whenever framerate drops to 60fps or less, but otherwise no input lag is felt in solo gameplay during 120fps@120Hz. Mind you, I use VSYNC OFF if I am playing competitively for the reduced lag advantage -- but it adds a bit of microstutter so I prefer VSYNC ON during solo gameplay whenever I can't "feel" the input lag and the framerate drops are reasonably infrequent.

TIP for better Borderlands 2 colors during LightBoost: During solo play (if you don't need to brighten the blacks to the point of degrading color quality): To keep the colors saturated though, monitor OSD contrast is 65%, and you should adjust Borderlands 2 Brightness down to "5". This eliminates the usual LightBoost "gamma bleach" effect, and Borderlands 2 is pretty bright and colorful. Maybe only 80% as colorful as non-LightBoost, but the gamma bleaching effect is gone to my eyes for this game.
 
Since this chart has LB monitors on it-

Found a cool comparison of what pixels look like close up on different monitors and film coatings. I highlighted some notables:

900x900pxll3529558f33kr.jpg


I'll take gloss. ;)
 
I got the tempered glass installed on my two AG removed Asus VG248QE panels. Upon inspection of the glass from the packaging I will say that they would be good for using as a window in a door, but for a display application they were FILTHY. I had to clean the shiit out of them with white amonia until I could get them acceptable.

I used bare latex gloves to avoid smudging and secured them to the naked aluminium display frame with clear packaging tape

The glass looks and fits great, it was worth the money and I am pleased that I did it because it adds piece of mind....However, I am also glad that I only used packing tape because I can see one tiny annoying dog hair from my Chhiiiwaaawaaaahhh that I missed on one of the displays and it shows up during light backgrounds....I also see a few specs of dust on the displays...but for me a few specs of dust are far easier to deal with then that horrid AG coating. My third display will get here soon and I will be running 3x1 Lightboost portrait like Vega.

Safety First


Glass laying over naked polarizer before packaging tape application


Finished product


 
Since this chart has LB monitors on it-

Found a cool comparison of what pixels look like close up on different monitors and film coatings. I highlighted some notables:

900x900pxll3529558f33kr.jpg


I'll take gloss. ;)

Oh, would you look at that. Thank for making my life easier. Just take a look please at how "orange" some reds are, at how "yellowish" some greens are, and how washed out some blues are. You can conclude for yourself that blood on those panels will look orange, grass will never be spring green, and sky will never look summer blue.

I'll go with gloss myself.
 
Since this chart has LB monitors on it-

Found a cool comparison of what pixels look like close up on different monitors and film coatings. I highlighted some notables:
Good comparision. I've seen that a month ago on a russian review website (overclockers.ru) which reviewed the latest LightBoost monitors, VG248QE and XL2411T, and was impressed by the benefits of LightBoost.

I got the tempered glass installed on my two AG removed Asus VG248QE panels.
That looks good! Do you see any double-reflection effect in a totally dark room, due to the internal reflection on both the rear/front sides of the tempered glass?
 
Good comparision. I've seen that a month ago on a russian review website (overclockers.ru) which reviewed the latest LightBoost monitors, VG248QE and XL2411T, and was impressed by the benefits of LightBoost.

color me confused - I thought you were one of the main people behind lightboost testing. And as recently as a month ago you only learned about its benefits?

Also, if I'm reading the graphic Vega posted, there are only a couple gloss displays - are they LB capable?

(I feel as if I'm missing something here!)
 
color me confused - I thought you were one of the main people behind lightboost testing. And as recently as a month ago you only learned about its benefits?

Also, if I'm reading the graphic Vega posted, there are only a couple gloss displays - are they LB capable?

(I feel as if I'm missing something here!)

He is saying the people from the Russian review website were impressed by lightboost.
 
Good comparision. I've seen that a month ago on a russian review website (overclockers.ru) which reviewed the latest LightBoost monitors, VG248QE and XL2411T, and was impressed by the benefits of LightBoost.

That looks good! Do you see any double-reflection effect in a totally dark room, due to the internal reflection on both the rear/front sides of the tempered glass?

No, there is no image distortion with the application of the glass. I would say it was a 100% success. The glass was a bit dusty & smudgy when I got it, but after a thorough cleaning it is excellent. These displays are much, much better with the AG removed and the glass just adds some piece of mind so you dont have to worry about spitting coffee all over your unprotected polarizer ;)
 
You need to use something like CPKeeper to try to lock your tweaked LUT if you want things to look decent. I know some people don't care about visual quality, but probably most do, and you really can't make the VG248QE look all that good without external calibration. There's RGB gain/offset in the service menu, but it doesn't seem to do anything. My VG248QE looks really good now, but I spent a lot of time with an i1 display pro tweaking things.

This might be a dumb question but do I have to enable Start With Windows, apply profiles at startup and minimize to tray in the CPKeeper setting menu for the program to work correctly or can I have it closed and it will still lock my tweaked LUT ?

Thanks
 
This might be a dumb question but do I have to enable Start With Windows, apply profiles at startup and minimize to tray in the CPKeeper setting menu for the program to work correctly or can I have it closed and it will still lock my tweaked LUT ?

Thanks

It needs to be running, so minimize it to tray.
 
Mark you mind sharing your NVIDIA Control Panel settings? Values for RGB Brightness Contrast Gamma and Digital Vibrance? Thanks
 
This might be a dumb question but do I have to enable Start With Windows, apply profiles at startup and minimize to tray in the CPKeeper setting menu for the program to work correctly or can I have it closed and it will still lock my tweaked LUT ?

Thanks

Yes, click the start and minimized buttons and voila, you're using your desktop colours in 3D.

I'm tweaking with my AMD control panel and I've found I get very nice colours in 3D; vibrant. While having the brightness I require.

It's so good now, I'm actually hard pressed to really remember default settings.
 
Back
Top