sadbuttrue
Gawd
- Joined
- Feb 16, 2008
- Messages
- 519
H - Built-in emitter or comes with 3d Vision kit
HE - requires 3D Vision kit for PC only use
HE - requires 3D Vision kit for PC only use
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Yes, that's correct. That's the flicker from the fluorescent lights.I don't understand german but I want to make sure you are aware that capturing a flickering monitor source with video camera recorded frames results in out of sync timing between the two, which is why crts look like they have bands cycling on them or pulsing when you see them in internet videos and on tv news reports, etc. In that video the entire scene of the reviewer and room that monitor is in is a pulsing and flickering thoughout much of that video.
From a computer/game usage perspective, there are currently some rather severe limitations -- they all currently have massive input lag, since it combines backlight scanning/strobing with interpolation. Couple that with the usual situation that HDTV's even in Game Mode tend to be worse at input lag than computer displays.sorry to ask the same question again.
how well does this work on TV sets like the Sony line with CMR960 or w/e they call it.
No, they do not have to be evenly spaced.TV's have always been the realm of high input lag and various other silly electronics that make them unfit for computer gaming.
Mark, I was thinking about 60 Hz 4K displays. Granted 60 Hz motion won't be that great with it's ~16 ms sample and hold, but what if you flashed the backlight twice inside that window instead of once? That would really help with the annoying visible flicker. I am sure even with IPS on a 4K screen there can be time in a 16 ms frame to flash a couple of pulses. Do the pulses have to be evenly spaced or if they are staggered to fit them into non-pixel transition time does that screw with the brain?
From a computer/game usage perspective, there are currently some rather severe limitations -- they all currently have massive input lag, since it combines backlight scanning/strobing with interpolation. Couple that with the usual situation that HDTV's even in Game Mode tend to be worse at input lag than computer displays.
Also, 60 Hz scanning/strobing is rarely done because of the annoying flicker. Not very many scanning backlights in HDTV's have a mode that excludes interpolation (bad for input lag). However, there's a Motionflow "Impulse" setting that does this in certain newer Sony HDTV's. This "Impulse" setting in Sony HDTV"s would probably be the most videogame-compatible strobe backlight mode available (At the current moment). There's supposed to be less input lag in this mode, but very little tests have been done. It flickers at a nasty 60Hz, though, but it would be fully console compatible, assuming its input lag was adequate. Impulse is non-interpolated.
See Existing Technology for more info about Samsung/Sony/Panasonic HDTV's with scanning backlights.
Searches:
- Motionflow Inpulse Setting
- Review of Sony KDL-55HX853 commenting about Impulse
If anyone wants "LightBoost" equivalent in a HDTV set, then the Sony Motionflow "Impulse" setting (non-interpolated) in recent high end sets with the MotionFlow XR 960 feature, is your best bet at this stage, but the input lag is probably approximately 2 frames (untested though!) and you will get horrendous 60 Hz flicker. It will probably result in less motion blur than a 120Hz computer monitor without a strobe backlight, but not fully zero motion blur due to the longer strobes (~4ms-ish). Input lag and flicker would probably be pretty bad but potentially usable in a partially darkened room. If someone wants to test this out, be my guest -- the Blur Busters Blog would love to cover testing of this feature once the input lag meter software is fixed. The goal is to simply bring in my laptop with the lag meter, retractable HDMI cable & a universal remote, into a Best Buy, and measure a few TV's at various strobe/scanning backlight settings, hopefully not too long after I've released my motion tests.
Note: The LightBoost HOWTO is not applicable. Just configure the Sony HDTV set's menu and change the Motionflow setting to "Impulse" (only found in certain high end models) and you're ready to go, for any 60 Hz computer or game console, with probably worse input lag than a computer monitor, but much less input lag than interpolation. You will have less motion blur than a non-LB 120Hz monitor, but worse motion blur than a LB 120Hz monitor.
Enterprising people can bring their game console to a store that sells the higher-end Sony "960" HDTV's, and try out the Impulse setting and report back. You won't be able to go into Game Mode with the Impulse setting (as far as I know), but it would be good for someone here to give it a test go, to see. Be my guest & please post back!
In that case, then input lag doesn't matter all that much. You can then also tolerate some high-quality interpolation, at least for video-based material.I'm actually more interested in TV/Movie use
No, they do not have to be evenly spaced.
However, this wrecks MPRT significantly. It would be equal to the leading edge of the first strobe, to the trailing edge of the last strobe. So if you had 2ms strobes spaced 8ms apart, your MPRT would be 10ms (2ms first strobe + 6ms black gap + 2ms final strobe). That's 5 times worse trailing-edge artifact than the 2ms strobes itself, and 2.5 times worse trailing-edge artifact than just putting the strobes together (one 4ms strobe per refresh). However, the reduction in flicker can win out because people are just simply so annoyed at 60 Hz flicker. This could be an additional "custom backlight mode".
MPRT of PWM-based backlights (multiple strobes per refresh) are usually a few milliseconds lower (e.g. 14ms) than steady backlights (e.g. 16ms). This is because of that tiny gap of blackness between the last strobe of the previous refresh, and the first strobe of the next refresh.
(MPRT = industry standard measurement called Motion Picture Response Time; also measures sample-and-hold lengths, not just pixel persistence)
The long-term solution is for all video formats to go to 120Hz or 240Hz native (no interpolation) or beyond in the next 50+ years. (How about the japanese NHK 8K 120hz format, anyone?) It would solve so many problems with motion blur, flicker, game-friendliness, computer-friendliness, and avoid interpolation, all at once.
I tried lightboost out today with my ViewSonic V3D245 (they don't make them anymore).
I was really surprised how much of a difference it makes. However, my eyes did get tired after a while just like they used to with CRT. With normal LCD my eyes never fatigue.
But it is still totally worth turning lightboost on while gaming.
Actually, it could be quite real. Undocumented or poorly-advertised LightBoost apparently exists!Sure it wasn't just placebo? From what I can see, the V3D245 doesn't support LightBoost (Nvidia Vision 2). Maybe it does and simply isn't documented, although that would be odd. It could be that 3D mode forces the ViewSonic into a more aggressive level of overdrive, which would indeed make a large difference in games.
Are you talking about the Capabilities String? No, it doesn't work that way. That's an incoming (received) capabilities string, and not an outgoing (sent) string. It stays constant no matter what monitor setting is, it just tells the software what MCCS settings are supported.So what happens if I use that string with an ati card?
Doesnt the string change when you disable 3dmode and thereby disable lightboost?
Why not test that, or am I thinking too easy here?
Thanks for pitching in some help!I tried it on BenQ XL2420T. No different VCP values with Light-Boost on or off.
Light-Boost on.txt
Light-Boost off.txt
Theoretically, sending the correct MCCS values *might* trigger LightBoost, if it's an MCCS controllable setting. That is, if LightBoost is not controlled by another means (e.g. embedded timing parameter -- such as a Timings Flags bit, in the same byte normally used to configure polarity and sync-on-green, etc. This would not be controllable via MCCS, but theoretically controllable via PowerStrip API, assuming the graphics card was still supported)
Thanks for pitching in some help!
Try testing undocumented/unreported VCP values. To read the value of VCP E0, you need to read them manually by typing "01 E0" in the command line, and seeing the output. Repeat the command for all unassigned codes. For both LB on and LB off. Do this for both the manufacturer assigned codes E0 through FF, as well as perhaps undocumented/unreported values between 00 through DF. The unassigned codes (gray rows) in the table beginning at page 116 of VESA MCCS v3 (mccsV3.pdf) in chapter "11 VCP Code Index"
Keep figuring this out along with me!
....This is to create an app that turns on/off LightBoost.
-- Turning on LightBoost via a hotkey
-- Turning on LightBoost via an easy click on system tray
-- Turning on LightBoost automatically only for certain games, etc.
To make LightBoost easy and more convenient, without INF or REG files, even without an emitter.
Interesting, that would prove that it's not a video signal-controlled feature. LightBoost remains enabled with ATI, if you swap plugs between nVidia to ATI! nVidia would not be happy about that, however... (Even though we are only simply wanting improved convenience of LightBoost, even for nVidia cards)I think LightBoost must be enabling via an DDC/MCCS signal to active one flag. The VESA coordinated video timings (CVT-R) are the same between LB on or LB off.
When my BenQ activate LightBoost (green LED is on) i cant turn LightBoost no more off. Systemrestart or disconnect the dvi-cable has no effect. LightBoost will be automtically on when the monitor is running @120hz. LightBoost will still active when you connecting the dvi-cable to a second pc with a ATI graphiccard.
For deactivating LightBoost I must power off the monitor.
Oh now I looked closely: You did all 256 commands with LB ON/OFF. That would have been a lot of manual work -- that means you manually typed commands 512 times! I did however, notice two differences, when I downloaded and ran a WinMerge on the two files (Popular source code diff utility, installed the free WinMerge tool, selected both files in Windows Explorer, right clicked and selected "WinMerge" -- it then instantly displays a color-coded list of differences between two text files) -- just helping out readers, this tool might speed up future research.I have already trying it. Please look the txt-files. No different values @ vcp 01 e0 --> 01 ff
Depends on what your eyestrain is caused by. CRT has similar duty cycles with phosphor. Eyestrain is different in games (fast motion) versus desktop (static objects). During videogames, three people reported less eyestrain with LightBoost than without, because of zero motion blur. You can turn on/off LightBoost after exiting games.Based on the figures and reports of eye-strain, I'll probably be giving this a miss.
---
VG278h - 2ms strobe = 24% duty cycle @ 120hz
XL2411t - 1.4ms strobe = 17% duty cycle @ 120hz
---
A duty cycle of <40% at 180hz PWM is considered bad. The above figures would cause more eye-strain than the worst PWM LED display at 0% brightness.
^^^^ This. (and multiple other reports!)2d Lightboost has less eye strain than when my LED monitors are in default mode.
Yep. My BENQ LightBoost now looks clearly superior to the ASUS VG278H LightBoost in almost every single category now (including color). Vertical color shift is less objectionable with BENQ XL2411T than with ASUS VG278H. Contrast is duller with LightBoost, but all things equal, the BENQ now appears to look better. However, I have not yet re-calibrated my colors during ASUS VG278H LightBoost for a more apples-to-apples calibrated comparison, and that is something I shall do next.Crimson tint fix is great news. One of the things that would really have bothered me a lot about these. Thanks.
Yep. My BENQ LightBoost now looks clearly superior to the ASUS VG278H LightBoost in almost every single category now (including color). Contrast is duller with LightBoost, but all things equal, the BENQ now appears to look better. However, I have not yet re-calibrated my colors during ASUS VG278H LightBoost for a more apples-to-apples calibrated comparison, and that is something I shall do next.
No, I haven't noticed any eyestrain.Interesting news. How is the eyestrain from PWM flicker on the 11T BenQ during Lightboost? I guess I never used mine long enough to test that. I know I am sensitive to PWM flicker. I set my Catleap at 100% brightness for the fastest Hz and have to dim the image down via NVIDIA control panel. Not the best solution but its better than eye-strain.
Better be quick, 2 left:
http://www.newegg.com/Product/Product.aspx?Item=N82E16824236313
Fabulous price.
This or the Benq XL2411T?
They will be pretty much identical but the Asus is available in the US and for a pretty cheap price IMO. I am interested in playing around with the control panel settings that Mark came up with to get rid of the color tint when in LB mode. Since the screen is so cheap I will most likely also convert it to a gloss screen with a real protective layer on it and not just leave the polarization layer unprotected.
Annoys me that the XL2411T isn't available in the US yet. I was almost going to buy the XL2420T as well (which is a lot more for some reason).
The BENQ XL2411T Crimson Fix
Zero crimson tint with BENQ LightBoost!
On the subject of BENQ color calibration, I cancel out the crimson tint via nVidia Control Panel, by doing the following:
Via Monitor Menus
Monitor's Contrast Setting = 65 .... (If using nVidia Gamma 1.10)
Via nVidia Control Panel - "Adjust Desktop Color Settings"
Adjust them individually, R, G, B.
---
R Contrast = 30%
G Contrast = 45%
B Contrast = 30%
---
R Brightness = 10%
G Brightness = 40%
B Brightness = 10%
---
R,G,B Gamma = 1.10
Make sure to readjust Monitor OSD contrast everytime you change Gamma. nVidia Gamma 1.10 worked best with a monitor OSD Contrast of 65.
With these settings, I can see the difference between RGB(0,0,0) and RGB(1,1,1), without a wrongly tinted color in backgrounds or in dim colors.
Test Patterns Used
-- Lagom Contrast Pattern
-- Lagom Black Level
Hmm I was going to say you should check the prices again but I see now that there is an actual price difference of 120 euros between the 11T and 20T unlike the US were they are similar. After I factor in shipping the price difference of importing an 11T is similar to the 20T which suggests the 11T is specifically designed to cater to the EU market by offering a 20T at a lower price point.
The tradeoff for the lower price point in the EU market is that the 11T doesn't have an S-Switch. It also lacks a DisplayPort, audio port and the additional HDMI/USB ports the 20T has.
FYI -- there are many people doubting 1ms. The 1ms benefit exists and is very apparently clear.The lack of some connections and stuff doesn't bother me, I just want the better display. The XL2411T seems to be the better one, 1ms and all that (if that matters).
I just got information that this may be panel-dependant. Some panels required a very different adjustment than this. So just want to add -- the settings I've posted, may not be a "one size fits all". More testing will need to be known if it truly varies from panel to panel, or only for specific manufacturing runs, etc.Thank you very much for your diligence and hardwork.