BenQ XL2411T eh?
Ya, anyone want one?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
BenQ XL2411T eh?
This would be no longer an issue if the strobing hides the overdrive artifacts. A properly timed strobe backlight can hide overdrive artifacts, as well as the pixel persistence. It doesn't matter how "noisy or bouncy" the pixel transition is, as long as all that is kept in total darkness. For example, the first 4ms of a 1/120sec refresh could be a bouncy 1ms overdrive and a 3ms "settle-down" safety margin. (Or whatever settle-down margin it uses) Then afterwards, you strobe the backlight on a pretty stable, ghost-free frame on time before the next refresh. Done. Obviously, it depends how the refresh is scanned-out; and how long the monitor (or graphics card) artifically lengthens the vertical blanking interval, in order to allow a settle-down time and to allow a strobe to happen before the next refresh.One ‘make or break’ point and contentious issue on the XL series thus far has been the pixel overdrive which on the 10T in particular was too aggressive. This lead to noticeable ‘inverse ghosting’ in some instances. The grey to grey response time is now quoted as ‘1ms’ rather than ‘2ms’, which could indicate some tuning has been done to the overdrive algorithm. Alternatively it could just be a mild exaggeration by BenQ – they wouldn’t be the first company to make misleading response time claims at any rate. One particularly attractive feature of the XL24 series monitors has always been the low input lag, which is further lowered by an ‘Instant Mode’ to bypass extraneous image processing. The company is claiming ‘0.001 frame’ of input lag which to all intents and purposes means no input lag whatsoever – it looks as if that won’t be something to worry about. We will of course find out more about what differentiates the XL2411T from its predecessors when we review it shortly. BenQ have told us that it should be available to test in November and to expect full retail availability in the UK around the same time. Some retailers are now stocking or taking pre-orders for around £235. We will bring any further details on price and availability as the news comes in.
If you're strobing instead of scanning, then don't forget that there's the timeslot of the refresh cycle, and the timeslot of the stable frame (like a lengthened vertical blanking interval between frames). Those timeslots have to be separate. You can only strobe during the vertical blanking interval.I've done some photo tests of the xl2411t if you need it https://dl.dropbox.com/u/11777259/xl2411t_test.rar
lightboost tends to bypass the most pixel changing time idd, but imho it's not timed perfectly as you can see and could be improved(cause afaik pixel changing time for any colour is less than 4ms even for 2420t which got slower panel so you got at least 4.33ms free slot with perfect picture but they somewhy didnt do it perfectly)
i wish you could control the backlight to bring it to 60hz! would have been motion blur free moment for ALL LCD owners.
when the backlight isnt strobing in lightboost mode is the display PWM free for ether asus or benq?
No.
Benq XL2420t - 180hz
Asus VG278h - 360hz
Asus VG278he - 432hz
Benq XL2411t - unknown, probably 180hz
Asus VG248qe - unknown, but should be 432hz
I have a VG278H monitor and none of these tricks are working for me.
Not sure if I found a new trick. But you enable stereoscopic 3D and launch the game. I put it on 1% so it isn't too distracting when not putting on my glasses. Once in game I press (ctrl+t) and that disables 3D. It puts me back into 2D mode, but the brightness is severely lowered. I think the game does feel a little smoother.
If anyone else could verify this LMK.
Right.I'm beginning to wonder if lightboost even exists on these older monitors. You keep referring to "lightboost 2", but the nvidia list of monitors only references the term "lightboost" on nvidia 3d vision 2 displays:
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).I looked into DDC software last night.
And both my LCD's are not DDC compliant.
From there I checked out the AMD "ADL API" which allows the user to control driver settings.
I was able to run their sample code which changes brightness on a loop. (max low, max high)
But we can't achieve what we're looking for through video output, it should(?) only get output once per refresh by the videocard. i.e. even if I can/could tell the videocard to change brightness at the exact intervals we want, and keep it all accurate, it's really only sending that data to the screen 1 time per refresh. (If I'm way off base here please let me know)
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).I looked into DDC software last night.
And both my LCD's are not DDC compliant.
From there I checked out the AMD "ADL API" which allows the user to control driver settings.
I was able to run their sample code which changes brightness on a loop. (max low, max high)
But we can't achieve what we're looking for through video output, it should(?) only get output once per refresh by the videocard. i.e. even if I can/could tell the videocard to change brightness at the exact intervals we want, and keep it all accurate, it's really only sending that data to the screen 1 time per refresh. (If I'm way off base here please let me know)
I'm running Windows 8.Did you try the method I got my vg278h working? It's on pg. 8 of this thread. Also a few people were having problem getting this activated on Win 8. No problems on Win7.
You can verify that the monitor has Loghtboost enabled by verifying all the Asus menu options are greyed out except contrast & LB.
I have found that keeping 3d enabled and using ctl T method will have less FPS performance compared to 3d disabled in the control panel.
I'm a C/C++ programmer, I'd like to hack the LightBoost in monitors, so that I can control the strobes (as much as API's allow).
I don't think software-based control of brightness will be reliable because of CPU jitters, etc. Although it's possible to do software-based VSYNC by precision time-coding to signal en external circuit (e.g. Arduino) *when* to strobe (e.g. time-stamping a Direct3D RasterStatus.ScanLine() with a QueryPerformanceCounter() and sending a signal to Ardunio, which uses its millisecond-accurate timer to undo the CPU/USB latencies).
RasterStatus.ScanLine(), with appropriate timings formulas (e.g. VESA GTF, or other algorithm) be converted to a fairly accurate "microseconds-since-last-VSYNC") value.
However, I don't think it can be done reliably if the software *does* the strobing itself.
That's a synchronous approach. The timestamping method (RasterStatus.ScanLine) allows high accuracy while doing it asynchronously. You know what I mean?I understand.
I was playing around with QuerryPerformanceCounter() to create a high precision sleep. The results seemed pretty good. When I requested a 120hz sleep interval I was getting 8.33353ms and 8.33324ms intervals. The average of that comes out to 8.3334
Yes, an external hardware clock is needed too. If we're trying to override the LightBoost strobes, yes, we need to do so.Now I agree that an external hardware clock would be better suited. But I was nonetheless impressed with the software based timer.
I did some tests on a flashing high-brightness white LED running at 1-to-10 kilohertz and found a >0.1ms random insertion of delay (permanently delaying future strobes), led to a barely-perceptible flicker. It was, however, nontheless potentially annoying. So, I've set 0.1ms as the accuracy -- ONLY because of flicker issue. But if you delay an individual strobe but you still maintain the same number of strobes per second (e.g. future strobes are not permanently delayed by one delay), then there's no flicker issue.I confess I did not setup/test any jitter measurement, and did not even lock my CPU clock nor the thread to a specific CPU Core. So i'm sure I would have gotten some jitter. I would think we would need to test before dismissing. One thing I like to think is that the strobing is not related to a specific frame. So as long as it's somewhat in line with the refresh then it should look ok to the user. In the end we could find a middle ground between mathematical accuracy and acceptable strobing.
For manufacturers: They have the display signal to work from, so that's the cheapest; just work with the display signal to time the strobes.From personal experience in design, fabrication and deployment. While the absolute perfect implementation would be awesome, 99% of the time the cheaper less perfect solution ends up being the chosen one (because of various decisions/parameters that are in the end not as important)
Precision for strobing is lower than most expect, as long as prevoius precision does not interfere with future precision. (e.g. a delayed strobe does not permanently delay future strobes). - but VSYNC is a very synchronous timer, so that solves that problem since we're using VSYNC as our time reference. Varying the timings of the strobing by 0.1ms themselves from their appropriate perfect mathematical timing, has no noticeable flicker artifacts, as long a tiny imperceptible delay in one strobe does not permanently delay future strobes. Based on LED flicker tests on my Arduino electronics circuit. BUT, permanently delaying all future strobes or a strobe being missed, _does_ can potentially a noticeable flicker, especially at higher frequencies (e.g. 1KHz PWM or 10KHz PWM, and one strobe suddenly becoming skipped or delayed -- amazingly, I _can_ notice that even in 10KHz PWM, it appears as a single faint flicker).What magnitude of precision do we think is required for strobing? Even the refreshrate usually fluctuates a bit in regular use.
Nope. The strobe delay won't be noticed by the human eye from the flicker perspective. HOWEVER, there are possible analog effects on the LCD, a strobe delayed may cause increased faint ghosting along the top edge of the screen, because the LCD panel might have started refreshing the next frame, etc. But if you've added lots of safety margin (e.g. a huge vertical blanking interval), you've got safety margin for delayed strobes. It's only a graceful, slow degradation to increased ghosting, as it enroaches into a tiny part of the LCD refresh cycle. (You may have seen my high speed videos recently posted on blurbusters.com)Same for jitter, is an offset of 0.1ms for 1 frame something we can notice? (note that I don't know how much jitter we would even get from software solution, could be 0.001, could be 100ms).
Nope. Are you aware of my home-made strobe backlight under construction at www.scanningbacklight.com? (Scroll down near the bottom and start scrolling upwards) I started this project before I discovered LightBoost was a strobe backlight. I've even created a FAQ at Scanning Backlight FAQ.On another note, has anybody taken apart a lightboost 2 monitor yet? I'm still not convinced nvidia implemented a super high tech solution. It would be nice, but if in their tests they could get "ok" results with a basic non costly approach I'm sure the managers opted for that.
BTW, I hear XL2411T is a better LightBoost monitor than XL2420T. I've ordered one from UK; it's still on sale and no VAT when shipping to North America -- even with $50 shipping, it's still cheaper than buying one at a local computer store.I'll go buy a lightboost LCD + nvidia card.
Do I "need" the emitter ? i.e. XL2420T vs XL2420TX
Excellent tip, thank you. Although flicker makes it mostly unsable, it definitely shows that combining LightBoost with Black Frame insertion, gives you the zero-motion-blur effect at 60Hz. This is perfect for MAME, emulators, etc. The CRT flicker adds an air of authenticity. And Super Mario "Nintendo smooth" scrolls with no blur or ghosting, too.Its possible for Vista64 (maybe Win7, not Win8) and all 120hz 3DVision Displays. A simple bug activating black frame insertion for every second frame. Its perfect for 60fps limited games or slow graphicards. Games are limited with v-sync on to 60fps, but the display runs in 3d mode with 120hz. I am checked it with a DVI capture card and yes thats really black frame insertion. 60fps games are extremly sharp (look just like real 120hz).
--> connect display with a dvi dual link cable
--> set the monitor refreshrate to 120Hz
--> make sure the nvidia usb-dongle is connected
--> goto "Nvidia Control Panel"
--> activate "Enable stereoscopic 3D"
--> click on "Test Stereoscopic 3D"
--> set resolution and refreshrate to 1920x1080@120hz and launch the test application
--> when you see the moving Nvidia letters, press Win key or Alt+Tab
--> now, the windows desktop are darker and has heavy flickering like an old 60hz CRT (3DVision glasses don't work correct, the right glas are more darker than the left)(display is in 3d mode)
--> close "Nvidia Control Panel" (let 3DCP/nvsttest.exe open!) and reopen it
--> deactivate "Enable stereoscopic 3D"
--> start a game with 120hz (display and usb-dongle don't turn of 3dmode)
I wonder if black frame insertion would be good with a 240hz capable monitor, running at 120hz with black frame insertion every other frame...Excellent tip, thank you. Although flicker makes it mostly unsable, it definitely shows that combining LightBoost with Black Frame insertion, gives you the zero-motion-blur effect at 60Hz. This is perfect for MAME, emulators, etc. The CRT flicker adds an air of authenticity. And Super Mario "Nintendo smooth" scrolls with no blur or ghosting, too.
Hopefully this becomes an official feature sometime, or somebody creates a System Tray Utility to do software-based Black Frame Insertion for 120Hz displays.
I wonder if black frame insertion would be good with a 240hz capable monitor, running at 120hz with black frame insertion every other frame...
I'd say that it's quite similar. Although TechNGamer is saying it's better than CRT, I think it's not better (yet) but it's definitely in the same neighbourhood. Far more CRT than LCD (even when comparing to normal 120Hz LCD)
Most people are using 60Hz LCD's, so based on math calculations from my strobe measurements (Casio 1000fps camera) the VG278H would have 85% less motion blur than most people's computer monitors (2.5ms of eye-tracking based blur, rather than 16.7ms of eye-tracking based blur). I'd say that throws it within a stone's throw of CRT, which I feel is 90-95% less motion blur than LCD (primarily the green ghosting effects, etc). Very hard to tell. On my VG278H, motion looks more CRT than LCD, for sure. The limiting factor is the response-time-acceleration artifacts, which trails faintly. But there's absolutely no phosphor ghosting, and edges are sharp in fast motion like CRT. It's much better than I expected, and can only get better with improved LCD's and improved strobe backlights. (If the manufacturers are willing to chase down this path.) I can now easily see a 1-pixel gap in PixPerAn chase test, something never seen before in any other LCD, although it does have visible distortions caused by response-time-acceleration. I'm able to read the "I NEED MORE SOCKS" text in the PixPerAn racecar perfectly even when it's running at 960 pixels per second. Being able to read tiny text while it's zooming across my screen at 960 pixels per second -- something I've never been able to do with any LCD, ever -- until now. I think approximately half a pixel motion blur at this speed, hard to tell. (960pps = 8 pixels step per frame, configure PixPerAn accordingly). It's a burry mess on 60Hz LCD. Once you move the car faster than roughly this speed, it's hard to track my eyes (even if it were a CRT) but it now looks like there's finally very ultra-faint amount of motion blur when you start moving objects at 1920 pixels per second (very hard to track my eyes that fast), so you just now finally barely be able to notice a possible minor shortfall relative to CRT (e.g. it psychologically feels only like a "1%" shortfall). I'll have to do more testing in pushing the limits. For all practical purposes, it's looks like zero motion blur unless I'm really trying damn hard to tire my hand out flicking my mouse so fast, that I can only barely track my eyes on the almost-too-fast-motion, and then finally I barely notice maybe 1 or 2 pixels thick of motion blur. Only I'm super aggressive at trying to move objects across my screen extremely fast. On the other hand, I've also seen worse CRT's -- long-persistence phosphor CRT's with more objectionable phosphor ghosting than what I'm seeing with the VG278H. I will say this thus far: Given a scale of traditional LCD=1 and a CRT=10, I'm giving the VG278H a "9" in sharpness of fast-motion (zero motion blur effect). To my eyes, motion on the VG278H (+zero motion blur tweak) looks 10 times better than a 60Hz LCD.
What's far more noticeable (but must be weighed against CRT phosphor ghosting artifacts) is that there's some minor trailing RTC/ghost/corona artifacts, much like you see in some reviewer photos. But surprisingly, the RTC ghost/artifacts don't show up very much in games; it only affects edges only between certain colors and certain speeds. It's hard to describe, the ghost/corona artifact is very different from what you normally see in non-strobed displays. Basically, the RTC ghosts/coronas are extremely razor-sharp double image that's chasing a few pixels behind a sharp moving image: That's the side effect of extremely slight (1%) pixel persistence leakage into the next frame for certain color contrasts. But I can't see this effect in videogames actually, and is actually less noticeable than phosphor ghosting trail artifacts (though it depends on the colors & depends on the game -- could be worse or less). It only happens when you've got big edges between big solid colors, and only between certain colors -- it's similiar to 3D shutter glasses leakage, except it's a faint razor-sharp double-edge chasing a razor-sharp edge. So I've traded CRT phosphor ghost artifacts (green ghosting in dark scenes) with faint-but-razor-sharp ghost/corona artifact.
However, the edges of everything remain razor sharp during fast movement, just like CRT. (I may replay Half Life 2 again like this. It's a new experience now, with CRT-sharp motion)
Watch your GPU if you want to do zero motion blur gaming; don't bother if you don't have a high-600-series Geforce card. Also, adjust video game settings to turn off GPU-based motion blur effects. (Although artistically desirable for some moments like being injured, I don't like it when artificial motion blur is enabled at all times in a videogame). You may have also to lower your detail level a bit, just to cap out at 120fps@120Hz. With zero motion blur gaming, you will now be able to notice frame drops even at 120Hz (where you weren't able to before, because of LCD motion blur). Much CRT at 60fps@60Hz, you quickly notice single frame drops, or CRT 120fps@120Hz, where you occasionally notice them. The zero motion blur effect makes frame drops far more noticeable; some people will hate this. Others don't mind it all that much. CRT video gamers are already familiar with this effect -- the effect is the same.
Remember, I'm talking about motion blur, not other aspects (color quality). I've had time with a friend's W900, and I also used to have a 21" CRT that could do 120Hz. For my Asus, color quality (e.g. for photoshop) is not as good as a well-calibrated FW900. It's a TN panel -- temper your expectations for color quality. Color quality will be a dissapointment for color-quality snobs. But I can say the motion blur is in "CRT territory". It totally feels like zero motion blur. Not noticeably better, not noticeably worse. Just different artifacts (very faint sharp-edge RTC artifacts instead of CRT phosphor ghosting). It may be because I'm more picky about IPS quality than TechNGaming is, so you've got to pick your poison (You want good photo viewing? IPS! You want zero motion blur on LCD? TN+LightBoost, baby!!!)
P.S. Although it looks great at just about any fps that doesn't microstutter too much --
Did I say this again: You _really_ do need to cap-out the fps at the same rate as strobes (120fps@120Hz), for the _best_ zero-motion-blur effect.
Very hard to do without a GTX680 SLI in the newest video games without an aggressive lowering of detail.
P.P.S. Regarding XL2420TX or XL2411T, I'm pretty unsure. I'd lean towards the one with the 1ms rating, but the bottom line is the length of the strobes for each monitor. Is the shutter glasses emitter built into the XL2411T like it is for the Asus VG278H? If so, then it's relatively easy to force-enable LightBoost in 2D mode -- get the XL2411T, the lower ms rating is a good sign because better response-time-acceleration (less RTC artifacts) is useful. But it's not a guarantee of it being a better strobed monitor. Or remaining bright during strobes. (Non-strobed brightness doesn't always guarantee sufficient brightness during strobed modes). We need someone well-connected to test them all, or someone near a massive elite computer store, to do some tests...
ips response time is slower so the results won't be the same. 120hz + very high fps results in smoother motion tracking regardless but not reduced blur. Not sure how a 10ms+ ips compares to the 1ms XL2411T or other 2ms 120hz TN , or even 3ms 120hz TNs if lightboost strobing were enabled on all of them. I also doubt that the ips panels have the same amount of aggressive response time compensation available in them.
I'd rather have crt like performance at 1080p even if I had to use a 24" monitor. I play on a 120hz 27" at 1080p now (w/o lightboost), but it is dedicated to gaming only. I use a 2560x1440 ips for all other things desktop.
.
Depends on what kinds of games you play. Fast twitch online games benefit hugely from CRT-style (provided by LCD LightBoost). Or are you doing a slower exploration/adventure game, or Civilization/Simcity games? (You'll appreciate the color quality of IPS).
Also, IPS 120Hz has slightly more motion blur than non-LightBoost TN 120Hz LCD
Gotta pick your poison. For me being used to C, regular 120Hz is NOT "a little motion blur" -- it's still a blurry mess in Arena-type games compared to CRT. Just half as blurry a s 60Hz.
Hopefully someday, we will get 1440p+IPS+LightBoost. Probably not for several years, alas.
Agreed. Although 120Hz TN (non-LightBoost, too) has less motion blur than 120Hz IPS, 120Hz IPS still has less motion blur than 60Hz IPS. The pixels on IPS cannot transition fast enough to be an exact halving of motion blur from 60Hz->120Hz, but there IS a reduction of motion blur from 60Hz->120Hz IPS. Just not as much as the blur reduction from 60Hz->120Hz TN, due to the slower IPS pixel response.Did you really just say a 2560x1440 60hz IPS panel has the exact same motion blur as a 2560x1440 120hz IPS panel??? I have a catleap 2B and an Overlord Tempest OC and the blur is night and day different between 60hz and 120hz.
I'm saying from what I understand they don't have the same combo of factors to reduce blur as much as a much lower response time 120hz TN with aggressive RTC.
Yah, um no...sorry but you just don't know what you are talking about. Your letting YOUR 60hz IPS 2560x1440 panel bias your opinion versus the 120hz 2560x1440 panels. I HAVE OWNED the VG278HE 144hz Asus, Samsung S27A750 & S27A950 (The key word is owned NOT talk out my ass) and the 2560x1440 120hz IPS panels blow them out of the fucking water. Whatever math applies to the 60hz versions (pixel response, fov movement, bump mapping, etc, gets thrown out the window once you go 120hz, which is why it would be very interesting if somebody like Mark could test them out and give us the real numbers on exactly what happens when an IPS goes 120hz. If you don't believe me, ask Vega because hes owned all of those displays too and he will be the first to tell you that 60hz 2560x1440 IPS is SHIT compared to 120hz 2560x1440 IPS.If it comes down to the difference being clear readable text and clear "readable" high texture detail incl. bump mapping, etc during FoV movement at speed - or not - for any time in the foreseeable future, my choice for (dedicated to) gaming would be the clear one.
Nowhere, yet. But there are televisions refreshing at 600hz+, so it's easily feasible.Where is there a 240Hz monitor?