ASUS/BENQ LightBoost owners!! Zero motion blur setting!

is there a reason for this, or just software limitation.

My understanding is that any LCD manufacturer could implement a similar solution to "lightboost" which could run independent of video input. I just see it as entries in the LCD's EDID list which would be strobe presets.

However any current "lightboost mode" LCD's seem to be locked to nvidia hardware.
 
I really doubt that. For one thing, ips response time is much too high to reach crt clarity. It would have to speed up drastically to 2ms ~ 1ms (it's typically at 10ms+ now or thereabouts, regardless much too high by comparison). Also, no ips monitors have the 2ms (let alone 1ms that achieves 94% crt clarity) lightboost2 enabled backlight technology or any equivalent tech.
A correction. It is not necessary to have 1ms pixel response to benefit from 1ms strobe backlight. A 5ms monitor could become successfully zero motion blur if the remaining 3ms (of an 8ms 120Hz refresh) is a clear frame suitable for a strobe backlight.

Repeat: It is not necessary for an LCD to go all the way down to 1ms or 2ms persistence to gain the ability of achieving MPRT's of 1ms provided they can finish their pixel transitions in total darkness BEFORE the backlight strobe BEFORE the next refresh.

Example IPS 120Hz refresh with CRT quality motion:
5ms -- IPS persistence
1ms -- Safety margin & response time acceleration timing
1ms -- strobe the backlight
1.3ms -- safety margin before the next refresh pass
TOTAL 8.3ms (1/120sec)

Presto, you've turned a 5ms IPS panel into a zero motion blur panel at 120Hz refresh (16.7ms) with a potential MPRT benchmark measurements of 1 millisecond. The challenge is good response time acceleration logic to eliminate the ghosting and bleed beyond around 5ms. That can be a big challenge.
But this is already successfully being done with full size HDTV's -- 3D IPS active shutter glasses, apparently. Just not with computer monitors.

This is easier at a lower refresh rate, but doing it at a higher refresh rate (120Hz) because its 5ms pixel persistence bleeds quite a lot beyond 5ms, and a single 1/120sec refresh leaves no safety margin (only 3ms leftover after the 5ms pixel persistence) of a stable frame to strobe a backlight on.


MarkR has hypothesized that if someday someone developed a new scanning backlight or similar lightboost2 1ms strobe equivalent for ips, ips blur could be reduced to maybe 50 - 75%, but still nothing like full crt clarity.
This is already being done with full size HDTV's so it definitely can be done.
See http://www.scanningbacklight.com/existing-tech.
The problem is they do it in a way that adds a lot of input lag, and often combines it with motion interpolation.

The response times would have to improved drastically to 1ms on ips combined
False. Read above.
Pixel persistence is bypassed by the strobes, and therefore the strobe length (seen by eyes) can be shorter than the pixel persistence cycle itself (unseel in the dark). The trick is to erase all remnants of the previous frame BEFORE the backlight strobe BEFORE the next refresh.

Also, it's already being accomplished successfully in full size HDTV's. Otherwise, how is IPS active 3D possible? It's just a matter of a proper strobe backlight, and making them low input lag (game friendly) like LightBoost.

The correct question to ask is: Why do we have IPS 3D active displays for living rooms, but not computer monitors? (IPS 3D active = automatically means it's a zero motion blur capable panel. Just needs a game-friendly low-input-lag strobe backlight for 2D usage. Alas, the manufacturers aren't doing that yet)
 
My understanding is that any LCD manufacturer could implement a similar solution to "lightboost" which could run independent of video input. I just see it as entries in the LCD's EDID list which would be strobe presets.

However any current "lightboost mode" LCD's seem to be locked to nvidia hardware.
Correct.
The limitation to nVidia is because nVidia came up with the technology specifications and shopped them out to the manufacturers, for an artificial lock-in to nVidia.

I am sure that a software utility sending appropriate DDC commands to the monitor (to enable LightBoost) would do the trick. In addition, if necessary, sending correct timings (e.g. An enlarged VSYNC interval that's equal to, or, longer than the pixel persistence)

Someone needs to reverse engineer the NVTIMINGS file and then decode the custom timings used for LightBoost modes, to allow:
- Different LightBoost refresh rates with nVidia cards. Ideally 85Hz
- Permit compatibillity with ATI cards (when set to same timings, and DDC commanded to turn on strobing).

Does anyone here have recent DDC logging/monitoring experience? I know ToastyX does; we could ask him. We need to crack the DDC command to turn on LightBoost. The timings crack is the easy part, as I am very familiar with front/back porches/intervals, and horizontal/vertical timings.
 
Glad to hear another report of awesomeness. However a few things with your further thoughts struck me right off the bat.

First of all, PQ includes movement when in regard to gaming, so opting for a higher resolution and more accurate+uniform color trades off PQ *during* movement to a sloshy mess that obliterates all objects detail, texture detail, shaders, depth via bump mapping, text/namesplates, basically the entire scene. Personally I think jack of all trades isn't the way to go anyway, and prefer to dedicate at least one monitor to desktop and one to gaming. I've been doing it for years with different types.

Understandable. However this is purely a preference of a personal nature. I too also have a different monitor with higher PQ but if I could get a single monitor that does it all then by all means I would jump aboard. Its also important to keep in mind not everyone wants a multi monitor setup for either reasons dealing with concerns of a tight budget or limited space. So its important to remember that while this is an enthusiast forum , not everyone has the same taste and or concerns. I tried to broadly scope my impressions to include anyone that is interested but with limited funds. But I do get your opinion overall.

I really doubt that. For one thing, ips response time is much too high to reach crt clarity. It would have to speed up drastically to 2ms ~ 1ms (it's typically at 10ms+ now or thereabouts, regardless much too high by comparison). Also, no ips monitors have the 2ms (let alone 1ms that achieves 94% crt clarity) lightboost2 enabled backlight technology or any equivalent tech.

Perhaps but advances are being made all the time. IPS as a panel technology probably won't be able to ever hold a candle to faster panel response times. Again this is more a personal desire and therefore not really subject to possible production.


In order to get the most out of a even a non-lightboost2 120hz monitor's motion tracking smoothness (not to be confused with blur reduction) and more recent action data shown per hz, you need to keep a very high fps. Best case higher than 120fps average vs scene complexity dips. Suggesting a 120hz or 120hz+ ips with a more demanding resolution of 2560 x 1440 as a way to achieve less demand on your gpu doesn't sound completely right to me. Yes you can get some motion tracking and blur reduction at over 70 fps (like 75 - 90 fps) though, but the resolution jump is also much more demanding so your argument sounds flawed .. - that maintaining 100fps+ on a 1ms lightboost2 monitor for crt clarity should be given up on and that you should instead opt for a 120hz ips with a much more demanding 2560x1440 resolution and worse than half the blur of a 60hz tn lcd, and run it sub 120fps for even less than its maximum motion tracking and blur improvement.
.

I was just suggesting that if you don't have good enough hardware to run your games at the required FPS then you will see some sluggishness. Yes , 1440p monitors running at 120hz will be far more demanding but as you pointed out one of the biggest flaws of IPS is its motion handling capability plus its important to remember that many GPU's simply can't show all there strengths at a lower resolution , in fact these days running 1920x1080 at 60+ fps is pretty damn easy to attain. The user knows this going in and while running at 120hz providers far smoother motion to that user of a 1440p , it still doesn't hold a candle to the crt like smooth results of lightboost 2.0 at 100hz-120hz. Both are very demanding on hardware and really if you don't have a strong enough setup you will probably not be happy with the fact that you'll find achieving the effect on a consistent basis is that much harder. So as to more of a warning for those of you who might not have a more aggressive gaming PC.

If any of you have are lucky enough to have a local Fry's or Micro-center I recommend you buy one of these monitors and see the results for yourself. At least if you find you can't run it smoothly you can simply return the monitor without a hassle.
 
I say GODDAMN Mark's explanations are good. It's like listening to a really good professor. I feel like I should be paying for this. Mad props, Mark.
Thanks for the compliment!
Pay attention to the BlurBusters.com Blog -- and keep tuned.
 
Perhaps but advances are being made all the time. IPS as a panel technology probably won't be able to ever hold a candle to faster panel response times. Again this is more a personal desire and therefore not really subject to possible production.
One correction. It is NOT necessary to use a 1ms-2ms pixel persistence to permit 1ms strobes. See my explanation above about how a 1ms strobe is possible with a 5ms pixel persistence.

The problem is in panel refresher design (including RTC) & backlight design.
There are successful active 3D IPS 120Hz panels in the home theater HDTV industry as proof.
"Active 3D" equals zero motion blur capable panel, technically. (With a proper strobe backlight)

They just don't exist yet in a game-friendly near-zero-lag manner, yet.
And isn't available in computer monitor sizes either, yet.
 
I am sure that a software utility sending appropriate DDC commands to the monitor (to enable LightBoost) would do the trick.
I'm sure that is not via DDC. Just disable DDC in monitor menu and try to send DDC commands and then try to enable lightboost. For my benq xl2411t when i set off DDC in monitors menu: DDC commands doesnt work but enabling 3d mode(lightboost) works.

Someone needs to reverse engineer the NVTIMINGS file and then decode the custom timings used for LightBoost modes
Guess it's the way. "C:\Program Files (x86)\NVIDIA Corporation\3D Vision\nvtimings.ini". There is "EDID model" (BNQ_7F10 for benq xl2411t and ACI_27F8 for asus vg278). Seems like there are timings for 3d output devices.
 
i guess most of us here would like to be able to use cards from both manufacturers with lightboost monitors instead of being tied to nvidia. sea islands beating kepler or just being better value would be a tough choice ;)
 
Mark, some user tells me that NVIDIA has a license for this technology and bypassing it (without buying the 3D-Kit) is not allowed.
Is this true?
Or does someone know more about this?
 
So they would need 5ms response time and then a 1ms backlight strobe of equivalent then. I appreciate the correction. At least that is less of a crunch goal for ips to hit. It still stands that the tech doesn't exist in current ips so its not like you can "hope to make it work" on current ips being manufactured/owned (to crt-like clarity), and its unlikely that any time in the near future both criteria will be produced in a 120hz or 120hz+ input desktop ips:
--much faster ips response times
(a solid 5ms corrected rather than the 2ms-1ms I supposed in my previous post. Current ips often have 10ms+)
--1ms backlight strobing capable backlight or equivalent
.
I'd love for something like that to be available in a desktop gaming display at 120hz / 120hz+. I just get a feeling it would be years away, if ever. I'd gladly be wrong about that. Most higher rez desktop ips panels due out (QuadFullHD, retina displays, etc) are still 60hz so gaming motion doesn't seem to be a priority.
.
Aggressive PC is definitely needed for most modern titles to maintain 100fps let alone 120fps on any 120hz monitor type (lightboost2 or not), so this is not for the budget pc on most games if you want to get the most out of 120hz or 120hz+LB2. Of course it is easier to hit the higher fps goals at 1080 rather than 2560x1440. A single 680 should be able to get 120fps on skyrim at 1080 on high though I think, so 100 and even 120 might be doable on some games. Source games can fly and are a lot of fun so are no problem either. I'm sure a lot of games in my steam library would run at 100fps for 100hz lightboost2 at fairly high custom game settings (rather than at ultra++^10), other than some of the most modern and most demanding ones. I might be in for a wait though, since buying both monitor + nvidia card(s) is a good chunk of upgrade money, and within a quarter (after my tax return gets here) gtx 780 might be out. I agree it would be better if we had amd as at least an option. There could also be more lightboost2 style monitor options around if I wait it out (no idea on this, just a hope). Still debating about what I'll do after I get tax return but I am leaning toward holding out, especially if I keep a dual gpu upgrade in consideration.
 
Last edited:
I've tested my 130 Hz Catleap, it takes a full 10ms for pixels to make a complete change.
 
Hey Mark how did the tests with your Arduino+photodiode input lag meter go? I know you were unhappy with the accuracy, but do you have a ballpark figure for input lag compared with non-lighboost?
 
Mark, some user tells me that NVIDIA has a license for this technology and bypassing it (without buying the 3D-Kit) is not allowed.
Is this true?
Or does someone know more about this?
I have not seen the specifics of the nVidia license, but some facts:
(1) There's nothing forbidding end users from bypassing or hacking the technology;
(2) There are non-LightBoost methods of doing strobe backlight technologies, see Existing Technology as well as Scanning Backlight FAQ.

I assume that's an nVidia request to computer monitor manufacturers, for the "whole 3D + LightBoost experience" but they don't have to specifically license nVidia's variant of the technology. Various different kinds of strobe/scanning backlights have been patented, nVidia is not the only game in town. There are also other ways to do it in a patent-free manner, too.
 
Hey Mark how did the tests with your Arduino+photodiode input lag meter go? I know you were unhappy with the accuracy, but do you have a ballpark figure for input lag compared with non-lighboost?
I found the problem, and it is something I can fix, but it's just a simple time issue -- I've been distracted by many projects. It will be a big BlurBusters.com Blog update once it's refined enough to release.
 
I've tested my 130 Hz Catleap, it takes a full 10ms for pixels to make a complete change.
Yes. That said, there are 3D IPS 120Hz shutter glasses HDTV's. That's not possible without bringing the pixel response to within one 120Hz refresh, so apparently, they've somehow succeeded. This means there are sufficiently clear refreshes for shutter glasses. This also automatically means those panels can do zero motion blur, since there's clear frames for a strobe backlight. Anything shutter-worthy is strobe-worthy.
 
If all this works why cant we buy / hack 3D glasses and get them synced on both eyes so we get light boost on any monitor?
 
So they would need 5ms response time and then a 1ms backlight strobe of equivalent then. I appreciate the correction. At least that is less of a crunch goal for ips to hit. It still stands that the tech doesn't exist in current ips so its not like you can "hope to make it work" on current ips being manufactured/owned (to crt-like clarity), and its unlikely that any time in the near future both criteria will be produced in a 120hz or 120hz+ input desktop ips:
That said, there are 3D IPS 120Hz shutter glasses HDTV's. That's not possible without bringing the pixel response to within one 120Hz refresh, so apparently, they've somehow succeeded. This means there are sufficiently clear refreshes for shutter glasses. This also automatically means those panels can do zero motion blur, since there's clear frames for a strobe backlight. Anything shutter-worthy is strobe-worthy.

Just like 2ms TN with lots of trailing/ghosting can be almost eliminated with good response-time-acceleration technology... 5ms IPS with 10ms trailing can be converted to 5ms IPS with nearly zero trailing with good response-time-acceleration technology. I think that's how they made it possible.

Otherwise, can you explain how these 3D IPS 120Hz HDTV's exist today, in elementary terms? I'm not talking about the polarized 3D displays. Clearly, they've managed to response-time-accelerate IPS LCD to less than 10ms on those panels. Otherwise, it's a physics violation -- you can't cram 10ms into 1/120sec.

Example: Panasonic VIERA TH-L42ET50
It's an IPS panel.
It's shutter glasses 3D.
If what you are saying is true, then, tell me, how is this TV possible?!?!

Somehow Panasonic managed to erase IPS smearing between 8.33ms refreshes (1/120sec = 8.33ms) with various kinds of response time acceleration technologies. This panel would, therefore, be zero motion blur compatible, with the appropriate strobe backlight, because shutter-friendly refreshes equals strobe-friendly refreshes. So, there you go! It's possible, it's already done. (Incidentially, Panasonic uses an "800Hz" scanning backlight on this HDTV -- but unfortunately it has lots of input lag.)

We need to see this technology arrive in computer monitor size panels; that's the problem -- it's going to take a while.
 
Last edited:
If all this works why cant we buy / hack 3D glasses and get them synced on both eyes so we get light boost on any monitor?
Do you mean "LightBoost" as the zero motion blur strobe effect? Yes, theoretically, 3D shutter glasses could provide the strobe effect for the zero motion blur effect. Open and close the shutter quickly in 1ms bursts, for each eye, as an example. However, that is not practical because
-- Shutter glasses have a persistence (like pixel persistence)
-- 3D shutter glasses makes everything very dark. 1ms shutter open-close would make things about 8 times darker for a 1/120sec refresh (1/120sec)
-- Synchronizing the glasses precisely to different monitors is a big challenge, with different monitors having different input lags and synchronization needs
-- You need very precise optimizations to response time acceleration and picture settings optimized for picture quality to the strobes (that's why picture adjustments are locked in LightBoost)
-- You need a blanking interval period longer than the pixel transition time.

It's a lot better to use the backlight as the method of strobing, since LED's switch really fast, and you can get a lot more light. And it's compatible with both 2D and 3D.
 
Yes I recall that it seems to exist in some way in TVs (considering the evidence of speed required for ips 3d) since I did read all of your prior posts. Maybe it is aggressive response time compensation added to the ips tvs as you guessed idk. I should have been more specific throughout my post in that what I was saying was in reference to it not existing in current desktop ips panels for the outlined reasons - they have higher response times(lacking aggressive RTC perhaps) and no 1ms backlight or equivalent tech that can be enabled to a lightboost2 type result -, and the fact that I didn't think it would show up in desktop ips panels any time soon. The comments were more directly a reply to people seeming to think that this could somehow be made to work on current *desktop* ips panels to full clarity, or any desktop ips panel in the near future most likely. I didn't mean to imply that it couldn't be done, just that it I think it would be quite a wait (years??), if they even ever deign to bring it to a desktop ips. As I said I'd gladly be wrong about that and have some 3D ips desktop monitor (1080p or not) come out that this could work on to the full benefit of 94% / crt-clarity (5ms or less response time, 1ms backlight strobe or equivalent that can be enabled/matched, 120hz input in the back). The push on high detail screens now seems to be retina displays, qfhd, 4k, etc for higher rez displays, all of which are to be 60hz as far as I have seen. That doesn't seem like desktop ips have a strong gaming push (or 3d). Even the 120hz/120hz + korean ips were not really manufactured intentionally to be 120hz gaming monitors.
 
Last edited:
Just tested a friend's low end Samsung 3d HDTV. I think it's series 6.
It only does 3d at 720p@60Hz and 1080p@24Hz.

Can't play games in 3d and also didn't manage to play a movie, only the nvidia 3d wizard and test app worked.
The interesting thing is that when running a 3d resolution, the TV is in stobe mode (active shutter glasses). You can play games at 60fps@60Hz with the glasses and they look blur free.


Also, why is it important that an IPS screen would finish the pixel transition in under 5ms?
Since the picture will be strobbed it should look blur free even if the final color has not been reached. At least that's what I'm assuming.
 
That way you would always be looking at colors that are off. The only way to strobe is with a clean picture for at least the duration of the strobe, or use a scanning backlight to give a tiny bit more time for the pixels to settle. But I think in order to do a decent scanning backlight, there should be at least 1 ms for the pixels that are lighted at that moment, so at 120 hz you would still need to be able to fully switch the pixels in max 7,3 ms (for scanning a single pixel) and quicker depending on how many zones you are using. Anyway, you will always want a pixel to be fully done switching before beaming it to your eyes or you will see ghosting, blur and other artifact that you are actually try to block out by using the strobes and make them superobvious.

Edit: correct me if i'm wrong please!

Btw, I have a vg278he and a 6970, if anybody needs some testing done for a possible hack i'd be happy to help.
Lets get this stuff working for all cards! I see it more as a screensetting anyway and have paid for that screen including lightboost, so let me use it please..
 
Last edited:
Just tested a friend's low end Samsung 3d HDTV. I think it's series 6.
It only does 3d at 720p@60Hz and 1080p@24Hz.

Can't play games in 3d and also didn't manage to play a movie, only the nvidia 3d wizard and test app worked.
The interesting thing is that when running a 3d resolution, the TV is in stobe mode (active shutter glasses). You can play games at 60fps@60Hz with the glasses and they look blur free.
Interesting. You must have triggered black frame insertion -- running at 120Hz, but blacking-out every other refresh. Does the picture dim significantly in strobe mode?

Also, why is it important that an IPS screen would finish the pixel transition in under 5ms?
It doesn't _have_ to be under 5ms, just that these criteria is met:

(1) Pixel transitions are practically finished (99%+) BEFORE the next refresh begins.
So you've got a pixel transition deadline of 8.33ms for a 120Hz refresh, or 16.7ms for a 60Hz refresh.

(2a) ...For entirel-backlight strobing (ala LightBoost), you also have to account for the top-to-bottom LCD "scan":
VIDEO: http://www.youtube.com/watch?v=hD5gjAs1A2s
You need to wait for the refresh "scan" to finish, THEN you wait for the pixel persistence of the most recently scanned pixels to finish transiting. You need a long vertical blanking that's at least as long as the pixel persistence itself, so that the last scanned pixels are finished transiting, before you strobe the backlight.

(2b) ...For a scanning backlight (row-of-LED flashing at a time, in a sequential pattern, top to bottom), there's no need to wait for the top-to-botton LCD "scan" to complete before strobing, but you have the additional disadvantage of backlight diffusion (LED's that are on, will leak light all over the panel, putting an upper limit to the reduction of motion blur).

Your zero-motion-blur strobe backlight margin is (total_refresh_length - (LCD_panel_scan_time + pixel_transition_time)). If you take 4ms to scan the panel top-to-bottom, and 2ms for pixel persistence, then you've got (8.33 - (4 + 2)) = 2.33ms left out of an 8.33ms refresh, to strobe the backlight during. When I say "pixel transition", I mean the time until pixel transitions are virtually complete (at least 95%+, preferably 99%+) and you can strobe just as the pixel passes its final value, and turn off the backlight before the pixel overshoots too much (pixel ripple). That's probably why most overshoot artifacts often disappear when you enable LightBoost -- so coronas disappear! The strobe is essentially equivalent the proverbial "camera flash" that captures the current state of the LCD for that instant that the backlight is on. At 120 flashes per second, persistence of vision (flicker fusion) makes it look flicker free to the eyes.

Your (near) zero-motion-blur scanning backlight margin is (total_refresh_length - pixel_transition_time - backlight_bleed_factor - granularity_factor). So at 120Hz, with a 5ms panel (response time compensated), and backlight bleed from a row of LED (10% of screen height = ~0.8ms), and a granularity of 20 rows of LED's (5% of screen height = ~0.4ms) you have: (8.33ms - 5ms - 0.8ms - 0.4ms) = 2.1ms of strobe margin per row of scanning-backlight LED's. Also, you still have minor backlight bleed covering 100% of the LCD surface, tantamount to approximately 5-10% of the brightness of LED's, so you will have slightly more ghosting artifacts with a scanning backlight than with just a simple strobe backlight.

The above becomes much easier to understand if you watch the high speed video of an LCD scanning pattern: http://www.youtube.com/watch?v=hD5gjAs1A2s .... This is a 2ms panel running at 120Hz refresh (flickering back and fourth between black and white). You'll notice that the pixel transitions are mostly finished after approximately 1/4th the height of the screen, in a top-to-bottom fashion. It so happens that 1/4th of 8.33ms is approximately 2ms. So pixels are mostly stable for 6ms out of an 8.33ms refresh. But you'll have to account for low resolution granularity of the LED's as well as backlight diffusion, to fudge the safety margin.

You will still get significant motion blur reduction even if you violate these margins, it just won't be as high quality motion blur reduction. The ghosting gradually becomes more and more visible the longer the strobe lengths are, especially when you've leaked into your your margin. You will also start to notice asymmetric ghosting artifacts (e.g. top edge having more ghosting than bottom edge, etc) if you try to push the limits.

Since the picture will be strobbed it should look blur free even if the final color has not been reached. At least that's what I'm assuming.
You will have sharp ghost double-image problems (crosstalk) -- like leakage between eyes in 3D shutter glasses. Panels that leak too much between refreshes, have very bad 3D crosstalk, and will interfere with the "zero motion blur" effect.
 
Last edited:
BTW.... Compare the high-speed video of a slow 2007-era LCD and a fast fast 2012-era LCD .... You'll clearly see that zero motion blur is impossible with the 2007 LCD because the pixel persistence leaks into the next refresh! No clear refresh to strobe a backlight through. Whereas you notice this is no longer a problem anymore with 2012-era 3D-compatible LCD's.
 
Last edited:
So are you saying
for "full screen" simple strobe backlight 4ms response time in order to get 2.x ms strobe margin,
and for a scanning line backlight 5ms response time in order to get a 2.x ms margin?
(And faster response times just increase that margin?).
.
Plus with scanning backlight
-- "you have the additional disadvantage of backlight diffusion (LED's that are on, will leak light all over the panel, putting an upper limit to the reduction of motion blur)."
--"still have minor backlight bleed covering 100% of the LCD surface, tantamount to approximately 5-10% of the brightness of LED's, so you will have slightly more ghosting artifacts with a scanning backlight than with just a simple strobe backlight. "
.
For me personally, significant blur reduction (worse than 50% less than a 60hz blurs, ~ 50%, 70%, etc) outside of the "crt clarity" where the original content: high detail objects (characters,devices,landscapes,architerctures), high detail textures, texture depth via bump mapping, other shaders, text/nameplates, -- is not kept clear during motion does not interest me as much for gaming if there as an lcd option (instead of fw900 crt) that does keep it clear. That's just my opinion.
.
Its pretty demanding to maintain 100fps+ for 100hz at high or high+ let alone ultra settings on some games, and more so for 120fps+ 120hz+ at 1080. 2560x1440 would be a lot more demanding. Do you think that if a *desktop* ips came out (if ever) with a low enough response time and a capable backlight , that it would be similarly a side benefit to a display geared for 3D? And if geared toward 3D would it be 3D gaming or 3D movies, or both? And has any (120hz input) 3D desktop display come out to date that was designed for either (or both) that was not 1080p? Totally hypothetical, just some thoughts.
 
Last edited:
So are you saying
for "full screen" simple strobe backlight 4ms response time in order to get 2.x ms strobe margin,
and for a scanning line backlight 5ms response time in order to get a 2.x ms margin?
(And faster response times just increase that margin?).
And faster scanning speed. Don't forget that all numbers in the listed "simplified" formulas are variables.

EXAMPLE (of a theoretical future LCD panel)
Fast scanning speed within a refresh (e.g. refresh top-to-bottom scan in 1/500sec = 2ms)
Fast pixel response time (e.g. pixel persistence of 1.2ms)
This results in (8.3ms - (2 + 1.2)) = a big 5ms of strobe margin.
This is unlikely, however, but you can see all three numbers before the equal sign, are all variables.

The real engineering formulas are FAR more complicated than this, and there are many variables (e.g. the length of blanking interval, the type of RTC used, what accuracy we want the pixel transitions to be -- e.g. +/- 1% of final value? +/- 10% of final value?) All these complications can vastly change the numbers. But when designing a home-made strobe backlight out of trial and errors, the formulas I've written are sufficiently accurate for hobbyist/hacker needs.

Of course, I'm writing this in a simplified mainstream way.
From an engineering perspective, it is significantly more complicated, as there is a pixel response curve and there's complications such as pixel bounce/ripple (overshoots in response time compensation). LCD pixels are never *perfectly* still. But you still get a very accurate impulse-driven display (more accurate impulse-drive than plasma since there's zero temporal dithering. Even though LCD is not as color-accurate as plasma, and even if the impulse driving is not as CRT-accurate), with the right panel and the right strobe backlight.

Its pretty demanding to maintain 100fps+ for 100hz at high or high+ let alone ultra settings on some games, and more so for 120fps+ 120hz+ at 1080. 2560x1440 would be a lot more demanding. Do you think that if a *desktop* ips came out (if ever) with a low enough response time and a capable backlight , that it would be similarly a side benefit to a display geared for 3D?
A display that is good enough for shutter glasses 3D, is automatically a display that works well with a full-strobe backlight. Conversely, a display that works well with a full-strobe backlight, is automatically a display capable of shutter glasses 3D. So one or the other happens -- the other is automatically possible.

Both situations have the same "practical complete refresh for an entire frame" requirement for shutter operation or strobe backlight operation. ("practically complete refresh for an entire frame" -- pixel transitions practically finished for the entire screen; at least 99% at its final value.)

(Scanning backlights is another "complicated" dimension to this -- a scanning backlight can be combined with 3D shutter glasses, to make reasonable 3D possible on a display that is otherwise not very practical for a full-strobe backlight. However, there are more cross-talk problems with this approach, due to backlight diffusion. So let's go with the simpler matter of LCD's compatible with full-strobe backlights, such such LCD's are vastly superior from a zero motion blur perspective.)

And if geared toward 3D would it be 3D gaming or 3D movies, or both? And has any (120hz input) 3D desktop display come out to date that was designed for either (or both) that was not 1080p? Totally hypothetical, just some thoughts.
For 3D/zero motion blur, I'm more focussed on video games and computer usage than movies.
I prefer to watch most of my movies at native 24fps -- I don't like the "Motionflow" effect or "Soap Opera" effect on my movies.

However, I appreciate some people prefer that, and some of the new 48fps movies (Hobbit) is a rather interesting topic of discussion. 48fps movies are very difficult to display smoothly at 60Hz or even 120Hz. Native 1080p/48fps playback with 3D shutter glasses, could work well at 96Hz, and LightBoost is already compatible with 100Hz (pretty close) -- probably only a driver upgrade needed to bring 48fps 3D movies to current LightBoost monitors in the future (theoretically).

That said, I'm not really interested in watching movies on computer monitors & my main interest for strobe backlights is for video game operation. Strobe backlights benefit that a lot more than it does for movies. However, others have said that strobe backlight does simulate the flicker of a old-fashioned movie projector far better than just a regular LCD.
 
Last edited:
Thanks again for your detailed responses. I really enjoy reading them.
.
The only "correction" I'd like to make is that the last thing you quoted about 3D ips was posted by me because I was trying to suggest that if an ips were released capable of this type of backlight induced clarity in the relatively near future, it would likely be a side effect of 3D tech in the same way as the current panels are - and that if this were true, it would likely be a 1080p ips. No 3d movie/3d gaming destop display intentionally manufactured to accept 120hz input (for gaming and/or 3D games~movies) has been released at higher than 1080p. .

To be more concise, I was suggesting that if an ips were released with this crt clarity capability it might similarly be a side effect of 3d backlight tech and that it would likely be a 1080p ips if anything and not 2560 x 1440.. and I was wondering what others thought about that. Pesonally I have no interest in 3D either outside of this backlight side effect for 2D. I didn't mean to suggest that was my focus.
 
when you enable lightboost in 2d like this, will it half your framerate just like when you enable 3d?
 
3D doesn't halve your frame-rate, you artificially do that by having the glasses block out half the frames for each eye. In 2D Lightboost you see all of the frames.
 
3D doesn't halve your frame-rate, you artificially do that by having the glasses block out half the frames for each eye. In 2D Lightboost you see all of the frames.

Ahh thats great. I tried anaglpyh 3d and that halfed my fps, i guess real 3d work differently :D
 
3D doesn't halve your frame-rate, you artificially do that by having the glasses block out half the frames for each eye. In 2D Lightboost you see all of the frames.

Actually for active 3D the GPU has to draw separate frames for each eye which does effectively half the frame rate. I've got a Samsung S27A950 and you can clearly see the performance impact when you enable 3D mode, both in smoothness and on the frame rate counter. If you're just enabling light boost and are viewing in 2D then the GPU won't actually be doubling the frames like this so there shouldn't be a performance hit.
 
That said, there are 3D IPS 120Hz shutter glasses HDTV's. That's not possible without bringing the pixel response to within one 120Hz refresh, so apparently, they've somehow succeeded. This means there are sufficiently clear refreshes for shutter glasses. This also automatically means those panels can do zero motion blur, since there's clear frames for a strobe backlight. Anything shutter-worthy is strobe-worthy.

Just like 2ms TN with lots of trailing/ghosting can be almost eliminated with good response-time-acceleration technology... 5ms IPS with 10ms trailing can be converted to 5ms IPS with nearly zero trailing with good response-time-acceleration technology. I think that's how they made it possible.

Otherwise, can you explain how these 3D IPS 120Hz HDTV's exist today, in elementary terms? I'm not talking about the polarized 3D displays. Clearly, they've managed to response-time-accelerate IPS LCD to less than 10ms on those panels. Otherwise, it's a physics violation -- you can't cram 10ms into 1/120sec.

Example: Panasonic VIERA TH-L42ET50
It's an IPS panel.
It's shutter glasses 3D.
If what you are saying is true, then, tell me, how is this TV possible?!?!

Somehow Panasonic managed to erase IPS smearing between 8.33ms refreshes (1/120sec = 8.33ms) with various kinds of response time acceleration technologies. This panel would, therefore, be zero motion blur compatible, with the appropriate strobe backlight, because shutter-friendly refreshes equals strobe-friendly refreshes. So, there you go! It's possible, it's already done. (Incidentially, Panasonic uses an "800Hz" scanning backlight on this HDTV -- but unfortunately it has lots of input lag.)

We need to see this technology arrive in computer monitor size panels; that's the problem -- it's going to take a while.

Mark, that panasonic doesn't state 120hz anywhere, and another indicator is that it claims 48hz for 24p playback, why not 72/96/120 ?

edit: you linked to the wrong model
http://www.panasonic.ae/EN/Pages/TH-L42ET50.aspx
but this one also never mentions 120hz?
it says active shutter progressive 3D, can't shutter glasses work at lower hz?
 
Note: these are assumptions

HDTVs don't input more than 60Hz. They use interpolation to boost the refresh rate by creating 'fake' frames.
So I'm guessing most of them run at 48 to 240Hz interpolation. (input signal 24 to 60Hz)
With movies being 24 or 30 fps, there should be no problem is displaying 3d.

Don't know if any 3D TV can handle 3d games. And if they could, 30fps would be the framerate.

Opinions?
 
Note: these are assumptions

HDTVs don't input more than 60Hz. They use interpolation to boost the refresh rate by creating 'fake' frames.
So I'm guessing most of them run at 48 to 240Hz interpolation. (input signal 24 to 60Hz)
With movies being 24 or 30 fps, there should be no problem is displaying 3d.

Don't know if any 3D TV can handle 3d games. And if they could, 30fps would be the framerate.

Opinions?

Well with HDTV's you get 120hz for example or 60 fps per eye if its active shutter tech. If its higher than 120hz than you get an increase of fps per eye and thus less crosstalk. This matters more for LCD's as they can not handle motion as well as a 3D Plasma. Passive 3D is just exactly what it sounds like , it displays both left and right eye frames at the same time on a polarized screen. The disadvantages aren't many but they significantly impact the resolution of the image which is why purists prefer active shutter but most people prefer passive 3D because it has very low crosstalk and shouldn't give you a headache after long usage. You could also make your own passive 3D monitor very easily since its just a matter of software and a polarized screen so there is also that advantage.

Consoles by the way do 3D , the PS3 does active shutter and as far as I know the Xbox 360 only does Passive 3D.
 
when you enable lightboost in 2d like this, will it half your framerate just like when you enable 3d?
No, it's always 120 refreshes per second in both cases.

You still get the temporal resolution equivalent of 120fps in 3D, it's just 60fps per eye.
You're still delivering 120 game time slices per second.
Just alternatingly between eyes.
Left eye, then right eye, then left eye, then right eye. 60fps/60fps.
But there's 120 game time slices per second.
So the frame shown to the next eye is rendered 1/120sec later after the frame shown to the previous eye.

One 3D frame is actually 2 frames in one (L/R framebuffers). So you are essentially rendering 240 framebuffers for 60fps/60fps per eye (120 visible presentations of framebuffers) and throwing away the rest. The extra of the pair is there in case the framerate slows down, since you need the other frame "just in case". That said, driver/game/software optimizations reduces GPU workload to only ~50% more than just plain 2D.

For maximum motion fludiity, during 3D, you actually have this in one second of 3D stereoscopic gaming:
Refresh 1: Wait for VSYNC and then Show L eye frame of frame pair for T+1/120sec
Refresh 2: Wait for VSYNC and then Show R eye frame of frame pair for T+2/120sec
Refresh 3: Wait for VSYNC and then Show L eye frame of frame pair for T+3/120sec
Refresh 4: Wait for VSYNC and then Show R eye frame of frame pair for T+4/120sec
...etc...
.
.
.
Refresh 119: Wait for VSYNC and then Show L eye frame of frame pair for T+119/120sec
Refresh 120: Wait for VSYNC and then Show R eye frame of frame pair for T+120/120sec

This is what happens when you play a source engine game at full framerate (With at least fps_max 120).
To the source game and the drivers, one "frame" == One 3D frame == two framebuffers of exactly the same game time slice (only one eye sees the 3D frame of this game time at 120fps, unless you've got a repeat refresh of the same 3D frame. Then yes, both eyes sees the same 3D frame).

There is a lot of confusion about "halving the framerate" with 3D, which is actually true, but only on a per-eye basis. But you're still deliver 120 refreshes per second (at 60fps/60fps per eye, alternatingly, 1/120sec apart) This presents opportunity to increase time resolution by having independent game world time slices for each refresh, to allow 120 moving-object movements per second. The human brain is able to feel the improved fluidity, even though it is alternate-eye delivery.

During maxed-out framerates in Source engine video games, you've got 120 different game-world time slices, so moving objects are still in 120 different "3D space" positions even when you use 120Hz 3D shutter glasses. I've confirmed that Portal 2 does things this way. The interlacing of frames between eyes preserves "120fps" of temporal resolution, and the LightBoost zero motion blur effect, even though only one eye sees a frame at a time. It's bad to present the exact same game world time slice to different eyes at different times (e.g. left eye and then 1/120sec later by the right eye). The correct & proper way to do it is present a game world time slice to the left eye, and present a new game-world time slice (all moving objects moved forward 1/120sec) to the right eye. The game-world time slice should always be the same time as the /presentation/ to the eye (the moment that the frame becomes visible to the eye); this preserves temporal resolution and the zero motion blur effect.

For more reading about this, see this thread about 3D shutter glasses operation -- this will help you understand better how frames are delivered using 3D shutter glasses. I will have high-speed video pointing through shutter glasses within the next 2 months, which will help explain shutter glasses behavior and the 120fps temporal resolution preservation via interleaving at 60fps/60fps.

Please read this thread about 3D shutter glasses operation before replying to this post with questions about framerate and human vision behaviors. I spent hours trying to explain what I already know about the function of 3D shutter glasses, and would rather simply talk about 2D LightBoost strobe backlights, even though I've got a useful brain to pick for 3D behavior.
 
Last edited:
Actually for active 3D the GPU has to draw separate frames for each eye which does effectively half the frame rate. I've got a Samsung S27A950 and you can clearly see the performance impact when you enable 3D mode, both in smoothness and on the frame rate counter. If you're just enabling light boost and are viewing in 2D then the GPU won't actually be doubling the frames like this so there shouldn't be a performance hit.

This is all implied in what I posted. Naturally in 2D mode both eyes are seeing all frames so it will be a much smoother image. Effectively blocking 50% of all frames one eye at a time is completely silly IMO and a waste of GPU resources, hence why I do not like 3D. Shutter glasses flicker is also the suck.

The only good thing about 3D is it may push technology into area's that these very reserved companies may well never have ventured. I am all for the 4K/8K revolution, but I don't think you will see any 4K/8K screens above 60 Hz (real input) for a very many years.
 
Mark, that panasonic doesn't state 120hz anywhere, and another indicator is that it claims 48hz for 24p playback, why not 72/96/120 ?

edit: you linked to the wrong model
http://www.panasonic.ae/EN/Pages/TH-L42ET50.aspx
but this one also never mentions 120hz?
it says active shutter progressive 3D, can't shutter glasses work at lower hz?
Yes, you're right.

However, that just exactly means you can do zero motion blur at the same Hz.
e.g. If you are able to do shutter glasses at 96Hz, you can already be zero motion blur with an appropriate 96Hz strobe backlight.
(Replace "96" with the shutter glasses refresh that Panasonic uses.)
 
This is all implied in what I posted. Naturally in 2D mode both eyes are seeing all frames so it will be a much smoother image. Effectively blocking 50% of all frames one eye at a time is completely silly IMO and a waste of GPU resources, hence why I do not like 3D. Shutter glasses flicker is also the suck.
Agreed on both counts -- I don't use 3D often except for certain games like Portal 2 where it's well-optimized by the game developer to work properly in 3D. The 3D is just there as a bonus, for me. It's an interesting subject topic matter to me too, as I have an undertanding of how they work, too.
I think you already know this, but a lot of people don't, so I'll repeat here for HardForum readers, just to be safe:

However, on a scientific basis, there's a good reason for the waste of GPU resources for using the GPU horsepower of 240fps to generate 120fps 3D at 60fps/60fps per eye. One 3D frame is two framebuffers of the same game-time. If there's stutters, or the game is running at half framerate internally (60fps), then both eyes see the same 3D frame of the same game-time. But if you go beyond 60fps while wearing 3D shutter glasses, there are 3D frames of a game time slice where only one eye sees. Theoretically, you could predictively skip rendering "the other frame" if you know you're maxed out at framerate, but that will either cause massive input lag (due to lookahead/lookbehind logic), or very nasty stutter artifacts that look far worse than 2D stutters, when there's a single frame dropped. So for best motion fluidity, you always render both framebuffers for a single stereoscopic 3D frame (two frames in one).

And there's 120 refreshes per second, so you've got 120 stereoscopic 3D frames per second, which is 240 framebuffers worth. (Essentially equivalent to 2D 120 frames for left eye, 120 2D frames for right eye, but you only have 120 refreshes per second at 60/60 per eye, half of the framebuffers are never seen by an eye). With VSYNC on, you're limited to 120 3D stereoscopic frames per second, but that does not technically prevent drivers from internally rendering both eyes internally.

Normally it would require double the amount of GPU, however, there are optimizations/efficiencies in games, GPU, and drivers that make it only 50% more GPU power, so you can get the maxed-out framerate in Source engine games on a GTX 680 at 1080p with all settings enabled.

Polarized 3D solves the problem by presenting 3D frames of the same game time slice, to both eyes simultaneously, and thus polarized 3D uses a lot less GPU power. That said, polarized 3D has other corresponding disadvantages, too. (e.g. temporal resolution of only 60fps, and less benefit from a strobe backlight)
 
Asus VG248QE 144hz 3D Vision 2: The Official Thread

3D vision2 1ms 144hz ... so that one should be similar to the benq xl2411t (as people said in that thread already). Due out this month sometime. Still 24" and the same matte coating, same panel basically. Hoping someone will make glossy one someday, and 27" to match my monitor array better. I could go down to 24" if I had to, as it has better ppi anyway but it wouldn't line up well with my current monitor array setup.
 
No, it's always 120 refreshes per second in both cases.
.

Thanks for the answer. I think i expressed myself a bit wrong, i meant the gpu increase :) I saw half the framerates from fraps when i enabled 3d :D

But thats great if thats not the case with lightboost.

edit:lied get 50% lower fps.. :)
 
Last edited:
Hey guys, I am planning on buying a new 144 Asus monitor when it is released. If I want to have this lightboost CRT smoothness feature/setting, do I have to have an Nvidia card? Can I not use an AMD 7000 series card?
 
Back
Top