Home-made Arduino scanning LED backlight to simulate 480Hz or 960Hz in a 120Hz LCD?

mdrejhon

Limp Gawd
Joined
Mar 9, 2009
Messages
128
Hello,

Goal: Eliminate motion blur on an LCD, and allow LCD to approach CRT quality for fast-motion.

Scanning backlights are used in some high end HDTV's (google "Sony XR 960" or "Samsung CMR 960"). These high end HDTV's simulate 960 Hz using various techniques, including scanning backlights (sometimes also called "black frame insertion"). The object of this is to reduce motion blur greatly by pulsing (flickering) the LCD -- scanning the backlight (flicker), like a CRT would scan the phosphor (flicker) These home theater HDTV's are expensive, and scanning backlights are not really taken advantage of (yet) in desktop computer monitors. Although there are diminishing returns beyond 120Hz, it is worth noting that 120Hz eliminates only 50% of motion blur versus 60Hz, however, 480Hz eliminates 87.5% of motion blur versus 60Hz. Scanning backlights can simulate the motion-blur reduction of 480Hz, without further added input lag, and without needing to increase actual refresh rate beyond the native refresh rate (e.g. 120Hz). Your graphics card would not need to work harder.

I have an idea of a home-made scanning backlight, using an Arduino project, some white LED strips, and a modified monitor (putting Arduino-driven white LED strips behind the LCD glass)

Most LCD's are vertically refreshed, from top to bottom.
The idea is to use a homemade scanning backlight, by putting the LCD glass in front of a custom backlight driven by an Arduino project:

Parts:
1. Horizontal white LED strip segments, put behind the LCD glass. The brighter, the better! 4 or 8 strips.
2. Arduino controller (to control LED strip segments).
3. 4 or 8 pins on Arduino connected to a transistor connected to the LED strip segments.
4. 1 pin connected to vertical sync signal (could be software such as a DirectX program that relays the vertical sync state, or hardware that detects vertical sync state on the DVI/HDMI cable). The vsync signal ideally needs to be precise, this might still be possible to do over USB, if you can make it sub-millisecond precision (or use timecoding on the signal to compensate for the USB timing fluctuations) If done using software signalling over USB, you can eliminate this pin.

The Arduino controller would be programmed to flash the LED strip on/off, in a scanning sequence, top to bottom. If you're using a 4-segment scanning backlight, you've got 4 vertically stacked rectangles of backlight panel (LED strips), and you flash each segment for 1/4th of a refresh. So, for a 120Hz refresh, you'd flash one segment at a time for 1/480th of a second.

The Arduino would need to be adjustable to adapt to the specific refresh rate and the specific input lag specific to the monitor:
- Refresh rate automatically configured over the USB
- Configurable on/off setting, to stop the flicker when you aren't doing fast-motion stuff (FPS gaming, video camera playback, etc.)
- Upon detecting signal on the vsync pin the Arduino would begin the flashing sequence to the first segment. This permits synchronization of the scanning backlight to the actual output.
- An adjustment would be needed to compensate for input lag (either via a configurable delay or via configuring the flash sequence on a different segment than the first segment.)
- Configurable pulse length, to optimize image quality with the LCD.
- Configurable panel flash latency and speed (to sync to the LCD display's refresh speed within a refresh) -- this would require one-time manual calibration, via testing for elimination of tearing/artifacts. For example, a specific LCD display might only take 1/140th of a second to repaint a single 120Hz frame, so this adjustment allows compensation for this fact.
- Configurable number of segments to illuminate -- e.g. illuminate more segments at a time, for a brighter image at trade-off (e.g. simulating 240Hz with a double-bright image, by lighting up two segments of a scanning backlight rather than 480Hz)
- If calibrated properly, no extra input lag should be observable (at most, approximately 1-2ms extra, simply to wait for pixels to fully refresh before re-illuminating backlight).
- No modifications of computer monitor electronics is necessary; you're just replacing the backlight with your own, and using the Arduino to control the backlight.
- Calibration should be easy; a tiny computer app to be created -- just a simple moving test pattern and a couple or three software sliders -- adjust until motion looks best.

Total cost: ~$100-$150. Examples of parts:
- $35.00 (RadioShack) -- Arduino Uno Rev 3. You will need an Arduino with at least 4 or 8 output pins and 1 input pin. (e.g. most Arduino)
- $44.40 (DealExtreme) -- LED tape -- White LED's 6500K daylight LED's, 50 watts worth (5meter of 600x3528 SMD LED 6500K).
- Plus other appropriate components as needed: power supply for LED's, wire, solder, transistors for connecting Arduino pins to the LED strips, resistors or current regulators or ultra-high-frequency PWM for limiting power to the LED's, etc.

LED tape is designed to be cut into segments, (most LED tape can be cut in 2 inch increments). Google or eBay "White LED tape". A 5 meter roll of white LED tape is 600 LED's at a total 50 watts, and this is more than bright enough to illuminate a 24" panel in 4 segments, or can be doubled-up. These LED tape is now pretty cheap off eBay, sometimes as low as under $20 for chinese made rolls, but I'd advise 6500K full-spectrum daylight white LED's with reasonably high CRI, or color quality will suffer. Newer LED tape designed for accent lighting applications, would be quite suitable, though you want it daylight white rather than warm white or cold white -- to match the color of a typical computer monitor backlight. For testing purposes, cheap LED tape will do. You need extra brightness to compensate for the dark time. A 4-segment backlight that's dark 75% of the time, would ideally need to be 4 times brighter than the average preferred brightness setting of an always-on backlight. For even lighting, a diffuser (e.g. translucent plastic panel, wax paper, etc) is probably needed between the LED's and the LCD glass.

This project would work best with 120Hz LCD panels on displays with fast pixel responses, rather than 60Hz LCD panels, since there would not be annoying flicker at 120Hz (since each segment of the scanning backlight would flicker at 120Hz instead of 60Hz), and also that the pixel decay would need to be quick enough to be virtually completed

Scanning backlight is a technology already exists in high end home theater LCD HDTV's (960Hz simulation in top-model Sony and Samsung HDTV's -- google "Sony XR 960" or "Samsung CMR 960"), and most of those include motion interpolation and local dimming (Turning off LED's behind dark areas of screen), in addition to scanning backlight. We don't want input lag, so we skip the motion interpolation. Local dimming is complex to do cheaply. However, scanning backlight is rather simple -- and achievable via this Arduino project idea. It would be a cheap way to simulate 480Hz (or even 960Hz) via flicker in a 120Hz display, by hacking open an existing computer monitor.

Anybody interested in attempting such a project?
 
Last edited:
just use incandescent bulbs - flicker free viewing and much cheaper too.
Hopefully you knew that was in jest. ;) Obviously -- that would do nothing to improve motion blur.

Display engineers know that flicker reduces motion blur (e.g. permit crystal-sharp fast panning), and flicker should preferably at a speed high enough to be unnoticeable to the human eye. I remember videogaming on a CRT monitor at 120Hz on a Voodoo2 SLI :)
 
Hopefully you knew that was in jest. ;) Obviously -- that would do nothing to improve motion blur.

Display engineers know that flicker reduces motion blur (e.g. permit crystal-sharp fast panning), and flicker should preferably at a speed high enough to be unnoticeable to the human eye. I remember videogaming on a CRT monitor at 120Hz on a Voodoo2 SLI :)

how was that experience on crt for you?
 
Fast motion on a CRT at 120Hz is still much sharper than LCD at 120Hz. Phosphor on a CRT decays in a millisecond, while the LCD pixels at 120Hz is generally continuously displayed for the full 1/120th second. Not everyone cares, but some of us do -- that's why some of us love CRT (including those threads of those users loving Sony 24" widescreen CRT), and even though we enjoy 120Hz LCD, some of us miss the motion clarity of CRT.

Consider motion blur. An example of a fast panning scene is moving across the screen at 1 inch every 1/60th of a second. Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers for fast-panning motion moving at 1 inch every 1/60 second.
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors) ...
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors) ...
At 240Hz (or CMR 240), the motoin blur is 0.25" thick ...
At 480Hz (or CMR 480), the motion blur is 0.125" thick ...
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's) ...

The same effect is simulated at a lower refresh rate, when using flicker. A 120Hz display with a backlight briefly pulsed at 1/960th of a second, will have the same motion blur elimination as 960Hz. A 60Hz CRT with 1ms phosphor decay (essentially pulsed 1/1000sec), would be similar.
For a more detailed discussion about motion blur, see this thread. However, until then, it appears it's is something that's possible to apparently do ourselves with homebrew parts, without a display manufacturer's involvement.
 
Last edited:
Fast motion on a CRT at 120Hz is still much sharper than LCD at 120Hz. Phosphor on a CRT decays in a millisecond, while the LCD pixels at 120Hz is generally continuously displayed for the full 1/120th second. Not everyone cares, but some of us do -- that's why some of us love CRT (including those threads of those users loving Sony 24" widescreen CRT), and even though we enjoy 120Hz LCD, some of us miss the motion clarity of CRT.

Consider motion blur. An example of a fast panning scene is moving across the screen at 1 inch every 1/60th of a second. Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers for fast-panning motion moving at 1 inch every 1/60 second.
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors) ...
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors) ...
At 240Hz (or CMR 240), the motoin blur is 0.25" thick ...
At 480Hz (or CMR 480), the motion blur is 0.125" thick ...
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's) ...

The same effect is simulated at a lower refresh rate, when using flicker. A 120Hz display with a backlight briefly pulsed at 1/960th of a second, will have the same motion blur elimination as 960Hz. A 60Hz CRT with 1ms phosphor decay (essentially pulsed 1/1000sec), would be similar.
For a more detailed discussion about motion blur, see this thread. However, until then, it appears it's is something that's possible to apparently do ourselves with homebrew parts, without a display manufacturer's involvement.


give me a sample video (youtube?) to test this. i have CRTs to check this out.
 
YouTube is unable to be used for this test, because you need a motion resolution test pattern that plays at a framerate perfectly matching the refresh rate. The YouTube/Flash layer interferes with this, on too many systems.

Try the Blu-Ray Disc, "FPD Benchmark Software for Professional", which contains a motion resolution benchmark. Unfortunately, it's only up to 60fps-native, so it's not useful for testing 120Hz-native displays (though it's fine for testing 120Hz-interpolated displays). It is also certainly useful for testing the "960Hz" Sony and Samsung displays.
It's also good for comparing CRT versus LCD versus plasma, too.

Someone needs to create a high-end computer benchmark program for testing motion blur. There are some downloadable computer-based MP4 files on AVSFORUM, in this thread, although the links are spotty. Bear in mind that most of these motion test videos are mastered at 60fps, so are not as useful for testing motion resolution of 120Hz-native LCD's. (Someone needs to address this with a 120fps video; or by writing a PC application that does a motion resolution test). Also, many of the existing motion video benchmarks move too slowly (e.g. a single pixel or two per frame), as fast motion simulating fast turning in FPS shooters, is required to show benefits of scaling past 120Hz-equivalence.
 
Last edited:
Won't work. Simply flashing the backlight faster will not refresh the pixels faster. The controller has to tell the pixels to refresh faster, along with the backlight, to reduce image retention.
 
YouTube is unable to be used for this test, because you need a motion resolution test pattern that plays at a framerate perfectly matching the refresh rate. The YouTube/Flash layer interferes with this, on too many systems.

Try the Blu-Ray Disc, "FPD Benchmark Software for Professional", which contains a motion resolution benchmark. Unfortunately, it's only up to 60fps-native, so it's not useful for testing 120Hz-native displays (though it's fine for testing 120Hz-interpolated displays). It is also certainly useful for testing the "960Hz" Sony and Samsung displays.
It's also good for comparing CRT versus LCD versus plasma, too.

Someone needs to create a high-end computer benchmark program for testing motion blur. There are some downloadable computer-based MP4 files on AVSFORUM, in this thread, although the links are spotty. Bear in mind that most of these motion test videos are mastered at 60fps, so are not as useful for testing motion resolution of 120Hz-native LCD's. (Someone needs to address this with a 120fps video; or by writing a PC application that does a motion resolution test). Also, many of the existing motion video benchmarks move too slowly (e.g. a single pixel or two per frame), as fast motion simulating fast turning in FPS shooters, is required to show benefits of scaling past 120Hz-equivalence.

if even tests arent available for such a thing then whats the point? stick to the crt and call it a day.
 
CRT are fluid because they go black after every frame. Mind that 60fps content is much more fluid in 60Hz on CRT than on 120Hz mode where there is double to every frame and fluidity is lost... it's something that is very similar to what destroys LCD fluidity: sample&hold blur but there is less frames in between (on LCD there is virtually infinite number of frames inserted)

and flashing backligt at 960Hz won't do anything to image at all. It's kinda stupid idea if you ask me...

it would have practical sense if flashing was at 120Hz but it would only help in 120Hz content. Lower fps content would have afterimages (two with 60fps, three at 40fps, four with 30fps, five with 24fps...) so it would not look much better than with just nonflickering background, maybe even worse ...

I have this issue on my CRT. Movie playback is not super fluid because I can't run it at 24Hz. At 120Hz it look pretty much the same as on fast 120Hz LCD but eg. panning stars are visible like 5 stars one after another and on LCD there is blurry line. Adding 120Hz black frame insertion to LCD would make this blurry line into five stars. Improvement? Don't think so...

Adding 960Hz it would make 40 stars and it would look exactly the same as without flicker => pointless

edit://
best demonstration of CRT weakness in this area are console games that run 30fps in 60Hz mode. There is just so much motion blur caused by this fake-sample&hold effect that there is actually less fluidity than on fast TN LCD !!!
only 60FPS console games are more fluid on CRT
adding flicker of any kind (30Hz is out of question obviously) to LCD would make those 30fps games look either worse (60Hz, 120Hz, 240Hz) or the same (480Hz and 960Hz)
 
Last edited:
Won't work. Simply flashing the backlight faster will not refresh the pixels faster. The controller has to tell the pixels to refresh faster, along with the backlight, to reduce image retention.
You are misinterpreting how things work. Google "scanning backlight" and "black frame insertion". The technology is already implemented in some high end HDTV's, so the TV engineers disagree with you.

LCD motion blur is mainly caused by the continuously-shining nature of the backlight, rather than the LCD response, since the pixel response (2ms, or 1/500th sec) is now faster than the duration of a refresh (16ms for 1/60th sec @ 60Hz, or 8ms for 1/120th sec @ 120Hz)

Image retention in the LCD is 'hidden' by the black periods of the backlight. (See HDTV Blur).

Today, LCD is not taken to its maximum advantage: Today's best LCD panels have a 2ms response speed, but because of continuously-lit pixels, we are seeing a full "16ms" worth of motion blur (for 60Hz) and "8ms" worth of motion blur (for 120Hz), as our eyes track fast-moving objects across the screen. You can take advantage of sharpening the motion blur by turning off the backlight while the pixel is refreshing (e.g. taking 2ms to gradually change from one pixel color to another pixel color.)

Also see http://www.techmind.org/lcd/
The visual effect of motion blur is self-explanatory and it is fairly intuitive to realise that a slow pixel response-time will cause this problem. What is less obvious, but at least as important in causing motion-blur, is the 'sample-and-hold' effect: an image held on the screen for the duration of a frame-time blurs on the retina as the eye tracks the (average) motion from one frame to the next. By comparison, as the electron beam sweeps the surface of a cathode ray tube, it lights any given part of the screen only for a miniscule fraction of the frame time. It's a bit like comparing film or video footage shot with low- and high-shutter speeds. Motion-blur originating from sample-and-hold in the display can become less of an issue as the frame (refresh) rate is increased... provided that the source material (film, video, or game) contains that many unique frames. For LCD TV there is significant interest in the industry in strobing (flickering!) or the backlight deliberately so as to reduce sample-and-hold motion-blur; the manuafacturers have various tradenames for this, including Samsung's LED motion plus, Philips' ClearLCD scanning led backlight, etc.

Basically, a scanning backlight, behind a specific segment of the screen would be, every refresh:
- Wait for pixels to fully refresh (LCD response of 2ms)
- Briefly flash the backlight behind LCD for 1/480th second.
- Cycle repeats during next refresh.
- This allows you to have the motion sharpness clarity of 480Hz, without needing a LCD pixel response (as long as the pixel response is faster than the length of refresh, it becomes possible)

On most displays, LCD pixels are refreshed gradually from top to bottom, so instead of flashing the whole backlight behind the LCD, you need to flash a backlight segment at a time. That's why you need 4 or 8 blacklight segments (horziontal stacked rectangles of backlight), for scanning.

Here's example of Sony's literature:
https://news.sel.sony.com/en/press_room/consumer/television/release/58840.html
The model also features Sony’s MotionFlow XR 960 featuring a precise backlight control that is synchronized with the liquid crystal movement from frame to frame creating clearer, sharper moving images.

Here's example of Samsung literature:
http://www.samsung.com/us/article/clear-motion-rate-a-new-standard-for-motion-clarity
Backlight technology: Samsung's backlight regulates output precisely in synchronization with the screen refresh to lessen the time it is lit, reducing ghosting and motion blur.

Here's example of Elite(tm) HDTV literature:
http://elitelcdtv.com/technology/fast-240hz-smooth-fluidmotion/
FluidMotion, which combines an advanced frame creation system with unique scanning backlight technology, to create a greater than 240Hz effect – improving picture clarity and smoothness in movies and sports content.

See? Being outvoted by professional TV engineers, those engineers (if they read what you said) would argue that your comment sound similiar to one of those inexperienced "I can't tell 30fps versus 60fps" people of year 1998, where this has long been dispelled by now with lots of proof. However, let's forgive this, as this is really an obscure topic to many people. So, education is key -- it's useful to be learn about how motion blur works, and why scanning backlights can really benefit LCD, especially at 120Hz where it's flicker free (to most people, anyway). Note that it's possible to turn off the scanning backlight in most of these models, to stop flicker (at the tradeoff of bringing back motion blur)

This technology is found in the high-end HDTV's. What is needed is that this technology be brought to computer monitors (but excluding the motion interpolation, and using only lag-free motion enhancement technologies such as the scanning backlights). I'm happy to pay extra to a computer monitor manufacturer that comes out with a "480Hz" equivalence via scanning backlight (without added motion blur), as this proven technology actually works (but was formerly undesirable at 60Hz for LCD's, due to flicker). With 120Hz, flicker is a non-issue for most people, and it's time to bring the scanning backlight concept to computer monitors.
 
Last edited:
I like your Idea, however I'd comment that 2ms response time is horribly optimistic garbage listed on monitor specs because marketing people want to. Also, TN panels (which is what I'm assuming you would use for their fast response times?) are kinda ugly... :p
 
I like your Idea, however I'd comment that 2ms response time is horribly optimistic garbage listed on monitor specs because marketing people want to. Also, TN panels (which is what I'm assuming you would use for their fast response times?) are kinda ugly... :p
As long as the LCD response is much faster than the length of refresh, that provides enough time margin to use a scanning backlight. At this point, it doesn't matter if the response time is only 2ms, what you need is mathematically enough time for a fully refreshed LCD pixel to pulse a backlight behind.

Scanning backlight can still emulate 480Hz or 960Hz even on a 5ms LCD panel, as long as there's a virtually ghost-free moment in the refresh cycle to pulse the backlight through.
For a 120Hz panel, a refresh cycle is 1/120th second. 1/120 = 0.0083333333333 seconds. That's 8 milliseconds.

Now you've got an 8 millisecond budget.
For a 5 millisecond LCD panel, you simply wait 5ms for the pixel to refresh (in the dark), THEN you've got a 3 millisecond leftover time budget (for the remainder of the refresh) to pulse the backlight through. If you flash the backlight for 1/960th of a second, that's just barely more than 1 millisecond. Now you've got plenty of time. (5 millisecond waiting for pixel refresh, then a 1 millisecond backlight pulse). For a 5ms LCD panel, that fits quite fully within the 8 millisecond refresh cycle of a 120Hz signal, with plenty of safety margin left over.

Ghosting can still be an issue, but so is phosphor retention in CRT. Current fast-refresh 2ms LCD's, could still take advantage of scanning-backlight with nearly no ghosting, simply by doing this during a refresh cycle.
2ms -- wait for pixels to refresh (in the dark)
4ms -- safety margin to wait for most of ghosting to disappear
1ms -- flash the backlight (scanning backlight) behind the pixels.

This can all be adjustable in the Arduino (pixel refresh waiting time, safety margin wait, and length of backlight pulse), to find out the best motion-blur-elimination effect. The capacity of firmware flexibility here, for experimentation, is excellent.
 
CRT are fluid because they go black after every frame. Mind that 60fps content is much more fluid in 60Hz on CRT than on 120Hz mode where there is double to every frame and fluidity is lost... it's something that is very similar to what destroys LCD fluidity: sample&hold blur but there is less frames in between (on LCD there is virtually infinite number of frames inserted)
Agreed on this, but a continuously-shining display has an infinite number of frames, which adds more motion blur than simply repeating frames.

Movie theaters flash each frame two times at 24fps, so you're getting 48Hz flicker. This adds motion blur for sure. However, many digital movie theaters continuously shine for the full 1/24th of a second, and this adds slightly more motion blur than simply repeating frames two times. The added motion blur is VERY subtle though, but noticeable.
and flashing backligt at 960Hz won't do anything to image at all. It's kinda stupid idea if you ask me...
Wrong. It's not a continuous 960hz flicker. It would be a single 1/480 or 1/960 sec flash, once per refresh (120 flashes a second at 120Hz). See my reply to vick1000.

it would have practical sense if flashing was at 120Hz but it would only help in 120Hz content. Lower fps content would have afterimages (two with 60fps, three at 40fps, four with 30fps, five with 24fps...) so it would not look much better than with just nonflickering background, maybe even worse ...
Wrong. Continuously-shining displays have infinite number of repeated frames between refreshes, while simply repeating frames means at least a small amount of black period. There's proven to be less motion blur if you simply repeat frames.
Example #1: 30fps on a CRT looks sharper than 30fps on a LCD.

I have this issue on my CRT. Movie playback is not super fluid because I can't run it at 24Hz.
True. But still more fluid than it looks on an LCD. You can change your refresh rate to 48Hz or 72Hz; like some people on AVSFORUM did ten years ago with CRT projectors and HTPC's. (PowerStrip was popular for this). This produces clearer film motion, using only two or three repeats per frame, rather than continuous shine. The fewer the repeats, the more black period, so displaying film at 48Hz have sharper pans than displaying film at 72Hz or 96Hz on CRT.

At 120Hz it look pretty much the same as on fast 120Hz LCD
Not surprising. At 120Hz, you're leaving very little black period between the last frame repeat of the previous frame and the first frame repeat of the next frame. Try comparing CRT vs LCD at a lower refresh rate, such as 48Hz or 72Hz. (BTW, my very old Samsung LCD was able to sync to 48Hz using PowerStrip tweaks)

Adding 960Hz it would make 40 stars and it would look exactly the same as without flicker => pointless
Wrong. You're flickering 1/960th of a second ONLY ONCE every refresh, e.g. 120 times per second. So you only get ONE flicker sample per pixel, much like a CRT.

edit://
best demonstration of CRT weakness in this area are console games that run 30fps in 60Hz mode. There is just so much motion blur caused by this fake-sample&hold effect that there is actually less fluidity than on fast TN LCD !!!
True in many ways, Frame limiters are annoying -- usually exists for performance reasons and prevent the annoying 30-vs-60 jumping up-and-down (double buffering behavior).

However, let's be apple vs apples here: Do exactly the same console on a CRT, you'll notice that 30fps on a CRT, even with the motion blur of the extra frame repeat, is still slightly sharper than the same 30fps on a LCD, due to the continuous illumination (infinite frames) between refreshes. The difference in motion blur during the same fast panning motion (e.g. turning left/right fast in an FPS) is pretty clear in an A/B side-by-side test. There's no contest here, when you truly splitter the same console to two displays (A CRT and a typical LCD) and play exactly the same console. For CRT, that extra 1/30th second of blackness between the last frame repeat of the previous frame, and the first frame repeat of the next frame, actually reduces motion blur by approximately 50% in this specific situation. e.g. 30fps on a CRT looks approximately 50% sharper in motion than 30fps on an LCD (without any motion enhancement technologies), for exactly the same material output at 60Hz.

However, you may also be observing to another effect that makes you prefer LCD over CRT: Random framerate fluctuations (e.g. 20-to-40fps). Fluctuations in framerate are annoying, and it is very true that motion blur (on LCD) masks this somewhat due to the panel's response, and for some gamers, it "looks better" because the framerate fluctuations are masked by the LCD response. So you're right, in one specific sense. But it also additionally proves the widely known point that CRT "reveals" a lot more in the original picture, including the imperfections -- yes you're right, if that's a flaw of a CRT. However, for games that run exactly 30fps at all times without frequent fluctuations, it clearly has less motion blur on CRT. (Note -- now when you're PC gaming, you can enable triple buffering or even disable vsync, this smooths out the random frame fluctuations.)

adding flicker of any kind (30Hz is out of question obviously) to LCD would make those 30fps games look either worse (60Hz, 120Hz, 240Hz) or the same (480Hz and 960Hz)
Only if you're flickering it wrong; e.g. at a low rate (annoying flicker) or too high a rate (unnecessary repeat frames). If you flicker only ONCE per refresh behind a given area of LCD (e.g. scanning backlight scans in sequence only once per refresh), you're suddenly now ahead (less motion blur) than continuous illumination. Obviously, some people are sensitive to flicker (e.g. CRT) and prefer the steady illumination (Even I like that too, when just reading text).

Again, this can be an adjustable setting; and can be turned on/off (e.g. "Game Mode"), to satisfy the widest audience.
 
Last edited:
It is a good idea. Go ahead and do it.
I'd love to. I certainly have the knowledge and skills -- including programming low-level languages that applies very well to the Arduino. Alas I don't have the time (at least this year). Due to this, it is very possible that someone else will beat me to doing this project. I might even slowly work my way towards it, as I'm almost ready to discard my old LCD (when replacing with one of the upcoming 1440p 120Hz displays), and don't mind hacking it open.

An interesting note is that old 60Hz LCD works up to 72Hz (I successfully 'overclocked' the refresh rate using PowerStrip). Scanning backlights can still be used at any refresh, including 60Hz or 72Hz. (the Ardunio project can be tunable to any refresh). Mind you, if I am pulsing the backlight 1/480th of a second only 60 times a second, I'll probably need really bright LED's to compensate -- perhaps $100 worth of LED tape (100 watts of LED crammed into a 24" area). I might even get a 2nd 120Hz monitor just for hacking apart, since 120 Hz is a lot easier for brightness maintenance, since I get to pulse it 120 times a second (any more pulses and I'd be repeating frames, which is bad). For display manufacturers, scanning backlights with ultra short pulses are now practical in a sub-$1000 budget, since LED brightness has gone up dramatically in recent years; one extra good reason why scanning backlights should now be revisited for computer monitors.

Although, I admit, I'd be willing to incentivize this (e.g. a reward pot on top of the cost of parts) to build this project, if one's skills were proven (good credentials, good reputation, and speaks the right electronics/programming stuff).
 
Last edited:
A scanning backlight does nothing to implement BFI, they have to work in unison to reduce image retention. Take a PWM LED backlight, it's already cycling faster than you intend to modify, but does nothing to improve blur.

The pixels have to be refreshed faster, or cleared with BFI, to reduce blur (image retention). A scanning backlight is used to "trick" the brain into releasing the image, BFI is used to clear the image on the pixel, and pixel response allows more rapid changes of pixel display.
 
A scanning backlight does nothing to implement BFI, they have to work in unison to reduce image retention. Take a PWM LED backlight, it's already cycling faster than you intend to modify, but does nothing to improve blur.
A PWM backlight is not a scanning backlight. Though, one could modify the PWM to flicker the backlight once a frame to gain the motion blur benefit, but that is problematic, because the whole LCD is not refreshed instantly all at once. You need to pulse the backlight very close to correct point in the LCD pixel refresh cycle for a given part of the panel, so that's why scanning is added rather than whole-panel flash.

Since the backlight is turned off while the LCD is refreshing (taking 2 milliseconds to change from one pixel color to a different pixel color), it no longer contributes to motion blur anymore, because the display is completely black during this period. So you're wrong on this count...

The pixels have to be refreshed faster, or cleared with BFI, to reduce blur (image retention).
The BFI doesn't affect the LCD refresh, so you don't "clear" with BFI. All BFI (and scanning backlight) does is make the backlight dark while the pixel is being refreshed. Computer monitor manufacturers did work on this (see TFT Central), but this was done in the 60 Hz era where the flicker of scanning backlight outweighed the motion quality improvements, and before the days of LED backlights. The limited brightness meant that the off cycle is quite limited. With brighter and cheaper LED's, and the ability to pulse LED's very fast, you can have motion enhancement factors much larger.

A scanning backlight is used to "trick" the brain into releasing the image, BFI is used to clear the image on the pixel, and pixel response allows more rapid changes of pixel display.
You mis-worded it. A scanning backlight is used to trick the brain into capturing an instantaneous image sample, whereupon the human flicker-fusion threshold merges with the next flicker, to perceive crystal sharp motion, much like the flicker on a CRT.

Good, modern LCD's no longer have many temporal dithering artifacts, so strobing the backlights properly (at the correct point of pixel refresh, e.g lines of LED behind other parts of the LCD, while waiting for different part of LCD to finish refreshing pixels).

If what you were saying was true (and you really believed it), you're saying the equivalent of video camera clips taken at 60fps using a 1/60th second shutter, would have no difference in motion blur versus video camera clips taken at 60fps using a 1/960th second shutter. We all know that when horizontally panning the camera fast (e.g. ski racing, hockey, etc), a faster shutter is better when there's enough light to compensate for the fast shutter speed.

For example, see the bright-dark-bright graph at http://www.tftcentral.co.uk/advancedcontent.htm#ama. You see the long bright period and the short dark period. BENQ did this tech on a really stupidly expensive 60 Hz LCD display to only simulate 120 Hz. It didn't even reduce motion blur by 50% due to more bright period than dark period (see the graph image on that link). The flicker wasn't worth it, even though there was motion enhancement benefit. But today, we're in an era of 120 Hz displays, LED backlights, and extremely bright LED's, we can do very short bright pulses with long dark periods (e.g. keep the backlight off almost 90% of the time behind any given LCD area). This would reduce motion blur by almost 90% on an LCD. (Math = 1/960th of a second flash at 120 times per second, means the backlight is bright behind a specific part of LCD for a grand total of only 120/960th of a second -- that's an 87.5% dark - 12.5% bright cycle -- which would reduce motion blur by 87.5%! Yes, this requires the backlight to be 8 times brighter, but we're already there with LED's. Some models of computer monitors are almost 3 times too bright for viewing in a typical darkened home computer room in an evening. Just double or triple up the LED's from there, and you've got plenty of excess brightness to turn off the backlight 7/8ths of the time (87.5%) and still have a decently bright image, but with CRT-style motion on an LCD!

Anyway, the science is sound (please click on the various links above), the technology has improved, and it's time to try it again on LCD monitors, in today's 120 Hz era and bright LED's, and inexpensive controllers. I also have done tests with flicker fusion, and it is pretty remarkably consistent, regardless of technology (CRT, LCD, plasma, etc). Engineers confirm a direct relationship between bright:black cycle and motion blur -- the longer the black cycle versus bright cycle in a refresh -- the less motion blur there is.

You only need knowledge of refresh rate and connection to the VSYNC signal, plus a few timing calibration sliders (e.g. to manually do a one-time adjustment to compensate for input lag for a specific display mode on a specific LCD). All achievable with a wholly external Arduino project, and a simple software app, without needing to modify the LCD monitor's firmware, etc.
 
Good, modern LCD's no longer have many temporal dithering artifacts, so strobing the backlights properly (at the correct point of pixel refresh, e.g lines of LED behind other parts of the LCD, while waiting for different part of LCD to finish refreshing pixels).
Replying to myself here.

Older scanning backlight technologies had a long bright cycle with only a short dark cycle.
ama_diagram.jpg

However, scanning backlights have been achieving 75%:25% dark:bright cycles in some motion-enhancement technologies on current HDTV displays (Sony XR 960, Samsung CMR 960), though it also combines unwanted motion interpolation. It should be possible to do the same (or go even better, 87.5%:12.5% dark:bright cycle) for a scanning backlight in a computer monitor, and skip the motion intepolation.

The response curve of a pixel is flatter nowadays right before the next refresh: LCD display makers have been able to do this in recent years; high-speed camera snapshots of a LCD panel (1/1000th sec) still show a usable LCD image void of time-based artifacts (e.g. temporal dithering) nowadays, although there's some gamma differences. --- LCD makers have had to do this, to mimize ghosting as much as possible for active-shutter glasses.

Testing shall be needed to figure out if best image quality comes from IPS or TN panels, since TN panels have a faster response more suitable for scanning backlight. Right now here, remember scanning backlights can be turned off to maximize color quality, but many of us are happy with 97-98% color accuracy (only a few percent reduction in greyscale accuracy) during backlight flicker. For a scanning backlight, we're flashing the backlight only during a tiny part of a LCD pixel cycle; the portion of the cycle where the LCD pixel is relatively stable and at its final color -- long after its pixel refresh cycle and just before the next refresh. An adjustment can be provided as a trade-off between input lag (flash the backlight sooner in a refresh cycle) and image quality (flash the backlight later in the refresh cycle, for less ghosting, best color). Flashing the backlight later in the refresh cycle, would add slight amount of input lag, but at 1/120th second, and 8 millisecond refresh cycles, an extra 1ms or 2ms isn't going to hurt, if we're dramatically improving color quality by waiting for the LCD pixels to stabilize in a region of the display before flashing the backlight behind them. Again, no interface directly into the firmware is necessary here -- just access to the refresh rate and the VSYNC, all of which can be done independently (e.g. controlled by the computer and the Arduino), with appropriate fixed adjustments to manually adjust the delay in the scanning backlight to sync up the Arduino scanning backlight with the optimal period in the LCD refresh cycle and the LCD's electronics own input lag.
 
I would LOVE to try this (maybe with the help of someone) but I don't have the skills and most importantly lack the time for the moment since I just got a new job (lucky me :) ) that is fairly time consuming.
 
@Mark Rejhon
big thank for explanation.

Now 60 or 120Hz flicker but with smaller "black time" would certainly help. It would not make full "CRT effect" because CRT is quite opposite (most time there is blackness, picture is only drawn for a very short time) so in my star example it wouldn't make perfectly visible five stars (24fps@120Hz) but rather five short blurry lines. So it would be quite beneficial to overal sharpness while still not being "perfect" like on CRT.

And personally I like CRT how it is and prefer playing console games on it anyway, even though most games are 30Hz. There is unfortunatelly this double vision on panning shots everywhere and is some games in some places it look just ridiculously BAD. It's still sharper than on LCD, especially on moving enemies but eyes can't adapt to it so much. LCD blur looks much more natural than this double vision and after some eyes don't see it so much.

Now if this 1/480 or 1/960 flickering would be something in the middle then it would be probably the best compromise. So I think now it's good idea :)

But implementation... it's near impossible to do... if there were already tweaked 120Hz monitors for reasonable price then yeah, I think some people would buy them... but to open monitor and open LCD panel... I don't think it would be much popular, especially if like I said before pure sample&hold is something that eyes get used to after a while. Mind that 120Hz game on 120Hz LCD looks already very good...

But yeah: if you have time and spare monitors to experiment then do so. It might at least be very good microcontroller learning opportunity :) Personally I will stick with good old CRT and change it for OLED hoping there will be some black frame insertion feature available in some models
 
For some reason, the image at TFT Central does not show up in this thread anymore, so I've rehosted the image on my own bandwidth, for reference purposes and citing them.
Replying to myself here.

Older scanning backlight technologies had a long bright cycle with only a short dark cycle.
ama_diagram.jpeg

(Original: TFT Central)
However, scanning backlights have been achieving 75%:25% dark:bright cycles in some motion-enhancement technologies on current HDTV displays (Sony XR 960, Samsung CMR 960), though it also combines unwanted motion interpolation. It should be possible to do the same (or go even better, 87.5%:12.5% dark:bright cycle) for a scanning backlight in a computer monitor, and skip the motion intepolation.

The response curve of a pixel is flatter nowadays right before the next refresh: LCD display makers have been able to do this in recent years; high-speed camera snapshots of a LCD panel (1/1000th sec) still show a usable LCD image void of time-based artifacts (e.g. temporal dithering) nowadays, although there's some gamma differences. --- LCD makers have had to do this, to mimize ghosting as much as possible for 3D with active-shutter glasses.
 
Last edited:
Not quite sure I understand that image, but I do understand what your goals are and how you expect to achieve them.

IMO the biggest hurdle here is actually taking apart the physical LCD panel and replacing the LED backlighting with backlights you can control, that give good picture uniformity etc. without substantial bleed etc.

Oh, and PLEASE stop mentioning 2ms.

I will agree that the biggest problem with the 120Hz 1440p screens today is motion blur.
 
@Mark Rejhon
big thank for explanation.

Now 60 or 120Hz flicker but with smaller "black time" would certainly help. It would not make full "CRT effect" because CRT is quite opposite (most time there is blackness, picture is only drawn for a very short time) so in my star example it wouldn't make perfectly visible five stars (24fps@120Hz) but rather five short blurry lines. So it would be quite beneficial to overal sharpness while still not being "perfect" like on CRT.

And personally I like CRT how it is and prefer playing console games on it anyway, even though most games are 30Hz. There is unfortunatelly this double vision on panning shots everywhere and is some games in some places it look just ridiculously BAD. It's still sharper than on LCD, especially on moving enemies but eyes can't adapt to it so much. LCD blur looks much more natural than this double vision and after some eyes don't see it so much.

Now if this 1/480 or 1/960 flickering would be something in the middle then it would be probably the best compromise. So I think now it's good idea :)

But implementation... it's near impossible to do... if there were already tweaked 120Hz monitors for reasonable price then yeah, I think some people would buy them... but to open monitor and open LCD panel... I don't think it would be much popular, especially if like I said before pure sample&hold is something that eyes get used to after a while. Mind that 120Hz game on 120Hz LCD looks already very good...

But yeah: if you have time and spare monitors to experiment then do so. It might at least be very good microcontroller learning opportunity :) Personally I will stick with good old CRT and change it for OLED hoping there will be some black frame insertion feature available in some models
Ok, now we're on the same page of understanding. Yes, you're right that it is probably a niche feature.

HOWEVER.... As for a perfectly visible five stars, it can be done on an LCD, given sufficiently bright LED backlight and sufficiently fast flash. All the scanning backlight needs to be, is simply flash as fast as the average phosphor decay (approximately 1-2 millisecond). The image remains on your retina thanks to persistence of vision and flicker fusion.

Once the scanning backlight flashes as briefly as the phosphor decay on a CRT, you're exactly in the same ballpark as CRT for persistence of vision -- and you get five equally clear stars for your 24fps@120Hz example. So if you're doing 1/960th second, you're already there. I'm willing to bet, given a sufficiently brief enough flash, and sufficiently fine granularity (at least an 8 segment scanning backlight), you can approach it.

....Now if we wanted to go insane, we could have a 1080-segment LED scanning backlight (that is, LED's that are only one pixel tall on a 1080p display), but we don't really need to go that overkill. For a scanning effect, human eyes won't notice the difference between a sufficient-granularity scanning backlight, and a CRT. (a good engineering question is, what granularity do we need? I think 8 is sufficient, though experimentation might later determine that's not the final frontier, and we might benefit from 16-segment and greater scanning backlights, still doable with Arduino projects, especially with extra chipsn for extra outputs)
....Heck, you want to get closer to CRT? You could even fade-in/fade-out the LED's quickly in each scanning backlight segment -- at the same speed of a phosphor decay. But that's not necessary. Instant on/off for a scanning backlight is acceptable, though you can use small capacitors to simulate the decay (it might even be slightly more comfortable on the eyes of some sensitive humans, at the cost of adding slightly extra motion blur, which longer phosphor decay also adds)
....With an ultra-fine scanning backlight (many segments), you get the same "scanning effect" (skews, etc) if you roll around your eyes in front or shake a video camera in front of it.
....Realistically: Heck, if you got the budget and can cram slightly over 200 watts of LED's behind LCD glass, you could go 16-segment and use 1/1920th second flashes -- and start to slightly exceed CRT (in motion sharpness) since now you're flashing faster than the phosphor decay in some CRT computer monitors. Obviously, with 16 segment scanning backlight, only approximately 15 watts of LED is illuminated at a time, so your power supply for the LED's don't need to be that beefy, and 15 watts of modern high-efficiencly LED already brightly illuminates a 24" display (some 22"-24" 1080p monitors use only 15 watts at average picture setting)
....The eye discomfort for a specific human for flicker for CRT vs LCD scanning backlight, should be similar, given sufficiently fine-granularity scanning backlight (8 segment or better). Which means, that humans not uncomfortable with 120Hz flicker on a 120Hz CRT, will not notice the flicker for a proper scanning backlight on a 120Hz LCD.

Proof of phosphor decay on a CRT (high speed video of CRT scanning)
http://www.youtube.com/watch?v=zVS6QewZsi4
See, the phosphor decay on some displays is known to be approximately 2ms, results in approximately 1/8th of the screen being brightly illuminated. This 60 hertz CRT display takes 16 milliseconds to fully be scanned by the electron gun, and you can see in this YouTube, that the phosphor is bright on 1/8th of the screen height - that's 1/8th of 16 milliseconds, which means a phosphor decay of about 2 milliseconds. (for testing out high speed videos of a CRT, make sure the video camera shutter is faster than the phosphor decay, otherwise you're not properly measuring phosphor decay). It's widely known by CRT afficanados, that CRT phosphor decay is commonly approximately 1 to 2 milliseconds, so the video appears correct when doing simple math.
....THEREFORE, you can sufficiently equal CRT quality in motion blur (5 sharp stars), using an 8-segment scanning backlight and 1/480 second bright flashes (for a 60Hz display).
Exceed this (1/960th second flashes at 120Hz) and you potentially get even less motion blur than the slower CRT phosphors -- all with a 2ms-response LCD! Exceed this some more (1/1920th second flashes), and you're exceeding even the best CRT's. You just need sufficiently overkill in LED illumination. LED's are getting cheaper and cheaper, it's a matter of when someone decides to nibble on this carrot.
Though of course, better CRT's have faster-decay phosphors that are illuminated more brightly for shorter time periods. (Good Trinitrons are one example)
NOTE: It doesn't have to be *exactly* a 1/480sec flash per scanning backlight segment, it could be 1/500sec as long as the next scanning backlight segment illuminates at the correct timing in the LCD refresh cycle, irregardless of when the previous scanning backlight segment turns off (I'm just using 1/480 for convenience as the screen refresh is easily divisible to 480.).

So to do "For The Win", monitor manufacturers that next attempts a scanning backlight should try to go all the way 1/960th second flashes, don't bother with 1/240th second flash or 1/480th second flash. There are point of diminishing returns beyond around 120Hz, so you need a dramatic jump beyond 1/120th second, to really be quite noticeable and make the scanning backlight really worthwhile. The jump between 1/120 and 1/960 (43.75% further reduction in motion blur) is roughly equally as noticeable as the jump between 1/60 vs 1/120 (50% reduction in motion blur).
(960 - 60)/960 = grand total of 93.75% reduction in motion blur over 60Hz.

Then at this point, with one sufficiently fast & bright, single brief flash per frame behind already-refreshed LCD pixels, the motion on LCD will look similar to the motion on CRT, with exactly the same effect including five perfectly clear stars (Except for various LCD characteristics such as imperfect gray levels, and bad image scaling, and possibly extremely faint ghost afterimages similiar to those you get from 3D. But just get a very good 3D LCD that minimizes that as much as possible.)

Yes, it's a niche, but I need to point out that you can still theoretically equal (or exceed) CRT in motion sharpness, even with a 2ms LCD panel. The LCD response is not the limiting factor, but the LCD's image bleed into adjacent frame (e.g. ghosting seen during 3D on a 3D LCD with active-shutter glasses). That's the limiting factor for a scanning backlight, you will not get less ghosting than that, but ghosting is an independent and totally different artifact from eye-tracking-based motion blur and persistence-of-vision. It'll just mean a single, very faint/sharp afterimages (similar to those seen when using 3D active shutter glasses on a 3D LCD near dark/bright boundaries) but in this case, this afterimage showing up after very sharp dark/bright boundaries during fast motion (fast pans) when using a high-speed scanning backlight; but won't be noticeable most of the time for most images and games, and no more annoying than phosphor-persistence effects on a CRT. The afterimage has no effect on the sharpness of the motion blur, a faster flash of the scanning backlight just sharpens all the edges during fast motions (including the original bright/dark boundaries and any faint ghost bright/dark boundaries similiar to those seen in 3D active shutter glasses). The ghosting may even be nearly completely eliminated in some of the better 3D LCD's, but it's important to at least know that this is an independent artifact of persistence-of-vision motion blur that is solvable by a scanning backlight. Anyway, back onto track of the subject:

Nowadays, we are finally at the correct point of LCD development (minimal ghosting to the next frame, thanks to modern development of 3D LCD's and active shutter glasses; some of them are actually pretty good now) and LED development (LED's are cheap and bright enough, and very easily controllable by Arduino and other homebrew electronics), it's TIME to equal or exceed CRT motion sharpness with an LCD, which is now finally possible with a /properly engineered/ scanning backlight. Hopefully a display manufacturer is brave enough to do this properly and go "FTW" -- and use ultrashort flashes (1/960sec or faster).

Even with a niche market, I think an Alienware maker can still be profitable manufacturing just 500 or 1000 units of a $500-$600 computer monitor, of which only $100 would simply be the cost of ultrabright LED's necessary for ultra-short-flash scanning backlight. The fact that this can be done as an Arduino project, shows that the BOM (bill of materials) for a scanning backlight, is sufficiently cheap to be profitable, even at the $500-600 level. Take any good $300 "120Hz" LCD, add $100 extra of scanning backlight over-engineering, add $100 extra profit margin, and you've got a $500-$600 computer monitor with fully CRT-equalling look-and-feel of motion. (Note: I'm specifically talking about motion, not things like black levels or scaling deficiencies). Heck, I'm willing to pay a little more above-and-beyond that, too.
 
Last edited:
IMO the biggest hurdle here is actually taking apart the physical LCD panel and replacing the LED backlighting with backlights you can control, that give good picture uniformity etc. without substantial bleed etc.
I agree that's the biggest hurdle, but it's actually simpler if you use a full-array backlight and a diffuser panel.

The $40 roll of LED tape has 600 white 6500K LED's on it, total 50 watts worth and illuminates more brightly than two 100-watt incandescant bulbs, given today's modern LED efficiencies of being 4 times brighter than equivalent incandescent wattage. It should be sufficiently bright for a 75%:25% dark:bright scanning backlight, sufficient for 1/480th second bright flashes. The 5 meter roll would be cut up into 16 horizontal strips, spaced vertically less than an inch apart. Put a piece of translucent white plastic (possibly a piece of fluorescent light diffuser) in front of those 600 LED's, and you've got pretty even backlighting. Two rolls of these bright LED's, would be even better, with 32 horizontal strips of LEDs for a grand total of 1200 LED's, a total of 100 watts (though in scanning fashion, only a fraction will be illuminated at a time). Still less than $100 for the cost of the LED's. The Arduino's have almost enough outputs for a 16-segment scanning backlight already, so you can easily do an 8 segment scanning backlight (2 horizontal rows out of 16 rows, if cutting LED tape into 16 rows). eBay search "5 meter white LED strip" or "5 meter white LED tape", you get a lot of remarkably cheap prices for chinese-made strips. The overkill number of LED's, combined with simple translucent white plastic, potentially provide superior eveness to most common LCD backlights, assuming you get LED's of good CRI (color rendering index). Many LED's exceed CRI 85, superior to many CFL backlights, but beware of the cheap chinese strips that are not pre-tested for white quality -- do your homework.

I've bought these LED tape for lighting up my house, and I guarantee you -- when laid out in such a compact area -- they are more than bright enough to light up a tiny 24" monitor for less than $40. It's easier to make a homemade diffused backlight with a full array rather than edgelight, so it's quite doable in a homebrew -- that's 600 or 1200 LEDs -- and very easily diffusable by a piece of white translucent plastic. Others have already done that before by cutting up monitors and putting them on overhead projectors, for "cheap LCD projectors" -- (google "home made LCD projector") -- so it's already homebrewable. The skill of taking apart of a computer monitor is common in this home brew community. Nothing new here.

Oh, and PLEASE stop mentioning 2ms.
Agreed, 2ms only means most of the refresh is done, it still decays slowly to the final pixel color -- which is why ghosting still exists (8ms later) during 3D on an LCD. It's already gotten so good on many of the faster LCD panels in better 3D LCD displays, though, that most of the ghosting is now gone between adjacent frames (an important pre-requisite for high-speed-flash scanning backlights)

I will agree that the biggest problem with the 120Hz 1440p screens today is motion blur.
You and me both! Mind you, experimentation will be needed if scanning backlights is more suitable for TN or IPS screens. The pixel response (2ms vs 5ms) might end up being less important than the stability of the pixel immediately after the completion of the pixel refresh. But I think that panels suitable for 3D will end up being the panels most suitable for scanning backlights, due to the pre-requisite of minimizing ghosting between adjacent frames.

On a more practical matter -- IPS panels and 1440p -- it may be easier to do a scanning backlight at 72Hz for a 1440p IPS panel (e.g. modify a Catleap 2B or Overlord, with an Arduino scanning backlight!); and you'll get less motion blur than 120Hz without a scanning backlight, and you wouldn't need a SLI to gain a 72fps consistency in many videogames to gain the benefit of 'perfect' CRT-quality motion without the flickery disadvantage of a 60Hz CRT, since most humans are okay with 72Hz in a darkened room. (One tradeoff, you may need brighter LED's for the short flashes to compensate for fewer flashes in a scanning backlight at 72Hz rather than 120Hz)
 
Last edited:
I was thinking 96Hz myself, but I suppose 72 works as well.

Thinking about it, how important is it that the flashes are so short? Wouldn't clarity be the same regardless of flash duration so long as the pixel was stable throughout the flash? (okay okay pixels are never fully stable even @60hz I guess)

Ah well. I agree it is a cool project and all, but posting long-winded (although clearly well thought out and researched) responses in a thread here doesn't seem to be making a ton of progress :p

I have a lot of things higher on my "why don't manufacturers do X" list, to be honest. A lot of them pertaining to monitors. It is borderline criminal how little advancement has been made with respect to LCD technology (available to users) over the last 10 years.
 
Thinking about it, how important is it that the flashes are so short? Wouldn't clarity be the same regardless of flash duration so long as the pixel was stable throughout the flash? (okay okay pixels are never fully stable even @60hz I guess)
It's very important that the flash be short in duration. There's a direct relationship between flash duration and motion blur.

Your eye is continuously tracking an object as it moves across the screen. If the flash was longer, your eyes tracks more during the illuminated part of the flash. The LCD image is stationary during the illuminated frame, even while your eyes is continuously tracking. Your eyes may have tracked a few millimeters down the screen by the time you turn off the backlight in a scanning backlight. The longer the flash, the more your eyes is tracking during an illuminated frame, the more motion blur. So you NEED shorter flashes!

A scanning backlight that's dark 50% of the time will reduce motion blur by 50%.
A scanning backlight that's dark 75% of the time will reduce motion blur by 75%.
A scanning backlight that's dark 90% of the time will reduce motion blur by 90%.

So you really, really, really, really NEED a high-speed flash (very short duration), to really dramatically reduce motion blur. Most display makers have not bothered because you need a really bright flash to compensate for the long dark period, but today we're at the correct technological point to do a scanning backlight that equals (or surpass) CRT-quality in motion blur reduction -- with modern 3D-ready LCD panels and very-bright LED's available for reasonable prices. I remember when I paid $1249 for a 17" CRT computer monitor in the early 1990's :-D -- most could only afford 13" or 14" then -- even a 24" monitor with a good high-speed (1/960sec) scanning backlight for $600+ would be an incredible bargain in comparision to that.

Too bad the race-to-bottom does not leave too much room for high sales in Alienware-niche monitor brands, but I think there's enough market to be profitable even at just 500 or 1000 monitor units with a high speed (1/960sec) scanning backlight -- within reach of small makers such as Overlord or Catleap, assuming they hired an engineer with the correct understanding of scanning backlights. (Because it's doable as a homebrew Arduino project, less than $100,000 of R&D is needed, easily amortized at $100 extra price over sales of 1000 units of CRT-quality LCD monitors. Most of this cost is simply prototypes and sourcing to appropriate manufacturers to create the custom backlight. Add ~$100 BOM/factory cost additional per display, and there's room left to be profitable at $600 pricing per CRT-quality LCD monitor. Heck, start with $800-900 or even $999, to be safe, till numbers ramp up. I'd buy a CRT-quality LCD monitor with a 1/960sec scanning backlight at $1000, or 1/480sec scanning backlight at $700-$800, and would pay even more if it is a 1440p display instead of 1080p or 1200p.)

Also, practical application: A video game player -- FPS shooter running at 60fps on a 60fps CRT (even back to the 3Dfx Voodoo2 SLI days on Quake in the late 90's), doing fast horizontal turns, causing fast horizontal pans, watching for that faraway sniper (WHILE turning fast) -- far more easily done on CRT when running full 60fps at 60Hz thanks to lack of motion blur. However, this is not as easily done on LCD due to motion blur. Many FPS gamers have lost the art of watching threats during fast pans, and often stop horizontal fast-turning to inspect the scene for hazards (e.g. faraway enemies in scenes), due to introduced motion blur when the whole world switched from CRT's to LCD's. There are far more benefits of CRT-quality gaming too, but this is one of the many examples. Now that it is possible to bring CRT-quality motion blur elimination to LCD's given the proper scanning backlight tech, why not?
Ah well. I agree it is a cool project and all, but posting long-winded (although clearly well thought out and researched) responses in a thread here doesn't seem to be making a ton of progress :p
To the contrary, I corresponded with a monitor manufacturer expressing interest, though not to the extreme interest (yet). I plan to contact a few others, try to get some geeks/engineers within one of these companies (I could use help -- message me).

There's also at least a few posters who said "wow, interesting info" or similar. If 100 people are sufficiently educated thanks to this thread (on this forum and cross posted to others on the net), one or two might help me make this goal alive (e.g. I program the software, someone else designs the Arduino hardware, third person pays to bring this all to a convention floor to show off to monitor makers). I've done this before for other technologies in the last decade (Five of my visits to Las Vegas and Indianapolis have been subsidized ten years ago -- to conventions such as CES, COMDEX, CEDIA, all from spinoff stuff from many of my online creativities in other technologies, regarding video algorithms, including the world's first open source 3:2 deinterlacer algorithm (dScaler authors list, my algorithm from 1999), found in dScaler software (very old TV viewer software with good deinterlacing). I was the moderator of the AVSFORUM Home Theater Computers forum from 1999 for a few years. They missed me quite a lot when I left, eventually to work in a different field. So who knows? Nothing ventured, nothing gained, and scanning backlights, for me, is a fun topic to write about, anyway -- since I understand this science very well. It might even someday lead to something, if somebody, somewhere, gets interested in me enough. :)

If I can educate at least 5 or 10 people, and interest at least one or two monitor makers -- my deed's done! My long posts tends to repel most people except for the very interested / smart / educated people. The Internet is full of too much noise. Also, my name ("Mark Rejhon") is easily googled, after all.

Back in year 1992, I made a 30fps versus 60fps comparision demo for MS-DOS, found at http://www.marky.com/files/dos/motion.zip ... (yes, my website is www.marky.com, check it out -- though it is very, very, very old -- and I really need to redesign it). Back in 1992, many people disbelieved one could tell apart 30fps versus 60fps. Demos like this (and many other people's). Today, not so much -- many people here in this forum believe it nowadays, especially after witnessing modern videogames at various times.

It is borderline criminal how little advancement has been made with respect to LCD technology (available to users) over the last 10 years.
Agreed on that last point!
 
Last edited:
What you are describing is something LG or Samsung or another Panel manufacturer would have to do. 3rd party vendors like Dell who purchase the panels couldn't (imo) do something like this since LG wouldn't sell them panels to hack apart and replace the backlighting on.

If you talk to HyperMatrix (I'm sure you know who he is), he talks to Witech a lot and while I just said that isn't the kind of company that could do this, discussions with them could be profitable.

Also, while this thread got a bit off topic, the OP seemed to be in contact with the company that makes Overlords, and they seem to be willing to do a decent amount of engineering work on their monitors.

Both might be substantially more interested if presented with a legitimate proof-of-concept. This thread wouldn't count XD.

Oh yeah, and "CRT Quality" is gonna be a tough thing to market because marketers have spent the last 10 years telling us how LCD is good and CRT is bad, however false that may be (and is). I mean, informed people will be interested for sure, but very few people count as "informed" with respect to monitor technology and quality.

EDIT: Have you ordered a catleap and/or overlord to try this stuff out with?
 
Both might be substantially more interested if presented with a legitimate proof-of-concept. This thread wouldn't count XD.
Agreed, BUT, it *could* lead to a proof-of-concept, either by me, by others, or by a co-operation between me and others.

Oh yeah, and "CRT Quality" is gonna be a tough thing to market because marketers have spent the last 10 years telling us how LCD is good and CRT is bad, however false that may be (and is). I mean, informed people will be interested for sure, but very few people count as "informed" with respect to monitor technology and quality.
Marketing will be a challenge, but there should be enough of an Alienware market (e.g. 1000 units) to make this endeavour worthwhile. Maybe even more if rave reviews goes viral (maybe, maybe not, who knows?). The proof is in the actual result, anyway -- not in the marketing.

Such vendors can figure it out; what's important is that the motion clarity is dramatically better in order to be noticed by many people - and to do so, you really need to go "FTW" and high end, high speed scanning backlight -- single 1/960sec flashes per frame. Thankfully, the BOM even for that should fit within $100 extra on top of the existing cost of a monitor, based on my research.

Yes, maybe not Alienware (now owned by Dell), but some similiar vendor targeting a niche market.
EDIT: Have you ordered a catleap and/or overlord to try this stuff out with?
Almost. I'm waiting on reviews of Overlord first, before making a decision on what panel to purchase. I almost bought a 1080p display, but I stopped because my current Samsung is a 1920x1200, and as a computer programmer, I disliked the idea of losing vertical resolution. I almost bought a 2560x1600 monitor too, but stopped because I didn't want to commit to upgrading my graphics card (I do some gaming, not many games -- just a few games a year, mainly hit series like HL, Bioshock, Crysis, Portal, etc.). But a computer monitor is something I keep for many years. The newer 1440p panels solve that problem for me, I don't lose vertical resolution, and the Catleap/Overlords now let me play in 120Hz territory. Alas, I'd love for it to have a scanning backlight also, but I'd need to experiment first.

This will free up my old Samsung for hacking/modding (Although being a 60Hz panel, it might not be a good proof-of-concept, I can still easily recycle the same Arduino scanning backlight with a different similiar-size panel. The LED's can just be mounted to plywood, and jerryrig a makeshift LCD glass holder. ;) I tested my Samsung using custom PowerStrip tweaks, and found the Samsung 226's and 245's are overclockable to 72Hz which will be sufficient to reduce flicker when I add a scanning backlight. There's already instructions for taking apart my old Samsung monitor (google "home made LCD projector"). I can still run proof-of-concepts with an old LCD, before buying a disposable 120Hz 3D LCD monitor, though some of those panels are harder to take apart than my old Samsung monitor (Thanks to being ultrathin and backlights being built into panel assemblies, which forces some delicate surgery). I'd rather test something on something disposable first, the Arduino scanning backlight is a project that is recyclable with any LCD panel of approximately the same size, permitting experimentation after initial successes with the first panel.

Right now, the problem is that I'm working lots of overtime on many projects, and time is limited to do this endeavour on my own. Are there electrics-experienced people (e.g. people who know how a transistor works, NPN, PNP, bipolar, mosfet, etc) here? Are there Ardunio enthusiasts here? Or perhaps, I can pick the brains off fiverr's Arduino pay-helpers section, to at least save maybe 50% of my hours for really cheap -- all I have to say is I want a flashing christmas decoration that flashes 8 separate LED tapes sequentially, upon a signal trigger. Done. Then I take over, do the rest of the hacking (turn the flashing sequence into high speed, connect the signal trigger to VSYNC, rerarrange LED's onto a sheet of plywood, add time-tuning features for input lag, etc). I can even design the schematic myself, I know enough about 555 timers, diodes, resistors, transistors, voltage regulars -- Arduino makes that all even easier by eliminating most of those components, and I would only need current regulators for the strips of LED's. Then give that schematic to someone else to create the starter hardware for my scanning backlight. I'll can do the programming myself. Thinking seriously about doing that within a few months; when things slow down, unless someone else says they're going to try first, or a monitor maker becomes fully interested enough (I have to exhaust that avenue first -- thanks -- yes, I will contact Overlord and see what they think, even if it's only a stepping stone rather than an actual product).
 
Last edited:
Oh -- one other thing I need to think of:
A high speed camera would be handy (either visiting a studio with one, or renting one, or purchasing a cheap one). Something that can do 1000fps is desired, since I want to capture the scanning backlight in action in a similar fashion to this YouTube of high speed camera of CRT scanning. It would be a good verification that the scanning of the scanning backlight is being done properly. It would also be useful for testing input lag on existing LCD monitors, to assist in manual calibration of the input lag compensator in my Arduino program (to properly sync up the scanning backlight with the monitor, without needing to modify the monitor's electronics). I wonder if prosumer cameras such as Samsung TL350, Exlim EX-FH20 and other prosumer cameras that have a 1000fps mode (although very low-resolution and poor quality at 1000fps on these non-studio cameras) is sufficiently sensitive enough to capture scanning in motion. Only a few hundred dollars, not the normal thousands that professional high-speed cameras cost. For camera testing, low-defintion at 1000fps is okay as I only need to be able to tell *which* segments of a scanning backlight is currently illuminated (at millisecond precision) -- buy one of the cheaper high speed cameras -- and use it for proof-testing and verifying the timings of the scanning backlight, and creating scientifically-acceptable graphs and measurements for wowing reviewers, readers and manufacturers.
 
You keep editing in extra info on me XD.

I don't have a fast camera, and I don't have the knowledge/wherewithal to actually do this, but if you move to Calgary lemme know and I'd chip in however I could.

That youtube video was pretty sick though. It was cool how you could see which colour phosphors decayed at what rate.

Lastly, I feel like whole-screen flashing would be simpler and easier than 1/4 or 1/8 at a time, but I agree it might not be realistic. I was gonna comment that you would need a way to control the bleed from that portion of the screen (since LED's are essentially omnidirectional) when using 4 or 8 seperately controlled strips, but after watching the CRT video I'm less sure.
 
I don't have a fast camera, and I don't have the knowledge/wherewithal to actually do this, but if you move to Calgary lemme know and I'd chip in however I could.
At least we're in the same country. (I'm in Canada, though I go to the US often).

That youtube video was pretty sick though. It was cool how you could see which colour phosphors decayed at what rate.
Yep. That's also why ghosting is often greenish colored on CRT's, because the green phosphor often takes longer to decay than red or blue.

Lastly, I feel like whole-screen flashing would be simpler and easier than 1/4 or 1/8 at a time, but I agree it might not be realistic.
Yes, it would work, but you'd potentially have some nasty LCD and ghosting/crosstalk artifacts, of partially-refreshed pixels. (similiar to artifacts seen in incorrectly-calibrated 3D active-shutter glasses e.g. not calibrated for ghosting minimization, and the shutter in your glass opens while a part of the LCD is still refreshing. You witness increased ghosting for a certain vertical region of the screen.

3D LCD's using active shutter glasses (alternate-frame flickering after each other) have run into this problem to a certain extent, where the top edge or bottom edge of the image has more ghosting than the other edge -- that's because of the 3D active shutter being opened for a specific right after the bottom edge of the LCD has finished refreshing. This is most apparent in bright moving white objects on a black background.

Calibration is done to eliminate this ghosting/crosstalk --
Example: Instructions on calibrating 3D shutter glasses for crosstalk minimization).
For a scanning or strobed backlight, a software program for calibrating the timings on the Arduino would be very similiar to calibrating 3D shutter glasses -- basically tweak until the blurring/crosstalk in a test pattern gets minimized. So with such a simple program created in conjunction with the Arduino, any user can easily do it, and no integration into the monitor's own electronics is necessary (except for knowing the VSYNC signal, which can be tapped in software via the computer, or via the cable)

As an example for scanning backlight and minimizing crosstalk/ghosting, you'd rather be flashing segment #6 out of 8, while waiting for LCD pixels approximately near segment #2 out of 8 to finish refreshing. Therefore, scanning backlight is better, although if you time a full-panel bright refresh during the VSYNC interval (a very brief time interval), it might prove acceptable (at least for experimentation). Heck, the Arduino project could potentially be programmable to do both full-flash, 4-segment-mode, and 8-segment-mode, and compare the modes for experimentation!
Ideally, you want to flash the scanning backlight 'out of phase' of the internal LCD pixel-refresh scanning, at a period of LCD refresh where the LCD pixel is stable, since all LCD panels utilize row-by-row (or, rarely, column-by-column) pixel refreshing mechanism. LCD pixels are changed to their next color, in a raster-like sequence. You don't want to flash the backlight during a period where LCD pixels are changing at their fastest speed (see the graph image), otherwise, you have nasty artifacts.

Also, whole-screen flashing is more uncomfortable on human eyes, because instead of some LED's always being illuminated (as in a scanning backlight), you've got a totally dark screen between the flashes. I once encountered an old LCD portable television with a 60Hz PWM backlight (ouch), and it WAS more uncomfortable than a 60Hz CRT, due to the strobing instead of scanning. Also, I've noticed that flicker fusion threshold is higher with strobing than with scanning. At 120Hz, it might be acceptable; and it might produce a much easier proof-of-concept, but it will still require the Arduino to listen to the VSYNC signal, to properly time the flashes. So the BOM is not much lower (bill of materials), you still need the same amount of LED's (the primary cost driver for a backlight that's dark 90% of the time).

Related thought: Perhaps all we need is 4-segments, you can just flash each segment 1/960th of a second, then 1/480th of a second later, flash the next segment in the scanning backlight -- you don't necessarily have to have one segment illuminated at all times; we can flash the segments more briefly than the stepping between segments. However, that's not ideal; it's preferable to have some illumination somewhere on the panel, at all times; for better eye comfort (more consistent average photon level to the eyeballs at all times) and preferable flicker fusion threshold. Humans can 'feel' flicker more easily during strobing than during flickering.

Also, it's much better on a power supply/electronics, to use only 15 watts at a time out of a 100 watt backlight continuously, rather than a sudden, single 100 watt surge. That leads to poor power factors (e.g. paying for 100 watts of electricity, even though you're only using 15 watts in average). The monitor's power supply can provide the power factor correction (PFC 0.99+), but -- regardless, the increased complexity of a scanning backlight is somewhat compensated by the simplification of power supply requirements, since you're illuminating essentially exactly the same wattage at all times, by always having one segment illuminated at a time. For a 24" panel, you need to have an average LED backlight power consumption of 15+ watts for an acceptably bright image in video games. For a fixed backlight power consumption average, the same number of photons are hitting your eyeballs, and is practically, in perception, the same brightness thanks to flicker fusion, regardless of how many segments are lit up, how brightly, how frequently (provided it's beyond flicker fusion threshold), or even the entire backlight strobed at once, provided you're outputting exactly the same average LED power for the same LED's. (For sake of argument, I'm ignoring stuff like power lost in switching inefficiencies, and different LED efficiencies at different voltages/currents/heat levels, etc.)

I was gonna comment that you would need a way to control the bleed from that portion of the screen (since LED's are essentially omnidirectional) when using 4 or 8 seperately controlled strips, but after watching the CRT video I'm less sure.
No, you don't need to. The diffuser does that for you (sheet of white plastic). If you illuminate the whole panel, there's no noticeable bleed. Provided each section is illuminated EXACTLY at the same moment for EXACTLY the same time period, the bleed won't be visible at all; it'll look as if the whole panel was illuminated evenly.
You don't want to vary the illumination. (e.g. introduce inaccuracies in scanning-backlight timings), or you might see some faint bleeding between adjacent sections.

But if you do it properly, synchronously, and illuminate each section exactly the same percentage of the time, this is not a problem; thanks to flicker-fusion -- it'll look like the whole panel simply being illuminated continuously, just less photons hitting your eyeballs. No bleed compensation is necessary; just make sure you have a good diffuser sheet and test by illuminating the whole panel and making sure that white-image uniformity for a fully-lit panel (scanning mode turned off) is proper. Then when in scanning, the uniformity looks exactly the same to the human eye thanks to flicker fusion.
 
Last edited:
Don't forget, when you're flashing LEDs for such a short time, they are gonna be extremely dim.
 
I really think you should try to get in contact with John Carmack. He is probably already working on something similar for his virtual reality glasses project.

By the way, this thread makes me happy! :D


Would be so badass for a manufacturer to release a dedicated/minimalist monitor for gaming:

1440p
120Hz native (960 Hz simulated)
Displayport
Internal LUT calibration
IPS or OLED
10 or 12-bit panel
Super low input lag

DONE! No extra BS inputs, features, etc.
 
Don't forget, when you're flashing LEDs for such a short time, they are gonna be extremely dim.
Just covered this. You just add extra LED's. Today, you can buy 100 watts of LED's for less than $100. (Less than $50 if off eBay, search keyword "600 white LED strip" - $30 for two strips). The total light output of 1,200 LED's is almost 10 times brighter than a 24" monitor backlight, sufficient enough for a scanning backlight that's dark 90% of the time.

(NOTE: When buying LED tape for prototyping a monitor backlight, but you want the daylight white 6500K ones with CRI 80. I have purchased LED strips for accent lighting for a home before, and man, one $15 strip (5 meters 600 LED) is plenty bright to illuminate a single 24" panel more brightly than its included backlight. Probably enough for experimentation, at least for a 75%:25% dark:bright scanning backlight. Two strips would illuminate an LCD panel about 10 times brighter than its backlight, enough for scanning backlights approaching 90%:10% dark:bright. These ribbon strips let you cut them in 2 inch increments, so you can cheaply lay rows upon rows upon a sheet of plywood for a prototype backlight for showing off. This provides a backlight almost 10 times brighter than today's backlights -- today efficient 24" LED 1080p monitors consume only 15 watts of LED power. Technology has already solved the brightness problem, just manufacture more LED brightness into a monitor to make high-speed scanning backlight possible (e.g. 100 watts of LED's is overkill for a 24" display, unless you want a scanning backlight that's dark almost 90% of the time)

Summary:
1. LCD's are now fast enough to finish refreshing before the next frame (requirement of 3D LCD's)
2. LED's are now bright and cheap enough (requirement of extra brightness needed in ultra-short flashes in scanning/strobed backlight)
3. Today's 120Hz LCD's, means that flicker of a scanning-backlight, will not bother _most_ people. (3D LCD's brought us 120Hz LCD's)
4. Controllers for scanning backlights are now cheap (it can be done with an Arduino)
5. Scanning backlights make it possible for LCD blur to be *better* than the LCD's own response speed.
6. Finally, scanning backlights can be made configurable in on-screen menus (e.g. turn off scanning and make it behave conventionally), if you only want to use it during videogames.

LCD blur can be less than CRT, since LCD pixel response no longer matters (once you meet pre-requisite #1 above)
For example, a single 8ms refresh (1/120th second) for a 120Hz display, can be enhanced with a scanning/strobed backight:
2ms -- wait for LCD pixel to finish refreshing (unseen, while in the dark)
5ms -- wait a little longer for most of ghosting to disappear (unseen, while in the dark)
1ms -- flash the backlight quickly. (1/960th second or 1/1000th second)

See? You just bypassed LCD pixel response as the factor in motion blur.
Heck, you could strobe for only 0.5ms instead of 1ms -- and you get sharper motion on an LCD than on a CRT, imagine that!
Flicker fusion and persistence of vision (exactly the same principle as using a CRT display), does the rest for you -- motion is crystal sharp.

All these values could be adjustable in the Arduino scanning backlight project, to reduce input lag, etc. Heck, if your backlight was sufficiently bright enough, you could do 0.5ms flashes of the backlight segment, and still have a very bright image. Also, scanning the backlight means you don't need a power supply for the whole LED panel (e.g. only 15 watts out of a 100 watt backlight is illuminated at a time). All you need is a sufficient overkill amount of LED's for the extra brightness required for a backlight that's dark 90% of the time. Look around, the good news is that LED's have fallen a lot in price in the last few years. (Don't believe me? Visit Las Vegas or Times Square. The video screens you see, are LED's -- and stadiums jumbotrons typically have about a million LED's on them). LED billboards are often underdriven at 5% of their original brightness, ONLY because of local bylaws banning overly-bright billboards (hazard). They need to be very bright, to be daylight-visible. They ARE bright enough now for a high-speed scanning backlight that's dark 90% of the time. And search "600 white LED strip" on eBay, only $15 each -- and that gives you more light than a typical computer monitor's builtin backlight. Today, we're finally at a point where we can have a backlight 10 times brighter than necessary, in order to have a scanning backlight that's dark 90% of the time, to achieve CRT-quality in an LCD. Yes, for a monitor manufacturer this feature is hard to market, but as long as you manage to equal or better CRT (1/960th sec), the reviews of the monitor by discerning gamers will tell the story, and self-sell the monitor.
 
Last edited:
Replying to myself, about combining 3D glasses and scanning backlight:
Yes, it would work, but you'd potentially have some nasty LCD and ghosting/crosstalk artifacts, of partially-refreshed pixels. (similiar to artifacts seen in incorrectly-calibrated 3D active-shutter glasses e.g. not calibrated for ghosting minimization, and the shutter in your glass opens while a part of the LCD is still refreshing. You witness increased ghosting for a certain vertical region of the screen.

3D LCD's using active shutter glasses (alternate-frame flickering after each other) have run into this problem to a certain extent, where the top edge or bottom edge of the image has more ghosting than the other edge -- that's because of the 3D active shutter being opened for a specific right after the bottom edge of the LCD has finished refreshing. This is most apparent in bright moving white objects on a black background.
Thinking about this some more -- if you combine 3D and scanning backlight -- the scanning backlight (or backlight strobing) needs to stay in sync with the active shutters, too. This requires fine-tuning of the shutter timing and/or the scanning backlight, so that it all looks fine through 3D active shutter glasses.

Monitor manufacturers may want to continue to allow 3D to be an option, in a monitor that has a high-speed scanning backlight. They are not mutually exclusive of each other, and will also benefit 3D too in less motion blur during 3D, too. Some of us don't care for 3D, but 3D is a "nice to have" -- and the motion-blur elimination benefits also still applies here too. The motion sharpness would be the same in 3D mode, assuming rendering was done quickly enough. On a practical basis, you need double the GPU horsepower to get the same "120fps" feel through 60Hz/60Hz 3D glasses on a 120Hz display (versus just plainly doing 120fps at 120Hz without the 3D glasses mode), since two images are rendered at all times (left and right eye), and you're only showing one of them at a time using active shutter glasses. (creative optimizations can lessen the GPU-doubling overhead, but there are complexities beyond scope of this post.) So you need the GPU horsepower of 240fps, in order to get the "120fps feel" through 3D shutter glasses on a 120Hz monitor. Ouch. Easier to allow the monitor and 3D glasses to be reconfigurable down to 72Hz or 96Hz, to reduce the GPU requirements if you want the "perfect blur-free motion" and "3D" at the same time.

BTW -- Over ten years ago in year 2000, I enjoyed 3D on a CRT. I used to have the Asus V7700 Geforce II GTS with the 3D glasses (Tomshardware, circa 2000, one of the first competitor graphics cards to finally truly surpass 3Dfx). That was when CRT was still popular. It also happened to be compatible with standard CRT projectors -- I hooked it up to in year 2000 to my NEC XG135 CRT projector (my own pics from 12 years ago! -- Cost me a pretty dime, but got a bargain used on it, back in year 1999!) projecting a huge image on the wall; and I enjoyed 3D (60Hz, at 30/30 for each eye). I was able to run at either 800x600 120Hz to get 120fps in Star Wars Racer. I instead ran it in 3D (800x600 at 60Hz with the old bulky shutter glasses circa 2000), and gained the full "60fps perfect motion feel" while still having 3D! The feel of "600mph 4 feet above the ground" (Star Wars Racer game tagline) combined with the "full perfect 60fps CRT feel" was amazing with a wall-sized image, in full 3D at 60fps (a flickery 30/30 per eye). :D Flicker wasn't too bad in a darkened room.

3D gaming on a wall sized screen using a 3-lens CRT projector -- this was 12 years ago -- That's more than a decade ago! (Man, in my late thirties today, I look super young in that old photo!) I sometimes miss the days when I used to work in the home theater industry -- That's a big source of all my imagery knowledge, and intimate understanding of scanning mechanisms, etc.

...This self-reply is just really to say that scanning backlights aren't mutually exclusive from 3D, and it's possible to combine 3D and scanning backlight in the same display, and can still be done using an Arduino, too. (Just need to add a software "phase adjustment", or even use the shutter glasses transmitter as the source of VSYNC signal for the Arduino-powered scanning backlight, to keep the backlight scan timings aligned with shutter timings)
 
Last edited:
Endorsement from John Carmack on twitter. Sounds like I am on the right track!!! !!! !!!

Mark Rejhon @mdrejhon
@ID_AA_Carmack I'm researching home-made Arduino scanning backlight (90%:10% dark:bright) using 100-200 watts of LED's. 120hz.net/showthread.php…

John Carmack @ID_AA_Carmack
@mdrejhon Good project. You definitely want to find the hardware vsync, don't try to communicate it from the host.
 
Back
Top