Home-made Arduino scanning LED backlight to simulate 480Hz or 960Hz in a 120Hz LCD?

(Warning, geek stuff below. Requires programming and basics electronics knowledge)
(Crossposted to equivalent concurrent thread in hardforum.com, overclock.net and 120hz.net)

After hearing back from John Carmack of iD software saying that this is a good project, I'm proceeding with some preliminary 'build' research, e.g. creating a small-scale breadboard trailblazer for this project. I've created electronics before, and I have programmed for more than 20 years, but this will be my first Arduino project. I've been researching, including Arduino's, to determine the best way to program it for a scanning backlight experiment.

Goals For Scanning backlight:

- At least 8 segments.
- Reduce motion blur by 90%. (Ability to be dark 90% of the time)
- Tunable in software. (1/240, 1/480, 1/960, and provisionally, 1/1920)
- Manual input lag and timing adjustment.
___

1. Decide a method of VSYNC detection.

Many methods possible. Will likely choose one of:
....(software) Signalling VSYNC from computer, using DirectX API RasterStatus.InVBlank() and RasterStatus.ScanLine .... (prone to CPU and USB timing variances)
....(hardware) Splicing video cable and use a VSYNC-detection circuit (easier with VGA, harder with HDMI/DP, not practical with HDCP)
....(hardware) Listen to 3D shutter glasses signal. It's conveniently synchronized with VSYNC. (however, this may only work during 3D mode)
....(hardware) Last resort: Use oscilloscope to find a "VSYNC signal" in my monitor's circuit. (very monitor-specific)

Note: Signalling the VSYNC from the host is not recommended (John Carmack said so!), likely due to variances in timing (e.g. CPU, USB, etc). Variances would but this gives maximum flexibility for switching monitors in the future, and makes it monitor-independent. I could stamp microsecond timecodes on it to compensate (RasterStatus.ScanLine may play a role in 'compensating'). In this situation, an LCD monitor's natural 'input lag' plays into my favour: It gives me time to compensate for delays (wait shorter/longer until 'exaxctly' the known input lag) caused by timing fluctuation. I can also do averaging algorithms for the last X refreshes (e.g. 5 refreshes) to keep things even more accurate. The problem is that Windows is not a real time operating system, and there's no interrupt/event on the PC to catch InVBlank behavior. Another idea is almost randomly reading "ScanLine" and almost randomly transmitting (with a USB-timing-fluctuation-compensation timecode) it to the Arduino, and letting the Arduino calculate timings needed. This is far more complex software-wise, but far simpler and more flexible hardware-wise, especially if I want to be able to test multiple different LCD's with the same home-made scanning backlight.
___

2. Verify the precision requirements that I need.

- What are the precision requirements for length of flashes (amount of time that backlight segment is turned on)
- What are the precision requirements for sequencing (lighting up the next segment in a scaning backlight)
- What are the precision requirements for VSYNC (beginning the scanning sequence)

Milliseconds, microseconds? Experimentation will be needed. People who are familiar with PWM dimming, already know that microseconds matter a great deal here. Scanning backlights need to be run very precisely, sub-millisecond-level jitter _can_ be visually noticeable, because 1.0 millisecond versus 1.1 millisecond variance means a light is 10% brighter! That 0.1 millisecond makes a mammoth difference. We don't want annoying random flicker in a backlight! It's the same principle as PWM dimming -- if the pulses are even just 10% longer, the light is 10% brighter -- even if the pulse in PWM dimming are tiny (1ms versus 1.1ms pulses). Even though we're talking about timescales normally not noticeable to the human eye, precision plays an important role here because the many repeated pulses over a second, _adds_ up to a very noticeably brighter or darker picture. (120 flashes of 1.0 millisecond equals 120 milliseconds. But, 120 flashes of 1.1 milliseconds equals 132 milliseconds) So we must be precise here; pulses must not vary from refresh to refresh. However, we're not too concerned with the starting brightness of the backlight -- if the backlight is 10% too dim or too bright, we can deal with it -- it's the consistency between flashes that is more important. The length of the flash is directly related to the reduction in motion blur, the shorter the flash, the less motion blur, and since we're aiming for 1/960th second flash (with a hopeful 1/1920th second capability), that's approximately 1 millisecond.

As long as the average brightness remains the same over approximately a flicker fusion threshold (e.g. ~1/60sec), variances in the flicker timing (VSYNC, sequencing) isn't going to be as important as precision of flashes, as long as the flashes get done within the flicker fusion threshold. There may be other human vision sensitivities and behaviors I may not have taken into account, so experimentation is needed.

Estimated precision requirements:
Precision for length of flashes: +/- 0.5 millisecond
Precision for consistency of length of flashes: +/- one microsecond
Precision for sequencing: +/- somewhere less than 1/2 the time of a refresh (e.g. (1/120)/2 = 4 milliseconds)
Precision for VSYNC timing: +/- somewhere less than 1/2 the time of a refresh (e.g. (1/120)/2 = 4 milliseconds)

Goal of precision requirements is to better these requirements by an order of mangitude, for a safety margin for more sensitive humans and for errors. That means length of flashes would be precise to 0.1 microseconds.
This appears doable with Arduino. Arduino's are already very precise and very synchronous-predictable; Ardunio projects include TV signal generators -- THAT requires sub-microsecond precision for good-looking vertical lines in a horizontally-scanned signal.
Example: http://www.javiervalcarce.eu/wiki/TV_Video_Signal_Generator_with_Arduino
___

3. Arduino synchronization to VSYNC

...(preferred) Arduino Interrupt method. attachInterrupt() on input pin connected to VSYNC. However, at 120Hz, the VSYNC is less than a millisecond long, so I'll need to verify if I can detect short pulses via attachInterrupt() on Arduino. Worse comes to worse, I can add a simple toggle circuit inline to the VSYNC signal, to make that signal changes only 120 times a second (e.g. on for even refreshes, off for odd refreshes), which is a frequency low enough to be detectable using Arduino. attachInterrupt() can interrupt any in-progress delays, so this is convenient, as long as I don't noticeably lengthen the delay beyond my precision requirements.
...(alternate) Arduino Poll method. This may complicate precise input lag compensation since I essentially need to do 2 things at the same time precisely (one for precise VSYNC polling and input lag compensation, the other for precise scanning backlight timing). I could use two Arduinos running concurrently, side by side -- or run an Arduino along with helper chips such as an ATtiny chip -- to keep my precision requirements for my 2 precise tasks.

I anticipate being able to use the Interrupt method; but will keep the poll method as a backup plan.
___

4. Dimming ability for scanning backlight

...(preferred) Voltage method. A voltage-adjustable power supply to the backlight segments. (Note: A tight voltage range can dim LED's from 0% through 100%)
...(alternate) PWM method. Dimming only during the time a backlight segment is considered 'on'. e.g. a 1/960th second flash would use microsecond delays to PWM-flicker the light over the 1/960th second flash, for a dimmed flash. A tight PWM loop on an Arduino is capable of microsecond PWM (it can do it -- Arduino software is already used as a direct video signal generator).

The dimming of the backlight shouldn't interfere with its scanning operation. Thus, simplest method to not interfere, is to use a voltage controlled power supply that can dim the LED's simply using voltage. Adding PWM to a scanning backlight is far more complicated (especially if I write it as an Arduino program) since I can only PWM only during the intended flash cycle; or I lose the motion-blur-eliminating ability.
___

5. Adjustable Input lag compensation

...(preferred) Use the Arduino micros() function to start a scanning sequence exactly X microseconds after the VSYNC signal.

Hopefully this can be done in the same Arduino, as I have to keep completing the previous scanning backlight refresh sequence (1/120th second), while receiving a VSYNC signal. Worse comes to worse, I can use two separate Arduinos's or an Arduino running along with an ATtiny (one for precisely listening to VSYNC and doing input lag compensation, another one to do precise backlight sequencing). If I use attachInterrupt() for VSYNC interrupt on Arduino, I can capture the current micros() value and save it to a variable. Wait for the current scanning-backlight sequence to finish, and then I start watching micros() to time the next scanning backlight refresh sequence.

___

6. Precise sequencing of backlight segments.

...(preferred) Tiny delays are done on Arduino with delayMicroseconds(). Perfect for sequencing the scanning light segments. Turn one backlight segment on, delay, turn off, repeat for next backlight segment.
...(alternate) Use the PWM outputs (six of them) of an Arduino, or use a companion component to do the pulsing/sequencing for me. These PWM outputs can be configured to pulse in sequence. However, these outputs won't give me the precision needed for a highly-adjustable scanning backlight capable of simulating "1920Hz"

The tiny delays on the Arduino is currently my plan. I also need to do input lag compensation, so I have to start sequencing the backlight at the correct time delay after a VSYNC. I am also aware that interrupt routines (attachInterrupt()) will delay the delay, but I plan to keep my interrupt very short (less than 0.5 microsecond execution time, see precision requirements at top) to make this a non-issue.

Even though my goal is "960Hz" equivalence, I want to be able to play with "1920Hz" equivalence just for experimentation and overkill's sake, and simply litreally "pwn" the "My LCD is better than CRT" prize, even though it will probably require a 200-watt backlight to do so without a dim picture.
___

Likely Steps

-- The next step is to download an electronics schematic creator program wuitand create the schematic diagram. Virtual Breadboard (http://www.virtualbreadboard.com/) has an electronics circuit simulator including an Arduino emulator. It would work perfectly for testing needs to run in slow-motion mode for visual verification of behavior, although it won't be timing-precise, it would at least allow me to visually test the code in slow-motion even before I buy the parts.
-- After that, the subsequent step is to breadboard a desktop prototype with 8 simple LED's -- more like a blinky toy -- that can run at low speed (human visible speeds) and/or high speed (scanning backlight).
-- Finally, choose the first computer monitor to hack apart. Decide if I want to try taking apart my old Samsung 245BW (72Hz limit) or buy a good high speed panel (3D 120Hz panel). My Samsung is very easy to take apart, and it is disposable (I want to replace it with a Catleap/Overlord 1440p 120Hz or similar within two or three months) so it is a safe 'first platform' to test on, even though its old technology means its response speed will cause more ghost after-images than today's 3D 120Hz panels, it will at least allow a large amount of testing before risking a higher-end LCD to it.
-- Create a high-power backlight (200watts). This will be the fun part of the project, buying 20 meters of 6500K LED tape and cramming all 2,400 LED's in a 2-foot wide 16:9 rectangle (suitable for 24"-27" panels). This might be massive overkill given, but I want to eventually nail the "1920Hz"-equialence "My LCD is better than CRT" prize. Only 10-20 watts of LED's would be lit up at a time, anyway. Appropriate power supply, switching transistors for each segment (25+ watt capable), etc. Attach it to the Arduino outputs, put LCD glass in front, and tweak away.
___

Although I do not expect many people here are familiar with Arduino programming, I'd love comments from anybody familiar with an Arduino, to tell me if there's any technical Arduino gotchas I should be aware of.
 
Last edited:
Update. I've designed a draft schematic. There may be errors, and there's no protection (e.g. overcurrent, overvoltage, etc), but this shows how relatively simple an Arduino scanning backlight really is. Most of the complexity is in the timing, synchronization -- still relatively simple Arduino programming.



Full size version: LINK
 
Last edited:
P.S. Can a Catleap owner tell me how easy it is to disassemble a Catleap monitor? Is the backlight easily removable from the LCD glass, or hard to remove? (Just for my future knowledge)

Disassembly is a pain, but doable. Be sure to take photos during the process so you know how to re-assemble it later.

One of the most difficult parts is removing the front bezel, which is held on by around a dozen plastic clips. The clips hold the front bezel to the back bezel, and must be unhooked by applying inward (towards the monitor center) pressure at the clip locations. I was able to use an old credit card and screwdriver to get mine apart, but be aware that the bezel plastic is very soft and will scratch easily.

Four screws hold the LCD in its metal casing to the plastic back bezel/plate. Removing these allows the parts to separate, but be careful not to stress the cables which run between each piece. The metal casing and plastic back can only be separated by a few inches before unplugging the connecting cables.

At this point (LCD casing and back are separated), you can remove the back arch and stand foot if desired to expose the VESA mount.

From front to back the LCD casing consists of: A front metal bezel, the LCD itself, an inner plastic bezel/spacer, several layers of diffusion material, a clear plastic light guide, white paper, and the metal backing. The data cables are on the top edge of the display, and the LEDs on the larger bottom edge. It appears that the single strip of LED PCB are held to the bottom edge of the casing by screws and should be easily removable.

Since the Catleaps use a edge-lit design you'll need to find a way to remove or cut the rearmost metal back for your LEDs. This will most likely make it difficult, if not impossible, to reassemble the outermost parts of the casing in their original configuration because of the extra space needed for your LEDs and circuits.

Other gotchas/ideas:
Make sure the diffusion materials are returned to their original orientation exactly, as in the wrong configuration it can act to reduce the amount of useful light reaching the LCD.
The diffusion materials, especially the middle one, can be scratched easily and may create visible marks later.
The middle diffusion material seems to recycle the incorrectly polarized light by reflecting it back off the white paper behind the light guide, meaning the paper does contribute to the overall display brightness. Removing this to add your LEDs may result in the display being dimmer than expected.
Dust WILL be a problem, and any stuck forward of the rearmost diffuser will be visible once the display is reassembled.
The color of your LEDs may vary somewhat, and will be more visible in your proposed configuration since it will not be diffused as much as the original edge-lit design. Adding more diffusion behind the light guide may help with this.
If you can get your hands on an old/broken LCD, it's very instructive on how the components fit together since almost all displays use similar internal layouts. That way your first experience is not with your much larger and more expensive display.

Let me know if I didn't explain anything well enough.
 
Disassembly is a pain, but doable. Be sure to take photos during the process so you know how to re-assemble it later.
Thanks for your informative instructions. I have not decided what future 120Hz display to disassemble, but will probably begin with my 'disposable' Samsung 245BW, because its backlight is a true backlight and easily separated. It will limit me to 72 Hz scanning backlight operations, and probably have more ghost afterimages trailing motion, but it would be a good proof-of-concept for me to test with.

Since the Catleaps use a edge-lit design you'll need to find a way to remove or cut the rearmost metal back for your LEDs. This will most likely make it difficult, if not impossible, to reassemble the outermost parts of the casing in their original configuration because of the extra space needed for your LEDs and circuits.

Other gotchas/ideas:
Make sure the diffusion materials are returned to their original orientation exactly, as in the wrong configuration it can act to reduce the amount of useful light reaching the LCD.
The diffusion materials, especially the middle one, can be scratched easily and may create visible marks later.
The middle diffusion material seems to recycle the incorrectly polarized light by reflecting it back off the white paper behind the light guide, meaning the paper does contribute to the overall display brightness. Removing this to add your LEDs may result in the display being dimmer than expected.
Good point, I agree that buying mass-manufactured LED ribbons will give me imperfections. This may be less if I use #5050 LED ribbons (narrow ribbons) rather than #3528 LED ribbons (wide ribbons). #5050 is about three times brighter than #3528 ... However, there's twice as many #3528 on a ribbon (600 LED per 5 meter), and the narrow ribbon allows me to space them more closely. This might cause me to lean closer to buying #3528 LED ribbons instead of #5050 LED ribbons, and easier to diffuse-mix the color inconsistencies when cramming a whopping 2,400 LED's behind a computer monitor LCD.

Yes -- for testing -- it would be preferable for me to get a broken LCD monitor that's simply broken in a fixable or disposable area (e.g. good glass but cracked bezel/stand, dead backlight, etc), though I'd have to really confirm it had everything else I needed to perfectly function.

As for dust, thanks for the warning. My first prototype probably won't be very 'clean', but that tip may seek me to use a clean metal or plastic chunk to mount the LED's on, rather than using a dusty surface like plywood. Many LED tape use adhesive for mounting, so that should be fine even if I mount to a piece of aluminum for heat dissipation, though I'm only dissipating only 10-20 watts of heat (I'm not lighting all 200 watts at once, and nor should I!).
 
I would also like to say this project would be awesome. I'm often pissed that I cant read text while scrolling and I have motion blur with a passion.
 
Update! I'm proceeding with prototyping for the project. I've bought the Arduino and parts that I need for "kitchen countertop experimentation", using single LED's and an oscilloscope. This will be the experimentation stage before deciding to construct the LED array.

Also, there's some amazing graphic charts and intelligence being posted by some AVSFORUM users in my corresponding thread on that forum. It's been very useful in refining the plans for the scanning backlight experiment.
 
A current limiting resistor is a bad idea for an application like this, because it's likely to result in an uneven, and inconsistent backlight. You don't sound like you are really in to doing a ton of hardware design, but I would recommend looking into using LDOs in current mode at the very least. These are very simple. You can use a LM317 to start out, but you may end up wanting to go with a higher current IC. If you really want to go all out, use a dedicated switching LED driver IC.

As a hardware guy, I feel compelled to point out that a simple 555 timer and shift register would be a far more elegant, and in my opinion, easier approach than using an Arduino, but for a one off, you may want to just go with what you are comfortable with. Even better, it probably wouldn't be too hard to sync up your clock signal with the monitor, which is something that might be important.
 
A current limiting resistor is a bad idea for an application like this, because it's likely to result in an uneven, and inconsistent backlight.
That's not a problem if there's only a few LED's per resistor. There are already 800 current limiting resistors built into 20 meters of LED ribbon. I can't remove these surface mounted current limiting resistors. The light on these is very consistent because are a whopping 2,400 LED's for my planned scanning backlight, and a whopping 800 separate current-limiting resistors already built into LED tape.
ledribbon.jpg

Less than one dollar per foot when purchased off eBay (search terms: "600 LED ribbon"). Getting a 5 meter (16 foot) is a total of 50 watts, and I can get higher quality 6500K light for $30-$40, suitable for a monitor backlight.
Bottom line: These ribbons are easy, cheap, and consistent -- even though they have built-in current limiting resistors. Ribbons with SMT#3528 LED's are only 8 millimeters wide. 600 LED, 50 watts per 5 meters/16 foot. I can cram more than 100 watts of ribbon per square foot, when tightly spaced (more than 40 segments of 2 feet each for a 27" panel -- more than 3,000 LED's, totalling more than 200 watts). Perfect for a scanning backlight that's dark 90% of the time. (10 watt average consumption per square foot). I need lots of easy-to-obtain brightness for very short flashes.

I do lose about 20% as heat in current limiting resistors, but I'll take that -- I need simplicity. I'm not going to put a soldering iron to thousands of LED's :)

As a hardware guy, I feel compelled to point out that a simple 555 timer and shift register would be a far more elegant, and in my opinion, easier approach than using an Arduino, but for a one off, you may want to just go with what you are comfortable with. Even better, it probably wouldn't be too hard to sync up your clock signal with the monitor, which is something that might be important.
Yes, you're right, it's just a simple LED sequencer you can do with a shift register. But I am prototyping, and I want to do it in a fun way. I am a software guy and I need very flexible adjustability and software programmability.

I also want to add some intelligence I can't easily otherwise do. Things such as precise timecoded VSYNC signalling over USB from computer to Arduino -- and programmatically compensating for CPU fluctuations/USB signalling delays. Also other creative stuff such as automatically calculating refresh rate based on Discrete Fourier Transforms if I need them; easily done in a small Arduino program. I'd like to add some creative scanning adjustments too, and I want to try to avoid using hardware VSYNC, so I can do maximum flexibility in using various different LCD panels, without needing to modify monitor electronics. I found a software method of timecoding that allows 1/135,000th second accuracy even when signalling the VSYNC over an inaccurate high-latency connection, and even mathematically compensate for missed VSYNC's. Also, there's way more software developers than electronics component users, so this open source project will be open to more people, too. I'd also benefit from source code improvements too.
 
Last edited:
Also, the software based nature of Arduino allows the easy flexibility of presets. I want to have automatic memorization of different scanning backlight "modes" (presets), so I can call them up for different modes (e.g. 1080p@120Hz versus 1080p@72Hz)
 
I've had this thread bookmarked awhile a go. Good to see your still enthusiastic about this project. Good luck in your efforts.
 
I am going to also experiment with PWM dimming, so I can keep the LED power supply simple (I could then just use a PC power supply to power the backlight, and be able to run it at any power range all the way up full 200 watt surges).

Instead of a single short strobe, I have an ultrashort PWM dimming sequence in place of the strobe, for a dimmer strobe. It could even be just two PWM pulses.

With a 200 watt backlight for a 24" display, I need to dim during the longer strobing scanning backlight modes (e.g. 1/240sec) than with the shorter strobing scanning backlight modes (e.g. 1/960sec) or the backlight is too bright during longer strobes. So this is where PWM dimming can come into play during longer strobes.

For example, these would result in the same human-perceived average backlight brightness:
1/960 with no PWM (single 1/960sec strobe, not split into two separate 1/1920sec strobes)
1/480 with 50% dimming by PWM (two 1/1920sec strobes spaced 2/1920sec apart = total 4/1920 = 1/480)
1/240 with 75% dimming by PWM (two 1/1920sec strobes spaced 6/1920sec apart = total 8/1920 = 1/240)

I did some tests with an Arduino today (tests are up and running!), and have successfully validated the ability to do this. In fact, PWM dimming works even up to 100 KHz -- and with some optimizations (port writes instead of C functions), more than 1 MHz. That's overkill, though. The challenge will be to write the Arduino program to treat each sequence as its own separate strobe sequence, so that the strobe sequences can overlap. I have come up with a way to do this, keep tuned.
 
Check out these FPGA based Arduino-like boards. http://papilio.cc/ You can drop an "Arduino" soft core on them. Some of the newer models have actual processors in addition to the FPGA.

It would be much easier to do hardware sync detection and LED scanning in HDL. Use the processor (hard or soft) to compute the delays and timing offsets.

Software people can usually pick up HDL fairly quickly.
 
Last edited:
Check out these FPGA based Arduino-like boards. http://papilio.cc/ You can drop an "Arduino" soft core on them. Some of the newer models have actual processors in addition to the FPGA.

It would be much easier to do hardware sync detection and LED scanning in HDL. Use the processor (hard or soft) to compute the delays and timing offsets.

Software people can usually pick up HDL fairly quickly.
Excellent idea, I'll keep that page bookmarked.

I just constructed the first first "test" scanning backlight program with 16 small indicator LED's connected to it -- and it is running accurately enough for software-adjustable 1/960 and 1/1920 operation. The programming was a bit easier than I thought, but I have to be careful about consistency from cycle to cycle (e.g. refresh to refresh) -- e.g. running the same number of program instructions during consecutive cycles. Technically, I've got a test program already written -- now I've got to wait for the oscilloscope, buy $200 of LED's, and a donor monitor! Going to FPGA may be computing overkill at this stage, but if I need some more advanced fine-tuning that is beyond the capabilities of the Arduino (and I might), that will be a useful platform!

I also created a blog website, I'm building it right now -- to post my first photographs of my first "kitchen countertop experiment". (using red indicator LED's for initial tests, before purchasing LED ribbons)
 
I want to try to avoid using hardware VSYNC, so I can do maximum flexibility in using various different LCD panels, without needing to modify monitor electronics.

Not using a hardware vsync is bad for a number of reasons:
1) The system becomes not only OS-dependent, but also requires a computer.
2) Most consumer OSes are not real-time, so the time you get the vsync is non-deterministic. You may not even have access to the real vsync since the video card is generating this.
 
Not using a hardware vsync is bad for a number of reasons:
1) The system becomes not only OS-dependent, but also requires a computer.
2) Most consumer OSes are not real-time, so the time you get the vsync is non-deterministic. You may not even have access to the real vsync since the video card is generating this.
1) Yes, I agree, that's a disadvantage. I believe for the first prototype, I will design the circuit to have flexibility to use both hardware and software VSYNC, so that I can choose either way depending on the panel I am testing.

2) True -- however -- I discovered algorithms to get software VSYNC accurate to less than +/- 1 microsecond! I monitor VSYNC over a 10-second period (Direct3D API RasterStatus.InVBlank), measuring time of entry into VSYNC using QueryPerformanceCounter, and then I average the number of cycles (for 600 VSYNC intervals) -- reject obvious outliers like doubled values (e.g. missed VSYNC's due to CPU starvation) -- and I found out my mathematic result is accurate to 1 microsecond. From there, I can timecode my signals with high-precision with microsecond timestamps (so that I'm USB-jitter independent) to the scanning backlight. Both the PC-end and Arduino-ends have access to microsecond-accurate timers, and I can improve accuracy even further by using Direct3D API RasterStatus.ScanLine (which tells me how long ago the last VSYNC was), although I'll be trying to avoid using that to maximize compatibility. Also, I discovered that I can make it run extrapolated quite accurately for several seconds to many minutes -- meaning, I only need to re-synchronize the VSYNC timing once in a while (e.g. I don't need to signal VSYNC every 1/60 second, so I'm 100% immune to CPU starvation issues, and 100% immune to short-term missed VSYNC's!). Once the VSYNC timing is sync'd, the drift is so low, both PC and Arduino timers drift apart out of sync very slowly, so the scanning backlight flash sequence stays "in-sync" unattended for a very long time. When VSYNC timings is resynchronized, I can slowly slew the Arduino imperceptibly to the new VSYNC timing, to prevent a single flicker of a sudden VSYNC timing correction (in the unlikely event that things are very out-of-sync). I just need the VSYNC timing to be accurate to approximately +/- 1 millisecond, which is 1000 times less accurate than the precision I'm actually getting (1 microsecond) from averaging the performance counters over 600 VSYNC intervals, giving me an accurate VSYNC timing basis that successfully extrapolates flaw-free for many minutes. There would be a phasing adjustment (timing of starting scanning sequence relative to timing of VSYNC) to compensate for other latencies, including input lag and average USB signalling lag (I've found an algorithm that compensates for jitter and returns stable average round-trip USB latency of 4.092ms with the trailing 1-second average latency fluctuating only +/-0.002ms for my system -- very low-jitter after subtracting calculated jitter from my jitter compensation math! Works on XP, Vista, 7 and 8). So software VSYNC is quite unexpectedly accurate, with the proper software algorithms, and proper extrapolation algorithms so that VSYNC time codes only needs to be signalled/synchronized occasionally (e.g. once every second, once every minute, etc). The one-time manual phase adjustment does the rest; to adjusting for input lag and "relative" timing of VSYNC to scanning sequence. Requirement = 1 millisecond precision; Actual achieved = 1-2 microsecond precision -- plenty of safety margin!

I've now updated the Scanning Backlight FAQ, with new answers for:
"Q: How can you synchronize the backlight to VSYNC?"
"Q: How do you synchronize the scanning backlight to the LCD panel?"

A small status update on my prototype: I have the Arduino circuit and sketch, including synchronization logic tested, with simple LED's. The next big step would be for me to build the 240 watt of LED backlight, to attach to the circuit. First, before that step, I'm currently working on the Arduino Input Lag Meter (which you can also build yourself for about $25, with no soldering!). Keep an eye on my blog when I release the software and schematic for the build-it-yourself Arduino input lag tester.
 
Last edited:
Some of the reels of ultrabright LED ribbons have now arrived!

...

...
They are very BRIGHT. Read more on my blog at www.scanningbacklight.com.

I'll be using a whopping 900 of these LED's as an active strobed/scanning backlight in one 24" monitor, to permit a sufficiently bright image at 0.5 second impulses per refresh -- sufficient to reduce LCD motion blur to be less than CRT. At 21-22 lumens per LED, this is a total of 20,000 lumens in the strobes! (possibly up to 60,000 lumens if I carefully overvolt the LED's during the short pulses, as LED's will tolerate surge current at short pulses).

Reasonably inexpensive #5050 Epistar LED chips (darn near 6500K) -- color purity is reasonably good at approximately CRI ~70-75; it's much better quality white light than CCFL -- but not as good as high end Samsung-manufactured LED's. (I'd end up spending 4 figures on CRI 90+ LED's alone for just one monitor.) Phosphor persistence is very low on these white LED's, supposedly far less than 0.5ms, I'll be conducting oscilloscope+photodiode tests in the coming weeks. Primary goal is simply eliminate visible motion blur, not have stunning IPS-LCD style photo quality (for now).

I'll be testing strictly with active 3D panels, since those are the only LCD panels reliably able to clear the vast majority of pixel persistence (>99%) within the same frame refresh (by design necessity), and such panels are capable of less motion blur than CRT, when coupled with this 150watt/sqft backlight in full-array-at-once strobing mode. The full strobe mode would eliminate any backlight diffusion issues (between on-segments and off-segments during sequential scanning modes), that would interfere with motion blur elimination, as this had been expressed to me by other experts. Strobe flicker is not a concern at 120Hz native refresh rate (= 120Hz strobe rate). The first prototype will be an ultimate videogaming computer monitor for my desk.

P.S. I had spots in my eyes. Need sunglasses when working with many of these LED's squeezed in a small space.
 
Last edited:
Yeah -- almost -- at least when I'm showing these ribbons directly in continuous non-stop illumination.

Behind an LCD, it will lose some brightness due to LCD inefficiencies, and then when strobed (dark 95% of the time -- e.g. strobed for only 1ms out of a 16ms refresh at 60Hz, or strobed for only 0.5ms out of 8ms refresh at 120Hz) -- then in this case, it will quite look like CRT -- comfortable brightness, not too dim and not too bright.

CRT phosphors briefly shine *this* brightly too (see overexposed high-speed camera image of CRT) -- just for a very short time period (millisecond-league). My calculations showed that I needed at least about 150 watts per square feet of LED, in order to equal CRT phosphor in illumination output for the same impulse length as phosphor decay.
 
thats new info. im just looking for a non flickering low brightness backlight. simple demands.
 
My modified monitor project isn't of interest to people who is sensitive to PWM flicker or CRT flicker.
The goal of my project is to completely eliminate human-perceptible motion blur of LCD, by strobing or scanning the backlight.
People who really like CRT's during videogames, for the zero-motion-blur ability, is the audience that's interested in my monitor hack.

Fortunately, it's software configurable, so it can go into non-flicker and flicker modes, depending on whether you want rock-steady static images, or zero-motion-blur video gaming... So you can turn off the scanning/strobing modes when you're just web browsing or editing a Word doument.

Meanwhile, there's lots for me to do over the coming several weeks, so keep an eye on my blog.
 
I recently did LED phosphor tests since white LED's have phosphor, and has a finite decay time. I measured using an Arduino "oscilloscope" circuit (photodiode sampled at 10 KHz) on my ribbons that I have purchased -- when power is turned off, the light decays to less than 10% brightness after 0.1ms, and decays to less than 1% brightness after 0.2ms. This is a lot faster than CRT phosphor, so this won't be my limiting factor for motion blur elimination on an LCD.

I've purchased LED ribbon (two of the packages arrived; 792 watts of LED's!), the 1000fps Casio camera (arrived at FedEx depot, waiting for pickup), an oscilloscope and logic analyzer (basic USBee clone, in the mail), MOSFET amplifiers (in the mail), 750 watt PC power supply (power for LED ribbons), and now have most of the parts necessary to build a 250 watt strobed backlight for a 23-27" class LCD panel. I now have to decide the following:

(1) Do I hack apart a cheap Asus VG236H obtained on sale, or do I wait for the Asus VG248QE which is a vastly superior panel almost practically guaranteed to be possible to have less motion blur than CRT (provided I have enough lumens in sufficiently short strobes)? Or go big and buy the VG278HE (expensive $600 monitor I'd be destroying). I am slowly leaning towards hacking a cheap 120Hz panel first, even if I don't "go better than CRT in motion blur".

(2) What kind of backlight diffuser should I use; I need to choose carefully. Are there LCD parts suppliers that sells diffuser sheets compatible with full-array backlighting (which is essentially what I'm creating); do I recycle existing edglight-compatible diffusers for behind-glass full-array backlight; or get a backlight-optimized diffuser sheet? Find a cracked high-end LCD HDTV, take its diffuser out, cut it up, and use it for my display? I essentially have to diffuse the LED dots into a continuous bright rectangle of bright light behind an LCD panel. I'm currently researching my options for efficient diffuser sheets.

(3) For the first prototype, do I lower costs and forgo the scanning modes, and only go for full-strobe modes; Scanning modes have the disadvantage of backlight diffusion interfering between adjacent segments, and complicates the circuitry. Full-strobe backlight flashing is good enough at 120Hz (does not flicker to most people) as already evidenced by the recent nVidia 3D Lightboost tests too (nVidia Lightboost is a strobed backlight).

(4) Do I test out a voltage/current boost circuit, so I can do 500-700 watt surges through LED's rated for 250 watts? LED's can handle more power when strobed briefly. The extra brightness is important so I can use strobes that are as short as possible -- my goal strobe length is 0.5 millisecond flashes (1/2000sec) per refresh, and my Arduino circuit has the capability to go down to 1/10,000sec strobes. Since I've now proved the LED ribbon phosphor persistence is in the neighbourhood of only 0.1ms, the LED's and Arduino's is not the limiting factor -- it's the amount of wattage I can output during short strobes, to make sure that the average picture level isn't dim.


These are questions I have to research the answers to, next.

Also, Xmas season and personal family responsibilities is coming, so the project may go into hiatus mid-December until early January. So construction will probably wait till after then, but research and tests will continue before then. I'd like to get plenty of groundwork done before then! (And hopefully some interesting imagery in the form of photos/data/graphs to blog) :)
 
Very exciting!

Would it be feasible to use LCD or OLED panels from smartphones? IIRC there's been a couple of 3D LCD phones, and I'm sure an OLED screen meets the minimum refresh time requirements. Using a small display could make prototyping much easier and cheaper. Replacment screens for the phones can be found on ebay for cheap (though OLED screens are much more expensive).

Also, have you talked to Vega? I'm sure your project would interest him greatly as he's in search of the perfect display setup for gaming and has modified several LCD and CRT setups including doing a 3x portrait FW900 setup with a fresnel lens.

His thread - http://hardforum.com/showthread.php?t=1675965
 
The science of this project is sound and I am very interested. I currently have two 130 Hz 1440P monitor's and I can use one as a test bed project. To replace the back-light properly would require a clean-room which I plan to build to eliminate dust as you separate the factory LCD panel from the back-light and the usually "loose" plastic light diffusion layers and polarizes.

My largest concern would not be the hardware modification for the new back-light which would be relatively easy, it would be the very precise timing needed for synchronization. Source (host) VSYNC signal is unusable. 3D shutter glasses signal is no good. Really the only way I could see the signal timing being perfect and have no drift is if you tapped into the internally generated signal of the monitor itself with an oscilloscope. This of course would be very monitor specific which could cause implementation problems.

I guarantee this is how Sony and others implement their scanning back-lights. Using external sources is no good (especially the further beyond 60 Hz you go). All monitors have some degree of circuit delay, but this delay is usually a constant so it can be adjusted for manually. I know nothing about Arduino, but I can wield a soldering iron well and modify monitors adeptly so I am definitely interested. The market for a CRT-like motion LCD with associated LCD's higher resolution than any Plasma or CRT could be huge.

The reason this hasn't been accomplished in computer displays is cost. The cost could easily escalate on a given monitor by many hundreds of dollars with the associated circuits and high-intensity LED"s. I guess most businesses have deemed the market too small. For those of us that demand the best, this could be a very interesting prospect.
 
Last edited:
I prefer hardware VSYNC, but for practical reasons, software VSYNC is also going to be supported to maximize monitor flexibility (eliminate requirement to modify monitor except only for its backlight)...
My largest concern would not be the hardware modification for the new back-light which would be relatively easy, it would be the very precise timing needed for synchronization. Source (host) VSYNC signal is unusable.
Would you be surprised I was able to do source (host) VSYNC with less than a +/- 10 microsecond accuracy?

The trick is made possible by "timecoding" (then it doesn't matter how late the VSYNC signal is randomly delayed). In a critical section, I can timecode a Direct3D RasterStatus.ScanLine() (which tells you how long ago the last VSYNC was, in number of scanlines ago, which can easily be converted to a microseconds value using "VESA Generalized Timing Formula") with a call to QueryPerformanceCounter() which is a microsecond-accurate timer. I've accurately timecoded to +/- 1 microseconds. Now it doesn't matter how quick I send the VSYNC out. Timecoded VSYNC signals are 100% immune to CPU fluctuations and USB ping variability. The Arduino already has a microsecond-accurate timer (micros()) which is used to compensate the timecoded value. Due to the slower performance of the Arduino, it becomes +/- 10 microseconds. On my system, the ping between PC and Arduino is actual tested average of 4.092ms (varies only +/- 0.001 millisecond) averaged over many pings. This is consistent. I subtract half the ping for one-way latency (not even needed; but merely to simplify manual adjustment), and then use a manual phasing adjustment (timing the backlight relative to VSYNC) -- and it's accurate enough since I only need ~1ms or ~0.5ms precision in timing the beginning of the scanning backlight scan or strobe. (0.5ms = refresh of approximately 1/32th screen height at 60Hz -- which is acceptable, and even at 120Hz the time of 0.5ms is still only a refresh of 1/16th of screen height). So host signalling is quite surprisingly accurate, if you use the timecoding technique.

In fact, I don't even need to signal VSYNC 60 times a second -- Once I know the refresh rate to +/- 10 microseconds (doable after counting VSYNC's for several seconds, as a dynamically maintained average), then all I have to do is send the Arduino a timecoded refresh-length value, and the Arduino can extrapolate the scanning sequences for at least 10 seconds -- meaning, theoretically I can just randomly signal the Arduino once a second, and the scanning or strobing sequence stays accurate for several seconds. No need to signal a VSYNC everytime a VSYNC occurs. To prevent jarring transitions (slight flickers, etc), I slowly slew the Arduino's internally generated VSYNC timing to the most recently signalled VSYNC timecode from the PC (which might say, be 5 microseconds early, or 7 microseconds later, etc)

Thus, I can successfully:
(1) Have variable latency for VSYNC signalling
(2) Don't care about CPU delays to VSYNC (CPU fluctuation proof)
(3) Don't care about USB delays to VSYNC (USB ping fluctuation proof)
(4) Extrapolate future VSYNC's. Don't even need to signal VSYNC every single time VSYNC occurs. (freeze-proof)
Thanks to the magic of timecoding the signal, and because both the PC and the Arduino has microsecond-accurate system timers;

Precision Requirement for VSYNC: ~1 millisecond
Achieved Precision of Software VYNC: ~0.010 millisecond
Two orders of magnitude safety margin.

See, microsecond-level timecoding makes host signalling of VSYNC accurate!

I guarantee this is how Sony and others implement their scanning back-lights.
Agreed. You are 100% correct, we already both know this.
However, this is a user modification. I want to try to make this a monitor-independent modification, so that future hobbyists don't need to reverse-engineer the monitor's electronics to succeed. That's why I kind of like keeping the VSYNC option flexible -- support for both hardware *and* software based VSYNC. I feel it would be wonderful to have "no monitor modifications absolutely required except backlight" ... That's is the engineering challenge I have successfully made possible: Do it with zero monitor electronics/circuit modifications!

The reason this hasn't been accomplished in computer displays is cost. The cost could easily escalate on a given monitor by many hundreds of dollars with the associated circuits and high-intensity LED"s.
Right. It's the cost of the insane wattage in LED's.
The circuit is actually cheap (sub-$100) but the LED's are insanely expensive.

And you have to do it efficiently -- pushing enough of the photons from 250 watts of LED through the panel at 10 times brighter than current monitors -- and doing it efficiently (e.g. parabolic reflectors or lens). The efficiency is the difficult part, and that's the part I am currently working on -- LCD panels absorb quite a lot of light, even during white screen. Doing it efficiently is expensive (otherwise you need 1000 watts of LED's instead of 250 watts to have less motion blur in a 24" LCD than a CRT)

I am also testing other approaches (making a massive edgelight consisting of a series of 3-5watt strobe LED's, and reusing the LCD panel's existing edgelight optics because keeping the existing optics simplifies things efficiency-wise.) .... the problem is the edgelight will only work with full-screen strobes, but that won't be a flicker problem (for most people) at 120Hz / 144Hz. (nVidia 3D Lightboost is a strobed backlight that also reduces motion blur, but not nearly as much as I'd like, and is mainly enabled only during 3D).

P.S. Just got my 480fps/1000fps camera for refresh-pattern captures, posted my first video at www.scanningbacklight.com
 
Last edited:
Don't get me wrong, if you could accomplish this it would be amazing. On paper it sounds like using a host VSYNC signal could work, but a lot of time in practice it doesn't work out well. I assume your first test-bed will be an 8-segment scanning back-light on a 120 Hz screen to simulate 960 Hz and use an adapter to splice between the monitor cable from the GPU and the input of the monitor to steal the VSYNC signal?

Will the GPU always be required to have VSYNC enabled (which isn't optimal in some cases) or does a usable time signal get sent at all with VSYNC disabled? Do you envision this only properly working with fast TN panels? I would like to try this out on my 130 Hz 1440P IPS but the pixels rise and fall is around 5-7 ms. At around a pixel refresh speed of 7.7ms at 130 Hz, that hardly leaves any time between pixel refresh. As soon as the pixels are done changing they are already starting to change once again.

So it leaves plenty of time ~5ms to get the eight sequenced light pulses in but hardly any room for error and buffer when the pixels are transitioning so the timing would have to be spot on to not illuminate pixels in transition state.

It would be great if you could implement an adapter that you could simply splice between the output of the GPU and the input of the monitor, that would eliminate any hardware specific engineering which would severely limit the application. Although, every single monitor is going to need to be back-lit modified which is only going to be a niche market, unless you market ready made adapter and monitors. There could be a nice market for enthusiasts.

One other issue is capturing the VSYNC signal. This is easy with DVI and HDMI as they both use constant stream digital signals but may be more complicated with Displayport. That signal is packet based. Currently most monitors still use DVI but I could see DP becoming the only standard here with a couple of years. That is until and if they get HDMI chips with pixel clocks high enough to support 120+ Hz, but then again who would want to do such an extensive and awesome modification of this type on only a 1080P monitor? I know I wouldn't. I've even been considering purchasing the Sharp IGZO 4K 31.5" display screen this spring. That monitor combined with this scanning back-light would surely make the best display on the planet for computer use.

Please let me know if you get beyond the proof-of-concept stage with the host VSYNC timing with Arduino actually working. I'd gladly love to buy some of these LED strips, modify one of my 130 Hz 1440P monitors to test this out. It may be beneficial to have third party(s) work on this purchasing their own hardware to get the process moving along at a quicker pace and help you validate stuff. I could also see a 120 Hz 1440P screen that costs $500 easily be made into a 120 Hz refresh/960 Hz scanning back-light be worth a couple of grand at least if the motion blur is dramatically reduced.
 
Last edited:
I've always wondered why scanning backlights weren't used in gaming monitors. I see the cost is quite high, infinitely worth it though.
 
Don't get me wrong, if you could accomplish this it would be amazing. On paper it sounds like using a host VSYNC signal could work, but a lot of time in practice it doesn't work out well. I assume your first test-bed will be an 8-segment scanning back-light on a 120 Hz screen to simulate 960 Hz and use an adapter to splice between the monitor cable from the GPU and the input of the monitor to steal the VSYNC signal?
I'm using 16 segments now (see the photo of the bench test unit, attached to single LED's rather than to MOSFET amplifiers connected to powerful ribbons hiding behind an LCD panel). Tests have shown I'm able to do well-timed 1/10,000sec strobes with some optimized Arduino programming. However, I'm leaning towards skipping sequential scanning and using only full-panel strobes only -- at least for a first prototype. The circuit is otherwise quite simple -- anyone who can put together LEGO, can follow instructions to build the circuit. The smarts/complexity is software and the synchronization of the flashes, and the money/wallet for the cost of the LED's.

The good news is that I now know this is possible with 3D-compatible panels without artifacts. The primary disadvantage of full-strobe is at least half a frame of average added input lag (+3.5ms at 144Hz refresh) but the perfect clarity (identify faraway snipers in 3D games without stopping turning first) will likely more than outweigh the added input lag.

Will the GPU always be required to have VSYNC enabled (which isn't optimal in some cases) or does a usable time signal get sent at all with VSYNC disabled?
VSYNC via GPU has nothing to do with VSYNC on display. VSYNC stands for Vertical SYNChronization, which *all* display signals have. If you lived through the 70's or 80's, you've seen VSYNC before: It's that black bar between frames when VHOLD is mis-adjusted on an analog TV. Displays require that synchronization signal in order to begin the next frame, otherwise the picture is scrambled, garbled, skewed, or rolling. There's guaranteed always a VSYNC being output to the display.

VSYNC in videogames is simply telling the game to wait for the hardware vertical sync signal (that always exists), or not to wait for the hardware vertical sync signal (that always exists). Turning this off in videogames has absolutely nothing to do with eliminating the signal, simply telling the game to wait for it, or to not wait for it.

Do you envision this only properly working with fast TN panels? I would like to try this out on my 130 Hz 1440P IPS but the pixels rise and fall is around 5-7 ms. At around a pixel refresh speed of 7.7ms at 130 Hz, that hardly leaves any time between pixel refresh. As soon as the pixels are done changing they are already starting to change once again.
It will work far better with TN panels than with IPS panels, but there will be benefit. To have less motion blur than CRT, you need a good TN panel and refresh electronics that can finish erasing pixel persistence on time before the next refresh (3D 120Hz/144Hz compatible).

But it will benefit any LCD (to varying extents). You can still have lots less motion blur than you do right now (e.g. 75% less). Your limiting factor is how many % of pixel persistence from the last frame is left in the next frame. If that figure is about 20%, then that becomes your limiting factor -- approximately 80% motion blur reduction with the last 20% impossible to eliminate due to pixel persistence. For an IPS panel, you'll definitely need a sequential scanning backlight, rather than a full-panel strobe.

So it leaves plenty of time ~5ms to get the eight sequenced light pulses in but hardly any room for error and buffer when the pixels are transitioning so the timing would have to be spot on to not illuminate pixels in transition state.
You need a lot of safety margin because of backlight diffusion. The boundaries between lit segments and unlit segments is extremely diffuse, unless you're building lots of optics (which is hard for a 900 LED backlight!) and it will still be somewhat diffuse even then. Even the stuff built into panels diffuse throughout the panel to an extent -- so even lighting a tiny corner of the panel, faintly lights up the rest of the panel unavoidably. So that becomes a limiting factor, which also contributes to my decision to prefer full-panel strobes (which also then forces the active-3D-compatible panel requirement, limited to TN for monitors at this time)

It would be great if you could implement an adapter that you could simply splice between the output of the GPU and the input of the monitor, that would eliminate any hardware specific engineering which would severely limit the application. Although, every single monitor is going to need to be back-lit modified which is only going to be a niche market, unless you market ready made adapter and monitors. There could be a nice market for enthusiasts.
Yes, a dongle would be far the most user-friendly method, and it can be hardwired to the Arduino, without any requirement for timecoding.

One other issue is capturing the VSYNC signal. This is easy with DVI and HDMI as they both use constant stream digital signals but may be more complicated with Displayport. That signal is packet based. Currently most monitors still use DVI but I could see DP becoming the only standard here with a couple of years. That is until and if they get HDMI chips with pixel clocks high enough to support 120+ Hz, but then again who would want to do such an extensive and awesome modification of this type on only a 1080P monitor? I know I wouldn't. I've even been considering purchasing the Sharp IGZO 4K 31.5" display screen this spring. That monitor combined with this scanning back-light would surely make the best display on the planet for computer use.
Indeed, especially if the IGZO is 3D compatible.

Even if not used for 3D, only 3D-compatible panels are panels that are capable of having less motion blur than CRT, with the right kind of backlight strobes. 3D compatible panels requires fast pixel persistence, and a fast refresh pattern, with lots of static period after a refresh ends and before the next refresh begins. This refresh pattern (fully refreshed static image) allows me to use full-panel strobes instead of sequential scans. That way, all of the pixel persistence artifacts is kept in total darkness between refreshes (and no backlight diffusion to worry about), and I'm only strobing light through a fully-refreshed frame. Tests have shown that it's not possible to have less motion blur than CRT without doing a full-panel-only strobe. So my preference has since shifted to full-panel strobes, relative to sequential-scanned strobes. Sequential scanning will still reduce motion blur a lot, but due to backlight diffusion (Between on segments and off segments), that becomes a major limiting factor to motion blur reduction.

Please let me know if you get beyond the proof-of-concept stage with the host VSYNC timing with Arduino actually working. I'd gladly love to buy some of these LED strips, modify one of my 130 Hz 1440P monitors to test this out.
The LED strip is the easy part -- you can get them by googling "5050 LED ribbon" on eBay, on Alibaba or amazon, though many are too dim for this purpose, the trick is in selection, getting the brightest ribbon.

However, one warning: The optics is a big challenge (focussing & diffusing the LED's as efficiently as possible to a solid white rectangle of bright light to put the LCD panel in front of); but keep tuned. This part may end up being beyond the scope of most other DIY users. To keep this DIY accessible, I may go with the edgelight-and-full-strobe-only method, since that's simpler from an efficiency perspective: Just use the optimized optics already built into the LCD panel's edgelight, just replacing edgelight LED's with far more powerful ones. This technically eliminates the need to build backlight optics, and keeps my power requirements down.

I'm not in a hurry to hastily build the desktop sized prototype quickly this until I essentially can push 10 times the amount of light (relative to a builtin backlight) through to the other side of an LCD during bench tests. LCD panels are light sponges; they are quite inefficient at transmitting light through! I'm quite intent at matching CRT quality in motion blur elimination, so that's forced me to go with full-panel strobes (to eliminate backlight diffusion problem), which subsequently forces me to stick to 3D 120Hz/144Hz TN panels that are able to erase their pixel persistence before the end of its refreshes, but also fortunately opens the door to turbocharged edgelight replacement options (far simpler for DIY).
 
Last edited:
I've always wondered why scanning backlights weren't used in gaming monitors. I see the cost is quite high, infinitely worth it though.
The good news is that nVidia has started encouraging manufacturers to put strobed backlights in their displays -- 3D LightBoost is a strobed backlight that flashes the backlight brightly whenever the shutter glasses are open. Both shutters in shutter glasses are closed, while waiting for a 3D monitor to refresh between frames; Instead of wasting brightness keeping the backlight turned on while both shutters are closed -- save the light and light the LED's more brightly for shorter time periods, only while shutters are open. That's why they call it "lightboost" -- brighter image when used with 3D shutter glasses. But it's also exactly what's needed for motion blur reduction too, and this excellent AnandTech article covers this topic.

3D LightBoost also reduces motion blur, thanks to its strobed nature. The problem is 3D LightBoost is not usually being used in 2D mode (where its motion blur elimination can help), and the strobes are not short enough to be less motion blur than CRT.

However, it's definitely a start; and the use of 3D Lightboost is a start in reducing motion blur in monitors. The upcoming ASUS VG248QE monitor has a 1ms pixel response, and it is supposed to be the world's least-motion-blur 24" gaming monitor out-of-the-box, when it hits the stores. People who don't want to do a strobed/scanning backlight mod, should pay attention to this monitor -- especially if its 3D Lightboost strobed backlight can be enabled during 2D mode for motion blur reduction. (Not yet determined if it's able to do so) Big bonus if the 3D Lightboost strobing can be enabled during 2D mode.
 
Ah yes, I hadn't thought about that virtually all LED back-lit LCD computer monitors are edge lit with a diffusion layer. The optical properties of these layers would be monitor specific. The layers take the light from the edge of the layer and diffuse it 90 deg across the surface to illuminate the panel surface area.

Simply putting ton's of these bright LED's behind that same diffusion layer may not produce the results you are after like you said, and the local strobe would migrate to much of the other surface area of the panel. With a direct-lit panel, depending on how close the individual LED light sources are to each other would determine the required diffusion amount. Of course all of these calculations are done by the manufacturer, and hence why there is virtually no direct lit computer monitors as cost is a huge factor.

So if the goal is not to change the optical diffusion layer, you would simply have to replace the edge lit LED's with far-far brighter ones like you said. Now isn't that where the problem lies? If you went with direct view LED back-light, you could increase the number of LED's versus edge lit which would significantly help the luminosity of the screen. Is it technically possible to get LED's that are tens of times brighter than current edge lit LED's? That I would have to question.

I think changing from a direct back-lit (with proper optics) scanning matrix to a full panel edge lit strobe would be simpler, but also not produce nearly as good of a result. I don't think finding an acceptable direct light diffusion layer optic would be as troublesome as you think.

Another thing to consider is will the technology be limited to TN panels? This will shrink the market considerably as TN panels have inferior image quality to any other panel type, are generally small screens and are limited to a rather measly resolution of 1080P. One of the reasons I gave up my FW900 CRT is that the 22.5" image and 1920x1200 resolution is just too small for me.

I don't have time to do much research at work, but what panel types are in those 960 Hz scanning back-lit TV set's (not TN) and what type of back-lit LED's do they use? Direct or edge? Usually those companies have done all of the R&D on something like this, it's just up to smart guys like you to adapt it to computer monitors in which those same large companies feel there is no market for.
 
Simply putting ton's of these bright LED's behind that same diffusion layer may not produce the results you are after like you said, and the local strobe would migrate to much of the other surface area of the panel.
I've been able to remove these layers and use a different diffuser and testing other diffuers that look like was paper. However, I'm not reaching the 10x I need. I'll end up needing to do the following:
1. Backlight focussing optics (e.g. parabolic reflectors, lens, etc) which is an awful lot of extra work. 900 lens or parabolas. There are ideas I'm going to attempt, including special diffuser sheets optimized for behind-the-panel backlights which are used some home theater HDTV's. This will increase brightness by about 2x.
2. Surge power during strobes. LED's can be overdriven for short periods. This can increase brightness of each strobe by about 3x, compared to the steady-light state.

This is all extra considerations, which I now need to weigh against the edgelight-strobing alternatives. I will be proceeding with an approach, the question is which actually represents the best chances of success.
So if the goal is not to change the optical diffusion layer, you would simply have to replace the edge lit LED's with far-far brighter ones like you said. Now isn't that where the problem lies? If you went with direct view LED back-light, you could increase the number of LED's versus edge lit which would significantly help the luminosity of the screen. Is it technically possible to get LED's that are tens of times brighter than current edge lit LED's? That I would have to question.
It's certainly possible to do.
What this simply means I might have is a gigantic fan-cooled shoebox-sized box along the bottom edge or top edge of my computer monitor -- in order to push 10 times the amount of light through the edge of an existing LCD that has its old edgelight removed.

LED's are already used to light up street lamps and stadiums, and are also used in some digital projectors too. There are also 100 watt LED's on eBay, too. You need to think three-dimensionally. The edge lights are designed to fit in a tiny thin strip. Have you ever seen an edgelight LED strip? It's often a tiny, puny strip, often just 3mm x 5mm x 500mm. (Example Photos). I don't have to stay in that confine of a tiny strip. My bulk budget isn't the same as a manufacturer that wants to make pretty compact LCD monitors; I don't mind being as bulky as a 2006-era CCFL LCD monitor (but with its bulk pushed to the bottom edge, literally!)

Presently, I don't mind if I require a big 300 watt fan-cooled box for the first prototype -- sitting at the bottom of my computer monitor, as long as I have less motion blur than CRT, while still being less than 1/10 the weight of a CRT (Given an unlimited budget, quite easy to achieve, as the panel technology is already here; but backlight technology isn't). If it's easier to do it this way, than to squeezing enough LED ribbons efficiently behind a LCD panel as a backlight. By being more flexible with aesthetics than the manufacturers, I can do something that the manufacturers are not willing to do -- bulbous box along top or bottom edge of an LCD monitor (for the high power edgelight).

Theoretically, something more rube-goldberg: An arc-lamp with a spinning black wheel with a white slit (for 1/1000sec strobes) -- a modification of an existing DLP projector color wheel -- I could focus all that projector light into a thin strip, via a prism lens, and focus all that light from a projector/arclamp through an edge backlight entry at the bottom of an LCD. There's many, many, many ways to solve this problem. Some of these solutions are more bulky than a CRT, so are not practical. Some extremely rube-goldberg ways.

Ideally, I'd still like to be much, much, more compact than a CRT. One method is with strobe LED's (e.g. one hundred cameraphone flash LED's, crammed into a tiny strip), or some other approach, on a 1cm-wide aluminum strip heatsink cooled by a small fan. As long as I can get rid of the heat safely. Or just a linear row of expensive Luxeon/CREE 5-to-10-watt-league LED's being focussed through a small glass prism (to focus the light). With appropriate surge current (>10W during strobes), this could cram sufficient wattage through an edge (10x light over an existing backlight) using only a reasonably compact bar at the top or bottom edge of a monitor; approximately 5cm x 5cm along the whole width of the monitor (most of which might be hideable behind the monitor, if using angled optics), and only a couple of pounds -- bulky for a LCD monitor, but far more compact than CRT. Ugly box, but might be a much simpler DIY solution to get the motion resolution I'm aiming to achieve the equivalent MPRT (Motion Picture Response Time) of 0.5 millisecond -- at least 95% motion blur elimination -- practially CRT equalling or beating.

The bottom line: Which is easier;
1. Using a turbocharged backlight. (Disadvantage: Lots of work needed in backlight optics will be needed)
2. Using a turbocharged edgelight. (Disadvantage: Full strobes only, may lead to an ugly bulbous box along the top or bottom edge, but may be ultimately easier)
I think changing from a direct back-lit (with proper optics) scanning matrix to a full panel edge lit strobe would be simpler, but also not produce nearly as good of a result. I don't think finding an acceptable direct light diffusion layer optic would be as troublesome as you think.
Existing home theater tests on local-dimming panels (zone backlights -- a good case study) shows only about a 15:1 contrast ratio. Same image contrast test (checkerboard test pattern). Instead of million-to-one contrast ratios, backlight diffusion limits it to about 15000:1 for on-backlight behind white LCD pixels -- versus off-backlight behind black LCD pixels, but the LCD panel is 1000:1 natively. Thus, if you're comparing on-backlight versus off-backlight behind the same LCD pixels (white pixels), when measuring the off-backlight and on-backlight part simultaneously, to capture backlight diffusion effects. 15000/1000 equals only 15. So backlight diffusion means I only have a 15:1 contrast ratio. Tests at home, appears to confirm this. Ouch.

Another thing to consider is will the technology be limited to TN panels? This will shrink the market considerably as TN panels have inferior image quality to any other panel type, are generally small screens and are limited to a rather measly resolution of 1080P. One of the reasons I gave up my FW900 CRT is that the 22.5" image and 1920x1200 resolution is just too small for me.
No, it won't be limited to TN panels. Its just that the display panel makers haven't yet started manufacturing computer monitor sized IPS panels with a fast-refresh pattern resembling the refresh pattern of 3D TN 120Hz monitors that are so well-suited to strobed backlights. All the current IPS panels designed for monitors are traditional sequential row-by-row refresh at slow speed, rather than refreshing the whole panel rapidly at once (so that pixel persistence can be most easily cleared before the end of the refresh cycle -- a necessary ingredient for 3D.)

I don't have time to do much research at work, but what panel types are in those 960 Hz scanning back-lit TV set's (not TN) and what type of back-lit LED's do they use? Direct or edge?
Depends. Some strobed backlights are using edge backlights. However, most are behind-the-LCD scanning backlights, for the premium displays. Many are using IPS panels, 3D-compatible shutter glasses compatible panels exist in HDTV sizes in IPS format, so there's hope for the future of LCD manufacturing for monitors. Time will tell, but today (as of 2012), a zero-motion-blur LCD computer monitor can only be achieved with a TN panel and appropriate ultrashort ultrabright strobes (extremely expensive to achieve) not yet currently being done by manufacturers.

Usually those companies have done all of the R&D on something like this, it's just up to smart guys like you to adapt it to computer monitors in which those same large companies feel there is no market for.
Yep, they have done lots of R&D. Some of them have done the zero-motion-blur LCD in the laboratory before, I'm pretty sure. But never in the public because of the strong wattages required to equal the brightness of CRT phosphor during short flickers, needed for the equivalent motion resolution... Nobody will pay $50,000 for an LCD monitor that just simply has less motion blur than a CRT. The panel technology (at least for 3D TN) is already here to make that possible, but cheap backlight techonology isn't yet.
 
Last edited:
BTW, I'm now collecting dead LCD panels (non-cracked preferably) for experiments/tests that includes developing methods of cutting the top/bottom metal cover edge to permit external edgelight mechanisms bulkier than the panel's existing edgelight. I need to do these tests before I risk damaging an expensive 120Hz panel.

Send me a PM if you have any dead monitors/panels to donate/contribute. Preferably those that just simply have a dead edgelight, so the panel is working once I replace its edgelight with my own. Also, still collecting backlight panels (e.g. old CCFL backlit panels), since I'm testing different diffusers that are optimized for backlights versus edgelights.

Send me a PM, if you have panels to donate (dead or alive). Toronto, Canada area.
 
(more or less talking to myself post)

I played around with my arduino a bit to create a pwm signal etc... Seems like it would/will work great.

What I really wanted to try was simply connecting a display's built in lights to the circuit to see what the light output would be like. I've measured my light output to be 330cd at max brightness. Now I usually calibrate at 120cd. From the other thread (I'll post this stuff here from now on since it's more appropriate. I missed it last time) we said we could probably do a linear interpolation between pwm'ed perceived brightness and max brightness. I usually calibrate to 120cd which in my case is sufficiently bright event during daylight (closed windows).

So if I aim for 120hz, then backlight would need to be on ~3.03 ms per cycle (8.33ms). At this point I'm still debating if it's still worth a try (if 3ms strobes will make enough of a difference for me to use it daily).

What I do see now, is the higher the refresh rate, the lest "powerful" your leds would have to be for a chosen perceived brightness.

So I guess it would make sense to build prototypes with crazy bright leds. So it's easier to tune-in a usable brightness range for all the user scenarios.

I guess it all comes down to the user, and what he needs as a brightness. I ran mine at 90cd for most of saturday, and it was "ok", but a bit on the dim side. Strobing 2.25ms shots @ 120hz would (I'm guestimating) yield ~90cd.

Edit: Another risky option would be to overvolt the leds. Because it would be for "short-ish" periods of times, it might be possible to gain quite a bit of brightness with no serious side effects (not sure on this, I'll have to read up).
 
Last edited:
(This thread was created in 2012
-- long before I discovered LightBoost strobe backlights was already successfully doing this in out-of-the-box monitors
)

I have my Arduino project on standby -- I've been focussing on a several sub-projects first (motion test software and input lag meter), and I'm now deciding I may go down a different route because of simplicity:

Are you doing IPS? If IPS, you'll need to stick with the scanning backlight approach (since IPS pixel persistence can't easily fit inside a vertical blanking interval yet). But if you're doing TN the strobe backlight is a much superior way and potentially simpler way of eliminating motion blur.... but has the tricky requirement of squeezing the pixel persistence into the time period of the interval between LCD refreshes... (as seen in my high speed video)

-- Instead of a scanning backlight, which I now feel has critical fundamental limitations with contrast ratio and motion blur elimination (due to diffusion between on segments and off segments), I'd like to make it a strobe backlight. Optics for a backlight is extremely challenging, and an edge light is much easier and more efficient. I can buy an existing edge-lit LCD, rip out its old edgelight and bend open the metal cover where the edgelight used to be, and put a thick, turbocharged edgelight there (high-brightness high-CRI LED's) so I can run at 0.5ms strobes while still having over 100cd of average brightness. Optics are much easier since I'd reuse the existing optics in the existing LCD. The disadvantage of a vastly brighter edgelight is a bulky edgelight so the monitor will have to be debezelled to allow the bulky bar at the top/bottom edges of the LCD.

-- The response time acceleration abilities of the LCD will be key. Artifacts caused by leftover remaining pixel persistence leaking between refreshes would result in the "sharp trailing ghost" effect. You might want to get an LCD that allows you to tweak all the various RTC parameters where possible, or even just turn off RTC and see what happens -- there will still be remnant artifacts (e.g. 'faint-but-razor-sharp' ghost afterimages) for any leftover incompletely-bypassed pixel persistence.

-- If you do a real-time-scan mode (e.g. BENQ instant mode) -- the type of LCD that has less than one frame of lag by immediately scanning out the pixels -- it will benefit you if you lengthen the vertical blanking interval (via nVidia Custom Resolution) to give more time for the pixel persistence to finish, before strobing. For a 3ms strobe, you need a vertical blanking interval that is 3ms long -- which is long and extremely difficult to do. Most panels often framebuffer the refresh before displaying, so lengthening the blanking interval via custom timings, won't have an effect and make this job harder. It's impossible to do on some LCD monitors because you can't override their buffering.

-- LCD's absorb a lot of light; More than 90%. You could need hundreds of watts in the strobes. I actually underestimated this, and need more than 250 watts of LED for a 24" monitor to achieve my goals, unless I do it through the edgelight method instead of backlight method. LightBoost actually does highly efficient job (but not with enough wattage to achieve the brightness goals I wanted).

-- IMHO, this project will be easier and have more "bang for the buck" with TN panels with:
......Use the edgelight approach (reuse LCD's existing edgelight optics, replace edgelight with a bulky turbocharged edgelight)
......Use panels that can fit pixel persistence inside the time period of a vertical blanking interval. (easier to strobe without artifacts). Active stereoscopic 3D panels made this possible.
......This project is much easier if you use fast 1ms or 2ms panels.

-- If we were doing this with IPS, it is extremely hard to squeeze IPS pixel persistence into the time period of a blanking interval unless you dramatically lower the refresh rate and artificially lengthen the blanking interval. Strobe backlight with an IPS panel will lead to lots of crosstalk between refreshes, even if there's a good deal of motion blur reduction. The increased intensity of double-image artifacts may kill the motion quality. A scanning backlight would in this specific case, lead to better motion quality than a strobe backlight, but give you an upper limit of motion blur elimination due to the backlight diffusion effect.

-- A high speed camera is a big help in this research & development. I highly recommend Casio EX FC-2000S (or similar Casio model) because it costs only $300 from Japan on eBay, it has a 480fps mode that's full-frame, allowing 8 captures of a refresh per 60Hz refresh, and 4 captures of a refresh per 120Hz refresh. It also has a 1000fps mode (but low resolution 224x64 and only a tiny sliver of its sensor). It allows you to watch the top-to-bottom "wiping" pattern of a LCD refresh, in order to determine how you can time your strobe backlight or scanning backlight, backlight diffusion patterns (if doing scanning backlight) and also determines how long you can lengthen your vertical blanking intervals (either through the panel or via custom timings in the signal).

That said, even experimenting (on just a 60 Hz LCD panel) will yeild some very interesting results, to show the law of physics and motion blur reduction (albiet with lots of double-image artifacts, due to remnant pixel persistence)
 
Last edited:
Back
Top