Has anyone replaced LED's or strips

auntjemima

[H]ard DCOTM x2
Joined
Mar 1, 2014
Messages
12,141
in an LED tv before?


A few days ago someone had an LG 55LN5750 out at the road, for free. I like to tinker, and I want a new computer monitor (lol, right), so I took it home.

I have determined, through youtube videos and other forums that a few back light led's are shorting and causing the driver to trip a breaker. I get about half a second of picture before it goes dark, but I can see the picture if I use a flashlight.

I have tested each LED with a multimeter in the diode setting and have come across three that do not light. I cannot tell if they are open, or if they are shorted, however, as I can only test a single LED at a time. If I was at work I would be able to use our Sorensen power supply to test the strip.

I was just curious if anyone here has had luck replacing the LED's or should I just replace the strips? I think my plan, since they already don't work, is to fix two of the strips (three have issues) with the final strip and just purchase one new strip. I figure that even if I end up fucking it all up, they didn't work before anyways.
 
I've been working with monitor LED strips attempting to modify them to act as a scanning backlight:

xBn9omy.jpg


That strip has four 11 LED segments that were controlled by a TI TPS61199 before I took the monitor apart.

How are the strips you have configured? It may be different for a TV than for an edge-lit monitor. It seems like most LED driver ICs will just shut down the shorted (or open) pin while leaving the other segments alone. If the whole strip lights up and then goes out, you may have a different problem.
 
I've been working with monitor LED strips attempting to modify them to act as a scanning backlight:

xBn9omy.jpg


That strip has four 11 LED segments that were controlled by a TI TPS61199 before I took the monitor apart.

How are the strips you have configured? It may be different for a TV than for an edge-lit monitor. It seems like most LED driver ICs will just shut down the shorted (or open) pin while leaving the other segments alone. If the whole strip lights up and then goes out, you may have a different problem.

That's pretty awesome!

The LED's are driven by two separate rectifying circuits. One for the top half and one for the bottom. The LED's in this set are run in series. Because of this an open LED will drop all the rest, on that half of the screen and a shorted LED will usually cause a breaker on the LED driver board to blow until restart. Reading a bit has me believe that one of two gone should still allow the tv to work, if they are shorted, but more than that and its a no go. I was able to get into work today and verified that my issue is 2 shorts and one open. The open likely causing my issue.
 
I've been working with monitor LED strips attempting to modify them to act as a scanning backlight:

xBn9omy.jpg


That strip has four 11 LED segments that were controlled by a TI TPS61199 before I took the monitor apart.
Very nice!

Blur Busters originally got its start due to scanning backlight research, and I helped Zis (with my pages on earlier research) with the Zisworks scanning backlight. While I can't link the open source research due to [H] rules (others unaffiliated with Blur Busters are allowed to, though.), you're probably already familiar with the prior Blur Busters research by now, if you're creating a homebrew scanning backlight!

You'll want multiple adjustments:
-- Scanning begin phase adjust (first segment flash at top, timing relative to VSYNC pulse)
-- Scanning flash length adjust (how much each is pulsed)
-- Scanning velocity adjust (how fast the scanning backlight is pulsed top to bottom).
-- Scanning versus full-strobe mode

Scanning phase ajdustment[
Usually you want the start phase to be roughly 180 degrees to the GtG midpoint of LCD scanout. This is where you get the point of minimum strobe crosstalk with a scanning backlight. This won't be exactly 180 degree phase relative to the LCD scan pulse, since the GtG (pixel response) is "lagged" relative to that. So you'd probably want to eyeball the adjustment while viewing the "TestUFO crosstalk" test. You'd adjust until the minimum strobe crosstalk occured. A common temptation by homebrew is to try to math-calculate the strobe phase relative to VSYNC, when it's actually best to eyeball the adjustment with a realtime phase adjustment while watching horizontally panning motion. You can see the GtG double-image (or in your case, quadruple-image/octuple-image zone) zones shift in-and-out of visibility as you adjust phase. Often LCD GtG behaves in totally unexpected ways, e.g. most of the GtG occurs at the beginning of GtG, but certain color pairs might be extremely slow/bad (e.g. VA panel dark greys) which might force a little biasing in your strobe phase to try to compensate for reducing the strobe-crosstalk-visibility of some of these very-bad colors.

Scanning flash length ajdustment
This is a brightness-versus-blurring tradeoff. The longer the flash length, the brighter -- but more motion blur. Also, you may want the flashes of the next scanning backlight segment to overlap from the previous flash, or not -- depending on the multi-image strobe crosstalk effect caused by the backlight diffusion (light from ON segments diffusing into the OFF segments). Strobe crosstalk looks different (worse, better). At short flashes, a segment will turn off long before the next segment illuminates. At longer flashes, segment illumination will overlap. This is acceptable, and these have pros/cons from a strobe-crosstalk appearance perspective. So an adjustment is favourable.

Scanning velocity adjustment
Logically, you may want to scan synchronously with the LCD scanout. However, visually, this isn't always ideal all the time, depending on LCD GtG factors + backlight diffusion factors. From a strobe crosstalk perspective, if you use large blanking intervals (long pauses between fast LCD scanouts), global flash backlights have a big advantage over scanning backlights -- you don't have to worry about backlight diffusion between OFF segments and ON segments -- which creates amplified strobe crosstalk artifacts. However, if you do a tradeoff between a full-strobe (infinite scan velocity) and a sequential scan (scan synchronously with LCD scanout), you can actually balance the pros/cons of global-strobe strobe crosstalk versus the pros/cons of sequential-scanning-backlight strobe crosstalk (the crosstalk artifacts look extremely different).

Scanning versus strobe mode
Without testing it out, it is hard to say whether the strobe crosstalk of global-strobe is better/worse than the strobe crosstalk of a sequential-scan. But as a general rule of thumb, if you can achieve better than 100:1 contrast ratio [usually not possible with edgelight, without FALD array] between the OFF versus ON segments, then a synchronous sequential scanning backlight is usually easier. Thus, it's good for a homebrew strobe backlight to have a scanning mode versus strobe mode, since the crosstalk apperance may be preferable in one mode or other. Also, if it is a panel that does not support large vertical totals (accelerated scanout + long pause between scanouts), then it's more likely strobe crosstalk artifacts are more preferable in scanning mode than strobe mode.

Strobe crosstalk appearance of global strobe:
Usually very clear in one screen part, and very ugly double-image effects in another screen part. Getting better overdrive, speeding up pixel response, and squeezing pixel response into the VBI, and using a Custom Resolution Utility to use large VBI (e.g. Vertical Total VT1500 or VT2000 is extremely favourable for 1080p) -- for example, the Zisworks display supports over VT2200 for 1080p120Hz, using 240Hz scanout velocity with 120 refresh cycles -- 4.1ms scanout and 4.1ms VBI. That makes it easy to squeeze the 1ms TN GtG in the VBI, letting the pixels settle between refresh cycles, which can greatly reduce strobe crosstalk for a global-flash strobe backlight.

Flicker at lower Hz:
Also, for 60Hz (low-strobe rates), scanning backlights "appears to flicker less" because the human eye is seeing at least some light continuously (just like a CRT), so 60Hz scanning backlight may be less painful of a flicker than a 60Hz global-flash strobe backlight (even if the latter has less strobe crosstalk artifacts).

Strobe crosstalk appearance of scanning:
Consistent crosstalk from top to bottom. Usually worse than the clearest zone of a strobe backlight, but better than the worst zone of a strobe backlight. You will usually get multi-image effects from internal diffusion of a scanning backlight, e.g. the ON segments leaking into the OFF segments. This creates 4-image or 8-image effects with a 4-zone scanning edgelight. It takes a contrast ratio of well over >100:1 between the ON versus OFF segments in order to really make the strobe crosstalk much more invisible. I generally only see that sort of numbers being barely approached in FALD implementations of scanning backlights, rather than edgelight-implementations, but often, when well done (with a slight scanning-velocity speedup relative to LCD scanout to "compress"/"unify" the multi-image crosstalk artifacts a bit) it may actually be very preferable to the overall artifacts of a global-flash strobe backlight.

Fixing erratic flicker in homebrew scanning backlight:
This was observed to be a common issue with complex homebrew scanning backlight logic. Also, you need precision control over strobe flash length. Strobe phase can more safely jitter (e.g. +/- 0.1ms) but strobe flash length should not vary by more than ~0.1%. This is because if the strobe flash length varies by 1%, you're getting 1% modulations in brightness. Which means for a 1ms strobe backlight, a 10 microsecond variance in strobe flash length creates a brightness flicker of 1%! Even 0.25%-0.5% flicker is sometimes still noticed: Open a full-white window (Windows Notepad -> Maximize) and then defocus into it, that amplifies your sensitivity to erratic flicker of a microcontroller-driven backlight. It's surprising how important microseconds are in preventing flicker. So try to get your strobe length variances to less than 1 microsecond. Which means turning off interrupts and stuff like that, when using microcontroller logic to time the strobe flash length.

Commanding of adjustability
The zisworks open-source scanning backlight controller (which recently became available) uses serial commands to do some of the above adjustments such as strobe phase. One solution that the Zisworks display did for the commanding of adustments (strobe phase) was to disable serial interrupts until only a very specific tight period every blanking interval (and read the serial buffer only during these times). Most microcontrollers have a small hardware serial buffer, so keep commands very short if you're writing your own custom backlight controller firmware.

Input lag with computer gaming:

Less with scanning backlight than a full-strobe backlight. This is applicable if you play videogames. Also, lag uniformity with VSYNC OFF is better with a scanning backlight (e.g. cable scanout versus strobe flash), while lag uniformity with VSYNC ON is better with a full-strobe backlight. This is very subtle and only of interest to those people who care about lag uniformity of the screen (e.g. lag differentials of top edge versus bottom edge, relative to GPU scanout).

Heck, maybe get Zisworks' scanning backlight controller (4-segment compatible) and use that, since you'd get most of the above adjustments, without all the hard trial-and-error work. The scanning backlight open source code is available for Arduino 1.8.2 IDE that way. Or even repurpose the Zisworks source code for your controller, in exchange for contributing your source code improvements back to Zisworks display users.

Regardless, the TestUFO Strobe Crosstalk test (or a similar test) will probably be your best friend in calibrating your scanning backlight.

If you do choose to write your own microcontroller code though, hopefully you will publish the source code to your scanning backlight for others to tweak...

I'm very interested to see the results of your homebrew scanning backlight!
 
Last edited:
Thanks for the very detailed response! I've been a member of the BlurBusters forum for quite some time, which is where the interest in this project originated.

As you've mentioned, one of the biggest issues with segmented backlights is actually how well the LED diffuser works; the ON segments bleed throughout most of the panel.

Shee7lP.png


I resorted to cutting the diffuser panel into four segments (picture was from before the cutting). This was messy and far from ideal (it would be best to order custom segments), but it does keep the light from each segment mostly contained.

My biggest issue now is reliably reading the VSync signal with the Arduino. The code for the LED pulses isn't very complicated. The first image I posted had the strip refreshing globally at 120 Hz with an LED pulse time of only 72 μs, which was very dim! Both of those are adjustable, and it's easy enough to add an offset to allow for LCD transition times.

If I make more progress, I'll keep this updated.
 
I resorted to cutting the diffuser panel into four segments (picture was from before the cutting). This was messy and far from ideal (it would be best to order custom segments), but it does keep the light from each segment mostly contained.
Clever solution! Hopefully the "seams" aren't too noticeable in non-strobed mode, but if you've cut them carefully, the seams should be sufficiently diffused together to not really be noticeable -- probably less noticeable than the wire on old Sony Trinitron monitors!

My biggest issue now is reliably reading the VSync signal with the Arduino. The code for the LED pulses isn't very complicated. The first imtage I posted had the strip refreshing globally at 120 Hz with an LED pulse time of only 72 μs, which was very dim! Both of those are adjustable, and it's easy enough to add an offset to allow for LCD transition times.
There are a couple of approaches for homebrew:

(1) Hardware VSYNC, which is preferred.

Ideally, it is preferable to solder a wire to some kind of a motherboard joint and directly into a digital GPIO pin of your backlight microcontroller (optoisolated, if needed). With your equipment (logic analyzer or oscilloscope), you could probe the monitor motherboard for a signal that correspond to the VSYNC interval. Anything that has a pulse frequency matching the refresh rate, and you've found something to piggyback on! There's many such solder joints available on a monitor motherboard that will tell you that, Zis might even be able to help you (though you might want to buy his backlight microcontroller too). One bonus of this approach is you may be able to support variable-refresh-rate strobing. You'll need to specially shape your strobes to attempt to maintain constant brightness (number of photons per second) during erratic refresh rates. That can be tricky, but can work if you keep a large vertical total margin (e.g. If you are modifying a 144Hz monitor, then limit your VRR range to 75Hz-120Hz). The bonus is that you can then play videogames capped at 120fps but permit it to fall slightly (120-110-100fps) without seeing any stutters. Strobing amplify the visibility of stutter (which is why strobing fans try to run at a perfect constant frame rates , so VRR strobing can help with near-fixed-framerate gameplay that has many framerate dips.

(2) There are now proof-of-concept algorithms that allow software signalled VSYNC to be extrapolated to microsecond-accurate VSYNC

Are you using a newer ARM-based 32-bit Arduino or a Teensy microcontroller? If you are, then there's another option: Let the computer signal the microcontroller whenever VSYNC occurs. And extrapolate from that. You can send a very approximate VSYNC signal from the PC instead (within +/- 1-2ms jitter, 95% of the time, with a few missed signals OK). Use D3D9's RasterStatus.INVBlank in a RealTime Priority driver, and then transmitting low-latency USB signalling (Teensy3.x controller can do <0.2ms USB jitter at 8KHz poll mode, direct to a USB3 motherboard port, bypassing hub), you might be able to use microcontroller to extrapolate a very accurate VBlank from an erratic VBlank signal, using time-smoothing and snap-to-grid (best-fit of erratic VSYNC signal) logic. You can extrapolate microsecond accuracy after 30 seconds of erratic USB-based VSYNC signalling. The Javascript source code on vsynctester.com actually becomes more and more accurate at predicting VSYNC over time (~30-60 seconds), filtering the noisy signal to compute an ultra-precise refresh rate.
 
Last edited:
Back
Top