Digital Viper-X-
[H]F Junkie
- Joined
- Dec 9, 2000
- Messages
- 15,116
Could they hit 72Hz?
Those are IPS panels right?
no idea on 72hz :<
and yes IPS 6bit+FRC
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Could they hit 72Hz?
Those are IPS panels right?
P.S. Can a Catleap owner tell me how easy it is to disassemble a Catleap monitor? Is the backlight easily removable from the LCD glass, or hard to remove? (Just for my future knowledge)
Thanks for your informative instructions. I have not decided what future 120Hz display to disassemble, but will probably begin with my 'disposable' Samsung 245BW, because its backlight is a true backlight and easily separated. It will limit me to 72 Hz scanning backlight operations, and probably have more ghost afterimages trailing motion, but it would be a good proof-of-concept for me to test with.Disassembly is a pain, but doable. Be sure to take photos during the process so you know how to re-assemble it later.
Good point, I agree that buying mass-manufactured LED ribbons will give me imperfections. This may be less if I use #5050 LED ribbons (narrow ribbons) rather than #3528 LED ribbons (wide ribbons). #5050 is about three times brighter than #3528 ... However, there's twice as many #3528 on a ribbon (600 LED per 5 meter), and the narrow ribbon allows me to space them more closely. This might cause me to lean closer to buying #3528 LED ribbons instead of #5050 LED ribbons, and easier to diffuse-mix the color inconsistencies when cramming a whopping 2,400 LED's behind a computer monitor LCD.Since the Catleaps use a edge-lit design you'll need to find a way to remove or cut the rearmost metal back for your LEDs. This will most likely make it difficult, if not impossible, to reassemble the outermost parts of the casing in their original configuration because of the extra space needed for your LEDs and circuits.
Other gotchas/ideas:
Make sure the diffusion materials are returned to their original orientation exactly, as in the wrong configuration it can act to reduce the amount of useful light reaching the LCD.
The diffusion materials, especially the middle one, can be scratched easily and may create visible marks later.
The middle diffusion material seems to recycle the incorrectly polarized light by reflecting it back off the white paper behind the light guide, meaning the paper does contribute to the overall display brightness. Removing this to add your LEDs may result in the display being dimmer than expected.
You and me both. Text should be razor sharp during smooth scrolls.I would also like to say this project would be awesome. I'm often pissed that I cant read text while scrolling and I have motion blur with a passion.
That's not a problem if there's only a few LED's per resistor. There are already 800 current limiting resistors built into 20 meters of LED ribbon. I can't remove these surface mounted current limiting resistors. The light on these is very consistent because are a whopping 2,400 LED's for my planned scanning backlight, and a whopping 800 separate current-limiting resistors already built into LED tape.A current limiting resistor is a bad idea for an application like this, because it's likely to result in an uneven, and inconsistent backlight.
Yes, you're right, it's just a simple LED sequencer you can do with a shift register. But I am prototyping, and I want to do it in a fun way. I am a software guy and I need very flexible adjustability and software programmability.As a hardware guy, I feel compelled to point out that a simple 555 timer and shift register would be a far more elegant, and in my opinion, easier approach than using an Arduino, but for a one off, you may want to just go with what you are comfortable with. Even better, it probably wouldn't be too hard to sync up your clock signal with the monitor, which is something that might be important.
Excellent idea, I'll keep that page bookmarked.Check out these FPGA based Arduino-like boards. http://papilio.cc/ You can drop an "Arduino" soft core on them. Some of the newer models have actual processors in addition to the FPGA.
It would be much easier to do hardware sync detection and LED scanning in HDL. Use the processor (hard or soft) to compute the delays and timing offsets.
Software people can usually pick up HDL fairly quickly.
I want to try to avoid using hardware VSYNC, so I can do maximum flexibility in using various different LCD panels, without needing to modify monitor electronics.
1) Yes, I agree, that's a disadvantage. I believe for the first prototype, I will design the circuit to have flexibility to use both hardware and software VSYNC, so that I can choose either way depending on the panel I am testing.Not using a hardware vsync is bad for a number of reasons:
1) The system becomes not only OS-dependent, but also requires a computer.
2) Most consumer OSes are not real-time, so the time you get the vsync is non-deterministic. You may not even have access to the real vsync since the video card is generating this.
Would you be surprised I was able to do source (host) VSYNC with less than a +/- 10 microsecond accuracy?My largest concern would not be the hardware modification for the new back-light which would be relatively easy, it would be the very precise timing needed for synchronization. Source (host) VSYNC signal is unusable.
Agreed. You are 100% correct, we already both know this.I guarantee this is how Sony and others implement their scanning back-lights.
Right. It's the cost of the insane wattage in LED's.The reason this hasn't been accomplished in computer displays is cost. The cost could easily escalate on a given monitor by many hundreds of dollars with the associated circuits and high-intensity LED"s.
I'm using 16 segments now (see the photo of the bench test unit, attached to single LED's rather than to MOSFET amplifiers connected to powerful ribbons hiding behind an LCD panel). Tests have shown I'm able to do well-timed 1/10,000sec strobes with some optimized Arduino programming. However, I'm leaning towards skipping sequential scanning and using only full-panel strobes only -- at least for a first prototype. The circuit is otherwise quite simple -- anyone who can put together LEGO, can follow instructions to build the circuit. The smarts/complexity is software and the synchronization of the flashes, and the money/wallet for the cost of the LED's.Don't get me wrong, if you could accomplish this it would be amazing. On paper it sounds like using a host VSYNC signal could work, but a lot of time in practice it doesn't work out well. I assume your first test-bed will be an 8-segment scanning back-light on a 120 Hz screen to simulate 960 Hz and use an adapter to splice between the monitor cable from the GPU and the input of the monitor to steal the VSYNC signal?
VSYNC via GPU has nothing to do with VSYNC on display. VSYNC stands for Vertical SYNChronization, which *all* display signals have. If you lived through the 70's or 80's, you've seen VSYNC before: It's that black bar between frames when VHOLD is mis-adjusted on an analog TV. Displays require that synchronization signal in order to begin the next frame, otherwise the picture is scrambled, garbled, skewed, or rolling. There's guaranteed always a VSYNC being output to the display.Will the GPU always be required to have VSYNC enabled (which isn't optimal in some cases) or does a usable time signal get sent at all with VSYNC disabled?
It will work far better with TN panels than with IPS panels, but there will be benefit. To have less motion blur than CRT, you need a good TN panel and refresh electronics that can finish erasing pixel persistence on time before the next refresh (3D 120Hz/144Hz compatible).Do you envision this only properly working with fast TN panels? I would like to try this out on my 130 Hz 1440P IPS but the pixels rise and fall is around 5-7 ms. At around a pixel refresh speed of 7.7ms at 130 Hz, that hardly leaves any time between pixel refresh. As soon as the pixels are done changing they are already starting to change once again.
You need a lot of safety margin because of backlight diffusion. The boundaries between lit segments and unlit segments is extremely diffuse, unless you're building lots of optics (which is hard for a 900 LED backlight!) and it will still be somewhat diffuse even then. Even the stuff built into panels diffuse throughout the panel to an extent -- so even lighting a tiny corner of the panel, faintly lights up the rest of the panel unavoidably. So that becomes a limiting factor, which also contributes to my decision to prefer full-panel strobes (which also then forces the active-3D-compatible panel requirement, limited to TN for monitors at this time)So it leaves plenty of time ~5ms to get the eight sequenced light pulses in but hardly any room for error and buffer when the pixels are transitioning so the timing would have to be spot on to not illuminate pixels in transition state.
Yes, a dongle would be far the most user-friendly method, and it can be hardwired to the Arduino, without any requirement for timecoding.It would be great if you could implement an adapter that you could simply splice between the output of the GPU and the input of the monitor, that would eliminate any hardware specific engineering which would severely limit the application. Although, every single monitor is going to need to be back-lit modified which is only going to be a niche market, unless you market ready made adapter and monitors. There could be a nice market for enthusiasts.
Indeed, especially if the IGZO is 3D compatible.One other issue is capturing the VSYNC signal. This is easy with DVI and HDMI as they both use constant stream digital signals but may be more complicated with Displayport. That signal is packet based. Currently most monitors still use DVI but I could see DP becoming the only standard here with a couple of years. That is until and if they get HDMI chips with pixel clocks high enough to support 120+ Hz, but then again who would want to do such an extensive and awesome modification of this type on only a 1080P monitor? I know I wouldn't. I've even been considering purchasing the Sharp IGZO 4K 31.5" display screen this spring. That monitor combined with this scanning back-light would surely make the best display on the planet for computer use.
The LED strip is the easy part -- you can get them by googling "5050 LED ribbon" on eBay, on Alibaba or amazon, though many are too dim for this purpose, the trick is in selection, getting the brightest ribbon.Please let me know if you get beyond the proof-of-concept stage with the host VSYNC timing with Arduino actually working. I'd gladly love to buy some of these LED strips, modify one of my 130 Hz 1440P monitors to test this out.
The good news is that nVidia has started encouraging manufacturers to put strobed backlights in their displays -- 3D LightBoost is a strobed backlight that flashes the backlight brightly whenever the shutter glasses are open. Both shutters in shutter glasses are closed, while waiting for a 3D monitor to refresh between frames; Instead of wasting brightness keeping the backlight turned on while both shutters are closed -- save the light and light the LED's more brightly for shorter time periods, only while shutters are open. That's why they call it "lightboost" -- brighter image when used with 3D shutter glasses. But it's also exactly what's needed for motion blur reduction too, and this excellent AnandTech article covers this topic.I've always wondered why scanning backlights weren't used in gaming monitors. I see the cost is quite high, infinitely worth it though.
I've been able to remove these layers and use a different diffuser and testing other diffuers that look like was paper. However, I'm not reaching the 10x I need. I'll end up needing to do the following:Simply putting ton's of these bright LED's behind that same diffusion layer may not produce the results you are after like you said, and the local strobe would migrate to much of the other surface area of the panel.
It's certainly possible to do.So if the goal is not to change the optical diffusion layer, you would simply have to replace the edge lit LED's with far-far brighter ones like you said. Now isn't that where the problem lies? If you went with direct view LED back-light, you could increase the number of LED's versus edge lit which would significantly help the luminosity of the screen. Is it technically possible to get LED's that are tens of times brighter than current edge lit LED's? That I would have to question.
Existing home theater tests on local-dimming panels (zone backlights -- a good case study) shows only about a 15:1 contrast ratio. Same image contrast test (checkerboard test pattern). Instead of million-to-one contrast ratios, backlight diffusion limits it to about 15000:1 for on-backlight behind white LCD pixels -- versus off-backlight behind black LCD pixels, but the LCD panel is 1000:1 natively. Thus, if you're comparing on-backlight versus off-backlight behind the same LCD pixels (white pixels), when measuring the off-backlight and on-backlight part simultaneously, to capture backlight diffusion effects. 15000/1000 equals only 15. So backlight diffusion means I only have a 15:1 contrast ratio. Tests at home, appears to confirm this. Ouch.I think changing from a direct back-lit (with proper optics) scanning matrix to a full panel edge lit strobe would be simpler, but also not produce nearly as good of a result. I don't think finding an acceptable direct light diffusion layer optic would be as troublesome as you think.
No, it won't be limited to TN panels. Its just that the display panel makers haven't yet started manufacturing computer monitor sized IPS panels with a fast-refresh pattern resembling the refresh pattern of 3D TN 120Hz monitors that are so well-suited to strobed backlights. All the current IPS panels designed for monitors are traditional sequential row-by-row refresh at slow speed, rather than refreshing the whole panel rapidly at once (so that pixel persistence can be most easily cleared before the end of the refresh cycle -- a necessary ingredient for 3D.)Another thing to consider is will the technology be limited to TN panels? This will shrink the market considerably as TN panels have inferior image quality to any other panel type, are generally small screens and are limited to a rather measly resolution of 1080P. One of the reasons I gave up my FW900 CRT is that the 22.5" image and 1920x1200 resolution is just too small for me.
Depends. Some strobed backlights are using edge backlights. However, most are behind-the-LCD scanning backlights, for the premium displays. Many are using IPS panels, 3D-compatible shutter glasses compatible panels exist in HDTV sizes in IPS format, so there's hope for the future of LCD manufacturing for monitors. Time will tell, but today (as of 2012), a zero-motion-blur LCD computer monitor can only be achieved with a TN panel and appropriate ultrashort ultrabright strobes (extremely expensive to achieve) not yet currently being done by manufacturers.I don't have time to do much research at work, but what panel types are in those 960 Hz scanning back-lit TV set's (not TN) and what type of back-lit LED's do they use? Direct or edge?
Yep, they have done lots of R&D. Some of them have done the zero-motion-blur LCD in the laboratory before, I'm pretty sure. But never in the public because of the strong wattages required to equal the brightness of CRT phosphor during short flickers, needed for the equivalent motion resolution... Nobody will pay $50,000 for an LCD monitor that just simply has less motion blur than a CRT. The panel technology (at least for 3D TN) is already here to make that possible, but cheap backlight techonology isn't yet.Usually those companies have done all of the R&D on something like this, it's just up to smart guys like you to adapt it to computer monitors in which those same large companies feel there is no market for.