Black frames/Scanning Backlight + 120Hz for less motion blur? (240Hz/480Hz/960Hz)

mdrejhon

Limp Gawd
Joined
Mar 9, 2009
Messages
128
Hello,

Short form question: When will computer monitor makers produce computer monitors with a CMR of 480? (480Hz-like)
The top-end HDTV's are already doing this, running at "960 Hz equivalence" -- or rather, CMR 960 for Samsung models.


(CMR = Clear Motion Rate = the amount of time the same pixel is continuously illuminated, when taking into account of everything, including panel latency, black frame insertion, scanning backlight, frame interpolation, etc. A marketing *gimmick* from Samsung, but since there's no other good name elsewhere in the industry, let's use "CMR" just for convenience's sake. If we wish, we can co-opt or invent a different brand-generic terminology. A need exists for a standard of rating a motion-blur equivalence when taking into account of *everything* (black frame if any exists, existence/lack of frame repetitions, constant-backlight vs scanning backlight if any, frame interpolation if any, actual native panel refresh, etc), and so far, it seems "CMR" comes close to this measurement standard.

Long form of question:
Has any monitor manufacturers considered black frame insertion or scanning backlihgt, combined with 120Hz, for reducing motion blur even further? CRT-quality level of reduced motion blur doesn't happen until each rendered pixel illuminates for 1/1000th second (either via 1,000Hz (impractical), or via frame interpolation (laggy), or via using long black frames between refreshes, or via a scanning backlight).

The best (e.g. $3000+) HDTV's use many techniques (e.g. scanning backlights) to achieve an approximately 1 millisecond pixel illumination time. This is different from pixel response (e.g. store and hold display illuminating the same pixel continuously for the full refresh). Those displays are often "Clear Motion Rate 960" which is like 960Hz, or 1/960sec sampling time (very close to 1msec for CRT phosphor illumination, for CRT-style motion quality). The use of black frame insertion (which would no longer flicker noticeably beyond 120Hz) or scanning backlight, would eliminate the lag problems used by frame interpolation used by HDTV displays. (Scanning backlights in high end home theater HDTV's simulate the scanning illumination from a CRT and more closely approach phosphor decay, allowing LCD pixels to be quickly de-illuminated between refreshes, for sharper motion -- less motion blur -- than is otherwise possible)

I personally would like to see this technology (e.g. scaning backlights) reach computer monitors sometime, and I'd be glad to pay about $500-$800 for a 27" 1440p monitor (120Hz) with a Clear Motion Rate of 480 or 960.

NOTE: Before you get to a huff about 60/120/240/480/960 -- there is a point of diminishing returns -- given a human tracking an eye across a rapidly-panning scene. The human eye CANNOT tell apart the flicker, but the MOTION BLUR *does* sharpen progressively, as you go to higher refresh rates (or higher CMR's)

For EXAMPLE SCENARIOS:
1. Television set showing a "fast pan" in hockey game / football game (using HDTV cameras with 1/1000sec shutter speed setting)
*or*
2. Computer monitor with a high-sample rate mouse (e.g. 1000Hz mouse) turning fast in the spot in an FPS game, for the "fast pan"....

Now assume the fast panning scene is moving across the screen at 1 inch every 60th of a second (e.g. taking 1 second to cross the width of a large-screen HDTV, one that's approximately 60 inches wide). Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers:
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors)
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors)
At 240Hz (or CMR 240), the motoin blur is 0.25" thick
At 480Hz (or CMR 480), the motion blur is 0.125" thick
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's)

NOTE: Replace "Hz" with "CMR" for those displays that use blackframe or scanning backlights; the science is similiar. Also, for video on television, it is assumed the camera shutter is faster than the refresh rate, in order to see the motion blur benefit. For games, it is assumed the computer mouse sample rate is higher than the refresh rate, and the graphics card is able to max out the native refresh rate (e.g. max out at 120Hz is all that's needed in order to benefit from CMR 960 when using a scanning backlight) in order to see the motion blur benefit.
NOTE2: CRT's generally have a CMR very similiar to CMR 960, because the phosphor mostly decays by approximately 1 millseconds (and more fully by 2 milliseconds).
NOTE3: There are many variables that fudge the numbers, such as panel response speed. Many panels are limited to 2ms (which maxes out at CMR 500 equivalence), requiring black frames or scanning backlights to go beyond LCD response rate (e.g. getting 960 Hz-like lack-of-motion-blur clarity, out of a 2ms or even 4ms+ panel). This will often affect actual measured "CMR" (photography-measured) versus mathematically computed "CMR", e.g. CMR 960 display actually only measuring to ~CMR 500 equivalence in a scientific high-speed-camera-measurement test, etc. But let's assume all the bottlenecks are sufficiently resolved (e.g. good scanning backlight eliminating panel refresh from being a factor)

Granted, most FPS players (and even many competition FPS players) do not try to identify background objects (e.g. small snipers) while spinning fast, and thus can't tell apart the 60Hz vs 120Hz -- other players play FPS single player to enjoy the graphics, and the motion blur becomes pretty apparent. There's an obvious point of diminishing return. But it is interesting to note that the difference between 60Hz vs 120Hz (0.5" less motion blur) is similiar to the difference between 120Hz and 480Hz (0.375" less motion blur). Obviously, some video game players (like me) really notice the improvement, when trying to track small fast-moving objects. So comparing 60Hz versus 480Hz for the same panning scene, you're comparing 1" motion blur versus 0.125" motion blur -- an 87.5% reduction in motion blur! I'm not a competition FPS gamer, but I now understand why even some competition FPS gamers still prefer CRT, even with today's 120Hz (translation: they're not as crazy as you think -- they're benefitting from reduced motion blur). To others, it's just a gimmick, but the science of motion blur is pretty real & scientifically measurable, and detectable by human eye (even videophiles enjoying CMR 960 (960Hz style) HDTV television sets).

Obviously, smaller TV's will not benefit as much as bigger HDTV's, because the motion blur is less pronounced. On a bigger HDTV, it takes 1 second for a 1-inch-per-1/60sec movement, to complete moving from one edge of the screen to the other edge, so you have enough time to detect the sharpness of the pan by tracking a single object (e.g. golf ball, hockey puck, soccer ball) from one edge of the screen to the other. On a smaller computer monitor, at half the width, you may have only 0.5 second. So instead of CMR 960 being mostly the final point of diminishing return for HDTV sports (960Hz like picture), it would be more like CMR 480 being the final frontier for computer monitors.
...And scientific tests on sports on the high end HDTV's (CMR 960), already show that it's not baloney (e.g. it's actually worthwhile, since some people are actually sensitive to it)

Now, my question is....
When do you think computer manufacturers will finally begin to introduce scanning backlights into computer monitors
480Hz-style would be fine (CMR 480 achieved using 120Hz+scanning backlight) -- CMR 960 is overkill unless you have a 60" display.
Mathematically, 480Hz would have 87.5% less motion blur than 60Hz
(whereas 120Hz only has 50% less motion blur than 60Hz),

(Note: It's also confirmed by human eyes, too -- Several of us, myself included, are able to tell the motion blur difference on 120Hz vs 480Hz (via CMR 480 or 480 Hz interpolation) fluidity during the super-fast pans on some these expensive HDTV displays at home theater showrooms, even in a random blind test where somebody toggles the setting on the HDTV without us looking)

Thanks,
Mark Rejhon
 
Last edited:
i don't know the answer to your question, but step one is making people aware of the technology. That was a great write up and i wasn't aware of this technology that seems to emulate crt displays, very interesting.

if i read this right, your saying the backlight lights up across the screen the same way an old crt light gun scans across and down at the given refresh rate? and that effect hides the somewhat lacking response time of the pixels?
 
Yep, that's right, though there's a number of differences.

CRT: Electron beam controlled by magnets, striking phosphor to illuminate it. Phosphor illuminates for a short period. (mostly decayed by 1ms).

Scanning backight: One (or a few) rows of LED lights illuminate at a time on a refreshed part of LCD panel, while the display controller is refreshing un-illuminated portion of panel. The scanning is done top-to-bottom, much like the scan lines of a CRT, but in this case, a few hundred pixels vertically are fully illuminated.

HowStuffWorks.com "How Scanning Backlights Work"
http://electronics.howstuffworks.com/scanning-backlight.htm

About.com -- info about LED HDTV's and local dimming
http://hometheater.about.com/od/televisions/qt/ledlcdtvfacts.htm

YouTube: High speed camera on a scanning backlight
LG display - http://www.youtube.com/watch?v=sk_X1oCWjd8
Samsung display - http://www.youtube.com/watch?v=QGrkw9OFA8s
Sony Motionflow - http://www.youtube.com/watch?v=prF93YQJW_0

Example brands
Elite -- http://elitelcdtv.com/technology/fast-240hz-smooth-fluidmotion/ -- more emphasis on scanning backlight (less artifacts) instead of aggressive motion interpolation (artifacty)
Sony with "XR 960" -- http://store.sony.com/webapp/wcs/st...10151&langId=-1&productId=8198552921666312874
Samsung's CMR page -- http://www.samsung.com/us/article/clear-motion-rate-a-new-standard-for-motion-clarity

Example of controversies
Controversies of interpolation vs scanning backlight -- http://www.koreatimes.co.kr/www/news/tech/tech_view.asp?newsIdx=53673&categoryCode=129
Gimmickery done by manufacturers -- http://www.techlicious.com/blog/beware-of-inflated-lcd-tv-specs/
(it also explains the lingo that is used to attempt to explain 480 Hz or 960 Hz equivalence via scanning backlight)
For high-end LCD/LED HDTV's ($3000 and up usually), you may have heard of "local dimming" (zone-based dimming) to enhance contrast ratio, by using brighter backlight for bright parts of the image, and dimmer backlight for dimmer parts of the image (and backlight off for the black parts of the image). One type of local dimming is simply using a full array of LED's (white or RGB) is run like a low-resolution video image behind the LCD glass. That greatly enhances contrast ratio.

Most gamers will be more interested in local dimming than simulated "960 Hz" (CMR 960), because it enhances contrast ratio in FPS shooters (as acclaimed by videogamer users of the ultra-expensive HDTV's), e.g. 100,000:1 contrast ratio in the same static image (none of the on/off contrast ratio gimmickery). The local dimming provide CRT-style contrast ratio on high-end LCD TV's, it is a method of allowing LCD (hyper-expensive) to outperform (cheap) plasma.

The electronics for driving LED's in local dimming, is still quite expensive and has not yet hit consumer computer monitors.

BUT.... a good side effect: Local dimming (for contrast enhancement) makes it cheap to add a scanning backlight. Pennies extra, in fact. Because the full array is single-LED-controllable, and the controllability of the backlight makes it possible to do a scanning backlight. And software can adjust the scanning to balance tradeoff between flicker and motion blur, etc. Full matrix local dimming LED arrays are very programmable!

NOTE: Enhancement of the "Clear Motion Rate" is often done by combining many factors. The native refresh may only be 60 Hz, with the use of frame interpolation to go to "240 Hz", and the use of scanning backlight to go to "960 Hz" (called CMR 960 in Samsung literature). We don't want that for computer monitors, since frame interpolation is going to add gaming lag. What we want is a computer monitor that has a native 120 Hz refresh rate, that can use a scanning backlight (at least 8 horizontal rectangles minimum) to simulate a 480 Hz or 960 Hz equivalence, and finally exceed CRT/plasma quality.

The best of both worlds (as found in ultra-pricey LED HDTV's), is local dimming combined with scanning backlight, with a good fast-response panel. With this, you can get a picture superior to plasma, and you finally approach CRT-style contrast and motion.

I suspect that those strongly attached to 24" tubes (e.g. FW900) are hoping that such a LCD display become cheap. That may be a long while, though.

Recommended Exercise for Rich Power Users or Reviewers
(or those with access to showroom or a manufacturer)
Try testing the new pricey HDTV's with PC's. A temporary possible solution (while waiting for PC monitors with scanning backlights): Evaluate the use of these ultra-pricey LED HDTV's for use as a gaming computer monitor. (Unfortunately, there's still some lag, and unfortunately, frame interpolation is sometimes used as part of the scanning backlight equation. However, some of those let you turn off interpolation, but some of those *may* lose the scanning backlight if you turn off frame interpolation.). A rich PC gamer is needed to test multiple $3000-$5000 HDTV's with scanning backlights, to find out which one produces the most CRT-like video gaming quality. It is also, possible, that some of those happen to accept a native 120 Hz input (due to 3D compatibility) after some tweaking via a HDMI adaptor and maybe PowerStrip (similiarlly to tactics done by Catleap 2B owners). Who wanna be a guinea pig on testing a CMR 960 HDTV with PC gaming? :) Obviously, a lot of Service Menu headaches will be needed (to try to keep the scanning backlight enabled while turning OFF frame interpolation.), and maybe some high speed camera (for testing motion blur). Video gaming magazines, want to team up with a home theater magazine? Ideas?

Recommended Exercise for Computer Monitor Makers
If you're reading this, please bring this technology to computer monitors. Alienware succeeded in ultra expensive PC's. Panels and electronics have fallen so much, that you can still find a market for a $1000 computer monitor using local dimming. The technology for local dimming is now cheap enough to sell a $1000 24" computer monitor with CMR 960 capability. (Samsung, are you listening? You invented CMR 960 Hz for LED HDTV's, bring it to an Alienware-league priced luxury computer monitor!) (Dell, ViewSonic, LG, etc -- feel free to beat Samsung at their game). You'll be the talk of competition videogamers, especially if you tape down the left or right arrow key in a video game in an an A/B comparision test. Make sure you demo side by side with a 60 Hz monitor, so that it's a smeary mess. The rapid panning will be 87.5% sharper on a CMR 480 display than a 60 Hz LCD. (Or if you're daring, CMR 960 -- the rapid panning is 93.75% sharper than a 60Hz monitor, and 43.75% sharper than a 120Hz monitor. Obviously, you'd prefer to do CMR 960 on a 120 Hz signal, rather than CMR 960 on a 60 Hz signal, to reduce gaming latency as much as possible, and to reduce flicker from a scanning backlight. And SLI setups are technically able to pull of capping at 120fps on 1080p, affording full benefits of 960Hz-like motion clarity via scanning backlight). Monitor manufacturers, how about it? You're already doing it for high end home theater HDTV's. Formerly, it was too expensive for computer monitors. Bring it now. I think it could become a hit with the "alienware" audience.
Note: One can still do scanning backlight on a 60 Hz signal without frame interpolation, albiet with more flicker. (that's why 120 Hz is recommended to eliminate flicker with a scanning backlight)
Note2: It is possible to combine combine scanning backlights with 3D too. The high priced LCD/LED home theater HDTV's do that already. So one can still advertise 3D.
Note3: Bonus points if this is pulled off on a 2560x1440 panel, instead of a 1920x1080 panel. (even if done at only 100 Hz and an 8-section scanning backlight to multiply it to CMR 800 instead of CMR 960)
Note4: You get to brag about the world's best contrast ratios in a LCD computer monitor, too (thanks to local dimming that often usually goes hand-in-hand with a scaning backlight). This could be a bigger sales incentive than the "960Hz" quality.
So I'd challenge - bring the existing home theater "960 Hz" LED HDTV technology to computer monitors -- via combining 120Hz and scanning backlight (flicker problem is solved, motion blur is solved, turn off frame interplation) -- I think it's now finally manufacturable in a $1000 "alienware audience" high-end-user price range for a 24" display. We just need to coax a manufacterer (e.g. Samsung) to try this market. Obviously, most of us won't be able to afford it, some of us obviously are interested (me), and this trailblazing will mean lower prices in a few years. :)

Recommended Exercise for Hard Core Modders
It's possible for an electronics major (with several thousands of dollars) to create a homebrew local dimming panel, and rip apart a computer monitor. e.g. modify a Catleap 2B. Some high-speed programming, and some careful synchronization adjustment is needed (as scanning backlights are mathematically complicated to synchronize to LCD panels -- but you can even use rheostat-style knobs in a jerry rigged setup). But since creating a full matrix LED array is quite expensive if you do it from scratch using Digikey parts, and doing it sufficiently high-performance (with minimum gaming lag) to sync up with a 100Hz or 120Hz LCD glass taken from a computer monitor -- that's gonna cost a few thousand, and you might as well spend the same money on a fully built HDTV with CMR 960 capability. (see section "Recommended Exercise for Rich Power Users or Reviewers")

____
 
Last edited:
given a human tracking an eye across a rapidly-panning scene. The human eye CANNOT tell apart the flicker, but the MOTION BLUR *does* sharpen progressively, as you go to higher refresh rates (or higher CMR's)

Sensitivity to flicker varies greatly from person-to-person. Personally I'm trying to avoid exactly what you describe here with black frame insertion and scanning backlights, since I know that even 420Hz backlight flicker is easily visible to me and will cause problems. I'll gladly accept some smearing in exchange for no eyestrain/migraines. Some studies have also shown that the human nervous system remains sensitive to flicker even after the frequency becomes too high to be visible.

That said, I have no problem making these features an option for people who want to use them. Just make sure that the option to turn off black frames/scanning/motion interpolation remains.

Thanks for taking the time to write this up and provide links.

When do you think computer manufacturers will finally begin to introduce scanning backlights into computer monitors?

There's at least one high-end reference monitor that can sync its scanning backlight to the video playback rate (can't remember who makes it).
 
I've asked in the past why "Gaming" monitors haven't been made this way yet. Probably the next step will be this though.
 
I think the proliferation of OLEDs will eventually give you what you want, as they have response times measured in μs instead of ms. In this scenario, you would still be bandwidth-limited when it comes to the scanning time unless you wanted to add latency and do it with a scanning backlight as you went over in your post. I think there is a lot of merit in the scanning backlight for LCDs, but for gaming I'd still take direct vertical refresh to avoid every ms of latency possible.

A few interesting comments from John Carmack when discussing how to get around the vertical scan problem were to think about playing with the order in which the lines come in. The two ideas I saw mentioned were interlaced (first the odds, then the evens, for example), or randomized, to try and get around the slow downward/vertical look.

I do think it's important to keep the distinction clear between blur caused by pixel response time and blur as it was generated by the camera at the source. Blur from pixel response time isn't desirable, but I firmly believe that motion blur from the source is good and correct as dictated by the frame exposure time set on the camera. Tech that messes with this, like frame interpolation, needs to go away IMO—it goes against the spirit of high fidelity (faithful reproduction).
 
2012 NeoPlasmas have at least 2000Hz LCD is a bad technology hop OLED Monitors will come out soon at a normal price.
 
Native 120 Hz OLED tech is the answer..., but I would be surprised if we see this in the next few years. The next big push will probably be 4K before we see OLED.
 
I think the proliferation of OLEDs will eventually give you what you want, as they have response times measured in μs instead of ms. In this scenario, you would still be bandwidth-limited when it comes to the scanning time unless you wanted to add latency and do it with a scanning backlight as you went over in your post. I think there is a lot of merit in the scanning backlight for LCDs, but for gaming I'd still take direct vertical refresh to avoid every ms of latency possible.
OLED is a good solution, if they turn off the OLED pixels between frame refreshes. OLED, strictly by itself, will not solve the motion blur problem -- if you "store-and-hold" the OLED pixel (e.g. Active Matrix OLED does this), you're still going to get motion blur during fast pans, even with 0ms instantaneous pixel response.

OLED monitor manufacturers must bring it upon themselves, to provide a "Gaming Mode" toggle in their OSD menus to turn off the OLED pixels between refreshes. This will make the display darker (since the pixel is not illuminated as long), but will eliminate motion blur.

For OLED computer monitor manufacturers, a pixel illumination time of 2ms at 120Hz would be a good compromise (CMR 480 equivalence). That would illuminate pixels for 240ms out of 1000ms, meaning 24% of the time. That means if the same illumination brightness would be only 1/4 as bright as the maximum native brightness of the computer monitor. (Then again, electronics majors know you can slightly overdrive an LED if you blink it rapidly -- so you might be able to bring some of the brightness back by overdriving the pixels more brightly than usual if you're turning off the pixels within 2ms)

Yes, brightness problem also applies to LCD with scanning backlights -- they also have the disadvantage of darkening the display -- but many LED computer monitors are already too bright, so we've got plenty of excess backlight brightness to "use up" for enhancing clarity of motion.

Again, it's important to know that "OLED" != "lack of motion blur". There's more to the story (store-n-hold versus pulsing the pixels in an OLED).

A few interesting comments from John Carmack when discussing how to get around the vertical scan problem were to think about playing with the order in which the lines come in. The two ideas I saw mentioned were interlaced (first the odds, then the evens, for example), or randomized, to try and get around the slow downward/vertical look.
Yup -- this is quite interesting -- but this is actually separate problem to solve than the motion blur problem.
-- Also, at 120 Hz, the vertical scan effect is far less noticeable than at 60 Hz, so it's an nonissue (when I've gamed at 120Hz on CRT "back in the day")
-- Also, multi-scan displays (refreshing different parts of the LCD simultaneously) can reduce this, but this also complicate the logic in scanning backlights. We used to have "dual-scan LCD" and similar, but now modern LCD can refresh the whole panel quickly within a frame (less than 1/120th of a second), the vertical scan effect is theoretically a non-issue now. I'm no longer concerned about vertical scan effects/skews/etc.

I do think it's important to keep the distinction clear between blur caused by pixel response time and blur as it was generated by the camera at the source. Blur from pixel response time isn't desirable, but I firmly believe that motion blur from the source is good and correct as dictated by the frame exposure time set on the camera. Tech that messes with this, like frame interpolation, needs to go away IMO—it goes against the spirit of high fidelity (faithful reproduction).
Bingo. However, you want to use a high speed shutter for sports. Hockey, downhill skiing, etc. Here, we're hitting a motion blur bottleneck at the LCD panel level.

Motion blur occurs at the weakest link anywhere between the source and the destination
- Camera shutter
- Panel response
- Store-n-hold (adds motion blur) versus pulsed pixels (reduces motion blur) -- e.g. scanning backlights, or modulation of OLED pixels
- Refresh rate
- Graphics card effects (artifically added motion blur)
- Graphics card performance (fps)
- Etc.

To achieve nirvana of lack of motion blur (8/16-bit-Nintendo-style sharp pans on CRT), you need to eliminate as many weak links as possible.

Note: Artifically added motion blur is very beneficial in many ways, it hides other deficiencies quite well (e.g. LCD-specific ghosting artifacts such as greenish tint blur, etc). It also helps masks random fluctuations in framerates. However, if you've got a SLI setup, very fast i7, 120Hz monitor, and a game that can max the fps (running 120fps at all times), you often *DO* want to turn off the videogame's artifical motion blur setting to gain full benefit of 120 Hz, since the game's own artifically-added motion blur now becomes the weak link.
(And in the situation of CMR 480 or CMR 960 displays, the monitor can do the rest to achieve 480Hz or 960Hz-like equivalence via scanning backlight or pulsed pixels, etc. -- as long as interpolation or other lag-introducing algorithms is avoided as much as possible.)
 
Last edited:
Native 120 Hz OLED tech is the answer..., but I would be surprised if we see this in the next few years. The next big push will probably be 4K before we see OLED.
OLED is the answer *only* if the OLED pixels are pulsed (rather than store-and-hold). The store-and-hold effect adds motion blur, even if response is 0ms.
 
2012 NeoPlasmas have at least 2000Hz LCD is a bad technology hop OLED Monitors will come out soon at a normal price.
NeoPlasmas are amazing at their response, but note -- at 2000Hz still often repeatedly pulse the same pixel over the same timescale in order to eliminate flicker, so it doesn't necessarily achieve "CMR 2000" equivalence. I'm interested in proof (e.g. high speed 1/2000sec photograph of the monitor during a fast pan). If the pan isn't crystal sharp, it's repeating at least a frames during the 2000Hz, or it isn't interpolating enough frames, etc.

Different 600 Hz plasmas have very different motion blur, because of the way their electronics do different things (e.g. better or poor interpolation of frames, repeating refreshes in order to brighten the display because pulsing only once will be a very dim display, etc). From high speed camera tests, plasma almost always wins over LCD, but the worst 600 Hz plasma (especially during its maximum brightness setting) has more motion blur than the best 240 Hz LCD, because of various factors. (Example of the better LCD's for the motion blur metric: Elite with scanning backlight, or Sony XBR-55HX929 at its XR 960 setting during high speed sports, etc)
 
Last edited:
Sensitivity to flicker varies greatly from person-to-person. Personally I'm trying to avoid exactly what you describe here with black frame insertion and scanning backlights, since I know that even 420Hz backlight flicker is easily visible to me and will cause problems. I'll gladly accept some smearing in exchange for no eyestrain/migraines. Some studies have also shown that the human nervous system remains sensitive to flicker even after the frequency becomes too high to be visible.
You're certainly right that there are rare exceptions;
Plus; there are many indirect ways to detect flicker too -- I can actually detect whether a 1000Hz LED is flickering, but ONLY if I am rolling my eyes fast (electronics project, 555 timer, an oscilloscope shows 1000Hz square wave, LED flickering at a 10%on/90%off duty cycle) -- this is related to the "wagon wheel effect" or the "stroboscopic effect" (wheel spinning looks stationary in a strobe light. The wheel might be spinning 100 time a second, and the strobe light at 100 Hz. But if you go to 99 Hz strobe, the wheel spins backwards once a second. And if you go to 101 Hz strobe, the wheel spins forward once a second. That's like a beat frequency effect.) Rolling your eyes around really fast in front of a flickering source can show a dotted blur rather than a continuous blur. This is the stroboscopic effect, and in some situations, humans have been shown to detect flicker beyond 1000Hz (indirectly).
--- This is the same as the "Rainbow Effect" that some people see when using DLP projectors -- if they roll their eyes, or there's fast moving scenery (e.g. bright objects on dark backgrounds, high-contrast edges), the color components visually break apart because each color are shown stroboscopically on 1-chip DLP projectors. These are hundreds of hertz.
(Note: I'm slightly sensitive to the rainbow effect. Not to the point where it hurts my eyes, though -- so I'm okay with DLP projectors with at least a 6X-speed color wheel)

So yes, you are right, too. Different people have different sensitivities to flicker. Including different sensitivities to stroboscopic effects, and the eyestrains related to them, too.
(Though my OP was about motion blur, not about flicker)

That said, I have no problem making these features an option for people who want to use them. Just make sure that the option to turn off black frames/scanning/motion interpolation remains.
Yes, exactly -- this accomodates people like you. I am fine with scanning backlight and/or black frames once the native refresh is 120 Hz or beyond (because it won't flicker less than 120Hz), since I am far more sensitive to motion blur than to the flicker.
I anticipate that properly-designed gaming OLED monitors would have similar options (enabling/disabling black frames or other OLED-compatible technique)
 
Last edited:
Addendum --

(1) "Motion resolution" is a good test of motion blur:
http://news.cnet.com/8301-17938_105-10020262-1.html
This isn't a widespread test, but it is of extreme great interest to me, as a recreation gamer sensitive to motion blur. As we already know, plasma displays (and their general 600 Hz subfield drive) score excellent in the motion blur metric, meaning they have "high motion resolution".

The best LCD's (e.g. $5,000 Elite LCD) have a motion resolution exceeding 1,000 in this same test, which is better than many cheap/poor plasmas. The unfortunate thing is that many cheap $600 plasmas still beat many $2000 HDTV LCD's (and 100% of all known LCD computer monitors), and we are still waiting for the Holy Grail of "Elite"-league computer monitors, as in my OP.

It shows that motion-resolution tests are pretty easy to make; and there are motion-resolution tests for computers.
I also wrote a motion-resolution test in year 1992 to prove the human eye could tell apart 30fps versus 60fps, called "MOTION" for a BBS, now found in the "Mini Graphics Demos" section my old Nostagila Programming section of my ancient website (requires MS-DOS, still kind of works DOS emulators that have access to the graphics card's VSYNC. Also works in Windows XP full-screen DOS, but not windowed DOS)...

(2) I've made a new thread about OLED motion blur, since there's a widespread misconception about OLED being the holy grail. It's cheaper to make a LCD computer monitor with a scanning backlight, so let's target the LCD computer monitor manufacturers FIRST -- since actual tests show that "scanning-backlight" LCD can beat regular OLED in motion blur sharpness. (even though OLED is going to be better in many ways such as color and contrast)
New thread: http://hardforum.com/showthread.php?t=1711833
 
Last edited:
I'd like to chime in with this:

If you intend to buy a plasma to play 30FPS limited console games, don't do it unless you are sure you can turn on motion interpolation (on some Samsung plasmas you can't)... because the resulting judder will be incredibly bad, to the point of being unplayable (flickering double images like mad). Now at 60FPS, the judder is non-existent on the plasma (same with blur), and no motion interpolation is needed. But VERY FEW console games besides Rage can lock in 60FPS.

What you really want is a high end LCD with 240Hz native refresh and yes, one of these newfangled advertising acronyms like CMR, being as high as possible... i.e 960. Yes, you might have some lag issues. But this only really affects online FPS (since music games let you account for lag). The horrible judder, however, affects almost ALL console games.

So do yourself a huge favor and buy an LCD and put on all the motion smoothing, anti-judder tech you can. Your eyes will really thank you for it. I learned the hard way. Of course, if you intend to buy the TV only for PC gaming and can lock in a guaranteed 60FPS with a high end card, plasma is the better choice.
 
ElecktroDragon,

Good idea, what you describe is a good "proof of concept" to prove that a market exists for such a computer monitor.

However, I prefer no added input lag above current technology. It's possible by following this steps:
1. Use 120Hz native input. This technology already exists, via 120Hz computer monitors.
2. Use scanning backlight to enhance 120Hz to 480Hz. This technology already exists, via high end HDTV.
3. Skip motion interpolation; the technology is (usually) unwanted in a computer monitor for gaming.
4. You gain "CMR 480" with NO EXTRA INPUT LAG over native refresh rate.

I want to eliminate the motion interpolation from the equation, and computer monitors exclude motion interpolation due to lag. But a lot of those "enhancements" (excluding motion interpolation) found in these high end HDTV's can be taken advantage in computer monitors. Tech exists today, somebody just needs to bring it to market.

It is possible to do so without adding extra input lag, just simply by excluding one of the technologies (motion interpolation -- because it is just not possible to do so without adding input lag). However, scanning backlight can be achieved without adding input lag, simply by controlling the backlight in the areas of LCD at the beginning of the pixel refresh for that area of the LCD, because LCD have a finite pixel refresh speed. (e.g. 2ms). Since fast electronics can merrily take their time controlling the backlight instantaneously long before the LCD pixels finish refreshing -- VIOLA -- scanning backlight without *added* input lag!

The graphics card would not need to work harder; it would just output only 120fps, just like for 120Hz monitors -- so computer requirements are not increased. The scanning backlight would do the motion enhancement to produce motion blur elimination equivalent to "480Hz".

Note -- I am a software developer (programmer) and have programmed algorithms for video (google "Mark Rejhon 3:2 pulldown" or "Mark Rejhon dScaler") -- More than ten years ago, I was the inventor of the world's first open-sourced 3:2 deinterlacing algorithm, which was brought into dScaler, and now similar algorithms are now taken advantage of in other software such as ffdshow, etc. That was back in the day when "Faroudja" was the gold standard in deinterlacers, and computers were only 400Mhz -- barely fast enough to do basic deinterlacing using Pentium MMX assembly language; but the dScaler team (including myself) showed that it could be done in 100% software, processing pixels completely in software in real-time at 640x480x30fps deinterlaced (or interpreted another way, 640x240x60 "fields" from the original interlaced signal, per second) on just a 400Mhz CPU back in year 1999-2000!

(BTW, I'm talking at you big and small vendors -- whether big ones like Samsung -- or small ones like Catleap or Overlord -- to consider a "480Hz" monitor. And make it configurable/adjustable via the OSD menu. I am willing to pay extra for such a computer monitor.)
 
Last edited:
Update to Recommended Exercise for Hard-Core Modders
I did some research, and apparently, it's cheaper than expected to do a scanning backlight now.
A $35 Arduino and about $40 of LED's, an appropriate power supply, plus a few other parts, can be used to create a home-brew scanning backlight (mod a computer monitor) -- if we don't care about local dimming. See this thread:
http://hardforum.com/showthread.php?t=1716564

Based on this new research, a display manufacturer can build this technology into a computer monitor for less than $100 additional cost -- and the vast majority of that cost is LED's.
 
Back
Top