Hello,
Short form question: When will computer monitor makers produce computer monitors with a CMR of 480? (480Hz-like)
The top-end HDTV's are already doing this, running at "960 Hz equivalence" -- or rather, CMR 960 for Samsung models.
(CMR = Clear Motion Rate = the amount of time the same pixel is continuously illuminated, when taking into account of everything, including panel latency, black frame insertion, scanning backlight, frame interpolation, etc. A marketing *gimmick* from Samsung, but since there's no other good name elsewhere in the industry, let's use "CMR" just for convenience's sake. If we wish, we can co-opt or invent a different brand-generic terminology. A need exists for a standard of rating a motion-blur equivalence when taking into account of *everything* (black frame if any exists, existence/lack of frame repetitions, constant-backlight vs scanning backlight if any, frame interpolation if any, actual native panel refresh, etc), and so far, it seems "CMR" comes close to this measurement standard.
Long form of question:
Has any monitor manufacturers considered black frame insertion or scanning backlihgt, combined with 120Hz, for reducing motion blur even further? CRT-quality level of reduced motion blur doesn't happen until each rendered pixel illuminates for 1/1000th second (either via 1,000Hz (impractical), or via frame interpolation (laggy), or via using long black frames between refreshes, or via a scanning backlight).
The best (e.g. $3000+) HDTV's use many techniques (e.g. scanning backlights) to achieve an approximately 1 millisecond pixel illumination time. This is different from pixel response (e.g. store and hold display illuminating the same pixel continuously for the full refresh). Those displays are often "Clear Motion Rate 960" which is like 960Hz, or 1/960sec sampling time (very close to 1msec for CRT phosphor illumination, for CRT-style motion quality). The use of black frame insertion (which would no longer flicker noticeably beyond 120Hz) or scanning backlight, would eliminate the lag problems used by frame interpolation used by HDTV displays. (Scanning backlights in high end home theater HDTV's simulate the scanning illumination from a CRT and more closely approach phosphor decay, allowing LCD pixels to be quickly de-illuminated between refreshes, for sharper motion -- less motion blur -- than is otherwise possible)
I personally would like to see this technology (e.g. scaning backlights) reach computer monitors sometime, and I'd be glad to pay about $500-$800 for a 27" 1440p monitor (120Hz) with a Clear Motion Rate of 480 or 960.
NOTE: Before you get to a huff about 60/120/240/480/960 -- there is a point of diminishing returns -- given a human tracking an eye across a rapidly-panning scene. The human eye CANNOT tell apart the flicker, but the MOTION BLUR *does* sharpen progressively, as you go to higher refresh rates (or higher CMR's)
For EXAMPLE SCENARIOS:
1. Television set showing a "fast pan" in hockey game / football game (using HDTV cameras with 1/1000sec shutter speed setting)
*or*
2. Computer monitor with a high-sample rate mouse (e.g. 1000Hz mouse) turning fast in the spot in an FPS game, for the "fast pan"....
Now assume the fast panning scene is moving across the screen at 1 inch every 60th of a second (e.g. taking 1 second to cross the width of a large-screen HDTV, one that's approximately 60 inches wide). Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers:
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors)
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors)
At 240Hz (or CMR 240), the motoin blur is 0.25" thick
At 480Hz (or CMR 480), the motion blur is 0.125" thick
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's)
NOTE: Replace "Hz" with "CMR" for those displays that use blackframe or scanning backlights; the science is similiar. Also, for video on television, it is assumed the camera shutter is faster than the refresh rate, in order to see the motion blur benefit. For games, it is assumed the computer mouse sample rate is higher than the refresh rate, and the graphics card is able to max out the native refresh rate (e.g. max out at 120Hz is all that's needed in order to benefit from CMR 960 when using a scanning backlight) in order to see the motion blur benefit.
NOTE2: CRT's generally have a CMR very similiar to CMR 960, because the phosphor mostly decays by approximately 1 millseconds (and more fully by 2 milliseconds).
NOTE3: There are many variables that fudge the numbers, such as panel response speed. Many panels are limited to 2ms (which maxes out at CMR 500 equivalence), requiring black frames or scanning backlights to go beyond LCD response rate (e.g. getting 960 Hz-like lack-of-motion-blur clarity, out of a 2ms or even 4ms+ panel). This will often affect actual measured "CMR" (photography-measured) versus mathematically computed "CMR", e.g. CMR 960 display actually only measuring to ~CMR 500 equivalence in a scientific high-speed-camera-measurement test, etc. But let's assume all the bottlenecks are sufficiently resolved (e.g. good scanning backlight eliminating panel refresh from being a factor)
Granted, most FPS players (and even many competition FPS players) do not try to identify background objects (e.g. small snipers) while spinning fast, and thus can't tell apart the 60Hz vs 120Hz -- other players play FPS single player to enjoy the graphics, and the motion blur becomes pretty apparent. There's an obvious point of diminishing return. But it is interesting to note that the difference between 60Hz vs 120Hz (0.5" less motion blur) is similiar to the difference between 120Hz and 480Hz (0.375" less motion blur). Obviously, some video game players (like me) really notice the improvement, when trying to track small fast-moving objects. So comparing 60Hz versus 480Hz for the same panning scene, you're comparing 1" motion blur versus 0.125" motion blur -- an 87.5% reduction in motion blur! I'm not a competition FPS gamer, but I now understand why even some competition FPS gamers still prefer CRT, even with today's 120Hz (translation: they're not as crazy as you think -- they're benefitting from reduced motion blur). To others, it's just a gimmick, but the science of motion blur is pretty real & scientifically measurable, and detectable by human eye (even videophiles enjoying CMR 960 (960Hz style) HDTV television sets).
Obviously, smaller TV's will not benefit as much as bigger HDTV's, because the motion blur is less pronounced. On a bigger HDTV, it takes 1 second for a 1-inch-per-1/60sec movement, to complete moving from one edge of the screen to the other edge, so you have enough time to detect the sharpness of the pan by tracking a single object (e.g. golf ball, hockey puck, soccer ball) from one edge of the screen to the other. On a smaller computer monitor, at half the width, you may have only 0.5 second. So instead of CMR 960 being mostly the final point of diminishing return for HDTV sports (960Hz like picture), it would be more like CMR 480 being the final frontier for computer monitors.
...And scientific tests on sports on the high end HDTV's (CMR 960), already show that it's not baloney (e.g. it's actually worthwhile, since some people are actually sensitive to it)
Now, my question is....
When do you think computer manufacturers will finally begin to introduce scanning backlights into computer monitors
480Hz-style would be fine (CMR 480 achieved using 120Hz+scanning backlight) -- CMR 960 is overkill unless you have a 60" display.
Mathematically, 480Hz would have 87.5% less motion blur than 60Hz
(whereas 120Hz only has 50% less motion blur than 60Hz),
(Note: It's also confirmed by human eyes, too -- Several of us, myself included, are able to tell the motion blur difference on 120Hz vs 480Hz (via CMR 480 or 480 Hz interpolation) fluidity during the super-fast pans on some these expensive HDTV displays at home theater showrooms, even in a random blind test where somebody toggles the setting on the HDTV without us looking)
Thanks,
Mark Rejhon
Short form question: When will computer monitor makers produce computer monitors with a CMR of 480? (480Hz-like)
The top-end HDTV's are already doing this, running at "960 Hz equivalence" -- or rather, CMR 960 for Samsung models.
(CMR = Clear Motion Rate = the amount of time the same pixel is continuously illuminated, when taking into account of everything, including panel latency, black frame insertion, scanning backlight, frame interpolation, etc. A marketing *gimmick* from Samsung, but since there's no other good name elsewhere in the industry, let's use "CMR" just for convenience's sake. If we wish, we can co-opt or invent a different brand-generic terminology. A need exists for a standard of rating a motion-blur equivalence when taking into account of *everything* (black frame if any exists, existence/lack of frame repetitions, constant-backlight vs scanning backlight if any, frame interpolation if any, actual native panel refresh, etc), and so far, it seems "CMR" comes close to this measurement standard.
Long form of question:
Has any monitor manufacturers considered black frame insertion or scanning backlihgt, combined with 120Hz, for reducing motion blur even further? CRT-quality level of reduced motion blur doesn't happen until each rendered pixel illuminates for 1/1000th second (either via 1,000Hz (impractical), or via frame interpolation (laggy), or via using long black frames between refreshes, or via a scanning backlight).
The best (e.g. $3000+) HDTV's use many techniques (e.g. scanning backlights) to achieve an approximately 1 millisecond pixel illumination time. This is different from pixel response (e.g. store and hold display illuminating the same pixel continuously for the full refresh). Those displays are often "Clear Motion Rate 960" which is like 960Hz, or 1/960sec sampling time (very close to 1msec for CRT phosphor illumination, for CRT-style motion quality). The use of black frame insertion (which would no longer flicker noticeably beyond 120Hz) or scanning backlight, would eliminate the lag problems used by frame interpolation used by HDTV displays. (Scanning backlights in high end home theater HDTV's simulate the scanning illumination from a CRT and more closely approach phosphor decay, allowing LCD pixels to be quickly de-illuminated between refreshes, for sharper motion -- less motion blur -- than is otherwise possible)
I personally would like to see this technology (e.g. scaning backlights) reach computer monitors sometime, and I'd be glad to pay about $500-$800 for a 27" 1440p monitor (120Hz) with a Clear Motion Rate of 480 or 960.
NOTE: Before you get to a huff about 60/120/240/480/960 -- there is a point of diminishing returns -- given a human tracking an eye across a rapidly-panning scene. The human eye CANNOT tell apart the flicker, but the MOTION BLUR *does* sharpen progressively, as you go to higher refresh rates (or higher CMR's)
For EXAMPLE SCENARIOS:
1. Television set showing a "fast pan" in hockey game / football game (using HDTV cameras with 1/1000sec shutter speed setting)
*or*
2. Computer monitor with a high-sample rate mouse (e.g. 1000Hz mouse) turning fast in the spot in an FPS game, for the "fast pan"....
Now assume the fast panning scene is moving across the screen at 1 inch every 60th of a second (e.g. taking 1 second to cross the width of a large-screen HDTV, one that's approximately 60 inches wide). Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers:
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors)
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors)
At 240Hz (or CMR 240), the motoin blur is 0.25" thick
At 480Hz (or CMR 480), the motion blur is 0.125" thick
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's)
NOTE: Replace "Hz" with "CMR" for those displays that use blackframe or scanning backlights; the science is similiar. Also, for video on television, it is assumed the camera shutter is faster than the refresh rate, in order to see the motion blur benefit. For games, it is assumed the computer mouse sample rate is higher than the refresh rate, and the graphics card is able to max out the native refresh rate (e.g. max out at 120Hz is all that's needed in order to benefit from CMR 960 when using a scanning backlight) in order to see the motion blur benefit.
NOTE2: CRT's generally have a CMR very similiar to CMR 960, because the phosphor mostly decays by approximately 1 millseconds (and more fully by 2 milliseconds).
NOTE3: There are many variables that fudge the numbers, such as panel response speed. Many panels are limited to 2ms (which maxes out at CMR 500 equivalence), requiring black frames or scanning backlights to go beyond LCD response rate (e.g. getting 960 Hz-like lack-of-motion-blur clarity, out of a 2ms or even 4ms+ panel). This will often affect actual measured "CMR" (photography-measured) versus mathematically computed "CMR", e.g. CMR 960 display actually only measuring to ~CMR 500 equivalence in a scientific high-speed-camera-measurement test, etc. But let's assume all the bottlenecks are sufficiently resolved (e.g. good scanning backlight eliminating panel refresh from being a factor)
Granted, most FPS players (and even many competition FPS players) do not try to identify background objects (e.g. small snipers) while spinning fast, and thus can't tell apart the 60Hz vs 120Hz -- other players play FPS single player to enjoy the graphics, and the motion blur becomes pretty apparent. There's an obvious point of diminishing return. But it is interesting to note that the difference between 60Hz vs 120Hz (0.5" less motion blur) is similiar to the difference between 120Hz and 480Hz (0.375" less motion blur). Obviously, some video game players (like me) really notice the improvement, when trying to track small fast-moving objects. So comparing 60Hz versus 480Hz for the same panning scene, you're comparing 1" motion blur versus 0.125" motion blur -- an 87.5% reduction in motion blur! I'm not a competition FPS gamer, but I now understand why even some competition FPS gamers still prefer CRT, even with today's 120Hz (translation: they're not as crazy as you think -- they're benefitting from reduced motion blur). To others, it's just a gimmick, but the science of motion blur is pretty real & scientifically measurable, and detectable by human eye (even videophiles enjoying CMR 960 (960Hz style) HDTV television sets).
Obviously, smaller TV's will not benefit as much as bigger HDTV's, because the motion blur is less pronounced. On a bigger HDTV, it takes 1 second for a 1-inch-per-1/60sec movement, to complete moving from one edge of the screen to the other edge, so you have enough time to detect the sharpness of the pan by tracking a single object (e.g. golf ball, hockey puck, soccer ball) from one edge of the screen to the other. On a smaller computer monitor, at half the width, you may have only 0.5 second. So instead of CMR 960 being mostly the final point of diminishing return for HDTV sports (960Hz like picture), it would be more like CMR 480 being the final frontier for computer monitors.
...And scientific tests on sports on the high end HDTV's (CMR 960), already show that it's not baloney (e.g. it's actually worthwhile, since some people are actually sensitive to it)
Now, my question is....
When do you think computer manufacturers will finally begin to introduce scanning backlights into computer monitors
480Hz-style would be fine (CMR 480 achieved using 120Hz+scanning backlight) -- CMR 960 is overkill unless you have a 60" display.
Mathematically, 480Hz would have 87.5% less motion blur than 60Hz
(whereas 120Hz only has 50% less motion blur than 60Hz),
(Note: It's also confirmed by human eyes, too -- Several of us, myself included, are able to tell the motion blur difference on 120Hz vs 480Hz (via CMR 480 or 480 Hz interpolation) fluidity during the super-fast pans on some these expensive HDTV displays at home theater showrooms, even in a random blind test where somebody toggles the setting on the HDTV without us looking)
Thanks,
Mark Rejhon
Last edited: