LCD can be better than OLED in motion blur. (actual test)

mdrejhon

Limp Gawd
Joined
Mar 9, 2009
Messages
128
Some people are saying that 120Hz OLED is the ultimate solution....
(Related thread -- http://hardforum.com/showthread.php?t=1711216 which also mathematically explains why 480Hz has 87.5% less motion blur than 60Hz, while 120Hz only has 50% less motion blur than 60Hz, so it's still beneficial to keep pressing past 120 Hz despite the point-of-diminishing-returns.)

Tests were done by home theater people on an OLED HDTV, using a motion-blur test pattern (moving test pattern in motion) on a test pattern Blu-Ray disc.

OLED can be worse than Plasma and high-end LCD (if the OLED does not use black frames), in a motion resolution test:
Quote from VierraFan at http://www.avforums.com/forums/plasma-tvs/1641343-plasma-vs-oled-motion-resolution.html
So, here is the result.

Measurement (a sort of) was performed using 60 fps clip, where test pattern was moving with speed 6.5 pixels per frame - as far as I know, these are 'standard' conditions for these tests, and using similar clip on plasma, I got similar results for plasma as in TV reviews

Result is a bit of disappointment. On plasma, it was about 900 lines with IFC off. AMOLED - only slightly more than 350, not much better than LCD which is around 300 (note - the pattern is for HD display, so it gave the result as for HD display, although this one has resolution 800x480). Between 350 and 700, our eyes see 5 lines instead of 4 - similar like on my LCD monitor, just sharper, because of much better response time of this screen

So, why this

This display is, like LCD, continuous type - picture is displayed whole of the time. It turned out that it has refresh frequency 60 Hz, just like my PC monitor, meaning that state of pixels is changed 60 times per second, from the top to the bottom (just that this one has portrait orientation, so when turned 90 degrees into landscape orientation, it's from left to right), and 1/60 s is needed to change all pixels (after that, refresh of the next frame begins). The difference is that on my LCD monitor, pixels need about 7 ms on average to change the state after change of voltage. Here, it's much shorter - well under 1ms

Unfortunately, it's still continuous type display, unlike plasma and CRT, that are pulse-type displays. So motion resolution perceived with our eyes still isn't much better than on LCD

However, there are good news - on TV sets, I suppose that higher refresh frequencies will be used, like 100/120 or 200/240 Hz or even higher (depending on technical limitations). They are also needed for 3D - which I suppose is a must for the TV at that price level. Since response time is very short, it will allow insertion of black frames, similar to scanning backlight on LCD sets, turning a display into pulse-type, which will increase perceived motion resolution. Perceived motion resolution depends on the ratio between time when picture is displayed on the screen and time when the screen is dark (assuming framerate is the same), so with this ratio high enough, it can be better than plasma and as good as CRT

However, there are also bad news. Black frame insertion will introduce flickering. To combat with this, I suppose that manufacturers will use 'double scanning', showing 50 Hz material with 100 Hz and possibly 60 Hz material with 120 Hz. But it will introduce judder (double edges) with popular name 50 (and possibly 60)Hz bug. To combat with that, frame interpolation (IFC) will be used, introducing soap opera effect. There will always be some compromise here.

Nothing new - we are already familiar with these things
[snip]
End of quote from VierraFan on http://www.avforums.com/forums/plasma-tvs/1641343-plasma-vs-oled-motion-resolution.html
See? Plasma 3 times better than a store-and-hold OLED.

OLED isn't, "by itself", the solution to motion blur, UNLESS it pulses the pixels rather than store-and-hold. (Plasma with a 900 pixel "motion resolution", and OLED with a 300 pixel "motion resolution", as from the moving-resolution-test-pattern motion test on the test pattern Blu-Ray disc)

Scanning backlights on LCD TV's are cheaper, and makes it much better than OLED (without black-frame insertion), much more cheaply than OLED.
(Example: $5000 Elite brand LCD HDTV for home theater -- www.elitelcdtv.com -- has a motion resolution of over 1000 pixel from the SAME motion test pattern testng that gave only a 300 rating to the OLED -- that's a rare LCD that is better than many plasmas AND OLED, in ACTUAL tests. Yes, LCD can beat plasma, in the "top 1%" -- and no LCD computer monitors exist with this technology yet, as far as I know. Unfortunately it requires $5000 of technology to make it happen.)

Let's press LCD computer monitor makers FIRST, to add a scanning backlight (which helps greatly overcome blur introduced by the LCD's response rate and store-n-hold effect). These high end LCD computer monitors could perhaps cost only $1,000. (Imagine: A good $400 Catleap-like monitor with $600 of scanning backlight technology, for example -- can be manufactured at today's prices). They would bring plasma-quality motion resolution to your computer desktop, today, with today's technology. Let's press computer monitor makers to do it now! It's cheaper than OLED, anyway -- the technology is already cheap and profitable enough, even if the scanning backlight costs more than the LCD panel itself. Some of us are willing to pay Alienware prices to get the luxury of 90% less motion blur on our computer desktops in a flat panel! Look at the prices of OLED displays, they're almost always more than the LCD HDTV's with scanning backlights. Lower-end versions of scanning backlights have finally reached some of the "medium-high-end" $2000-$2500(approx) HDTV's even though they aren't as well-tuned as the high end, and a much smaller scanning backlight (e.g. only 24" or 27" diagonal) should be much cheaper to manufacture than those, and we can settle for CMR 480 (sufficiently good enough to almost equal a well-tuned 600 Hz plasma).

Local dimming is a bonus (often goes with scanning backlight technology in high end LCD HDTV's) since it adds about a couple orders of magnitude of better contrast during same-image contrast (e.g. over 100,000:1 contrast ratio in the *same* image), since it darkens LED backlight illumination in the areas of the panel that are dark. This is found in the high end home theater LCD HDTV's but are not found in consumer computer monitors.

So you can see -- it should really be already possible to $1000 LCD computer monitor that has better motion resolution than an average OLED and plasma.

Note -- motion resolution testing on computer monitors -- someone needs to produce a better motion resolution test program than the existing motion tests that exists, for the 120Hz era today. Since the monitor manufacturers should use a 120 Hz PC-generated motion resolution test pattern, rather than testing using the 60 Hz Blu-Ray disc (since we aren't going to be using frame interpolation for PC output). Any novice DirectX programmer can program a motion resolution test pattern one in less than 5 days of programming work, as a C/C++ programmer and DirectX API's -- all you do is bit-blitting a regular high-resolution test pattern bitmap horizontally scrolling every frame, every VSYNC, preferably at high-speed panning pans simulating the speed of fast turns within FPS videogames. Adjustable test pattern speed is useful)

The world's first "Alienware league" 24" or 27" LCD computer monitor that can simulate 480 Hz (or better) via using 120 Hz native + local dimming + scanning backlight (or black frame insertion at no less flicker than 120Hz flicker), with configurable options (ability to turn on/off scanning backlight and black frame), costing $1000 or less, will get my money -- we need such a monitor, in order to bring CRT-quality lack of motion blur, to a LCD display. (But it must exclude motion interpolation; it adds too much lag) (And of course, we get to still keep 3D, as a nice-to-have feature, too.)
The technology already is in a few models of $5000 home theater LCD TV's (even that price is cheaper than many OLED of the same size)

OLED can be better in many ways such as color and contrast...
But LCD with a *good* high-end scanning backlight, can be better AND CHEAPER than OLED, while being almost as good in contrast.

So -- *logically* -- we should target the LCD computer monitor manufacturers first, to get closer to the "motion blur" holy grail.
A good first target would be Samsung -- they already make HDTV's with scanning backlights (achieving "960 Hz"-like CRT operation, or "CMR 960"), so they just need to transfer the technology to their "high end monitor" department (sans the motion interpolation, of course).
Scanning backlights allows LCD panels to have CRT-quality motion sharpness.
Samsung employees -- are you reading this? :)
Give your bosses a link to this thread, maybe you need to create a high-end monitor department to target the Alienware-and-up crowd.
Or somebody else -- partner up with a Korean/Chinese manufacturer -- and start a Kickstarter project for the "CRT-quality LCD" (which IS possible now)
 
Last edited:
This is always going to be a compromise I think...

For gaming and movies, there is merit to reducing how long each frame is shown for, but for general desktop use and eye comfort, continuous will always win. There are people today who shy away from purchasing certain LCDs because of low PWM duty cycles on 180 Hz backlights due to the induced flicker.

Back in the CRT days with the super fast phosphor illumination times, flickering was pretty terrible, but we got around it by pushing the refresh rate. I think 120 Hz (or some other suitably high standard) will need to come before this quickening of the frame display time will catch on. People, myself included, would not want to see this at 60 Hz.

This is all mainly within the context of gaming, of course, as this topic is not so clear cut with film.
 
This is always going to be a compromise I think...
Agreed.

For gaming and movies, there is merit to reducing how long each frame is shown for, but for general desktop use and eye comfort, continuous will always win. There are people today who shy away from purchasing certain LCDs because of low PWM duty cycles on 180 Hz backlights due to the induced flicker.
Agreed, but such a high end monitor can be made *configurable*. You can turn on/off black frame insertion and scanning backlights. So you could still get full store-n-hold benefits at the desktop. Smart computer monitor makers can be made to automatically switch modes whenever a game is started -- via some kind of signal from PC (perhaps a new signal over DPMS, or via a system tray utility, such as building it into ATI CCC or PowerStrip, etc.).

It's just a full matrix array of high-speed LED's being used as a backlight, to make local dimming and scanning backlight possible. So just keep them illuminated store-and-hold instead (since many of these are active matrixes requiring refresh to also turn off the LED's), since you need to keep illuminating the LED's anyway for maximum panel brightness. (Though high-end LED displays at maximum brightness can often be too bright for my eyes; plenty of spare brightness to use up for either 3D and/or local dimming)

Same for other technologies OLED, too -- motion blur elimination via black frame insertion. (same flicker compromises)

Back in the CRT days with the super fast phosphor illumination times, flickering was pretty terrible, but we got around it by pushing the refresh rate. I think 120 Hz (or some other suitably high standard) will need to come before this quickening of the frame display time will catch on. People, myself included, would not want to see this at 60 Hz.
I totally agree. Black frames and scanning backlights should only be combined with 120Hz+. 120Hz is a good compromise because it is sufficiently high to eliminate most of flicker for most people, while being low enough to be possible for graphics cards (especially SLI, and turn off certain GPU-hungry settings such as soft shadows and/or 4x+ FSAA) to be able to meet using today's GPU technology.

120Hz computer monitors are finally here today so we've finally solved that piece of the problem -- now the road is clear for a premium maker to add scanning backlight, but 120Hz vs 60Hz only reduces motion blur by 50%. I'd say, a monitor maker should be convinced to show off a more dramatic improvement by eliminating 87.5% of motion blur via 60Hz vs 480Hz (simulated).

This is all mainly within the context of gaming, of course, as this topic is not so clear cut with film.
Agreed. I hate motion interpolation too. That said, such a monitor should be configurable. (e.g. a "Game Mode") I'll pay an Alienware-league price for that. The technology is already here today, just needs to arrive in a enthusiast-grade computer monitor.
 
Last edited:
You make a good point with these things being configurable. Like a lot of things, having that flexibility is probably the best solution, and something that OLEDs will make possible given the right control circuitry.
 
You make a good point with these things being configurable. Like a lot of things, having that flexibility is probably the best solution, and something that OLEDs will make possible given the right control circuitry.
Bingo -- yes.
However, it's cheaper to do *exactly* the same thing by using a scanning backlight with a 120 Hz LCD.

A pulse-modulated (black frame insertion) OLED would have the same flicker problem as a scanning-backlight LCD
...so, why not convince the computer monitor makers to do the cheaper route first, within 6 to 12 months (tech is already here today)
-- rather than waiting 5 years for the same OLED equivalent??

Cheap OLED's with pulse-modulated/black-frame insertion don't seem to exist yet (as far as my research could find -- I could be wrong -- tests show that many are continuous illumination), and early OLED's have to store-n-hold because OLED pixels aren't bright enough for black frame insertion without severe degradation in brightness. Cheaper OLED already often has brightness problems with bright images. (some of them dim, plasma-style)
In contrast, scanning backlights is already a few-years-old technology in the ultra-expensive LCD HDTV's.
(OLED has other obvious benefits -- contrast ratio, color quality, etc -- but unfortunately motion blur IS NOT CURRENTLY an OLED advantage (YET) because of the above)

The existence and evolution of a LCD computer monitor, with scanning LED backlight (LED's are already bright enough too), will over the long term, help pave the way for convincing manufacturers to introduce OLED computer monitors with proper circuitry (black frames) for enhanced gaming experience. For LCD monitors with scanning backlights, makers can always also advertise the enhanced contrast that such an active backlight array is able to do -- local dimming can be designed flicker-free, too -- so (configurable) motion blur elimination isn't the only benefit.

We gotta pave the way for "480Hz" desktop gaming experience (via LCD's using 120Hz + scanning backlights), and the cheapest and quickest 24"/27" desktop techology to introduce today, that can be manufacture now, is via LCD with scanning backlight today (as of mid-2012). The BOM (Bill of Materials) is low enough to go under $1000 now, with parts already manufactured and priced today. No new custom chips needed. The $1000 "luxury" price is within reach of the Alienware audience.

Time to market: Six months. Could be designed and introduced to market in just six months -- this technology is much easier to design and manufacture than a new cellphone or new motherboard. Both Sony and Samsung already has the parts and engineers, thanks to their top-end HDTV's, just remove the motion-interpolation. You can still get to "480Hz" using only minor modifications of today's firmware already built into LCD HDTV that are doing "960" simulation via scanning backlight. High end HDTV's (e.g. Sony with XR 960, Samsung with CMR 960, or an Elite HDTV) -- often create an internal 240Hz via motion interpolation (ugh!!), and pump that to simulated "960Hz" via scanning backlight at 4:1 -- so all you need to do is apply the same 4:1 enhancement to a 120Hz computer monitor signal, to gain "480Hz" like motion quality. You do lose 75% brightness for scanning backlights doing 4:1 enhancement to the motion, but just add brighter LED's -- many LED computer monitors are already too bright at their brightest settings, anyway. Saying goodbye to the motion interpolation, also means bye-bye to most of the display lag, too -- we don't need the motion interpolation to be added to the monitors. There could be little software programming work for Samsung, when creating the world's first "480Hz" LCD (CMR 480) computer desktop monitor, since they've already developed the technology in their top-model LCD HDTVs. Add proper settings to turn on/off the backlight enhancement features (e.g. local dimming, scanning backlight modes (2:1 and 4:1 and always-on), etc.) The hardware is already invented (120Hz panels, LED's, local dimming electronics). Low costs of software development by recycling the same firmware driving their best HDTV, by transferring the tech to the best computer monitors. Though they'll really need an enthusiast branding (e.g. Samsung's own "Alienware" brand, etc.) because it'll be quite expensive, but if they can make it for, say, $899 or $999 -- I'm standing in line!
 
Last edited:
Tests were done by home theater people on an OLED HDTV, using a motion-blur test pattern (moving test pattern in motion) on a test pattern Blu-Ray disc.


I followed your link and he says he is testing a Plasma TV against smartphone with a 800 x 480 AMOLED screen, not an OLED HDTV. So you need to realize there is all kinds of potential failure in that kind of flimsy comparison.

I'm currently trying to make some measurements on Samsung Galaxy S II OLED screen....

==================================================

DISCLAIMER: I'm aware it is comparison of a MOBILE PHONE and DEDICATED DISPLAY DEVICE, which is very questionable. Also, like my previous experiments with plasma screen (dedicated display device), this isn't intended to be a test - I'm just trying to predict what new OLED TV technology will look like, because we don't have much information at the moment. I don't take any responsibility that my conclusions are correct - I tried to do my best, but if you read on, it's on your own responsibility, or wait for real tests of real dedicated display devices

==================================================

Picture from this experiment attached below. It was much harder to record what I see on the small screen of the mobile phone than on the 42" plasma...
 
Last edited:
I followed your link and he says he is testing a Plasma TV against smartphone with a 800 x 480 AMOLED screen, not an OLED HDTV. So you need to realize there is all kinds of potential failure in that kind of flimsy comparison.
True, but the same "store-and-hold" issue applies. A full size OLED would have the same problem -- including the early overpriced Sony XEL-1 -- while very good, it's extremely overpriced (four figures), and plasma still has less motion blur than the XEL-1. Very few people have ever tested OLED for motion blur, and we need more testers. (Someone should go to CES or CEDIA with the motion test pattern Blu-Ray and do some testing!)

If there's a *way* to produce cheap OLED and simultaneously eliminate motion blur as well (or better than) plasma, then by all means -- it's a welcome technology. It's possible, it's doable, but can it be done cheap? I think the scanning-backlight LED is a low-lying apple today; it can be picked first.

Some new large OLED panels were demonstrated for the home theater, but the prices are still insane -- though irregardless, we need some real motion tests on those!
 
so basically CRT is the pinnacle for fps gamers... sad :( thought oled would be as good.
 
so basically CRT is the pinnacle for fps gamers... sad :( thought oled would be as good.

i play fps games on my IPS monitor. anything below 10ms is perfectly acceptable, only extensive testing would make you realize any different.


OLED is the future. organic light-emitting diode <--- organic is the key word. its currently still being developed and isnt in the consumer market yet. LG recently made a OLED TV that will come out this year but it costs around 10,000 USD. Likely in around 5 years they will be affordable and the technology will also have developed in order to produce better response times. OLED has many many advantages
 
They compared the Galaxy SII OLED screen to an HDTV plasma!? Really!? And then we're talking about comparing it to a $5k+ LCD? The screen in the SGS2 costs about $35 according to iSuppli, and we're comparing it to a roughly $1k Plasma and a $5k+ LCD...

Sorry folks, but this isn't a legitimate comparison. This is like saying "Ya know, if we put an LS2 into it, the Focus wouldn't suck as much compared to a Corvette...just sayin'..."

Mobile screens have to be tailored to deal with power restrictions, so they have many limitations compared to their desktop/living room counterparts. Compare it to the LCDs used in similar generation (2011) high-end smartphones, perhaps the IPhone 4, and then tell me that "LCDs can be better than OLED."
 
local dimming + scanning backlight or black frame insertion
what's the currect state of play re input lag with these technologies? When they first appeared on TVs the lag was 120ms-200ms! I haven't really considered this stuff since, but I'm particularly sceptical that you can have local dimming without lag.

so basically CRT is the pinnacle for fps gamers... sad :( thought oled would be as good.

Here's a sadder thought: Field Emission Display (FED) had enormous potential to be the best possible solution for gamers, it promised 240Hz CRT like screen refreshing and native contrast of 20,000:1.

The Field Emission Technologies company (a Sony subsidiary) couldn't afford, nor get funding to buy the production facility from Pioneer after Pioneer shut down their plasma manufacturing business. Ultimately FET was wound down and the technology was sold to AU Optronics. They promised very high end monitors would be produced for broadcast and mastering purposes, but they never appeared. The FED tech description has been pulled from AUO's site as well now so presumably the FED monitors will never be sold.

Aside from potentially better contrast, I always thought OLED would be second best vs FED, but we're still stuck with LCD and all it's limitations.
 
They compared the Galaxy SII OLED screen to an HDTV plasma!? Really!? And then we're talking about comparing it to a $5k+ LCD? The screen in the SGS2 costs about $35 according to iSuppli, and we're comparing it to a roughly $1k Plasma and a $5k+ LCD...

Sorry folks, but this isn't a legitimate comparison. This is like saying "Ya know, if we put an LS2 into it, the Focus wouldn't suck as much compared to a Corvette...just sayin'..."

Mobile screens have to be tailored to deal with power restrictions, so they have many limitations compared to their desktop/living room counterparts. Compare it to the LCDs used in similar generation (2011) high-end smartphones, perhaps the IPhone 4, and then tell me that "LCDs can be better than OLED."
True, but the same problem also affects larger OLED screens. It's the manufacturer responsibility to make sure the logicboard does what it needs to do to give the OLED good motion resolution. But it's not possible as cheaply as quickly to market, as a scanning backlight with an LED monitor.

EDIT: New, improved article at: Why Do Some OLED's Have Motion Blur?
 
Last edited:
OLED is the future. organic light-emitting diode <--- organic is the key word. its currently still being developed and isnt in the consumer market yet. LG recently made a OLED TV that will come out this year but it costs around 10,000 USD. Likely in around 5 years they will be affordable and the technology will also have developed in order to produce better response times. OLED has many many advantages
Agreed, but it could be at least 5 years before good OLED with good motion resolution, comes to market on the computer desktop, at high 3-digit prices (e.g. $999 or less)

Good "480 Hz"-like clarity in LCD is possible in less than 1 year. It's a good first step.
 
so basically CRT is the pinnacle for fps gamers... sad :( thought oled would be as good.
OLED can do it, but not quite yet at affordable prices. Cheap OLED panels have poor motion resolution. My point is that OLED needs additional logic/electronics to give it good motion resolution, combined with sufficient OLED brightness to compensate for the black-out periods. OLED is eventually the holy grail, but the first panels at the cheapest prices, isn't going to be stunningly better than LCD+scanning backlight.
 
BenQ already tried selling a model with scanning backlights 5 years ago. Apparently it didn't catch on.

http://www.behardware.com/articles/646-1/benq-fp241wz-1rst-lcd-with-screening.html
http://hardforum.com/showthread.php?t=1161471
Yeah, no wonder it failed -- it is a first-generation attempt:

1. That's BFI (black frame insertion), not scanning backlight. Those two are different, although they serve a similar purpose.
2. The Benq is a 60 Hz display
3. It used a very slow backlight. CCFL, not LED.
4. The BFI was used to simulate something only marginally higher (less 120 Hz). Motion sharpening factor of less than 2:1 in the Benq.

Five years has passed. LED's are brighter, faster than CCFL Backight technology can sharpen motion by approximately a 4:1 factor (or better), and we can combine that with 120 Hz. Given today's scanning backlight technology, very bright LED's, it can have 4 times less motion blur than 120 Hz. (Even 120 Hz alone, today, is still sharper than old first-generation BFI in the Benq.) The Benq, therefore, is ignored as a precedent.
 
what's the currect state of play re input lag with these technologies? When they first appeared on TVs the lag was 120ms-200ms! I haven't really considered this stuff since, but I'm particularly sceptical that you can have local dimming without lag.
Most of your quoted lag comes from the motion interpolation, so make sure you're not measuring combined lag (local dimming + motion interpolation + video processing + etc).

However, you're also right that it introduces a tiny lag. The lag of local dimming has greatly improved, and you can theoretically refresh the local-dimming backlight in parallel with refreshing the LCD panel.
Although not done by all HDTV's, real-time local dimming is already possible with zero-added-lag.

Technically, from an engineering perspective, you can do top-down refresh of a LCD at the same speed of top-down refresh of a scanning backlight, and since you're already "in-sync" because it's necessary in order to have a successful scanning backlight, you can already then take the opportunity to do high speed matching of backlight brightness of each LED to the LCD pixels that's starting to be refreshed (since in a scanning backlight, the LCD pixels are typically updated before the LED pixels are turned on. Because the LCD pixels are refreshed a fraction of a millisecond before the LED backlight "pixels" are refreshed during a scanning backlight.....You can use the time difference to do the math operations necessary to correctly do local dimming with ZERO LAG -- or less than 1 millisecond added lag).

That said -- yes -- many HDTV's don't do that -- for engineering simplicity, they buffer the whole frame and send it to the LED array as if it was a separate low-resolution display running independently of the LCD panel. But that does not have to be, there's no predictive lookforward/lookbackward logic needed (like for other operations such as line doubling, deinterlacing 1080i -> 1080p). Technically, nothing stopping electronics engineer from doing zero-lag local dimming. There's enough computing power in today's chips to do that already, especially if you insert a GPU or even just an optimized FPGA into the TV chassis.

MATH EXAMPLE: Scanning backlight that illuminates 1/4 of the panel at a time. LCD panel of 2ms response is common, so the trailing 'scanning' rectagular block of LCD is 'refreshed' about 2ms (or slightly more) before the LED backlight illuminates behind that area. (Observe that also 1/120th sec divided by 4 is two milliseconds). The purpose of this is to give the LCD time to fully refresh the pixel before turning on the backlight behind it (during a scanning backlight). So, the LCD response speed (2ms) is approximately how much time you have to compute a new brightness of a LED backlight "pixel", and today's chips are fast enough to do real-time synchronous refresh of a LED local dimming backlight array, in real-time sync with a LCD panel refresh. Additional processor chip, FPGA chip, entry level GPU, etc, has enough power to do it -- real-time zero-added-lag local dimming uses simpler mathematics than the science of deinterlacing 1080i->1080p (far less math operations, and you can even pull off the math of local dimming in 2ms real-time using MMX/SSE/SSE2 assembly language on an old 2 Ghz Core 2 Duo. A simple GPU or FPGA program could do real-time zero-lag-added local dimming easily.) Again, full-frame lookforward and lookbackward logic is not necessary, in order to successfully pull off local dimming.

I helped a deinterlacing algorithm, and a invented 3:2 pulldown detection algorithm about ten years ago (back in the day when Faroudja ruled the line doubler world) (old link (C) 2000 - 12 years ago), for the open-source dScaler project, so I've been there. A Pentium II 400Mhz was able to do real-time deinterlacing using MMX assembly language, at 640x480 60 times per second, from a Hauppauge TV card, more than ten years ago. I am sufficiently familiar enough with the mathematics behind a deinterlacer, and it's way more complex than the math required during local dimming (most local dimming is just simply averaging a group of pixels, and assigning it to a low resolution LED. A bit more complicated if you're doing software-driven PWM LED for intermediate values rather than resistive LED brightness, but local dimming is otherwise simpler than the math behind a good motion-adaptive deinterlacer, and much less than a motion-compensating deinterlacer, and several orders of magnitude simpler than the math in a MPEG4 compressor, already being done real-time 60fps in handheld pocket cameras and smartphones)

But then again, that may be overkill....
....All that engineering or programming may not even be necessary.
I'm even okay if it introduces 8ms lag (the duration of a single 120Hz frame) or even 16ms (the duration of two 120Hz frames) for local dimming, and as long as we can turn on/off the local dimming feature, it's a non-issue. The first $999 display can just do the lazy "full frame" buffering for computing local dimming frames for the low-resolution "LED backlight image" behind the LCD panel. (even though you can engineer it to make it integrated with the LCD refresh for 100% zero-added-lag local dimming).

Bottom line:
Most of the input lag comes from deinterlacing (1080i->1080p) and from artifical frame interpolation (60Hz->240Hz), if you eliminate those, you've eliminated most of the lag.
Besides, local dimming can also be turned off too (perhaps independent setting of the scanning backlight setting), for balancing out any minor input lag.
 
Last edited:
I think, that we need first see the OLED computer monitors to start comparing them to LCDs, and it will be some time, after all LG or Samsung haven't even started selling tv sets :)

Besides its new technology, it will take some time for it to mature. Its like comparing first TN models to likes of current 120hz TN... I believe it will take at least 3-4 years before first good computer OLEDs hit the market in reasonable prices.
 
Yeah, no wonder it failed -- it is a first-generation attempt:

1. That's BFI (black frame insertion), not scanning backlight. Those two are different, although they serve a similar purpose.
2. The Benq is a 60 Hz display
3. It used a very slow backlight. CCFL, not LED.
4. The BFI was used to simulate something only marginally higher (less 120 Hz). Motion sharpening factor of less than 2:1 in the Benq.

Five years has passed. LED's are brighter, faster than CCFL Backight technology can sharpen motion by approximately a 4:1 factor (or better), and we can combine that with 120 Hz. Given today's scanning backlight technology, very bright LED's, it can have 4 times less motion blur than 120 Hz. (Even 120 Hz alone, today, is still sharper than old first-generation BFI in the Benq.) The Benq, therefore, is ignored as a precedent.

It was advertised as BFI, but tests show that it was actually scanning backlight. You can find the details in my first link.
 
I'm even okay if it introduces 8ms lag (the duration of a single 120Hz frame) or even 16ms
I can't agree with this, there is too much latency already in the vast majority of LCD displays. In terms of input lag caused by circuit delay, there are a lot of people already sensitive to 8ms, and 16ms is very perceptible. You might get away with adding <8ms circuit delay to TN panels that have only a few ms pixel response and no significant circuit delay, but I am sure there are a few gamers who will complain vehemently if you do, and a lot of them are probably users who also crave less motion blur. So your theoretical gaming panel needs to be <4ms circuit delay with TN like 1ms-2ms pixel response.

If you consider IPS panels, 16ms+ circuit delay is easily in the perceptible or very annoying category (according to user sensitivity.) Speaking for myself, I'm not too conscious of ~8ms circuit delay, so as there are IPS with approximately this amount or less I could live with that as a maximium total circuit delay including local dimming etc. Other, more 'hardcore' gamers who place a premium on IQ will probably still demand IPS with <4ms circuit delay, but they are willing to live with the ~6ms pixel response as a compromise.

Aside from user perception, there is another 'unseen' effect of latency on games. It limits your experience whether you notice or not. I've worked on the development of sports games and responsiveness has always been the one of key concerns or problem to overcome. Because events are happening in such a time and space compressed environment, calling and playing back correct animations in response to user input is sometimes difficult or impossible i.e. the game is unresponsive, and actually there is sometimes zero time window in which the user can press a button. Other times the opportunity for the user to do something is very short, <½ frame - 1 frame. You can't react in time, as a gamer you have to predict and that's fine, it happens in all games, but as you add latency all events are appearing later, so the time available to the user to predict decreases. If all other things are equal, games appear to be less responsive and are harder to play. How do you compensate? make the game slower overall, and easier to play!

Games are being made easier anyway to make them widely accessible, and a generation has grown up with games, the younger players skills are increasing dramatically. It's no wonder the 'hardcore' complain about games being too easy. I think the latency battle is already lost: 40ms-50ms lag is generally dismissed by TV reviewers and buyers as acceptable, but if you're a 'hardcore' gamer you should continue to reject high latency and keep asking for CRT equivalent or better.

irregardless
Oh dear, you've ruined everything you've said up to this point, you should be banned from the internet :p
 
I can't agree with this, there is too much latency already in the vast majority of LCD displays. In terms of input lag caused by circuit delay, there are a lot of people already sensitive to 8ms, and 16ms is very perceptible. You might get away with adding <8ms circuit delay to TN panels that have only a few ms pixel response and no significant circuit delay, but I am sure there are a few gamers who will complain vehemently if you do, and a lot of them are probably users who also crave less motion blur. So your theoretical gaming panel needs to be <4ms circuit delay with TN like 1ms-2ms pixel response.

If you consider IPS panels, 16ms+ circuit delay is easily in the perceptible or very annoying category (according to user sensitivity.) Speaking for myself, I'm not too conscious of ~8ms circuit delay, so as there are IPS with approximately this amount or less I could live with that as a maximium total circuit delay including local dimming etc. Other, more 'hardcore' gamers who place a premium on IQ will probably still demand IPS with <4ms circuit delay, but they are willing to live with the ~6ms pixel response as a compromise.

Aside from user perception, there is another 'unseen' effect of latency on games. It limits your experience whether you notice or not. I've worked on the development of sports games and responsiveness has always been the one of key concerns or problem to overcome. Because events are happening in such a time and space compressed environment, calling and playing back correct animations in response to user input is sometimes difficult or impossible i.e. the game is unresponsive, and actually there is sometimes zero time window in which the user can press a button. Other times the opportunity for the user to do something is very short, <½ frame - 1 frame. You can't react in time, as a gamer you have to predict and that's fine, it happens in all games, but as you add latency all events are appearing later, so the time available to the user to predict decreases. If all other things are equal, games appear to be less responsive and are harder to play. How do you compensate? make the game slower overall, and easier to play!

Games are being made easier anyway to make them widely accessible, and a generation has grown up with games, the younger players skills are increasing dramatically. It's no wonder the 'hardcore' complain about games being too easy. I think the latency battle is already lost: 40ms-50ms lag is generally dismissed by TV reviewers and buyers as acceptable, but if you're a 'hardcore' gamer you should continue to reject high latency and keep asking for CRT equivalent or better.
Distinction: I'm not a hardcore gamer. I'm a high-end recreational casual gamer, who play more than 50% of the time non-online. For this, response time isn't as important as maximizing image quality.

But I agree: Online gaming is sometimes about who shoots first, and things like that. Both persons with exactly equal reaction times (e.g. 150ms) for a moment. Trigger pressed when they see something. They happen to react, say, exactly 150ms later. But the display with 1ms extra lag versus the other, could be the person who loses, because the game software thinks he shot 1ms later (only because his display was 1ms slower than his competition's). Even if the humans can't tell apart 1ms, I understand it can be like the Olympics -- fractions of a seconds (seeing the image) beating each other to the finish. Good skillz safety margin can overcome this, but given gamers of virtually equal skill, of virtually equal reaction time, fractions of milliseconds can count here. :) I'm not of the league, but I understand.

That said -- as I've explained -- zero-added-lag local dimming is technically possible.
Perhaps it's a good idea to spend an extra month or two of engineering to do that. (if zero-added-lag local dimming is not already being done)
But as long as it can be turned on/off -- I'm fine.
They should be configurable.

Oh dear, you've ruined everything you've said up to this point, you should be banned from the internet :p
Guilty as charged for the word -- c/irregardless/regardless/
 
It was advertised as BFI, but tests show that it was actually scanning backlight. You can find the details in my first link.
Ok, but the same problem applies. It improved motion resolution only very slightly -- it didn't even double motion resolution.
 
Those are compromises which I refuse to accept, as the only real compromise I had to "suffer" by using a good quality 21 inch CRT was the fact that the geometry of the image wasn't perfect.
Don't tell me about it being heavy as I only had to carry it 2 TIMES, once when I got it at home and the other time when I had to take it to repair(where it also died). So if you compare having to carry it 2 times in my life, with using an LCD with various problems every day of my life ever since, I am thinking you know what my thoughts are.

I will make sure that I will help in any way possible, any company that will come with a product that will get those 3 things into an existing LCD.

1. AG Coating Removal Option. (or come already with a decent ag coating, or you waiting for 20 years mark?)
2. No more MOTION BLUR. This is what has been taken from us, and we will never again get this from current manufacturers. They only care about business users, they don't care about gamers. Some ppl are saying, "lol, is almost as good as a CRT", well, IS NOT. Do you know how my eyes suffer, and how much I had to change my game style, and how much time I need to stare at a fixed angle image before I can move my mouse, just to make sure I noticed any slight movement that might even be a single pixel that will show the edge of a foot somewhere far far into the other side of a huge room? I guess only fps gamers know how important is to know if your opponent is somewhere close or not.
3. Input LAG? Is it that hard to add a toggle for use scaler or not? Most of the gamers do not want to sacrifice their input lag due to a stupid scaler, that will anyway only be used if he was stupid/poor/other situation, and doesn't have the possibility to use an optimal setup. Most gamers? Yes, you already guessed, WILL use an optimal setup, and no scaler is needed.

So after so many years in this situation, I am thinking very seriously about REWARDING the very first company that comes with those (MOTION BLUR being the biggest difference) by making everything possible to only buy products only from them. Be it OLED, ELECTRONIC INK, NEW CRT, LCD, I DON'T CARE. All I want is a good monitor that is also good for GAMING. One other thing which I hate is the fact that we feel forced to buy monitors for specific jobs, and many ppl already have 2 different monitors, one for gaming, and one for work/photoshop/texwork/etc. I do not like being FORCED to buy 2 or more monitors. CRTs offered a good quality product and has been taken away from us. I will make sure I will reward the company that will change this in my own way, by buying their products.

One final thought, is that regardless of the state the technology is at this point, 2 of 3 points I talked about are perfectly possible (coating and input lag), yet those are not a standard, and only can be bought in very specific/rare items.
 
Those are compromises which I refuse to accept, as the only real compromise I had to "suffer" by using a good quality 21 inch CRT was the fact that the geometry of the image wasn't perfect.
Don't tell me about it being heavy as I only had to carry it 2 TIMES, once when I got it at home and the other time when I had to take it to repair(where it also died). So if you compare having to carry it 2 times in my life, with using an LCD with various problems every day of my life ever since, I am thinking you know what my thoughts are.

I will make sure that I will help in any way possible, any company that will come with a product that will get those 3 things into an existing LCD.

1. AG Coating Removal Option. (or come already with a decent ag coating, or you waiting for 20 years mark?)
2. No more MOTION BLUR. This is what has been taken from us, and we will never again get this from current manufacturers. They only care about business users, they don't care about gamers. Some ppl are saying, "lol, is almost as good as a CRT", well, IS NOT. Do you know how my eyes suffer, and how much I had to change my game style, and how much time I need to stare at a fixed angle image before I can move my mouse, just to make sure I noticed any slight movement that might even be a single pixel that will show the edge of a foot somewhere far far into the other side of a huge room? I guess only fps gamers know how important is to know if your opponent is somewhere close or not.
3. Input LAG? Is it that hard to add a toggle for use scaler or not? Most of the gamers do not want to sacrifice their input lag due to a stupid scaler, that will anyway only be used if he was stupid/poor/other situation, and doesn't have the possibility to use an optimal setup. Most gamers? Yes, you already guessed, WILL use an optimal setup, and no scaler is needed.

So after so many years in this situation, I am thinking very seriously about REWARDING the very first company that comes with those (MOTION BLUR being the biggest difference) by making everything possible to only buy products only from them. Be it OLED, ELECTRONIC INK, NEW CRT, LCD, I DON'T CARE. All I want is a good monitor that is also good for GAMING. One other thing which I hate is the fact that we feel forced to buy monitors for specific jobs, and many ppl already have 2 different monitors, one for gaming, and one for work/photoshop/texwork/etc. I do not like being FORCED to buy 2 or more monitors. CRTs offered a good quality product and has been taken away from us. I will make sure I will reward the company that will change this in my own way, by buying their products.

One final thought, is that regardless of the state the technology is at this point, 2 of 3 points I talked about are perfectly possible (coating and input lag), yet those are not a standard, and only can be bought in very specific/rare items.
Hear, hear, hear!
I am with you about the motion blur problem.

I have a proposal.

I propose to start a kickstarter project for the best PC based motion resolution benchmark, capable of measuring up to approximately 1000Hz-equivalence(ish).

There are existing motion resolution benchmarks, but are not good enough for showing usefulness of better motion resolution. Also, blu-ray based motion resolution test pattern disc (moving test patterns) are not useful for gaming purposes.

The benchmark program, written in Visual Studio C/C++/C#, and to be open sourced (if mutually agreed, unless teamed up with a big-donating sponsor) would have the following components in the same app:

(1) Fast Pan benchmark. A test pattern scrolls horizontally very fast at every VSYNC. It will allow measurement by both human eyes (similiar to existing blu-ray motion resolution test disc) and scientifically (configure SLR digital camera to different shutter speeds and take a photo of pattern)

(2) Slow pan benchmark. Same as above.

(3) Browser scroll benchmark.
It simulates a browser window and vertically continuously scrolls text at medium speed.

(4) moving/flashing squares benchmark, similiar to my 1992 DOS program at http://www.marky.com/files/dos/motion.zip

(5) Scrolling ticker benchmark, like those used by HDTV manufacterers for comparing Hz. Will have a mode to also compare 60 vs 120 too. There would be a test pattern track underneath the scroller, for resolution measurement purposes.

It would be great for comparision between displays, and (for magazine reviewers) ....

Goals
- More publicity of motion blur problem
- Distinguish the failures (bad BFI, bad enhancement) from the good stuff
- Easy side-by-side testing
- Make blog reviews possible
- Make forum reviews possible
- Make magazine reviews possible

Plus, long term future improvements are possible.
...For example, future versions of this benchmark software could connect to an external high speed camera (1000fps) with a PC interface, to mathematically calculate a motion blur score. (If such high speed cameras exist, even if they only support low resolutions).
...Another idea is just a simple photodiode sensor with millisecond resolution, maybe the same sensors some use for testing display lag, and simply using the benchmark to flicker a square in the screen, to measure screen response rate AND length of light pulse, relevant to CRT scanning, to OLED/plasma refresh and to LCD backlight scanning/BFI (to compute predicted CMR-style factor, based on the light pulse length).
...Somebody would finally be able to set up a motion resolution test lab with industry standard measurements.
...All sorts of future feature enhancements are possible, but I propose K.I.S.S. and start with a simpler benchmarking app that's doable in not too much time.

I am also a standards-document writer (Example is my XMPP extension that I authored at www.xmpp.org/extensions/xep-0301.html which has my name....I spent hundreds of hours refining that standard...) and if unexpected extra funds come, I might try writing an industry standard merging all the superfluous manufacterer ratings (e.g. Samsung CMR 960, Sony XR 960, etc), although that would be a more future item (since I find programming more fun first)

With a small kickstarter fundraiser, I can create such a benchmark.

My (outdated) website: http://www.marky.com
My resume: http://www.marky.com/resume

Although I do not work in the gaming industry, I spent a lot of the late 80s and early 90s in video game related stuff, including self programming hobby games with raster interrupts on commodore 64 (giving me excellent vsync familiarity) and a motion resolution test I created for DOS in year 1992 (MOTION.ZIP) in my Nostagilia Programming section at http://www.marky.com/programming/ .... combined with my good understanding of this, and some basic DirectX programming (it's just panning a full-screen images and/or simple sprites at full refreshed-rate-matched fps) I have the underlying skills required to make a benchmark.

So, perhaps, I should start a small kickstarter project to fund my programmer time/resources to create a motion resolution benchmark for PC's (capable of measuring all the way to 960 Hz-equivance, unlike existing blu-ray test pattern motion benchmark discs).

I am ready to step up to the plate and create a kickstarter, if a small group of us are willing to pledge a bit towards the project.

It's a possible first step to convincing an Alienware-style vendor to release a monitor flat panel that has CRT quality motion resolution. It has practical demos (eg panning imagery and text), in order to prove that there is practical applications to better motion resolution. It also allows monitor manufacterers to test their monitors, possibly (eventually). It would also show to the "I cannot tell 240Hz" masses that text is much clearer in a scrolling browser window, and say "oh wow, it is beneficial after all".

If we can bring enough pledges, why don't I start a kickstarter project to start such a motion-resolution benchmark program? I might need to get HardOCP permission (or even use angel funding, from one of us, to buy a HardOCP sponsorship to advertise this kickstarter project), since this forum audience would be an obvious prime audience of kickstarter donations. It would satisfy the moderators too, and meet forum rules better, I think.

We need a few people like you and me to recruit kickstarter pledges, and it'll help get the ball rolling in the long term by raising awareness.

A good, doable and realistic first step to helping any future display tech start heading in the right direction, at least for enthusiast-league displays.

What do you think about the Kickstarter project idea?

Sincerely,
Mark Rejhon
http://www.marky.com
(old outdated site -- just supplying my credentials/rep for Kickstarter idea)
 
Back
Top