sharknice
2[H]4U
- Joined
- Nov 12, 2012
- Messages
- 3,964
I see someone has a penchant for hyperbole. A budget CRT from the 90's was a 15" 800x600 60 Hz flickering 4:3 display. A 27" 4K 144 Hz display with FALD and HDR can't compete with that? lol. OLED doesn't have any crippling, ridiculous problems. The only reason the motion quality isn't better isn't because of the technology, it's because of the way it's packaged in today's 60 Hz TV's. OLED picture quality absolutely destroys CRT.
If you use the previous entire frame, there will be a lag with the FALD responding. There is a reason why in every TV made, turning off FALD is a requirement for any decent input lag. I'm just wondering what the geniuses at NVIDIA came up with to minimize the lag.
It seems like they could just control FALD on the fly with essentially no latency. Setting the zones brightness as it gets the data just like it sets the color of each pixel as it gets that data.
If each pixel has a brightness value it just keeps a value for each zone and once all the pixels are received for that zone adjusts the brightness. It would basically just be controlled as it is scanned out like the pixels are, only instead of millions of pixels it would be hundreds of lighting zones.