Perfect scaling (doublescan)

Kamamura

n00b
Joined
Mar 12, 2012
Messages
16
Logic would suggest that when the monitor resolution is just 2x the resolution of the output (For example, the output is 1920x1080, and the monitor resolution was 4K), Each pixel of the output will be displayed as 2x2 pixels on the monitor, and the result would be a pixel-perfect, sharp image. Yet, if you try that in practice, the image is smeared, like if the pixel colors were approximated. I know this function was called "doublescan" in older Nvidia drivers, and if the resolution was too low (like the 320x200 from ol' MS DOS), the driver "doublescanned" it perfectly to 640x400 with pixel-perfect clarity.

Why cannot we have these simple, nice functions?
 
Top