GotNoRice
[H]F Junkie
- Joined
- Jul 11, 2001
- Messages
- 12,017
I have a 120hz monitor (Samsung S27A950D) and it spawned a discussion about refresh rates with my friend. We were talking about how many monitors are able to accept a 75hz input signal, and were debating if that meant the LCD was actually operating at 75hz or not.
I know that the default video mode used when you boot most pre-UEFI computers is 640 × 400 @ 70 Hz, and that even includes the Windows Logo pre-Vista. So obviously an LCD will need to be able to accept a signal of at least 70hz regardless of what refresh rate it actually operates at. I've also had LCDs exhibit strange behavior when set to 75hz in windows.
But my friend really wants to believe that 75hz means the monitor is actually refreshing 75 times, not just accepting the signal but still operating at 60hz. Can anyone shed some light on this?
I know that the default video mode used when you boot most pre-UEFI computers is 640 × 400 @ 70 Hz, and that even includes the Windows Logo pre-Vista. So obviously an LCD will need to be able to accept a signal of at least 70hz regardless of what refresh rate it actually operates at. I've also had LCDs exhibit strange behavior when set to 75hz in windows.
But my friend really wants to believe that 75hz means the monitor is actually refreshing 75 times, not just accepting the signal but still operating at 60hz. Can anyone shed some light on this?