After a lot of googling Ive read many contradicting view points on this issue (even on this forum), so I thought Id create a hardocp account and ask this question one last time, sorry.
There seems to be two different prevailing opinions:
Opinion 1) - Say we have an LCD with a response time of 8ms, yielding a theoretical maximum frame rate of 125. If I were to play a game with VSYNC turned off and the max FPS capped at 125 within the games configuration file, then I could assume (providing my videocard is good enough) that the LCD is literally displaying 125 full frames per second, regardless of what the refresh rate is set to within Windows.
Opinion 2) - Now, the other school of thought says that even though the LCD is theoretically capable of displaying 125 frames per second with VSYNC turned off, it is still limited by the refresh rate/sampling rate/vertical frequency/some_strange_thing of LCD technology, which basically means the LCD is not communicating fast enough with the videocard in order to produce the full 125 frames.
So to put it simply, with LCDs response time = the speed at which a given pixel can physically change while the refresh rate = how fast new pixel/update information is actually being sent to the screen.
The most reputable website I could find with Opinion 2 (aside from various forum posts) was tweakguides.com on this page: http://www.tweakguides.com/Graphics_8.html
Let's look at an LCD's theoretical refresh rate, based on its response time rating. Consider the example of an LCD monitor nominally rated at an 8ms response time. Given 8 milliseconds is 8/1000ths of a second, in one full second it can refresh all the pixels on the screen (if necessary) 1000/8 = 125 times, which makes it equivalent to a 125Hz refresh rate. Yet no 8ms LCD monitor allows you to set a refresh rate even remotely close to this in Windows, nor do even 4ms LCD monitors.
Interestingly that article also says the reason the DVI standard was built with such a limitation was because LCDs must accommodate the current Windows/Videocard architecture, which works on a full frame by frame basis, rather than a per-pixel one:
Well it appears that LCD monitors need to emulate a refresh rate in Windows primarily for compatibility purposes with games and hardware. Games, Windows and your graphics card are all still designed around composing individual frames in the frame buffer, and sending these whole frames to your monitor one by one, with the timing for buffer flipping typically based on Vertical Blank Intervals - all things which were originally required for CRT monitors. Therefore LCD panels have to try to operate on the same basis, despite the fact that they don't have the same physical limitations of a CRT.
I still dont know who is right![Big Grin :D :D](data:image/gif;base64,R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7)
There seems to be two different prevailing opinions:
Opinion 1) - Say we have an LCD with a response time of 8ms, yielding a theoretical maximum frame rate of 125. If I were to play a game with VSYNC turned off and the max FPS capped at 125 within the games configuration file, then I could assume (providing my videocard is good enough) that the LCD is literally displaying 125 full frames per second, regardless of what the refresh rate is set to within Windows.
Opinion 2) - Now, the other school of thought says that even though the LCD is theoretically capable of displaying 125 frames per second with VSYNC turned off, it is still limited by the refresh rate/sampling rate/vertical frequency/some_strange_thing of LCD technology, which basically means the LCD is not communicating fast enough with the videocard in order to produce the full 125 frames.
So to put it simply, with LCDs response time = the speed at which a given pixel can physically change while the refresh rate = how fast new pixel/update information is actually being sent to the screen.
The most reputable website I could find with Opinion 2 (aside from various forum posts) was tweakguides.com on this page: http://www.tweakguides.com/Graphics_8.html
Let's look at an LCD's theoretical refresh rate, based on its response time rating. Consider the example of an LCD monitor nominally rated at an 8ms response time. Given 8 milliseconds is 8/1000ths of a second, in one full second it can refresh all the pixels on the screen (if necessary) 1000/8 = 125 times, which makes it equivalent to a 125Hz refresh rate. Yet no 8ms LCD monitor allows you to set a refresh rate even remotely close to this in Windows, nor do even 4ms LCD monitors.
Interestingly that article also says the reason the DVI standard was built with such a limitation was because LCDs must accommodate the current Windows/Videocard architecture, which works on a full frame by frame basis, rather than a per-pixel one:
Well it appears that LCD monitors need to emulate a refresh rate in Windows primarily for compatibility purposes with games and hardware. Games, Windows and your graphics card are all still designed around composing individual frames in the frame buffer, and sending these whole frames to your monitor one by one, with the timing for buffer flipping typically based on Vertical Blank Intervals - all things which were originally required for CRT monitors. Therefore LCD panels have to try to operate on the same basis, despite the fact that they don't have the same physical limitations of a CRT.
I still dont know who is right