Something I have noticed with LCDs..

Mikerocks2112

Limp Gawd
Joined
Aug 23, 2008
Messages
362
I have noticed while gaming that if the FPS is locked at 60fps, it looks absolutely fantastic and very smooth. Now the moment it begins to drop below that it just looks so stuttery and kinda makes me nauseated if its in the 30 fps range. Even at 55 FPS...it just doesn't looks smooth to me. Honestly, whoever said the human eye can't see a difference past 30 fps is full of it. It seems that LCD's exaggerate the problem because I really don't remember gaming in the 40-50 fps range a problem when I used a CRT.
 
It's got nothing to do with LCDs (unless yours is defective).

Anything below a solid 50+ FPS gives me a headache pretty quickly.
 
I think it is more of a psychological effect. I have gamed on many LCDs and CRTs and I typically was/am fine at 30fps or more. Take Crysis for example: When I play with a bit too high settings, it will drop to 35-40 fps, and it still looks smooth as silk. It is when it drops below the 28 fps mark that it starts to look stuttery. You may have a defective monitor though if there truly is an issue with the image at those framerates.
 
Vsync has been shown to lower your MIN FPS vs having it off and possibly havingh tearing

with VSYNC on, my FPS would drop to low 20' sometimes which was annoying, with it off, i never see numbers that low, nor do i see any noticeble tearing.

it isn't psychological at all, some peopel can see well over 100FPS+, it has been proven time and time again.
 
the human eye can definitely distinguish past 30fps

they wrongly assume we can't because video is around 30fps and we don't complain... but they didn't take motion blur into account
 
I don't think the amount of frames per second is the real problem, anything above 30 should be playable and appear smooth. The problem is when the frame count is jumping all over the place, the larger the change in rate, the more pronounced the stutter. Say you are cruising along at 60fps with vsync enabled, then for a few seconds the rate drops to 40, it will appear as stuttering. But if the rate was at 40fps constant, the stutter would not be there, and to the eye/brain, the game would appear perfectly smooth.
 
with vsync on, you get either 60 fps, 30 fps, 20 fps, 15 fps, 12 fps, 10 fps, etc.

(ie 60/1, 60/2, 60/3, 60/4, 60/5, 60/6, ...)
 
with vsync on, you get either 60 fps, 30 fps, 20 fps, 15 fps, 12 fps, 10 fps, etc.

(ie 60/1, 60/2, 60/3, 60/4, 60/5, 60/6, ...)

uh what? i thought it just capped fps at refresh rate :confused: not to mention fraps shows numbers other than those when i game, with vsync on
 
uh what? i thought it just capped fps at refresh rate :confused: not to mention fraps shows numbers other than those when i game, with vsync on
If your PC hardware can't render the game at 60fps, then it simply can't. Vsync isn't the magic technology to make it where you never need a faster PC.

It synchronizes the frames to your displays refresh rate, so if your PC can't do the 60fps(60hz) then it will drop down in large increments as the previous poster outlined in order to maintain the sync.

I don't know how fraps works. It may be that vsync is broken in that game, some new games it is. Another guess is that fraps is displaying what your computer is rendering, not what is being displayed by the videocard. Thus if your hardware is only rendering at 50-55fps and not reaching the 60 needed to lock in at full 60hz, then you may only be seeing 30fps.
 
If your PC hardware can't render the game at 60fps, then it simply can't. Vsync isn't the magic technology to make it where you never need a faster PC.

It synchronizes the frames to your displays refresh rate, so if your PC can't do the 60fps(60hz) then it will drop down in large increments as the previous poster outlined in order to maintain the sync.

I don't know how fraps works. It may be that vsync is broken in that game, some new games it is. Another guess is that fraps is displaying what your computer is rendering, not what is being displayed by the videocard. Thus if your hardware is only rendering at 50-55fps and not reaching the 60 needed to lock in at full 60hz, then you may only be seeing 30fps.

my pc games 60fps in my games, and vsync works just fine. what im saying is fraps(a tool the tells u the current fps rate) and it displays numbers other than 60, 30, 20, 15, and 12; it displays all numbers, usually with vsync on the perfomance hit lands me in the 40s or 50s. i dont know what that guy is talking about
 
my pc games 60fps in my games, and vsync works just fine. what im saying is fraps(a tool the tells u the current fps rate) and it displays numbers other than 60, 30, 20, 15, and 12; it displays all numbers, usually with vsync on the perfomance hit lands me in the 40s or 50s. i dont know what that guy is talking about

That's where the funny stutter w/vsync comes in. 45 fps could look like:
16.7ms, 33.3ms, 16.7ms, 33.3ms, ...
ie: [frame], [frame, pause], [frame], [frame, pause], ...

make sense?
 
and ill leave on that note, cuz i dont get it :p

Exactly. But that's why you post here: to get the answer. And you definitely got an answer. And albeit a little complex, it is definitive; and you have to admit, it shows the people here really know what they are talking about. ;)
 
and ill leave on that note, cuz i dont get it :p

Okay, time for a more detailed explanation with vsync on.

Your LCD (at 60Hz) shows a frame every 16.667 milliseconds. So, your video card ideally needs to at least pump out a new frame every 16.667ms also.

If your "effective vsync-off framerate" drops to 59 fps, then it takes 16.949ms, and it misses its window of opportunity by 0.282ms, and has to wait until the next 16.667ms interval comes by, which is a long time relatively. So the previous frame is stuck on your screen for two frames for a total of 33.333ms. 1000/33.333 is 30 fps.

With vsync on and at least 30 fps, then each frame is on your screen for either 1 or 2 frames, so the framerate would average from 30 to 60.
 
with vsync on, you get either 60 fps, 30 fps, 20 fps, 15 fps, 12 fps, 10 fps, etc.

(ie 60/1, 60/2, 60/3, 60/4, 60/5, 60/6, ...)

Well keep in mind I also have Triple Buffering running as well. Doesn't TB kind of alleviate what you are talking about?
 
Now that, I don't know! (Wouldn't that introduce more lag?) I actually mused about that a bit above earlier, but removed it before posting. At 59fps throughput, eventually though it will stutter 1 frame per one second regardless.
 
This issue is also where 120 Hz LCDs are being advertised. They have double the 'slots' in which to change the display image, and thus the image can appear smoother. Naturally, not all (relatively few) 120 Hz LCDs can also accept more than 60 FPS, but that's another matter :)
 
Now that, I don't know! (Wouldn't that introduce more lag?) I actually mused about that a bit above earlier, but removed it before posting. At 59fps throughput, eventually though it will stutter 1 frame per one second regardless.

Triple buffering does add lag, image lag. It's just what it says, a third frame buffer, all though nVidia allows you to increase/decrease the frame number in it's drivers, and therefore increase/decrease lag. It does not eliminate the lower portion of synched frames, but reduces the amount of low synched frames to make the average frame rate smoother.
 
The human eye can detect upto 500fps (without blur);)


Considering the eye sees a constant stream of light data, the statement "the eye can only see 30 frames per second" is ignorant at best.

Now saying the brain can only interperate 30 fps, that is debatable. I know personally, I can see the different between 60fps and 100fps on a CRT, the difference between 100fps and 200fps however, is less pronounced.
 
Considering the eye sees a constant stream of light data, the statement "the eye can only see 30 frames per second" is ignorant at best.

Now saying the brain can only interperate 30 fps, that is debatable. I know personally, I can see the different between 60fps and 100fps on a CRT, the difference between 100fps and 200fps however, is less pronounced.

Yes, the statement about the constant stream is absolutely correct. The human eye doesn't see lights in frames like on a computer or TV.

And, seeing as how I have no research in the area, I cannot comment on the fact that we can interpret 100fps, etc. But I will say personally, I enjoy 100fps plus, and can't usually tell a difference when it drops to the 50s and 60s (more like if, depending on the game). But I can tell when it gets to the 30s. I don't know if this is true for all, as I'm sure people's eyes are all different, and people's brains interpret at different speeds. Just some food for thought =D
 
It's definitely not psychological.

With VSYNC turned off, there's little noticeable difference between 50fps-ish and 60fps-ish. This is because the images get displayed as immediately as they are being rendered, even mid-scan (basically the video output immediately splices over to the next frame -- that's why you see tearing sometimes)

With VSYNC turned on, there's a MASSIVE noticeable difference between 50fps and 60fps. That's because it waits until the beginning of the next frame to display the newly rendered image. (Often this will drop to 30fps, not drop to 50fps, but the game may (due to intermittent CPU usage) alternatingly switch between 60fps and 30fps rapidly, to the point where average framerate is more like 50fps. This results in REALLY AWFUL MOTION, you're not just witnessing 30fps judder, but ALSO the stutter of the game rapidly switching between 30fps and 60fps)

VSYNC-on at 60fps looks much better than VSYNC-off, *if* your graphics card is fast enough to stay at 60fps at almost all times without dipping much. I can play all Half-Life-2 based games (previous generation engine) at almost always full 60fps framerate from start to end. In this situation, VSYNC-on is definitely preferable. Most arcade video games are designed to operate with VSYNC-on, for their silk-smoothness.

While human eyes can't really tell /directly/ 30fps versus 60fps (you can't count the frames fast enough). The human eye can see indirect effects -- even at 1,000fps. (Yes, one thousand frames per second). Whether by reduce motion blur, wagon-wheel optical effects, or other types of side effects that change with the refresh/respnose.

Another kind of stutter can be detected by motion blur that 'varies' (there is for example, more motion blur at 30fps than at 60fps). Even the human eye can even tell apart 60fps and 120fps when it comes to 'stutter' and 'motion blur', as has been evidenced by the 60fps-versus-120fps demos now found in home theater stores, for fast-panning horizontal motion. Just like camera photography where fast action photographs at 1/1000th second camera shutter is sharper than the same fast action photograph at 1/250th second camera shutter, there is no hardlimit for the human eye but rather a point-of-diminishing returns, based on how fast the action. Motion blur on LCD is inversely proportional to a monitor's refresh rate when you're able to play material matching the refresh rate, and remarkably still noticeable even up to 1000Hz. That is why home theater companies are hyping those silly 240Hz and 480Hz displays (they /actually/ make a difference for Hockey and Football -- and I can vouch for this too.)

Flicker -- this disappears roughly approx 60 Hz (but can be higher or lower -- 30Hz for dark environments, 100Hz for bright environments) Office environments have made 85 Hz standard for CRT's, until LCD's (Which don't flicker) made this moot.
Wagonwheel effects -- this can disappear if the framerate is high enough or the shutter is slowed down (introducing source-based motion blur). Wagon wheel effects can still be observed even with a strobe light running at 1000Hz (yes, 1Khz), when the wheel is spinning fast enough. Anything with a repeating pattern (like a picket fence) can have wagon wheel effects.
Motion blur effects -- this reduces with higher refresh rate, or more blackness between frames (CRT scanning, LCD black-frame insertion, LCD scanned backlights, LCD frame interpolation like in 120Hz/240Hz/480Hz etc)
Stop-motion effects -- (stroboscopic motion) Often seen while watching high-contrast sports material like downhill skiing that's is being done using a sports camera high-speed shutter. Many people have observed this even at only 60fps (1280x720p 60Hz). Crystal clear frames with the skiier moving several inches between every frame, the stopmotion becomes noticeable when you're staring at a stationary background object. Most sports cameras use a slightly slower shutter to reduce this 'stopmotion' effect, and the stopmotion effect is often more noticeable on plasma than for LCD because of the motion blur of LCD. Highspeed shutter cameras running at higher-framerates on higher-refresh display, show a remarkable lack of stop-motion effects, which show that human eyes can still see frame-stepping effects even at only 60 frames per second. Most things look smooth, but when you stare at stationary background elements (like an advertisement in the background in a skiing sports show) while things move past, things move at big 'jumps' if the objects are moving fast enough. (Note: Camera confirmed to be faster than 1/1000th sec shutter, and running at full 60 frames per second, so it's definitely not camera).

There's a lot of knowledge by the higher professionals, and it is really a myth to say human eyes can't tell apart 30fps versus 60fps. There's a point of diminishing returns, but it doesn't even end at 120Hz.

For most people, 30fps is plenty enough, especially gaming on a regular mass-market display, but it's like standard definition to some eyes, when others have been spoiled by 1600x1200 100fps on a 100Hz CRT monitor before they switched to, say, a LCD monitor -- that's high-definition motion fluidity.
 
The human eye can detect upto 500fps (without blur);)
There's no specific number actually. It can be as low as approx 30fps in certain material in certain environments, while it can be well beyond 1000fps in others. (see my previous post)

There are many image metrics that is affected by a 'frequency' (a framerate, a refresh rate) at different 'critical point' levels:
- motion blur
- flicker effects
- stopmotion / stroboscopic effects
- wagon wheel effects
- judder effects (i.e. 30fps on a 60Hz CRT)
- stutter effects (or rather, "variance in the judder", or "random judder")

Some material really show off one or many of these, at varying extents at varying framerates, and other factors affect this (viewing distance, brightness of display, brightness of lighting in room - i.e. this makes flicker more noticeable) Different humans can tell better than others. Untrained humans that don't notice things (people who can't tell apart CRT and LCD motion blur until they're pointed out, in side-by-side demos, then "eureka")

Of all these effects, when it comes to LCD's motion blur is the toughest one to solve, and one technique to solve motion blur on an LCD is overkill (i.e. motion interpolation to 240Hz or 480Hz), tests have shown that motion blur at 120Hz is 1/2 as much as 60Hz. At 240Hz, motion blur is 1/4th as much. At 480Hz, motion blur is 1/8th as much and we hit the limits of the LCD pixels (2ms response, only allows up to approx 500Hz, then further motion interpolation is useless). Another technique is the black frame insertion or scanned backlights, but that introduces a new undesired effect (flicker) that LCD normally does not have. So a good compromise is to use a combination of technologies to try and reduce ALL of the aboveforementioned bulleted effects. It's a complex engineering science. (It's very complex stuff. I understand the difficulties monitor manufacturers encounter trying to implement all such technologies in their displays. I know enough about assembly language programming -- machine language -- embedded software -- that I can assure you it's complex. Many years ago, I formerly worked on the dScaler open-source deinterlacer and invented the 3:2 pulldown algorithm that's used within it).

500fps would look virtually perfect. However, for some material, 500fps versus 1000fps would look very faintly noticeable -- most particularly differences in wagonwheel artifacts (imagine a wagon wheel, car tire, disc, etc spinning fast enough to be stationary or look almost stationary. Higher framerates is one fix. Wagon wheel effects can occur at any framerate, even 500fps, but becomes less noticeable at higher framerates). Some very minor motion blur effect differences may still be observed (especially if this a continously-lit display, like LCD with a non-flicker backlight) because photographers still can see a difference between 1/500sec shutter and 1/1000sec shutter for the same fast-action material, but it would take a very critical eye at very close distances to notice this.

The point is there /is/ a point of diminishing returns, where someone doesn't really care. I think probably 500fps would be good target. My opinion is an "near"-ultimate LCD display is probably a "480Hz" display displaying a 120fps native source, running 75% dark, 25% bright (using a scanned backlight, to simulate CRT scanning at 120Hz), to show each part of a 120Hz frame for only 1/480th of a second. The backlight cycling at 120Hz (instead of 60Hz) eliminates flicker, the 120fps native source will eliminate most judder/stutter/wagonwheel/etc effects (relative to 60Hz), the "480Hz" via scanned backlight will eliminate the motion blur. (or rather, make motion blur approximately 1/8th as intense as at 60Hz -- basically 480 divided by 60.),. Currently, we are getting close to this technology -- home theater LCD makers are introducing "240Hz" and "480Hz" (I hate this hype, as it's really technologies that simulate this, marketing hype, but the benefits are still real, see my explanation in this and other posts...), and a couple of new "120Hz native (non-interpolated)" capable 22"/23" computer monitors are now starting to become available. Within about 5 years, about the year 2014, thanks to technologies like these, I believe we'll have CRT-quality on LCD monitors (at least high-end consumer enthusiast displays). Probably sooner, but I like to be conservative.
 
Very informative posts, Mark Rejhon :) Sounds like definite material for a FAQ or so :D

Mind if I borrow from it some time in the future? :)
 
NFS Carbon, on my old 7600GT XXX would play @ 1280 x 1024, on medium at about 39-45 FPS. Very little lag, if any. It's a westinghouse monitor. But, i think it;s just your monitor or something

Now, with my new 9600gt, i can play nfs carbon at 70+ fps on high with particle system on and stuff. It has dipped below 60fps, and it's been fine. it starts to lag around 35-36fps
 
Very informative posts, Mark Rejhon :) Sounds like definite material for a FAQ or so :D

Mind if I borrow from it some time in the future? :)
Feel free. There's still so much misinformation out there.

But if anything gets published (blog, magazine, etc), please pass it by my eyes first. Google "marky.com" and you can track down my email address that way.
 
Feel free. There's still so much misinformation out there.

But if anything gets published (blog, magazine, etc), please pass it by my eyes first. Google "marky.com" and you can track down my email address that way.

Okay, I will keep it in mind :)
 
Back
Top