Why dont people use vsync?

What does semi-competitively mean? The difference between fps_max 100 and 60 is quite noticeable with any type of snap aiming, and I don't know of a single serious player who would back up your statement. fps_max 100, 100hz or higher (CRT only for scompetitive gaming) and Vsync off. The main reason for the disabling Vsync was because of the FPS hit. Even on the super old HL engine, smokes caused fps problems a 3+ years ago.
after HL1 engine removed FPS movement limitations, there was no benefit to getting FPS over your monitor's refresh rate because of the many reasons listed previously. your GFX card can render images at whatever FPS it wants but the monitor cant display them faster than it can refresh itself (60hz for vast majority of LCDs) without getting page tearing. years ago it was 60, 75, and 80hz for CRTs but the effect was pretty much the same as far as i know. if it's not, then it's moot as not many still use CRTs these days.

you cant argue with the facts, which prove my above statements. i may not have a computer engineering degree like defaultuser, but i do understand the basics of how monitors display an image.
 
Doesn't the refresh rate of the monitor "cap" the framerate to 60 anyway?

Yes.

I meant that the movement and such is smoother, making your ability to react faster. Of course, the flashbang is still going to wear off at the same rate, but you're going to get more frames of it going away, meaning you can see again sooner, whereas with say 30 frames, it's going to be choppier and you can't react properly. Same goes for my statement on player movement. I know they're not moving faster, but they're moving smoother as is your aim, therefore enhancing your own gameplay.

No.
 
defaultluser's post cleared everything up for me :)

I will continue not using VSYNC now, as input lag is a huge turnoff, while I dont really notice much tearing (in most cases).
 
I like using it for single player games. Like the Half-Life 2 series. Looks so smooth and beautiful.

But I cannot stand the mouse lag I get when playing Css, Cod and other fps games.

So for me:
SP = Vsync On
MP = Vsync Off
 
I haven't used Vsync since a few days after I ever learned it existed, because of its deficiencies. Input lag, less recent frames, jittering/staggering frame rates, etc. all are a "no go" for me. I hardly ever see screen tearing without it, and it doesn't bother me anyway.
 
game by game basis for me. in some older games vsync on is a must have because it eliminates a lot of issues.
 
I haven't used Vsync since a few days after I ever learned it existed, because of its deficiencies. Input lag, less recent frames, jittering/staggering frame rates, etc. all are a "no go" for me. I hardly ever see screen tearing without it, and it doesn't bother me anyway.

+1
input lag is too noticable compared to v-sync off and I can't stand that.
 
No, you can't, not 45fps constant. The buffer will eventually empty if the video chip is supplying frames at 45fps and the screen is eating them at 60fps; at this point, the triple-buffer becomes a double-buffer, and has the same problems.

What triple-buffer CAN do is give you a fairly smooth experience if you have say 60fps AVERAGE. The buffer will empty as it smooths (temporary) low spikes in framerate, and will refill if the framerate spikes (temporarily) above 60fps.

HOWEVER: triple buffering takes %50 more framebuffer memory, and doubles your video lag. In other words, there is no easy solution to the problem; they all have their drawbacks.

Which one drives you crazier? Input lag, or tearing? For me, it's definitely input lag.

NO VSYNC: best performance, has tearing.

VSYNC: no tearing, but requires MINIMUM framerate of the monitor refresh rate to prevent jitter.

VSYNC WITH BUFFER: no tearing, AVERAGE framerate required for smooth gameplay is same as monitor refresh. HOWEVER: with AVERAGE framerates slower than monitor refresh, gets jitter. Also takes more memory, and adds input lag.

I don't think you understand how triple buffering works. It doesn't fill and empty, nor will a triple buffer ever turn into a double buffer.

There is a technique called triple-buffering that solves this VSync problem. Lets go back to our 50FPS, 75Hz example. Frame 1 is in the frame buffer, and 2/3 of frame 2 are drawn in the back buffer. The refresh happens and frame 1 is grabbed for the first time. The last third of frame 2 are drawn in the back buffer, and the first third of frame 3 is drawn in the second back buffer (hence the term triple-buffering). The refresh happens, frame 1 is grabbed for the second time, and frame 2 is copied into the frame buffer and the first part of frame 3 into the back buffer. The last 2/3 of frame 3 are drawn in the back buffer, the refresh happens, frame 2 is grabbed for the first time, and frame 3 is copied to the frame buffer. The process starts over. This time we still got 2 frames, but in only 3 refresh cycles. That's 2/3 of the refresh rate, which is 50FPS, exactly what we would have gotten without it. Triple-buffering essentially gives the video card someplace to keep doing work while it waits to transfer the back buffer to the frame buffer, so it doesn't have to waste time. Unfortunately, triple-buffering isn't available in every game, and in fact it isn't too common. It also can cost a little performance to utilize, as it requires extra VRAM for the buffers, and time spent copying all of them around. However, triple-buffered VSync really is the key to the best experience as you eliminate tearing without the downsides of normal VSync (unless you consider the fact that your FPS is capped a downside... which is silly because you can't see an FPS higher than your refresh anyway).
http://www.hardforum.com/showthread.php?t=928593
 
I don't think you understand how triple buffering works. It doesn't fill and empty, nor will a triple buffer ever turn into a double buffer.


http://www.hardforum.com/showthread.php?t=928593

There is a technique called triple-buffering that solves this VSync problem. Lets go back to our 50FPS, 75Hz example. Frame 1 is in the frame buffer, and 2/3 of frame 2 are drawn in the back buffer. The refresh happens and frame 1 is grabbed for the first time. The last third of frame 2 are drawn in the back buffer, and the first third of frame 3 is drawn in the second back buffer (hence the term triple-buffering). The refresh happens, frame 1 is grabbed for the second time, and frame 2 is copied into the frame buffer and the first part of frame 3 into the back buffer. The last 2/3 of frame 3 are drawn in the back buffer, the refresh happens, frame 2 is grabbed for the first time, and frame 3 is copied to the frame buffer. The process starts over. This time we still got 2 frames, but in only 3 refresh cycles. That's 2/3 of the refresh rate, which is 50FPS, exactly what we would have gotten without it.

Think about what you just described. You just described my EMPTY BUFFER case, except that since we are working with FRAMERATE and 2/3FRAMERATE, the ratio is 2:3 for extra frames. Notice above that frame 1 is grabbed twice. So will frame 3, 5, 7, 9 ... which gives you the same problem with jitter I mentioned above.

Don't believe me? Do the math. You get a rate of 75fps every third frame, and you get a rate of half that (37.5 fps) every two out of three frames:

75 * (1/3) + 37.5 * (2/3) = 50 fps.

The math doesn't lie. To "average" 50fps with VSYNC and triple buffering, your card is JERKILY moving between a rate of 75 and 37.5 fps once every second.
 
Think about what you just described. You just described my EMPTY BUFFER case, except that since we are working with FRAMERATE and 2/3FRAMERATE, the ratio is 2:3 for extra frames. Notice above that frame 1 is grabbed twice. So will frame 3, 5, 7, 9 ... which gives you the same problem with jitter I mentioned above.

Don't believe me? Do the math. You get a rate of 75fps every third frame, and you get a rate of half that (37.5 fps) every two out of three frames:

75 * (1/3) + 37.5 * (2/3) = 50 fps.

The math doesn't lie. To "average" 50fps with VSYNC and triple buffering, your card is JERKILY moving between a rate of 75 and 37.5 fps once every second.

Its not once every second, its more like 50 times every second, and its not going to be any jerkier than without vsync.
 
i run with vsync on because i dont see a reason for the system to output images that wont get used....

I agree Vsync makes a world of diference in the quality of the rendered image on my screen. This is another of those threads where people will get there panties in a bunch because they can "see" 120FPS. There are tons of threads out there just like this one. Play the game the way YOU want to play it :) if Vsync looks better to you on then off then go for it :) if not then hey live with the tearing :) either way as long as your having fun it doesn't real matter ::D
 
after HL1 engine removed FPS movement limitations, there was no benefit to getting FPS over your monitor's refresh rate because of the many reasons listed previously. your GFX card can render images at whatever FPS it wants but the monitor cant display them faster than it can refresh itself (60hz for vast majority of LCDs) without getting page tearing. years ago it was 60, 75, and 80hz for CRTs but the effect was pretty much the same as far as i know. if it's not, then it's moot as not many still use CRTs these days.

you cant argue with the facts, which prove my above statements. i may not have a computer engineering degree like defaultuser, but i do understand the basics of how monitors display an image.

That facts are that no one plays fps games competitively on LCD monitors if they have the choice. Most CRTs support well above 100+ hz at low resolutions, even 5+ years ago. I'm not talking about casual players who want a great experience, I'm talking about people who want the very best performance so they remove every detail and limitation possible. That's why most stuck around the 640x480 range.
 
That facts are that no one plays fps games competitively on LCD monitors if they have the choice. Most CRTs support well above 100+ hz at low resolutions, even 5+ years ago. I'm not talking about casual players who want a great experience, I'm talking about people who want the very best performance so they remove every detail and limitation possible. That's why most stuck around the 640x480 range.

no, we used 640x480 b/c it made enemy heads GINORMOUS and very easy to pop. the bigger your target, the easier to hit. that was the concept behind running low resolutions for FPS games. i had a quality 19" CRT that did 1600x1200 at 75hz, yet i played CS 1.x @ 800x600 for just this reason. i also enabled vsync and max FSAA/AF to compensate for the jaggies of such a low res.

i was a huge CRT supporter, and it wasnt until the 2ms panels came out (really turbo-charged 5ms) that i finally jumped to LCD. plus my CRT was dieing and the only CRTs i saw myself buying were used Sony 24" widescreen CAD displays @ $600-700 w/ already 1-3 years of use under their belts.
 
no, we used 640x480 b/c it made enemy heads GINORMOUS and very easy to pop. the bigger your target, the easier to hit. that was the concept behind running low resolutions for FPS games. i had a quality 19" CRT that did 1600x1200 at 75hz, yet i played CS 1.x @ 800x600 for just this reason. i also enabled vsync and max FSAA/AF to compensate for the jaggies of such a low res.
ok Im confused because 640x480 or 1280x960 on the same monitor still results in the same size head along with everything else. :confused:
 
vsync is good for single player, sucks for fast multiplayer. It adds a noticeable amount of input lag to the game. I myself was doing really poorly at TF2 a few weeks ago and thought I'd just had a really bad night. A few days later I noticed that vsync was on. I switched it off and my accuracy and score were back to normal.

So yeah, in some cases it definitely makes a difference.
 
vsync is good for single player, sucks for fast multiplayer. It adds a noticeable amount of input lag to the game. I myself was doing really poorly at TF2 a few weeks ago and thought I'd just had a really bad night. A few days later I noticed that vsync was on. I switched it off and my accuracy and score were back to normal.

So yeah, in some cases it definitely makes a difference.

Great analysis.

I drank mountain dew one night, and got into a car accident.
I drank coke the next night, and DIDN'T get into a car accident.

MAH GAWD, DEW CAUSES ACCIDENTS.

If I'm using an LCD, I turn Vsync on... plain and simple, Tearing is 10x more annoying than getting 60fps vs. 100fps...

If you're one of the spazzballs that still bitch about millisecond differences in input-lag for games, you shouldn't be using an LCD in the first place.
 
Great analysis.

I drank mountain dew one night, and got into a car accident.
I drank coke the next night, and DIDN'T get into a car accident.

MAH GAWD, DEW CAUSES ACCIDENTS.

If I'm using an LCD, I turn Vsync on... plain and simple, Tearing is 10x more annoying than getting 60fps vs. 100fps...

If you're one of the spazzballs that still bitch about millisecond differences in input-lag for games, you shouldn't be using an LCD in the first place.

Agree :) I love that analysis HAHA. Just because he was having a bad night he blamed it on Vsync ROFLMAO OMG THATS FUNNY! ok sorry hehe if you like to play without vsync go for it :) who am I to tell you to stop but I don't care who you are THATS FUNNY!!! (sorry Larry the Cable guy I had to say it)
 
I guess I will turn on v sync when playing The Witcher because I get tears at times and during ingame movie playbacks so this should help. Otherwise I have always left it off.
 
Great analysis.

I drank mountain dew one night, and got into a car accident.
I drank coke the next night, and DIDN'T get into a car accident.

MAH GAWD, DEW CAUSES ACCIDENTS.

If I'm using an LCD, I turn Vsync on... plain and simple, Tearing is 10x more annoying than getting 60fps vs. 100fps...

If you're one of the spazzballs that still bitch about millisecond differences in input-lag for games, you shouldn't be using an LCD in the first place.

It's a stupid analysis. Maybe if you wrote

I drank whiskey one night, and got into a car accident.
I drank coke the next night, and DIDN'T get into a car accident.

MAH GAWD, WHISKEY CAUSES ACCIDENTS

It would be better, because Vsync on impairs your ability to play just like alcohol impairs your ability to drive. Or maybe your ability to drive is as bad as your ability to play FPS shooters that you wouldn't notice the difference between driving drunk and sober, just like you don't notice the difference between vsync on and vsync off.

Just because you can't see or feel the difference between vsync on and vsync off doesn't mean that other people can't. Serpico has it right, vsync on really is bad for multiplayer FPS.

You would have to be completely and totally useless at FPS shooters not to notice the lag that occurs with Vysnc on.

Let me explain some of the things that happen with vsync on. In Counterstrike for instance, a guy comes around a corner and kills you. But, on your screen He hasn't appeared yet because of the lag introduced by vsync on. It's only a fraction of a second, but that is long enough. Or you shoot at a guy standing still and miss and He just turns and kills you easily? Well it's because He had moved before you started to shoot, but because of vync on He hadn't started to move on your screen. He is alerted by your missed shot and turns and kills you.

Yes, it's only fractions of a second, but, it really does affect your play to be always that little bit behind the game.
 
It's a stupid analysis. Maybe if you wrote

I drank whiskey one night, and got into a car accident.
I drank coke the next night, and DIDN'T get into a car accident.

MAH GAWD, WHISKEY CAUSES ACCIDENTS

It would be better, because Vsync on impairs your ability to play just like alcohol impairs your ability to drive. Or maybe your ability to drive is as bad as your ability to play FPS shooters that you wouldn't notice the difference between driving drunk and sober, just like you don't notice the difference between vsync on and vsync off.

Just because you can't see or feel the difference between vsync on and vsync off doesn't mean that other people can't. Serpico has it right, vsync on really is bad for multiplayer FPS.

You would have to be completely and totally useless at FPS shooters not to notice the lag that occurs with Vysnc on.

Let me explain some of the things that happen with vsync on. In Counterstrike for instance, a guy comes around a corner and kills you. But, on your screen He hasn't appeared yet because of the lag introduced by vsync on. It's only a fraction of a second, but that is long enough. Or you shoot at a guy standing still and miss and He just turns and kills you easily? Well it's because He had moved before you started to shoot, but because of vync on He hadn't started to move on your screen. He is alerted by your missed shot and turns and kills you.

Yes, it's only fractions of a second, but, it really does affect your play to be always that little bit behind the game.

from what i read...vsync adds minute INPUT lag in some games. not OUTPUT lag. that means what YOU do is slightly delayed, not what you see. there may be a slightly longer delay b/w your mouse click and the gun firing but you'll see enemies peeking corners exactly the same in both settings...
 
If your FPS is over 60 you *CAN'T* see a difference between vsync on and off (and there isn't any lag, either). Anything you do notice is likely a side affect or some random driver bug (input lag can occur though as the game only checks for events once every 1/60th of a second or so).
 
I am seriously feeling the input lag issue. Definitely the reason v-sync is off in most of my FPSing.
However in most older games v-sync is the best way to play because they weren't designed to run at the obscene frame rates modern hardware takes them to. Some newer games definitely need to be v-sync'd also; anything Quake 4 based and anything UT3 based seems to just love v-sync on my box.

If your FPS is over 60 you *CAN'T* see a difference between vsync on and off

This is close but not entirely correct. I have to run but this site explains quite well:
http://www.100fps.com/how_many_frames_can_humans_see.htm
 
i can notice a difference between 60 and 100fps. just play counter-strike 1.6 at 60 fps for 1 month, then play it at 100fps for 1 month. Go back to 60fps and you will notice the difference. (you should really start noticing the difference when u play at 100fps instead of 60)


and btw, have you noticed why CoD4 defaults trying to reach 90fps instead of 60? take a big wild guess.
 
i can notice a difference between 60 and 100fps. just play counter-strike 1.6 at 60 fps for 1 month, then play it at 100fps for 1 month. Go back to 60fps and you will notice the difference. (you should really start noticing the difference when u play at 100fps instead of 60)

This is close but not entirely correct. I have to run but this site explains quite well:
http://www.100fps.com/how_many_frame...humans_see.htm

Ultimately you end up getting 60FPS regardless, hence you can't see a difference. It is not possible to see the difference between 60fps and 60fps (even if its rendering 100fps - you still only get 60 of them), as they are the same.

and btw, have you noticed why CoD4 defaults trying to reach 90fps instead of 60? take a big wild guess.

CoD4 defaulted to averaging ~40fps for me, no where near 90. And have you noticed that consoles get well below 60fps, yet people still indicate that a game is smooth by saying that it gets 60fps?
 
ok Im confused because 640x480 or 1280x960 on the same monitor still results in the same size head along with everything else. :confused:

It is the same head size. The only difference you encounter is quality and xhair size. I'm not sure why he's saying head sizes are the reason for choosing 800x600. I'm not sure why he would do anything to remove jaggies anyways.
 
models are the same size but playing at a small rez made your sights and Xhair a larger proportion of your screen, and made it easier to hit targets.

and why wouldnt i remove jagged edges? larger resolutions produce smoother model edges, so when playing in a small rez, you need to enable antialiasing (or higher levels) to make edges look smooth.
 
Interesting to read what everyone has to say about this whole vsync @ 60FPS. What i'm more interested in is the people here who "can notice a difference" with a frame rate of 61 or more in there games.

I can say with proof that ANYONE who claims to notice a difference from 150 FPS drop down to 60 FPS is full of shit, and i can prove it too.

60 FPS is an output of 60Hz in electrical terms, what else in everyday life runs at 60Hz's? AC current. Yes that wonderful stuff that powers your lights in your house operates at the same rate of 60 FPS in your games.

How a light bulb runs on 60Hz AC current is that it turns ON then OFF 60 times a second, but your eyes CAN'T SEE IT that it appears as constant light. So, if we where to increase from 60Hz up to 150Hz, then drop it back to 60Hz again you wont notice a thing. You can try to prove me wrong by looking at a light but i doubt your eyes can render that fast, LIKEWISE WITH YOUR GAMES.

And ... if you think i'm full of shit about this i'm currently being trained as an avionics areonautical engineer in the RNZAF, i've been exposed to lots of systems running somewhat the same as games and it makes no difference once your above the 60Hz mark.
 
How a light bulb runs on 60Hz AC current is that it turns ON then OFF 60 times a second, but your eyes CAN'T SEE IT that it appears as constant light. So, if we where to increase from 60Hz up to 150Hz, then drop it back to 60Hz again you wont notice a thing. You can try to prove me wrong by looking at a light but i doubt your eyes can render that fast, LIKEWISE WITH YOUR GAMES.

Well, to be fair, that's not entirely correct. Standard incandescent lamps have heat-up and cool-down periods. The time it takes a 60Hz AC cycle to switch from current flowing in one direction to the other is far, far, far less than it takes for the lamp to cool down (and thus let off less light) any perceivable amount. The lamp doesn't completely switch on and off like your example requires. CRT's can switch off their pixels fast enough to be noticed. 60Hz on a CRT will drive me mad. I could tell the difference between 75Hz and 85Hz at 1280x1024 on my old 19" CRT.

Driver inefficiency with vsync could be a legit reason, though. With vsycn, it's up to the card and the driver to determine which frames will be rendered and what times. Without it, it would be the framebuffer's job to just drop frames that cannot be displayed. To me, though, it sounds much more like snake oil. It kind of reminds me of the attempts at overclocking the PS/2 ports for faster mouse and keyboard response times. :rolleyes:
 
Most people have taken this the wrong way. It's the difference between vsync on and vsync off, not between 60 and 100 fps. Vysnc introduces lag, which gets you killed in first person shooters. It makes mouse movements very smooth, which is terrible for any FPS player.

The reason framerate is important is not because things happen faster or anything like that, but because in FPS's when there are firefights or heavy action frame rates drop. I have seen good systems fall from a 100fps average to around 30-40fps in large firefights. This affects your play and causes stuttering etc. So that is why people want as much framerates as possible so that their average frame rate doesn't drop below 60fps and continues to be smooth.

To the guy with the lightbulbs, it's not really a great example because your light bulb doesn't turn on and off 60 times in a second. Go sit in front of a CRT monitor and change the refresh rate to 60hz and compare it to 100hz. You will notice (most people do) a flickering at 60hz, but not at 100hz. Some people can still notice the flickering at 100hz.
 
and why wouldnt i remove jagged edges? larger resolutions produce smoother model edges, so when playing in a small rez, you need to enable antialiasing (or higher levels) to make edges look smooth.

Because seriously competitive players don't want the game to look nice. :p Have you ever played Q3 with a pro's config? The game looks absolutely terrible, but it has less distractions which helps with reaction times. The same would've happened in CS if those settings weren't banned from leagues. Jaggies provide a slight advantage as well. I understand that this doesn't apply to normal gamers here, but you did bring up competitive CS so that's why I was explaining why vsync and most other enhancements were turned off.
 
Because seriously competitive players don't want the game to look nice. :p Have you ever played Q3 with a pro's config? The game looks absolutely terrible, but it has less distractions which helps with reaction times. The same would've happened in CS if those settings weren't banned from leagues. Jaggies provide a slight advantage as well. I understand that this doesn't apply to normal gamers here, but you did bring up competitive CS so that's why I was explaining why vsync and most other enhancements were turned off.
now im just curious. how would it reduce reaction times? if your machine can display full quality at a low rez without any slowdowns, how would that affect the time it takes for your machine to process an input or output?


i can definitely tell the difference b/w 60 and 85hz on a CRT. that was a major reason for running 85-100hz, and with vsync on you'd get over 60FPS because of this. 60hz gave me a headache after an hr or 2. i could walk up to a public PC and tell right away if it was 60 hz or higher.
 
now im just curious. how would it reduce reaction times? if your machine can display full quality at a low rez without any slowdowns, how would that affect the time it takes for your machine to process an input or output?


i can definitely tell the difference b/w 60 and 85hz on a CRT. that was a major reason for running 85-100hz, and with vsync on you'd get over 60FPS because of this. 60hz gave me a headache after an hr or 2. i could walk up to a public PC and tell right away if it was 60 hz or higher.

60Hz CRTs give me a headache too, but on LCDs its not the same effect since the back light doesn't' flicker. I only started using Vsync when I got an LCD, due to tearing. I play a lot of shooters and honesty the input lag issue from vsync is an invisibly fine hair to split, compared to the input lag some (poorer) LCDs give naturally. (Love my 226BW though!). If you are really that hardcore about it you should probably stick with a CRT so you can ramp up the refresh rates to 100Hz if that's what you really need.
 
It makes my mouse lag and kills my framerate, do we need more then 60fps on 60hz displays? no, but it does make the mouse feel the whole lot smoother, also tearing doesnt bother me, can't remember the last time I even noticed it.
 
now im just curious. how would it reduce reaction times? if your machine can display full quality at a low rez without any slowdowns, how would that affect the time it takes for your machine to process an input or output?


i can definitely tell the difference b/w 60 and 85hz on a CRT. that was a major reason for running 85-100hz, and with vsync on you'd get over 60FPS because of this. 60hz gave me a headache after an hr or 2. i could walk up to a public PC and tell right away if it was 60 hz or higher.

I meant the entire config is designed to reduce game detail, which increases reaction times. The equivalent in CS would be changing texture sizes to 1x1 (gl_max_size 1) - essentially a white walls effect. More contrast and less to focus on = advantage. For the purposes of this discussion, it would be silly to use the worst possible graphics and keep AA and Vsync.
 
I meant the entire config is designed to reduce game detail, which increases reaction times. The equivalent in CS would be changing texture sizes to 1x1 (gl_max_size 1) - essentially a white walls effect. More contrast and less to focus on = advantage. For the purposes of this discussion, it would be silly to use the worst possible graphics and keep AA and Vsync.
ah mipmap detail and such. i guess overall it could decrease response time a little. im sure back in those days you'd notice it more so than today with multi-cores and stripped to the bone XP OS'es w/ 2+GB RAM, even in a modern game.
 
Interesting to read what everyone has to say about this whole vsync @ 60FPS. What i'm more interested in is the people here who "can notice a difference" with a frame rate of 61 or more in there games.

I can say with proof that ANYONE who claims to notice a difference from 150 FPS drop down to 60 FPS is full of shit, and i can prove it too.

60 FPS is an output of 60Hz in electrical terms, what else in everyday life runs at 60Hz's? AC current. Yes that wonderful stuff that powers your lights in your house operates at the same rate of 60 FPS in your games.

How a light bulb runs on 60Hz AC current is that it turns ON then OFF 60 times a second, but your eyes CAN'T SEE IT that it appears as constant light. So, if we where to increase from 60Hz up to 150Hz, then drop it back to 60Hz again you wont notice a thing. You can try to prove me wrong by looking at a light but i doubt your eyes can render that fast, LIKEWISE WITH YOUR GAMES.

And ... if you think i'm full of shit about this i'm currently being trained as an avionics areonautical engineer in the RNZAF, i've been exposed to lots of systems running somewhat the same as games and it makes no difference once your above the 60Hz mark.

Your argument here only applies to fluorescent lights. In typical incandescents, the hot filament stays hot—and therefore bright—despite alternations in current; it can't cool fast enough to dim or flicker. People DO notice flickering in fluorescent lamps. This is why you often see them in pairs - one will be on when the other is off. Also, fluorescent lamps which operate directly from mains frequency AC will flicker at twice the mains frequency, since the power being delivered to the lamp drops to zero twice per cycle. This means the light flickers at 120 times per second (Hz) in countries which use 60-cycle-per-second (60 Hz) AC. And how often do you stare directly at a light bulb.

It's been proven by the USAF that humans can not only detect an image but also identify it (USAF pilots saw the image of a plane) even if it is sent as 1 frame in 220 ( the other frames were all black). This is primarily due to after image. So there is DEFINITELY a difference even above 200fps, but the question becomes how significant is this difference in a realistic gaming situation. Also, even if you can't notice the difference, it can still affect you. In high school my friends and family were amazed that I could sit in front of my computer for over 30 hours on a weekend and feel fine afterwards. I imagine it was because my monitor was at either 100 or 120hz. (i always used 800x600) And the green move from incandescent to fluorescent has often been hindered by legitimate complaints that some people are bothered by the flickering.

Personally, yes I can EASILY tell the difference between 60fps and 100fps. I still use a CRT, and I can't stand the 60hz setting - I always have it 100+. I can also tell the difference between 100 & 120, but it is negligible enough for me to stay at 100hz so that I don't risk damaging my monitor. LCDs don't work the same way though; you can't just talk about the hz.

So for v-sync, because I use a CRT at 100hz, I rarely have v-sync on for reasons talked about earlier in this thread. And you have to consider the FPS you get when crap is going on and you have to do and shoot stuff, not when you are staring at wall.

I'm still waiting for a LCD that is 22+inches, 2ms or lower, <6ms input lag, and <$1k. Viewing angle doesn't concern me, but color fidelity might. Finally with the Iiyama ProLite E2201W, I may just get an LCD.
 
I generally use vsync. But in counter-strike 1.6, vsync gives me mouse delay, and its veryyyyyyyy annoying. I enable triple buffering, and it still lags. I mean... it looks so much better with vsync on, but the mouse lag is really awful.
 
Well, I almost always use V-Sync, but this thread has made me question that. I always bench first without it just to get a feel, but if its at a good speed (like over 60fps) then I'll just turn it on. The tearing is extremely annoying, and would be worth losing some speed for clarity in the picture. I'm not sure how people don't notice, it is really obvious on a lot of games. For me, V-Sync on just looks better subjectively.

Also, I cannot stand people that keep spreading these lies about the human eye not seeing past 60Hz. That is totally bogus. You think a CRT monitor is more advanced than a human eye?!?! I can say with absolute certainty that there is a difference up to 120Hz. It is very obvious, at least to me, the flicker on a CRT monitor. Even 85Hz looks like garbage to be honest. 100Hz is smoother, but it can definitely go higher since theres a noticeable difference at 120Hz. If you think otherwise you clearly have never seen a game running at over 100fps on a suitable monitor.

Also, Samsung has 120Hz LCDs monitors that should hit late this year or early next. Then we can finally see all those glorious frames the GPUs are chucking down the drain.
 
Back
Top