wow...V-Sync sux !

302efi

Gawd
Joined
Oct 25, 2004
Messages
943
Latley I've been playing alot of Hellgate: London and I've been noticing some shitty frame rates with my 7800GTX. Playing at 1280x1024 with everything set at high (very high for a few of them) and I had V-snyc enabled. This was with the moniter somehow defaulting at 60hz refresh.

The FPS would drop in the teens-20s and never reach above 50 or so.

Well I did a little reserch and saw that V-sync was locking my max FPS at no higher than 60.

So I turned that shit off..lol

Now FPS was in the 120-150's and never droped below 80 or so !!!!!!
 
That's the purpose of V-sync... usually you want to run V-sync if you have page tearing, or your (old) games are running super fast due to the speed of FPS if I recall right...
 
Vsync helps alot of tearing problems lots of people have (me). Im locked at 60, and thats all that matters for me, because it never drops below in most games i play. 100+ is pretty nice to have, but it really doesnt mean much for the most part, now i know ill probably get flamed by most people pulling the whole "MY EYES SEE 100FPS"

I call bull, as long as its around 60 imo, its eye pleasing.
 
If your FPS never went below 80, vsync would lock your frames at 60 for the entire time. If your frames were in the teens and 20's with vsync on and your refresh rate at 60, that means your "real" frames per second were somewhere below 30. Here's how it works at 60hz refresh rate -

Code:
vsync off | vsync on
60+          60
30-59        30
20-29        20
15-19        15
12-14        12
10-11        10

etc. Always an even fraction of your refresh rate to eliminate any possibility of tearing. Something in your post was wrong - most likely your made up FPS values... :rolleyes:
 
If you enable triple buffering that may help with frame rate dips caused by V Sync. But it may also work only in Open GL. Hopefully someone else can shed light on that.
 
Or... how many of the people who posted in this thread have played Hellgate London?

It could be a game issue. Hellgate has a lot of issues to be patched still.
 
Thank you voice of reason.. If it locks it at 60Hz/FPS then he has an LCD monitor that can't display higher than 60FPS anyways so he would just be getting an assload of tearing. Unless of course he is making this all up. Plus V-Sync wouldn't matter if you always got more then 60FPS anyhows.

I'm on a CRT..Never did I say I was on a LCD, did I ? :rolleyes:

Call it what you guys want, I get minimal tearing with v-sync off at and FRAPS says I'm running along at 120 or so on avg.

The only time I get a big drop is in the Terminals, high 70 - low 80s

I understand what your sayimg though, if my minimum now is high 70s, then when I turn V-Sync on I would a constant 60 fps.

Yeah make this up..sure:rolleyes:
 
HG:L is a buggy piece of poo. This isn't some universal v-sync issue, it's a game-specific one.
 
If you enable triple buffering that may help with frame rate dips caused by V Sync. But it may also work only in Open GL. Hopefully someone else can shed light on that.

TB does indeed eliminate the frame rate dips. It is natively supported for OGL, but there are ways to force it in D3D titles (i.e., ATI Tray Tools).
 
HG:L is a buggy piece of poo. This isn't some universal v-sync issue, it's a game-specific one.

Dunno about the second part I agree with the first. I usually leave vsync off for my games though and live with the occasional tearing.
 
If you enable triple buffering that may help with frame rate dips caused by V Sync. But it may also work only in Open GL. Hopefully someone else can shed light on that.


That does help keepo frames close to the max of your V-sync, up AA and AF.

V-sync on reviews i have seen can let major drops happen for some reason, that would otherwise not happen, its like the vid card is working super had to push out as much as it can and then being denied at the monitor, so if you do use v-sync, try cranking up all details even AA adn AF so your frames do drop a little below the v-sync max you have.


V-sync is somerthing used primarily on LCD's simply because most CRTs these days can be set at 100+ refresh rate @ 1280 x 1024 unless you have a 17" or something, so it was just assumed you had an LCD because the only time, or %99 opf the time v-sync is brought up, it is in relation to LCD's.
 
HG:L is a buggy piece of poo. This isn't some universal v-sync issue, it's a game-specific one.

It applies to all games, its a VSync issue.

HG:L is really buggy though :)
 
TB does indeed eliminate the frame rate dips. It is natively supported for OGL, but there are ways to force it in D3D titles (i.e., ATI Tray Tools).

how do you enable TB in ATI Tray tool? i've always used DXTweaker, it works fine but it's a bit cumbersome to set it up.
 
how do you enable TB in ATI Tray tool? i've always used DXTweaker, it works fine but it's a bit cumbersome to set it up.

DXTweaker is what I use and it works very well. You would never know how to enable TB if someone didn't tell you but it isn't bad at all once you know.
 
V-Sync puts your monitors refresh rate in sync with your GPU. Your GPU is capable of producing X frames per second, but if they arent synced up with the monitor, the frame is dropped, which is why you get the drop in frame rates. Vsync is a huge hit to FPS usually, unless you have a powerful enough video card. I personally rarely notice tearing, but if you have an 8800 series, you can play most games with Vsync on and very acceptable FPS.

It doesnt matter what type of display you have i dont think, only the refresh rate. Alot of people dont understand, if your refresh rate is 60hz, the most fps you can see is 60. When your getting 150+ fps....your eyes arent actually seeing that. For most LCD's your eyes never see more than 60. Realize though, its not just a frame rate cap.
 
I don't care what the stupid online guides are telling you kiddies, but i can definitely notice an increase in smoothness between 60 and 80.
 
I don't give a shit what you tell me, i can tell a difference in smoothness between 60 and 80.

You sense the difference in reaction time on a frame by frame basis, with respect to the acceleration/position of your hand/mouse. Your eyes dont sense the difference.
 
I don't care what the stupid online guides are telling you kiddies, but i can definitely notice an increase in smoothness between 60 and 80.

You aren't getting 80FPS in anything with that FX5500 PCI card. :D
 
I'm on a CRT..Never did I say I was on a LCD, did I ? :rolleyes:

Call it what you guys want, I get minimal tearing with v-sync off at and FRAPS says I'm running along at 120 or so on avg.

The only time I get a big drop is in the Terminals, high 70 - low 80s

I understand what your sayimg though, if my minimum now is high 70s, then when I turn V-Sync on I would a constant 60 fps.

Yeah make this up..sure:rolleyes:

If you play Portal with the V-sync off, you will get massive (and unplayable) tearing. In some cases, V-sync is good. I mostly play with it off, but if you have a decent enough card, you might as well run it.
 
I don't care what the stupid online guides are telling you kiddies, but i can definitely notice an increase in smoothness between 60 and 80.

Yeah, I was wondering how you get above 60fps with that card. Do you play games on another computer? Shit, not even Doom 3 on its lowest settings would get above 60fps with that card.

PCI is just too much of a bottle neck with today's (and yesterday's) games. Even if it is the BFG Tech OC 5500 (which I used to own; try playing Quake 4 on it at any settings, yuck).


...and kiddies? lol

Ok, 'old man'. You've got to be old when you're using a card that ancient.
 
I don't care what the stupid online guides are telling you kiddies, but i can definitely notice an increase in smoothness between 60 and 80.

I've always found V-Sync'd 60 fps to look much smoother than 80+ unsynced. Even if you don't actively notice tearing, by syncing the frames, the makes the video look smoother. Note that I'm making a distinction between how smooth it 'looks' and how smooth it 'feels'.

nissanztt90's explanation sounds right. The higher FPS will be more responsive to your controls and movements (since the frame is displayed as soon as possible, instead of waiting for the next display refresh).

ps, a little note about triple buffering. Enabling it will keep v-sync from dropping frames, but TB also introduces a little lag time. That may or may not be an issue.
 
"Note that I'm making a distinction between how smooth it 'looks' and how smooth it 'feels'."

Thats more or less what i meant between what your mouse movents feel like vs what your eyes percieve...basically the reaction time. You move quick, the frame is spit out quick, the guys head is there...BAM...dudes dead. You move quick, you have to go through a few vsync frames first, then the guys head is there...but BAM...your dead because your pc took too long to process the frame..
 
"Note that I'm making a distinction between how smooth it 'looks' and how smooth it 'feels'."

Thats more or less what i meant between what your mouse movents feel like vs what your eyes percieve...basically the reaction time. You move quick, the frame is spit out quick, the guys head is there...BAM...dudes dead. You move quick, you have to go through a few vsync frames first, then the guys head is there...but BAM...your dead because your pc took too long to process the frame..

Technically, the process of aiming has already been processed, your fps just hasnt caught up with how you moved, much in the fact of "flinging" awp shots in CS, you learn how hard and how far to move your mouse and click, and if you play it back in a demo if recorded, you will notice the shot going off much before the x-hairs move over the player, yet they still die.
 
DXTweaker is what I use and it works very well. You would never know how to enable TB if someone didn't tell you but it isn't bad at all once you know.

i still want to know if there's a easier way to enable TB in ati tray tool. :confused::confused::confused:
 
Technically, the process of aiming has already been processed, your fps just hasnt caught up with how you moved, much in the fact of "flinging" awp shots in CS, you learn how hard and how far to move your mouse and click, and if you play it back in a demo if recorded, you will notice the shot going off much before the x-hairs move over the player, yet they still die.

Ive read about that before, and its supposedly called "hitbox lag" and im pretty sure its on the FAQ or something on steams website. Not saying your wrong or anything, but learning how far and how hard to move your mouse goes hand in hand with reaction time, and if your lack of frames are slowing that down...i hate awps anyway.
 
I've noticed that having vsync on with an LCD in some games you get a slight hitching but never noticed it on a crt. I set vsync to off by default because of this and any games that need it on because of really bad tearing I use the Nvidia game launcher to create custom settings with it on just for that game. With a 2ms LCD tearing isn't as noticeable but on my older LCD with slower response time it was very noticeable in some games. A crt is so fast you can play just about all games with vsync off and the tearing isn't nearly as noticeable as on an LCD.
 
Most LCD panels these days have a refresh rate of more than 60hz and have for a while. I have a benq 17" that does 75hz no problem and my Epson 3lcd projector does 120hz. Tearing isn't really an issue for me personally. Most will tell you I'm a true believer when it comes to high FPS and I am. However there is certainly some truth to the whole you feel it before you see it argument. I can't stand either. I have noticed that tearing is more prevalent on a CRT than LCD. LCDs with with a faster GtG time also exhibit more tearing. True you want a fast LCD but in the 4msec range the tearing becomes prevalent and annoying. Going on personal experience 6-9msec is the sweet spot. Get any slower and you get into ghosting territory....Ugh. FPS>Tearing

I've noticed that having vsync on with an LCD in some games you get a slight hitching but never noticed it on a crt. I set vsync to off by default because of this and any games that need it on because of really bad tearing I use the Nvidia game launcher to create custom settings with it on just for that game. With a 2ms LCD tearing isn't as noticeable but on my older LCD with slower response time it was very noticeable in some games. A crt is so fast you can play just about all games with vsync off and the tearing isn't nearly as noticeable as on an LCD.
Reply With Quote


I have observed the opposite my self as I stated above. going from a 20" trinitron to the 17" benq I get much less tearing with the same settings My 3lcd Projector is set at 120hz and has some noticeable tearing.
 
You move quick, the frame is spit out quick, the guys head is there...BAM...dudes dead. You move quick, you have to go through a few vsync frames first, then the guys head is there...but BAM...your dead because your pc took too long to process the frame..
With double buffering, the longest possible delay in terms of frames is one. Divide that by 60, and you're obviously talking about 1/60th of a second. If the frame rate is below 60, then it's 1/30th of a second and so on and so forth (and you can start to see how big a problem this might be). The card never draws a number of 'old' frames, as there wouldn't be any buffers for them. The GPU renders the current frame to the back buffer, then the buffers are simply flipped.

Personally, vsync completely throws my game in CS:S, but it isn't a problem in others. I think Tech 4 games get it right with its fixed tic rate.

and correct me if im wrong... LCD's do not go by Hz...
They do. Your GPU is spitting out frames regardless of what kind of monitor you have, and while the individual pixels on the panel may not refresh, the active frame is still refreshing, and your display can't draw over 60 of them per second if your panel's refresh rate is set to 60 Hz.
 
LCD panels dont "refresh" like a monitor, they "refresh" on a per pixel method, each pixel changes as opposed to the entire image on the scren changing like a crt, so yes, it does "all" changing, but from each individual pixel changing, if that makes sense?
 
well I was going to write a big accusatory essay telling you about how stupid some of the posts have been but I wont...

With V-sync off you cant get an accurate FPS reading for obvious mathematical reasons.

On neither an LCD nor a CRT can you reach an FPS higher then your monitors refresh rate.

Tearing on an LCD has no particular reason (that I'm aware of) to be better/worse then it is on a CRT.

Ultimately turning V-sync on or off has no effect on your graphics cards total pixel output (ie FPS), bar a software bug. However it does re-prioritize what pixels appear and when.

I really wish some of the people here would educate themselves on how vertical sync works, the misunderstanding is clearly rampant. I'd expect as much out of steam forums, but certainly not [H]
 
On neither an LCD nor a CRT can you reach an FPS higher then your monitors refresh rate.

Interesting...

How is my refresh rate on my CRT at 100hz at 1280x1024, but my FPS is going 120+ ?

Is this a FRAPS bug ?

Not flaming, just asking ;)
 
That is what your GPU is rendering, but you aren't seeing those extra 20 frames.
 
That is what your GPU is rendering, but you aren't seeing those extra 20 frames.

Wow....That I never knew :cool:

I always assumed that whatever FPS FRAPS was showing, is what you were seeing...lol

Damn I learned something to day !
 
Hellgate is buggy yeah but Vsync does limit fps.

You can definitely notice the difference between 60fps and 100fps.

use /fps in hellgate for fps.

CRTs still rape LCDs for gaming performance. 120hz, 150hz > 2ms GTG LCDs.
 
^^ due to the higher allowance of refresh rates.

but for many 60FPS is more then enough, others prefer more, other like less, it is personal preference.
 
I hate it. It doesn't just limit FPS to 60 but if your fps drop below 60 it will limit it to 30, if your fps drop below 30 it will limit to 15 and so on so that it prevents page tearing and keeps in sync with the refresh rate. Sucks.

(might be wrong about the number specifics but I do know it cuts FPS pretty bad when they drop below 60)


Better to crank settings as far as you can so that you average 40-60 fps just because the graphics settings are so high.
 
Back
Top