How VSync works, and why people loathe it

Your framerate can be higher, if you have vsync off. But of course you'll never be able to see more than the actual refresh rate of your monitor. It simply can't draw faster than that. At most you'll see multiple tears in one frame, which will only get more desorienting.
 
Brent_Justice said:
Tearing is still a problem on LCD's as well, yes, they don't refresh, but tell Windows that :D Notice they do run at 60Hz in Windows. VSYNC is a part of games and the OS.
Considering this, and how LCD operate, could it be possible using DVI output (and apropriate software, so let's asume something other than Windows) to have the video card directly update the output pixel by pixel, or, rather, frame by frame, to the limit the video card can draw each frame? From that point it's only a matter of whether the response rate on the LCD is low enough to keep up with the drawing.

Again, I understand that the way software works on a Windows machine won't allow for this, but hypothetcially speaking, and hardware considered exclusively. Is it possible with existing hardware, or is even all of that still stuck a legacy mode, for lack of a better term, and still depend on designated refresh rates?

EDIT:
Arkalius said:
I should test this in SC3... I used another utility to force TB in DX while playing SC3 and it worked (apparantly). I haven't tried using the driver setting. I should also test in Doom 3...
Doom3 I know should utilizes Triple Buffering by default. You should be able to disable/enable it through the game console. Forcing triple buffering through drivers on Doom3 may cause issues. So be warned if it doesn't work out well. SC3, I am not too sure about. I believe it also uses triple buffering by default, but if nothing else, you should be able to enable it by means of the ini files. Can't give you specifics, but it was easy to pull off on Pandora Tomorrow. In fact, it was about the only way to get frame rates above half your refresh, because the damned thing was coded so poorly.
 
Kiggles said:
Considering this, and how LCD operate, could it be possible using DVI output (and apropriate software, so let's asume something other than Windows) to have the video card directly update the output pixel by pixel, or, rather, frame by frame, to the limit the video card can draw each frame? From that point it's only a matter of whether the response rate on the LCD is low enough to keep up with the drawing.

Again, I understand that the way software works on a Windows machine won't allow for this, but hypothetcially speaking, and hardware considered exclusively. Is it possible with existing hardware, or is even all of that still stuck a legacy mode, for lack of a better term, and still depend on designated refresh rates?

What do you mean exactly? To use a 'variable' refresh-rate, which always matches your game's output?
In theory that is possible. Generally videocards can handle pretty much refresh-rate already, even with analog... and so can monitors (multisync).
The problem is in re-syncing the signals, I don't think that can be done quickly enough, and accurately enough. As you may know, changing from eg 60 to 75 Hz generally also means you need to re-center your screen, and modern monitors store these settings for various refresh rates after you've adjusted them. Some even have an auto-center function. But both switching to a different refreshrate and using the auto-center takes quite a bit of time, far too long for realtime usage.
 
Awesome Thread. Turned off Vsync in WoW and my Frame rate shot through the roof. Very happy.
 
The part about triple buffering being uncommon in games is moot since you can enable it in the video card drivers (ATI, not sure about NVidia)

ATI released a driver about a year ago that enabled triple buffering for OpenGL games, D3D games are triple buffered by default.

e.g. Quake 3 has no option for it, after ATIs new driver I could play Q3 with triple buffering, which meqans high res with aa and af on a 9800, and is always smooth, but without triple buffering you have to drop the res and the framerate still dropped sharply on Q3tourney4
 
Deimos said:
D3D games are triple buffered by default.

Not really, but the specifics of that were already covered earlier in this thread.
 
What about tripple buffering and laggy mouse movement? I remember this being a big problem for me (especially horrible in max payne).
 
Trematode said:
What about tripple buffering and laggy mouse movement? I remember this being a big problem for me (especially horrible in max payne).

That is indeed a problem... We all seem to have forgotten about that.
Ofcourse the mouse position has to be read before the rendering of a frame can be done. With double buffering, you render one frame ahead, so you have up to 1 frame of mouse lag. At 60 fps that would be 16.7 ms of lag. With triple buffering you render two frames ahead, so your lag could go up to 33.3 ms. That is enough lag for humans to notice.
And that is still at a reasonably high rate of 60 fps... ofcourse if the framerate drops to something like 20 to 30 fps, it becomes really laggy.

So perhaps it's better to use double-buffering or no v-sync at low framerates, to avoid the lag. But that is all up to personal taste.
 
Scali said:
What do you mean exactly? To use a 'variable' refresh-rate, which always matches your game's output?
In theory that is possible. Generally videocards can handle pretty much refresh-rate already, even with analog... and so can monitors (multisync).
The problem is in re-syncing the signals, I don't think that can be done quickly enough, and accurately enough. As you may know, changing from eg 60 to 75 Hz generally also means you need to re-center your screen, and modern monitors store these settings for various refresh rates after you've adjusted them. Some even have an auto-center function. But both switching to a different refreshrate and using the auto-center takes quite a bit of time, far too long for realtime usage.
Not necessarily a variable refresh rate, but a way of telling each pixel to change colour. Think of it like addressing RAM, telling each cell (or address) to change to another value when required. But when you think about it, if your running at a good res, lets say 1280x1024, you have to address over 1.3 million pixels, telling them to change. Now that requires a lot of power to render a frame AND do that at the same time, and you still need to develop a protocol that basically designates how to address the pixels, and then the hardware needs to be re-designed to do this... it creates a bit of a nightmare. So, that idea is a fair way away but it is still plausible, and it gets rid of tearing (even with a low refresh time LCD).
 
Yea, that idea will probably fall down because of the fact that it takes longer to find out which pixels to update than it does to update the entire screen. Currently DVI just mindlessly blasts all pixels through the cable at high speeds, I suppose in situations like these, it's better to not try to be smart.
 
Tearing is a side effect of the realities of CRTs and the ramifications of that on the VGA standard. Since a CRT needs to update 10's of times per second to keep a non-flickering image on it's screen, the standard was developed so that the monitor didn't have to rely on the video rendering device to update it as it could run into a situation where it wouldn't be able to do it fast enough. Instead it's set up to use the frame buffer method and the monitor would come get the frame whenever it needed it, therefore allowing it to maintain a constant (and therefore sufficient) refresh rate. LCD's overcome the shortcoming that necessitated this standard to be implemented, but they still have to use the standard, and therefore are bound by its limitations.

In a perfect world, we'd have perfect LCD's with perfect response times, and would get rid of all the CRTs, and could change our display standard to something else that relies on a push system rather than a pull system. What I mean by that is that the current system is a pull system; the monitor has to get the frame data from the video renderer. A push system would mean the renderer sends frames to the monitor as they are ready. An LCD could support this method since it doesn't have the need to refresh itself constantly. An update cap would have to be set to prevent overloading the device with frames, but that's relatively easy to do. With this method, the whole tearing issue is solved without any kind of need for VSync, because the refresh rate is the same as the framerate. The disparity between the two would be eliminated and the problem solved.

Unfortunately, such a system would be very much different from the VGA standard currently in use, and it would be completely incompatible with it. The differences would be so pronounced that it would be very hard to implement both VGA and such a new standard on the same device as well. As long as there are CRT's, we will have to live with this "pull" style system.
 
Brent_Justice said:
Did you write this or did you copy and paste it from somewhere else? Cause that is an absolutely perfect and easy to understand description of VSYNC.

In addition I would add that enabling Tripple Buffering solves the halving of the framerate problem with VSYNC enabled.

Easy there Mr. Super Editor. I think your job is to quietly check out this kind of thing and then show the guy the respect he deserves.
 
Arkalius said:
In a perfect world, we'd have perfect LCD's with perfect response times, and would get rid of all the CRTs, and could change our display standard to something else that relies on a push system rather than a pull system. What I mean by that is that the current system is a pull system; the monitor has to get the frame data from the video renderer. A push system would mean the renderer sends frames to the monitor as they are ready. An LCD could support this method since it doesn't have the need to refresh itself constantly. An update cap would have to be set to prevent overloading the device with frames, but that's relatively easy to do. With this method, the whole tearing issue is solved without any kind of need for VSync, because the refresh rate is the same as the framerate. The disparity between the two would be eliminated and the problem solved.

Basically you're then moving the frontbuffer into the monitor. Somehow I don't think it will work... because the refreshrate effectively dictates the speed at which a frame can be sent to a monitor. So the videocard will still need some kind of frontbuffer to store finished frames, because it takes about as long to transfer them to the monitor, as it does to render them. Without a buffer, the videocard would still have to wait until the monitor has received it, before it can start rendering again. You must realize that at eg 1600x1200 at 60 fps, you are transferring 1600*1200*3*60 = 330 MB/s of data. That is already about twice as much as a normal PCI bus can handle. So it's not like you can just create an interface that will 'instantly' transfer an entire frame from the videocard to the monitor... even at the modest refreshrate of 60 fps, you already need quite a sophisticated interface.
So as you can see, even with an LCD, triple buffering is not that far from the ideal situation with the current technology.
The DVI interface could be improved so that higher 'refreshrates' become possible, but it's not all that much use yet, since LCDs aren't actually capable of refreshing that fast.
 
Yes, you'd still need a framebuffer. However my idea is that as soon as the frame is done, it sends it to the monitor, and starts rendering a new one, instead of the monitor coming to get it at specific intervals. The monitor just sits and waits for finished frames as they're ready. I'm not saying the rendering device should draw the frame directly to the monitor. Instead of the monitor saying "Ok, give me your frame buffer" 60 times every second, the video card says "The frame in the buffer is done, here you go" as soon as its finished, thereby making the monitor's "refresh rate" equal to the framerate. This can't work on a CRT because you'd need to maintain a framerate of at least 60 or the screen would start flickering horribly, but LCD's don't have to continually refresh. They still might get redundant frames of course but there wouldn't be a disparity in the rate at which it updates itself, and the rate at which the video card is drawing frames, thereby eliminating the tearing problem without VSync. It's really a reverse kind of VSync... instead of forcing the video card to stay in pace with the monitor, it makes the monitor stay in pace with the video card. Obviously there'd need to be some kind of maximum rate, as the monitor could only update so fast, but that's fine. Our eyes can't see a framerate much higher than around 75 to 85 anyway and it's not hard to implement an FPS cap on the video card.
 
But with triple buffering, you're not forcing the videocard to stay in pace with the monitor.
The only issue is the delay, but if you can run at a refreshrate high enough, say 100 or 120 Hz, then you won't really notice. Running at 120 Hz gives as much delay with triple buffering as 60 Hz would with double buffering.
 
yeah TB VSync is fine... just saying that a newer standard would be more efficient :)
 
Yea, you're right. I'm just putting it in perspective with what's available today.
 
good thread. Before I speak, let me background of my pc. I have a Nec Diamondpro 930sb 19" CRT monitor connected to a GF6600GT. I mostly play older games like UT99 (fast paced game) and will eventually play newer games.

My general question is, if there are no slow framerates and no low refresh rates, what are the ideal settings to use? Right now I play at 1024x768, vsync off, refresh rate 120, with a frameratelimit to 100. Should I set the frameratelimit to 120 to be in sync with the refresh rate, or set no limit at all? I've noticed some tearing when both refresh rate and frameratelimit are set to 120.

I've noticed the mouse movement is very diffferent between vsync on and off. With vsync on, the mouse movements feel kinda delayed, yet both gameplay and mouse movements feel smoother. The slight mouse delay is probably the main reason I prefer vsync off
 
That system of sending the frame to the monitor as it is finished would be wonderful... if we all had CRTs. But since we don't I guess we just have to live with our current system.
 
Sorry for resurrecting this thread, but I found it extremely helpful! I have always seen the terms "ghosting" and "tearing" used interchangeably, and I thought that they were used to describe the same thing, but it turns out that they are very different. I recently upgraded my video card from a 7900GT to an 8800GT, and I seem to be noticing a lot more tearing now, probably due to the higher performance of the 8800GT. Before reading this thread, I was blaming the problem on my Dell 2005FPW LCD monitor. I even captured a video of the problem with my camera (it is the intro sequence to Crysis). For anyone wanting to see another video example of "tearing", here is a link to the video file (~9 MB):

http://www.box.net/shared/ydwx9e40sk

Also, I would be grateful if someone could take a look at the video and confirm my suspicion that the problem is indeed "tearing", and not a problem with my monitor. Thanks again for this wonderful write-up!
 
Thanks, quite informative. It helps explain why my 8800GTX doesn't feel much faster than my old 8800GTS 640 @ 192x12 in COD4.
 
I always have vsync ON. I can see tearing in almost everything. It's really bad in Portal and I think in all the HL games. I forced it on and it all went away. Interesting about the triple buffer. I'll have to see if I can turn that on also. This should be in the FAQs.
 
LOL at "people" who loathe vsync. All those who have no idea how FPS and refresh rate are linked or have crappy hardware that can't keep up :p

EDIT: And now having actually read it, I can't believe there are people who actually think that vsync drops the framerate because it's an iintensive operation or some such. Thank you for writing this, it may stop many dumb questions.
 
awesome post, i hate tearing and this gave me a much better understandingof whats going on and when or when not to turn vsync on. bravo. sticky nominated
 
I guess I'm the only one that notices this thread is over two years old. :confused:

Good read though. I wasn't around when it was originally posted.
 
great read. i get tearing alot too. with most lcd's capped to 60 or 75 hz, enabling vsync is a good option. i know it helps me :)
 
WOW

defiantly sticky this. your grammar, punctuation and spelling is near perfect.
 
I can't stand the tearing. If it is present I get very distracted by it.
 
Yeah I did post this a long time ago, and I'm glad it's still helping people understand this often misunderstood concept. I absolutely hate tearing so I generally always use vsync. nVidia drivers support forcing triple buffering these days but I've not really spent much time testing how well that works.
 
Now here's the ultimate question? Do I live with tearing or bad framerates? I play wow at 1920x1080 on a lcd and with vsync off I get around 70ish FPS which causes tearing. If I use vsync tearing goes away but when I enter a major city I get around 20fps. What I went with was vsync off in cities, and vsync on out in the wild. I just need a better video card :p.
 
correct me if I'm wrong, but doesnt the option for triple buffering appear in the same drop down menus or choices as AF? I thought it often goes like this:

double buffering
triple buffering
2x AF
4x AF
8x AF
16x AF
(etc etc)

But the 2 are not related? What do they show up in the same menu? How could enabel triple buffering and AF at the same time?
 
That's texture filtering and you're thinking of bilinear and trilinear filtering, which is an entirely different and unrelated issue to this.

Bilinear filtering takes the weighted average of the 4 nearest texels to the pixel to figure out what color to paint it. It will show noticeable changes in quality though where mipmap boundaries are. Trilinear filtering solves this by using the two nearest mipmap textures smoothing it out more. Anisotropic filtering helps deal with blurryness associated with surfaces at oblique angles by dealing with the fact that texture minification can occur at different factors in different dimensions (ie on the ground near the horizon the texture is minified much more in the direction going away toward the horizon than it is in the direction paralell to it). MIPmaps are isotropic. Anisotropic filtering solves it by doing what you could call RIPmapping.

I remember back in the days of doom when nearest neighbor filtering was all we had... blocky textures up close and shimmering aliasing all over the place in distant areas. To see it now is hideous but we loved the graphics back then... Bilinear filtering was a blessing! Today, it would look like shit... go ahead and try it ;)
 
yeah i just went into a game and realised I was thinking of bi- and tri- LINEAR (filtering)

So is there a list of games that support triple buffering? because I have seen that too.

Is there any point to turning on triple buffering without using vsync as well?
 
LOL at "people" who loathe vsync. All those who have no idea how FPS and refresh rate are linked or have crappy hardware that can't keep up :p
.

Even with decent hardware, vsync will lose you frames and increase mouse input lag.

Anyone who seriously plays FPS games online or just generally likes a responsive mouse will disable vsync without even thinking about it. Triple buffering certainly doesn't help the mouse lag either.

I always have vsync off for this sole reason. And yes, the tearing does annoy me, but I'd rather that than have a laggy game experience.
 
Now here's the ultimate question? Do I live with tearing or bad framerates? I play wow at 1920x1080 on a lcd and with vsync off I get around 70ish FPS which causes tearing. If I use vsync tearing goes away but when I enter a major city I get around 20fps. What I went with was vsync off in cities, and vsync on out in the wild. I just need a better video card :p.

I run V-Sync and I don't have bad frame rates. I also don't notice any input lag in most games. On the rare occasions I have noticed it, the input lag has been so slight that I haven't worried too much about it.
 
Back
Top