How VSync works, and why people loathe it

Scali said:
I meant the specific extensions that enable triple-buffering, obviously :)

Obviously a semantic error. :D As for the rest of your lecture on semantic errors you're preaching to the choir.

Scali said:
These modes are often emulated.

I don't think you can know that unless you work for an IHV. Emulation is probably more work than maintaining the few 10s of thousands transistors needed to implement VGA functionality. Furthermore the emulation I'm talking about would have to be on chip anyway as I'm talking about writing asm directly to hardware IO ports in a non-PM OS and bypassing BIOS completely.

This VGA spec is what allows PCs to implement the 'copper bar' which you said

Scali said:
no longer possible on modern videocards

Implying that modern video cards do not implement VGA compatibility. I'm calling bullshit.


Scali said:
Also, nobody would still use a palette mode today for any kind of graphics (for one, there is no 3d acceleration in palette mode), so the copper bar effect cannot be applied in any practical situations.

I know plenty of fellow engineers that would love to see paletted textures. Besides the obvious memory and bandwidth savings it can also offer better quality than other hardware compression techniques like DXTn. But I digress...

The 'copper bar' effect was never practical; it's like a guitar solo or showing your dick. It's an effect you whip out to demonstrate your keen ability at performing precise timing on limited hardware. If it were easy there would be not point in doing it.

Scali said:
PC hardware has always been relatively poor compared to home computers.

LOL!
 
personally i can never notice tearing with my sony spd 21' monitor and keep it off. if ne1 could ever capture a tearing effect i would like to see it. btw i just ran a demo that runs over 400 fps and i cant see any difference with the vsync on or off regardless my monitor is running 1600x1200 at 100hrtz
 
Shane said:
if ne1 could ever capture a tearing effect i would like to see it.

I'm not sure how you'd capture the tearing effect because the effect is not in vram, it's on the output device. You'd have to place a camera in front of your display and record it. But for CRT's this has the added problem of synchronizing the capture device to the display refresh. This means you'd have to capture at a fairly high frame rate - not exactly easy for consumer level hardware.

That said I'll try to capture the tearing effect tonight and post my results for you.
 
Dark_Seph said:
Same here, I can live without AA or AF but when I see it flash by, O_O.

Very good write up.

Same here. Tearing is just too much. I notice it much more on my LCD than on my CRT. I keep going back and fourth on changing out my LCD for my CRT. However, my LCD is way better for everything but games. However the colors of the LCD's and the vibrance makes me keep the LCD anyway. Plus its much easier on my eyes when I've been staring at the monitor for hours on end.
 
I edited the original post to include double and triple buffering discussions as they are relevant to the vsync issue. If you guys think the original was easier to follow or was better for whatever reason, let me know via PM and I'll put it back if enough people think so.
 
Okay I've captured the tearing effect. You'll need Divx 5.2.1 or later to view the video. I applogize for the crappy capture, grainy nature of the video, and low (320x240) resolution.

Tearing can clearly be seen in the double yellow line on the road way between the 4-6 second mark.
 
Oh yeah, there's a good video of tearing. Watch the street lines carefully folks. If you don't see it, then you've trained yourself to be blind to it... Might not be such a bad thing if that's true hehehe
 
So, how do you enable triple buffering? Where is the setting? Does this information tell me what kind of monitor I should buy? What are optimal settings?
 
kmeson said:
Obviously a semantic error. :D As for the rest of your lecture on semantic errors you're preaching to the choir.

You make a lot of semantic errors!

I don't think you can know that unless you work for an IHV. Emulation is probably more work than maintaining the few 10s of thousands transistors needed to implement VGA functionality. Furthermore the emulation I'm talking about would have to be on chip anyway as I'm talking about writing asm directly to hardware IO ports in a non-PM OS and bypassing BIOS completely.

What makes you think these hardware IO ports are real? It's a PCI/AGP/PCI-e card for starters.

This VGA spec is what allows PCs to implement the 'copper bar' which you said

Yes, in a very limited way, where C64 or Amiga would have interrupts, or you'd be able to write cycle-exact routines because all hardware is equal.

Implying that modern video cards do not implement VGA compatibility. I'm calling bullshit.

I did explain what I meant by that.

I know plenty of fellow engineers that would love to see paletted textures. Besides the obvious memory and bandwidth savings it can also offer better quality than other hardware compression techniques like DXTn. But I digress...

But it's not available in most hardware. Besides, even if you do have paletted textures, that still doesn't mean you can actually do the copperbar effect. And these paletted textures are normally still used in a truecolour environment.

The 'copper bar' effect was never practical; it's like a guitar solo or showing your dick. It's an effect you whip out to demonstrate your keen ability at performing precise timing on limited hardware. If it were easy there would be not point in doing it.

It actually is piss-easy on proper hardware like the Amiga.


If you don't know what the Amiga was capable of, I suggest you do some reasearch. EGA, the PC counterpart of the time, was a joke compared to the Amiga. Heck, even VGA can't do a lot of things that the Amiga can. I find it sad that you appear to be completely ignorant about this matter, judging from your 'LOL!'-reaction.
 
forumposter32 said:
So, how do you enable triple buffering? Where is the setting? Does this information tell me what kind of monitor I should buy? What are optimal settings?

Triple buffering has to be supported by the game generally. There is a utility someone linked in this thread that allows you to force triple-buffering in DirectX games but it doesn't always work right. OpenGL has no way to force triple-buffering because it's not part of the OpenGL standard, and can only be implemented through an extension that the game itself has to include.

As far as monitor choice, there really isn't much to say on that with regard to this issue. Higher refresh rate capabilities are good but it doesn't solve the problem completely.
 
Scali said:
What makes you think these hardware IO ports are real? It's a PCI/AGP/PCI-e card for starters.

Because when the processor boots up it is in real mode, and since no-PM software will be loaded port virtualization cannot occur.

All hardware on PC's uses either memory mapped IO or ports. In the PC architecture port virtualization requires a PM OS. Specifically the OS must configure either IOPL and the CPL or configure the TSS with the IO protection bit map. Once configured to disable port IO, or disable the specific port (as in the later case), a general protection exception will be generated. Catching this exception then provides the means for virtualization.

If you are suggesting the video card is virtualizing a port, then we're back to my original assertion that "emulation is probably more work than maintaining the few 10s of thousands transistors needed to implement VGA functionality."

Scali said:
Scali said:
PC hardware has always been relatively poor compared to home computers
I find it sad that you appear to be completely ignorant about this matter, judging from your 'LOL!'-reaction.

If you don't find humor in your quote, I'm sorry. At this point I don't feel like trying to having a discussion with you anymore.
 
kmeson said:
If you are suggesting the video card is virtualizing a port, then we're back to my original assertion that "emulation is probably more work than maintaining the few 10s of thousands transistors needed to implement VGA functionality."

Despite your lengthy write-up, yes, that is what I'm saying. The emulation happens at the videocard level, the rest of the system doesn't have to know about that.
Also, you overlook a lot of details, namely that palette modes require specific hardware support in the DAC, and things like that. Perhaps you'll reconsider after you've looked into the hardware-side of things.

If you don't find humor in your quote, I'm sorry. At this point I don't feel like trying to having a discussion with you anymore.

Well I do think it's worth discussing the virtues of the Amiga chipset over the PC chips at the time, but at this point I think you're just trying to get out of it, because you realize that you were wrong and know absolutely nothing about the subject.
 
Arkalius said:
OpenGL has no way to force triple-buffering because it's not part of the OpenGL standard, and can only be implemented through an extension that the game itself has to include.

It can also be forced by the display driver, but as far as I know, only ATi has this feature in its drivers. You can just turn it on in the ATi Control Panel, somewhere in the OpenGL section.
 
Scali said:
at this point I think you're just trying to get out of it, because you realize that you were wrong and know absolutely nothing about the subject.

This right here is why you got banned from Beyond3D. Clearly you've not learned a damn thing.
 
kmeson said:
This right here is why you got banned from Beyond3D. Clearly you've not learned a damn thing.

It's not actually, but I don't feel like discussing that here.
And why don't we keep this discussion at a technical level? That is, if you can.
 
Arkalius said:
Triple buffering has to be supported by the game generally. There is a utility someone linked in this thread that allows you to force triple-buffering in DirectX games but it doesn't always work right. OpenGL has no way to force triple-buffering because it's not part of the OpenGL standard, and can only be implemented through an extension that the game itself has to include.

As far as monitor choice, there really isn't much to say on that with regard to this issue. Higher refresh rate capabilities are good but it doesn't solve the problem completely.

I'm not sure but I think my old monitor can't go above 70 Hz. If I buy a new monitor with higher refresh rates, should I go ahead and set the refresh rate higher in the Nview thing? (LOL, I haven't even used it yet. I don't know how do the settings.)
 
Firstly, STICKY!!!! For sure, it has to be a sticky, a must read for anyone who posts anywhere on anything to do with video cards. A damn good read.

Secondly, kids (Scali and kmeson, I'm looking at you), don't make me come back there...

Finally, if i can add something here if I may, a few people it seems have missed an important point about LCD displays that was made right from the start. Yes, an LCD display is made up of pixels that are not constantly updated like a CRT, rather told when to change colour by the signal. The point made, was that they still use an updateable interface to carry the information from the video card to the computer, ie VGA or DVI. Now, DVI still has to be updated, at around 60Hz depending on the configuration and exact timing. Now, either way, and LCD panel is no substitute for V-Sync. Why? Because the interface is still updating at a fixed figure. Using an example from Arkalius in the guide:

Why does this happen? Lets take a specific example. Let's say your monitor is set to a refresh rate of 75Hz. You're playing your favorite game and you're getting 100FPS right now. That means that the mointor is updating itself 75 times per second, but the video card is updating the display 100 times per second, that's 33% faster than the mointor. So that means in the time between screen updates, the video card has drawn one frame and a third of another one. That third of the next frame will overwrite the top third of the previous frame and then get drawn on the screen. The video card then finishes the last 2 thirds of that frame, and renders the next 2 thirds of the next frame and then the screen updates again. As you can see this would cause this tearing effect as 2 out of every 3 times the screen updates, either the top third or bottom third is disjointed from the rest of the display. This won't really be noticeable if what is on the screen isn't changing much, but if you're looking around quickly or what not this effect will be very apparant.
You will still see tearing on an LCD, as it still has to, effectively, 'update'. When the signal is sent to the monitor, the frame may not be totally rendered yet, so you will still see some of one frame and some of another. Don't ever think of an LCD as a way to eliminate V-Sync AND tearing on the screen. Until they invent an interface that tells which pixel to change when, and has no 'refresh' rate, then you will eliminate both V-Sync and tearing. But I highly doubt they will do that, way too much bandwidth needed.

I hope this helps, and that my self-fulfilling raving had a point.
 
This thread is *so* good; it brings a tear to my eye, it really does.;)

Now anytime I see a "Vsync, on or off?" thread I can just link to this :cool:


If you could just add a little mention of the inevitable mouse input lag produced by vsync when framerates are low, that'd be the cherry on the icing on the cake.
 
Hey guys, I was checking some stuff out, how's this work on frame rate counters? I put it on in BF2, and while it doesnt go 60...30...60...30... it kind of does hang out at those numbers. Might go 60..58...55...30. Maybe a little harder to catch then that but it certainly hangs out at those damn numbers. I know Doom3 did this 60/30 constantly and you might see a flash of another number besides 60 and 30. I guess this ones just not as accurate? I think it's just the metering and I dont have triple buffering going on and it is 60...30. :( If I turn vsync off everything seems smoother despite the tearing.... and I have a much harder time of "catching" my FPS meter showing 30. So imagine I'm on the right track you know thinking of course my GTX is not hanging around 30 with vsync off it's a nice card, but obviovuly will drop below 60's ALOT with good seetings in the game, hence the frequent 30's with vsync. I know the vsync premise just the counter has me a bit confused as to whether BF2 might have triple buffering. It really doesnt look like it though. Any FPS calc would involve an average of sorts across the second even possibly, i think, it's just software. That hack I posted to force triple buffering crashed my game when a map was done loading. :(

I just have a real nice BF2 system that I'd like to realize its potential. I've got an LCD here... my 60Hz is not some crappy monitor. Man we really need to rally for this option here. I guess it might hurt us the most... 60/30 sounds pretty bad compared to 85/42.5. though we should all be free to try it out. Thats why I asked about the performance.... I know holding an image cant help, but Im trying to figure why life has to be so difficult. Unless triple buffering is just a total dog to performance. Then why not talk of it in the 512mb reviews? I dunno... computers suck!!! Lol

Sorry, every comp thread I read now has a BF2 angle.
 
Scali said:
It can also be forced by the display driver, but as far as I know, only ATi has this feature in its drivers. You can just turn it on in the ATi Control Panel, somewhere in the OpenGL section.

Someone in the thread concerning the new Beta drivers (77.76 I think?) from nVidia said triple buffering was now an option.
 
ROCK!!!! Just loaded up the driver and triple buffering is right there under advanced settings, where AA and the optimizations are. At least with coolbits and what not. Says nothing about being OpenGL specific! To be sure I checked the whats this option and crossed my fingers it didnt mention OpenGL, all that greeted me was just the definition (~"on maintains higher frame rates with vsync on") Dudes I gotta say, I had a 6800GT die on me, went with an XL, and was kinda hoping to get back to Nv, I really do like their drivers better and my GT was a fine card. I couldn't be happier with this GTX purchase. To me at least today Nvidia is king. Dude if Im not pegged at 60 then 30 on my GTX in BF2 now with no tearing, I could probably dance in the streets. If they have this working well in DX now after a thread like this one, how could I not hold them up in high regard. DX damnit! DX. I'll own Quake 4 and all but.......
 
Which drivers have this? I have 77.76 and there's nothing about triple buffering on my advanced settings...

EDIT: Nevermind I have 77.72 oops.
 
Definitely working wonders in BF2 for me. :D I suspect my early inclinations were right. The fps counter hanging aoround those numbers was the ol' 60/30 trick, triple buffering was not built into the game it seems. I fired up some different apps with vsync and no TB, some with their own framte rate counters like 3dmark, and they would jump around in numbers to some extent but locked on to those digits without triple buffering. BF2's counter jumps a bit more than most but my GTX wasnt hanging around 30 so frequently once I had the triple buffering forced. It works, just as you can see I feel my tools for measuring are inadequate to be more uhm... scientifical or something. :p Well, even your immediate fps is an average of some kind.
 
Russ said:
Someone in the thread concerning the new Beta drivers (77.76 I think?) from nVidia said triple buffering was now an option.

Tripple Buffering can now be forced in ForceWare 77.76
 
First off, before I make a fool of myself... I should ask BF2 is DX this time is it not?

What I would say... hope you just don't believe me because you think I'm stupid, and that it's not the ATi card in your box that makes you so skepical. :D jk. The vid card game is flollow the leader anyway, ATi will probably follow suit. If it wasnt for their OGL implementation, Nv probably wouldn't have it either.

However, to clarify, maybe Brent can make me understand some of my own testing.... it's not uncommon for some fluctuations on an FPS counter even if you're monitor is doing the vsync 60/30 trick is it? To beat a dead horse in BF2, before it was hanging around 60 and 30. Now it seems more all over the place like it should be.
 
Thing with framerate is that it's an amount of frames measured over a certain period of time.
Within your measurement-interval, the framerate may not be entirely constant. That's why you may get fluctuations and such, and even with vsync on and double-buffering, the framerate doesn't always have to be 60 or 30. If it goes back and forth within the measurement interval, the actual measured framerate will be somewhere between 60 and 30.
 
Scali said:
Thing with framerate is that it's an amount of frames measured over a certain period of time.
Within your measurement-interval, the framerate may not be entirely constant. That's why you may get fluctuations and such, and even with vsync on and double-buffering, the framerate doesn't always have to be 60 or 30. If it goes back and forth within the measurement interval, the actual measured framerate will be somewhere between 60 and 30.

Thats the hunch I was going on. I just didn't want to sing from the roof tops that its working and be wrong. Even an immediate fps next to an average like some games do, well even fps is an average of sorts. If you want to know how many frames you are getting drawn in an exact instant... well hell it's 1. Instead, first half a second I drew x frames, second half I drew y frames.... I drew x + y even if they are different. For half a second , my 60Hz vsync'd and double buffered setup received 15 of the 30 Images it had a chance to draw, the other 30 the graphics was there from the start. I got 45fps that second. The average I am sure would be way up now with triple buffering as my counter doesnt have a serious magnetic attention to 30. Bleh... should have it right now I think.
 
Yeah what you'll notice is if you get to an area where the FPS drops below 60 for a moment and then goes back up, the counter will drop somewhat to some number like 40, then go back up to 60, because it didn't have enough time to go all the way down to 30. It doesn't mean you were actually getting 40FPS, it's just "counter lag" as it were.

I've been playing with the 77.76 triple buffering option, and I could swear I've gotten it to work in Far Cry but it doesn't stick. It's like it works for awhile but then it stops or something. Either way I get some artifacting in a few small places when I have TB on with Far Cry.

I should test this in SC3... I used another utility to force TB in DX while playing SC3 and it worked (apparantly). I haven't tried using the driver setting. I should also test in Doom 3...

So has anyone gotten TB working in BF2?
 
Well I still don't know what was meant by this here:

kmeson said:
Scali said:
PC hardware has always been relatively poor compared to home computers.
LOL!

Anyone else have any idea? I fail to see what's so funny, and kmeson seems reluctant to explain it.
If something I say is so damn funny, I'd like to know why.
 
Scali said:
Well I still don't know what was meant by this here:



Anyone else have any idea? I fail to see what's so funny, and kmeson seems reluctant to explain it.
If something I say is so damn funny, I'd like to know why.

What is the difference between a PC and a home computer?

Terra...
 
Scali said:
Well I still don't know what was meant by this here:



Anyone else have any idea? I fail to see what's so funny, and kmeson seems reluctant to explain it.
If something I say is so damn funny, I'd like to know why.

Maybe you should switch to decaf.
 
I just want to say thanks for this thread. I've been using vsync for years and people always call me a retard when I say so. It's nice to FINALLY see someone even halfway defending it. It actually hurts my eyes to play with vsync off. I can't focus on anything.
 
Terra said:
What is the difference between a PC and a home computer?

A PC is just a 'personal computer', as in, it is small enough to fit on a desk, so each employee can have his or her own personal computer, as opposed to timesharing systems etc, where only a terminal would fit on a desk, and the actual computer is in a central room, serving many people at once. Although in this case I specifically meant the IBM compatible PC as we now know it, and how the term PC is applied in everyday life.

A home computer is ofcourse a personal computer, but it is designed for the home. So it is small, silent, affordable, can be used on a standard television set, and has special features, mainly interesting for home use (better graphics and sound for games and other entertainment for example). In a way they are 'game consoles with a keyboard'. Popular examples of home computers are ZX Spectrum, Commodore 64, Atari ST, Amiga.
In the early 1990s, PCs started to take over from the home computers for some reason, and home computers went out of fashion, and out of production ofcourse.
 
you say my FPS cant be higher then my refresh rate but in Enemy Territory I have com_maxFPS 125 and is maintained 95% of the time and my lcd refresh rate is 75hz.
 
Back
Top