Does vertical resolution have any correlation with tearing with vsync on?

uh?

negative positive, you mean make it maybe longer input lag or something?

there shouldnt be any (relevant) effect whether it's 600px or 1600px
 
no, not really, a higher vertical resolution will give more horizontal lines for the tearing to show up and because your frame rate will be lower due to the higher resolution it may change the frequency of the tearing.

as for input lag, running v-sync OFF reduces input lag because v-sync requires buffering of 1 frame so it automatically adds 16.6 ms of input lag, enabling triple buffering can add even more input lag but supposedly nvidia's method of triple buffering adds less lag than AMD's at least according to anandtech's article on input lag.
 
...tearing with vsync on?
Huh?

If you're getting tearing, vsync isn't on.

With vsync off, I'd say ...maybe. A 2560x1600 frame is around twice the size of a 1600x1200 frame. So it should take twice as long to send to the monitor, which means it's twice as likely to be interrupted by a buffer flip (i.e. twice as likely to tear). But this is assuming transfer rates are the same regardless of the monitor, and I have no idea if that's the case.
 
no, not really, a higher vertical resolution will give more horizontal lines for the tearing to show up and because your frame rate will be lower due to the higher resolution it may change the frequency of the tearing.

That would apply only to CRTs
 
That would apply only to CRTs

no, LCD's refresh much the same way CRT's do actually as far as updating horizontal lines on the screen from top to bottom. they do not refresh the entire screen all at once, plasma's do do that however.


here is a type of input lag testing software running on my PC testing my LCD TV that i use as a monitor vs a CRT
DSCN0535.jpg

same monitor now vs a plasma TV
DSCN0551.jpg

notice the differience, same PC and same program running, the difference is how the displays work.
 
Huh?

If you're getting tearing, vsync isn't on.

With vsync off, I'd say ...maybe. A 2560x1600 frame is around twice the size of a 1600x1200 frame. So it should take twice as long to send to the monitor, which means it's twice as likely to be interrupted by a buffer flip (i.e. twice as likely to tear). But this is assuming transfer rates are the same regardless of the monitor, and I have no idea if that's the case.

how long it takes to send the data from the PC to the display is quite insignificant and with vsync off the delay that is caused by the PC to sync the display to the frame update is removed from the situation. with analog connections (VGA) its near instantaneous,
DVI/HDMI is not far from instantaneous as well. think of local LAN pings. i have a server here in my room connected to the same hub as my pc if i ping it i get <1 ms response. that is a two way lantency reading from a digital connection both further away from my PC than my monitor is and going through a "middle man" device. latency is not an issue with video connections, bandwidth can be depending on the resolution and that is why dual link DVI exists to fix that.
 
how long it takes to send the data from the PC to the display is quite insignificant and with vsync off the delay that is caused by the PC to sync the display to the frame update is removed from the situation. with analog connections (VGA) its near instantaneous,
DVI/HDMI is not far from instantaneous as well.
It's fast, but it's not insignificant. If it were nearly instantaneous, then it would be nearly impossible for a buffer flip to land in the middle of a frame transmission, but that's exactly what happens every time you see a tear.

There is a limit to the resolution that DVI and HDMI can support, and presumably it's a limit because the link is saturated at this point, i.e. it's taking an entire refresh interval to send each frame. A 1920x1200 signal accounts for about 84% of the available bandwidth over DVI / HDMI 1.2, which would suggest that each frame is taking 14ms to transmit.

Analog signals are a bit more clear-cut. The signal must be coming through continuously as long as the electron gun is firing, blanking intervals aside. So the time it takes to send a full frame over VGA / DVI-A is exactly the frame interval minus the vertical blanking interval.

think of local LAN pings. i have a server here in my room connected to the same hub as my pc if i ping it i get <1 ms response.
Yeah, but a ping is not a 50MB block of data. Your monitor may receive the first bit <1ms after it was sent, but it's not going to receive the last bit <1ms after the first.
 
Huh?

If you're getting tearing, vsync isn't on.

With vsync off, I'd say ...maybe. A 2560x1600 frame is around twice the size of a 1600x1200 frame. So it should take twice as long to send to the monitor, which means it's twice as likely to be interrupted by a buffer flip (i.e. twice as likely to tear). But this is assuming transfer rates are the same regardless of the monitor, and I have no idea if that's the case.
Thanks for the reply. Sorry about the typo in the title.

I have an Apple LED Cinema 24 inch which has native resolution of 1920x1200 and an evga 560Ti 2GB.
What do you mean by transfer rates? I seem to get pretty bad tearing in Turok with nglide at 640x480 (with nvidia cp set to do not scale and override application setting) if I have vsync off. That could be because it doesn't run at less than 60fps though, as I have no way to limit the frame rate for that app.
 
What do you mean by transfer rates? I seem to get pretty bad tearing in Turok with nglide at 640x480
I mean, DVI allows up to ~4Gb/s, but I don't know if the link always runs at this speed. Your monitor could have half the resolution, but if it's also transferring at half the rate, then your chances of tearing at a given framerate are the same.

This is all kind of pointless theorising, though... As for your actual problem (which you might want to bring up earlier next time ;) ), lower res = higher framerate, and higher framerate = more tearing. Running a 15-year-old game at a measly 640x480, you've probably got framerates in the thousands, which is going to mean dozens of tears per frame. If frame limiters aren't working, there's not much to be done besides turning on vsync, I'm afraid...
 
I mean, DVI allows up to ~4Gb/s, but I don't know if the link always runs at this speed. Your monitor could have half the resolution, but if it's also transferring at half the rate, then your chances of tearing at a given framerate are the same.

This is all kind of pointless theorising, though... As for your actual problem (which you might want to bring up earlier next time ;) ), lower res = higher framerate, and higher framerate = more tearing. Running a 15-year-old game at a measly 640x480, you've probably got framerates in the thousands, which is going to mean dozens of tears per frame. If frame limiters aren't working, there's not much to be done besides turning on vsync, I'm afraid...
Okay, thank you:) nvidia is going to add the fps limiter to future drivers, so I'll see what it does then.
 
Back
Top