Vertical Sync in CPU Games - 30 or 60?

  • Thread starter Deleted member 23757
  • Start date
D

Deleted member 23757

Guest
Quick question. I have an LCD (in sig) set to 60 refresh rate. When I enable Veritcal Sync in games something wierd happens. The FPS are either 30 or 60. If it can render 60 it jumps to 60, if it drops below 60 the frames automatically drop to 30. Is this normal behavior for Vertical Sync on an LCD?

Edited the title.
 
kronchev said:
no sync will ever be 30 FPS. how are you getting that reading?

Using Fraps and the in game FPS meters. For some reason it seems like it is splitting the frames in half when it cannot keep 60 FPS going so it drops to 30 FPS. Very wierd. I hope there are some gurus out there that can help :)
 
PerfectCr said:
Using Fraps and the in game FPS meters. For some reason it seems like it is splitting the frames in half when it cannot keep 60 FPS going so it drops to 30 FPS. Very wierd. I hope there are some gurus out there that can help :)

i have a 174v and I experiance no such problem with vsync, so sorry
 
Searched Google and I can only find one forum where somone said enabling VSynch halves FPS it drops below the max of your refresh rate :(
 
I thought if you enable triple buffering (i think?) it doesnt do that
 
Isn't the game capped at 60fps ?

The only way to exceed that is to disable vsync and hope that you don't get
tearing or other issues...but then again I may be full of crap. :D
So you should max out if your computer/video can handle it.

Works fine on my LCD....

Try using:

com_showfps 1

for ingame FPS
 
msny said:
Isn't the game capped at 60fps ?

The only way to exceed that is to disable vsync and hope that you don't get
tearing or other issues...but then again I may be full of crap. :D
So you should max out if your computer/video can handle it.

Works fine on my LCD....

Try using:

com_showfps 1

for ingame FPS

Um, thanks but I am not talking specifically about Doom 3 and I don't think you understand my question.
 
I notice this as well. However I'm not noticing any in-game differences when it's jumping from 30 to 60 so I'm guessing vsync is screwing with Fraps which is reporting incorrect numbers.
 
PerfectCr said:
Where do you enable that?

I'm not sure, i have an ati card so i'm not familiar with nvidia's menus, but in its probably somewhere in the ogl/d3d options
 
Well well well....Some further digging led me to this thread at Guru3D. It looks like Enabling TRIPLE BUFFERING would solve this issue but NVIDIA removed the Triple Buffering option for OpenGL from their drivers sometime back. Enabling this would solve the FPS halving issue. :( Oh well. Guess I gotta leave VSync off.
 
I experience the exact same thing on my 23" LCD. For the most part in d3 it is 60fps and every once in a while it will drop to exactly 30fps. It doesnt effect the playability at all. One thing that is for sure is D3 does look awesome on ultra detail.
 
I have a CRT and noticed that I either get 30 or 60 FPS constantly (usually 30). It's not monitor related. Also, the game runs my monitor at 60Hz and its pissing me off. I want 85Hz.
 
MiXdNuTs said:
I have a CRT and noticed that I either get 30 or 60 FPS constantly (usually 30). It's not monitor related. Also, the game runs my monitor at 60Hz and its pissing me off. I want 85Hz.

Yeah if you see the thread I linked you'd see that it's DRIVER releated not monitor related. Your issue is easily solved by forcing refresh rates from the driver level.
 
MiXdNuTs said:
I have a CRT and noticed that I either get 30 or 60 FPS constantly (usually 30). It's not monitor related. Also, the game runs my monitor at 60Hz and its pissing me off. I want 85Hz.


You can use a program called nvhardpage (google it) and it allows you to turn ON triple buffering for OGL. Maybe Nvidia will pull the head out of the backside and give us back this option one day in the driver control panel...
 
TheRapture said:
You can use a program called nvhardpage (google it) and it allows you to turn ON triple buffering for OGL. Maybe Nvidia will pull the head out of the backside and give us back this option one day in the driver control panel...

Actually I did download that tool and I could not find the option. I read Nvidia actually REMOVED support for it from the Driver itself, so there is no way to enable it even with a tool such as that.
 
PerfectCr said:
Actually I did download that tool and I could not find the option. I read Nvidia actually REMOVED support for it from the Driver itself, so there is no way to enable it even with a tool such as that.


Well I have it set in that prog, and I am able to use frame rate between 30 and 60...in fact quite often I am running right around 45fps or so..here is a graph of a level I recorded using FRAPS...

DOOM3-1280-HIGH.jpg
 
TheRapture said:
Well I have it set in that prog, and I am able to use frame rate between 30 and 60...in fact quite often I am running right around 45fps or so..here is a graph of a level I recorded using FRAPS...

Your graph really does not prove anything (assuming you are using an Nvidia card, ATI SUPPORTS Triple Buffering) since the transition between 60 and 30 is not immediate. It takes a few seconds but it does level off at 30. Read the thread I linked to above and you will see.
 
if i remember right, this is why that happens. vsync is meant to reduce screen tearing, so it caps your frame rate at your monitor's refresh rate. if you card cannot produce that fps, then it has to cut the frames exactly in half so that the frame rate is still consistant/divisable by the refresh rate; hence, no screen tearing.

i could be mistaken though :(
 
Nihilanth99 said:
if i remember right, this is why that happens. vsync is meant to reduce screen tearing, so it caps your frame rate at your monitor's refresh rate. if you card cannot produce that fps, then it has to cut the frames exactly in half so that the frame rate is still consistant/divisable by the refresh rate; hence, no screen tearing.

i could be mistaken though :(

Yeah I think you are right. The more I read the more I am coming to this conclusion.
 
I thought everyone who knew about vsync understood the by-products of it. Here's what is happening. Start counting after your first frame is drawn. After 1/60th a second, the monitor starts another screen draw. However, your system can only draw 59 frames per second. Thus, is takes barely longer than 1/60th of a second. Since your system isn't finished with the 2nd frame, the monitor just gets refreshed with the frame from the 1st 60th of the second. So, we've just completed 2/60th a second and only put 1 image on the screen.

Now, just after the screen starts redrawing, your system finishes the 2nd image that needs to be displayed. Since you have VSync on, your system won't try to push this 2nd image to the monitor. The monitor is already drawing frame 2 by repeating frame one. Now, your system could go ahead and start rendering the next image for the game, but since you only have double buffering available, you can't render. Your first buffer is what is showing up on the screen. Your second buffer is what you just finished rendering. The computer wants to start image 3, but has nowhere to put it. Soooooo, we have to wait until the 3rd 60th of this second, when the frame we just finished rendering starts getting displayed. Then, the old display buffer will be available for rendering, and the old storage buffer will become the display buffer.

So now we are displaying our 2nd image in the 3rd 60th of the second. While this one is drawing on screen, the computer starts rendering image 3 in the old display buffer. When this 60th of a second is up, our system isn't quite done with the 3rd image. Sooooo, the system just re-uses the 2nd image and draws that for the 4th 60th of this second. And once we finish, just barely after the 2nd image was put onscreen again, we are ready to render the 4th image, but have nowhere in which to render. So, we wait for the 60th of a second to end, make the image buffer we just finished using to create the render the new display buffer, and start rendering the 4th image into the old display buffer.

This continues throughout the display process. If you can't render faster than the screen can refresh, you'll drop to an FPS rating of 1/2 the refresh rate. And if you render slower than 1/2 the refresh rate, you'll drop to 1/4 the refresh rate. This continues down 1/8, then 1/16, and so on.

From this, you should be able to see why triple buffering will boost your FPS rating. If you can start rendering into an unused buffer, you can sometimes catch up with the monitor/LCD. That's because when you run 59 FPS, you actually render at a different rate per frame. So frame 1 might take barely longer than 1/60 of a second, as might 2, 3, 4, and frame 5. Then, frame 6 comes up and is very plain and easy to draw, the system draws in 1/90 of a second. Frames 1 through 5 display on the 1st, 3rd, 5th, 7th, 9th, and 11th refresh, but frame 6 is ready in time for the 12th refresh. So whenever frames can render in less than the refresh rate, you get new images on each refresh. If you couldn't render faster than the refresh, you get images displayed multiple times.

Wow, that explanation sucks. Hopefully, though, you can make enough sense of it to see what's happening.

RagManX
 
Thanks I think I got it :) Good explaination, thanks for taking the time to write it!
 
Nice explanation, but we still have no way to enable triple buffering to work. Anyone have nay info?

G
 
gthompson20 said:
Nice explanation, but we still have no way to enable triple buffering to work. Anyone have nay info?

G


um,

control panel for the vid card?
 
gthompson20 said:
Nice explanation, but we still have no way to enable triple buffering to work. Anyone have nay info?

G

Try reading the entire thread and see the link to a thread on GURU3D I posted.
 
Back
Top