Refresh Rate

MaMMa

Supreme [H]ardness
Joined
Aug 11, 2004
Messages
6,778
I just went from an ATi 9800XT to a BFG 6800 GT OC

I used to be able to run my monitor refresh rate at 120 @ 1600X1200. Now with the 6800 it can only get it at 75hz! This is lowering my FPS in all games. Whats the deal? I've got a Samsung SyncMaster 957MB and I'm sure it can handle these high refresh rates.

I used the nvidia drivers, beta forceware and omega. All the same, the omega really screwed me.

Anyone know why this is happening?
 
The deal is that you need to uncheck the "hide modes that this monitor cannot display". 120Hz at a decent resolution is probably going to wreck any normal CRT. What kind is that?
 
u used that registry tweak thing right where u are able to choose unsafe refresh rates (non defaults) with ur xt right ??

i think u did cause ive never heard 120hz especialy at that high :p so reinstall it
 
I used no registry tweaks. my monitor can handle these rates, hidden or not. I did check that box to get to higher rates but it should work. Since it caps at 75 now my fps aren't as high as they should be. In cs:s I'm getting from 60-113fps, I believe it should be constant with out dropping. My 9800xt did better then this and I'm sure that my 6800 is the better card.

My brother is currently running my 9800 xt and he gets better fps, what gives :confused:
 
If its got VSYNC enabled thats the only way it should stop at 75Hz. If not you're using different settings or need to use driver cleaner. Your new card is much more powerful than the XT was. Are you getting 75 constant FPS without dropoff? Does the FPS go lower than the XT did? Is VSYNC enabled in the game config files? Are your Nvidia drivers the most current?
 
Drivers are the latest. I've done a clean format not to long ago after my raptor died. Vsync is off and if its on then the FPS would lower, right? I get up to 130 when no one is around and it drops to 60s. I would have to assume that this cards lowest fps should be a lot higher then a measily 60.

I run the game at 1280X960
 
well, this is the main thing, how come my ATi card can handle the high refresh rate and Nvidia can't?
 
How long has that monitor been running at 120Hz at 1600x1200? I just want to know if running high refresh rates can cause an early death to your monitor.
 
Its ran 120 for about 1.5 years or so. I've got 5 year warranty on my crt so I'm good.

As far as the link above, If it can run 85hz at that high of a resolution, why can't it even handle 100hz at 1600X1200, this is just sad :mad:
 
yes i would. read the thread. Any gamer would notice a drop from 100fps to 60
 
Like you're gonna notice a difference between 85hz and 100hz+

i notice a huge difference. 100hz is the minimum refresh rate where i do not notice any flickering. i would love to know how to run 1280x1024 @ 100hz... the most windows will let me do is 85.
 
MaMMa said:
yes i would. read the thread. Any gamer would notice a drop from 100fps to 60

ANd you should learn the difference beyween REFRESH RATE and FRAMES PER SECOND...

A card can easily give you 110FPS, while you monitor is locked at 75Hz...

Terra - Stop mixing things!
 
I understand that. My monitor is not locked at 75hz, thats the thing. It only started after I went from ATi to Nvidia. You people aren't understanding me
 
What a cluster we have going on here. Let's talk some basics.

Assuming vsync is off, FPS is the count of how many frames the video card can throw into the output buffer over a given second. Note that these frames are NEVER delivered to the output buffer evenly distributed across one second. Consider them to be random in distribution and you'll be able to piece the rest together.

Refresh rate is the number of times per second that the output device (CRT or LCD) reads the output buffer and displays a new image. Refresh rate IS evenly distributed across one second.

Knowing that, you can assume two things. First off, even with 80FPS and 60Hz refresh rate, it is possible, and even likely, that several times per second the image in the output buffer will be the same from one refresh cycle to the next. This would not be true if FPS was constant. Second, you can assume that in the same scenario, the output buffer's image would have changed more than once between refresh cycles. This would be true regardless of whether or not FPS were constant.

What does this all mean? Not much, except for the fact that it appears there are a few people in this thread who already knew all of this, and there are several who clearly don't.

Expecting any video card to produce a constant FPS output regardless of scene composition is silly. Let me correct that. Expecting a COMPUTER to output a constant FPS regardless of what's going on in the scene is stupid. Don't forget that the CPU plays a role in rendering a scene as well. You can have quad SLI'ed 6950 Ultra 2's in your system for all it matters. Sometimes there are going to be constraints that the CPU imposes.

And finally, 60 FPS with vsync on can be a hell of a lot more impressive and better looking than 85 FPS with vsync off. The ultimate question at the end of the day is does it look smooth, and can you hang when things get intense and everyone else is bogged down?
 
Anything above 85Hz is flicker free so its not like 85 and 100 fps is going to make a noticable difference. All Samsung CRTs are cheapo shadow mask type so if you're running refresh rates that high you're probably getting image distortion and turning the invar white hot. Not even my aperture grille Trinitron goes above 72Hz at 1600x1200 and the highest end Sony(walk into any graphics design studio or place where image quality is paramount and you'll see these) CRT that costs $1,700 won't go above 85Hz @1600x1200.
 
I never expected the FPS to remain constant in all scences, it just bothers me that for some reason my 9800XT ran performance and quality better in cs source then my 6800. In any other game however, the 6800 easily dominates the 9800.
 
Finally a little bit of real information to the thread...

Basically your refresh rate has nothing to do with your FPS. And I know people can tell the difference between 60 and 75 or 85 Hz, but I really doubt you can tell the difference between 75 or 85 and 120.
 
When I have my refresh rate at 60, my fps is capped at 60. WHen my refresh rate is at 75-85 its higher? How does refresh rate have nothing to do with FPS? :confused:
 
Maybe you've got the drivers set up improperly or something, because the 6800 is way more powerful than the 9800. The 6600GT is even more powerful.
 
My drivers are fine, I used 3 different ones all the same issue :(
 
MaMMa said:
When I have my refresh rate at 60, my fps is capped at 60. WHen my refresh rate is at 75-85 its higher? How does refresh rate have nothing to do with FPS? :confused:

AFAIK, with Vsync off, the two are not related. And how has your monitor not caught on fire or something while supposedly playing at 120Hz? Unless you had it at like 640x480 at 120Hz...which of course is pointless in a way I don't even need to explain.
 
MaMMa said:
I just want it to run at 100hz @ 1280X960

Yeah, you can say that all you want, but is it necessary? I don't see how 85 or even 75Hz is such a step down. Now the fps issues sound like a different story. What game are you playing?
 
I'm playing counter-strike source, my fps is now 49-80s. Its not smooth much. I can't play it for nothing. This really blows. I can careless about the refresh rate really, I just want the game to run smooth and look good :(
 
You do realize that Doom 3 and other games are capped at 60 or even 30 FPS and are totally playable right? 60-80 is still really high.
 
Yes I understand that, I can play doom3 at 1600X1200 same with HL2, but the second I get in cs source, everything feels crappy. I tried to play it at 800X600, 1024, 1280 and 1600, all the same.

I know this card should be able to handle this game
 
I just turned my AA and AF down a notch to 4/8 I think, and now its running lots better.

What's fastwrites?
 
You had AA and AF at those levels with the 9800? My 9800 can barely do 2xAF @ 1280x1024. I can't even run at higher than 2XAF on my 6800 @ 1600x1200+maximum settings in every game (including Ultra on Doom 3).

Fast writes is a setting for the AGP that you can disable in the low level system settings in RivaTuner from www.guru3d.com
 
Some monitors at 85Hz still flicker. The only way to get rid of it is to go to 100Hz. I had a KDS 195vf? (the trinitron 19"), but it sucked even at 85 Hz any resolution. I could definately tell the screen was flickering. However, I do wonder if running it at such a high desktop resolution caused the monitor to die in less than 3 years. This new NEC P95f+, I can do 1280x1024@85 and no flickering. I know it can do 1600x1200@85, but at that resolution, I think the screen is too small. There is a difference between the "good" trinitrons and the "entry level" trinitrons and the refresh rate is the only way you can tell them apart when you buy the monitor. The entry level ones only manage up to 1600x1200@60Hz. The good trinitron does 1940x(something) @75Hz.
 
85Hz is the VESA standard for flicker free operation. The max my Trinitron will do at 1600x1200 is 72Hz according to the drivers.
 
I'd never run that high. My monitor can address up to 2048x1536 at 80hz without problem, but at 16x12, the thing's rated for 85hz from the factory (again can address higher though). Higher will cut down on its lifespan and can cause distortion (don't see it on mine, but I don't look for those things).

If you really want higher refresh rates, just go into the Monitor tab in the display properties and deselect "hide modes that this monitor cannot display". Thar or you can go into the nvidia tab and use the refresh rate override
 
Back
Top