no need for >60fps guys..

Contrary to popular belief - your average police department helicopter *might* hit 140mph with the wind on half a tank of fuel... Most of the time they are lucky to break 100-120mph. Outrunning a police helicopter isn't impossible, it's actually pretty easy on a sport bike, however they have a 3-4 mile visibility.. so you have to outrun them for quite some time before you decide to disappear.

They might not be as fast, but they are not limited to roads. ShortCuts FTW!!!

Running from the police is a dumb thing to do unless your living in a movie. I have a SportBike and don't plan on ever trying to run from poepoe's. Yamaha R6 (I'm a new rider :) )
 
There are lots of games where this happens, the ID engines are famous for it, doom3 is capped internally at 60FPS for this reason.

Well thats sad then. I remember reading your posts in some other threads and know you know a bit about game development so i will believe you. I just didn't think professional developers would do something like that. I would never lock the games logic with its framerate, thats just stupid.
 
LMAO I love that part about how true gamers don't use LCD.. LMAO!! good one Blaze..:rolleyes:

I'm saying if you're going to honestly give a shit about refresh rates, frame rates, etc. You wouldn't be using an LCD in the first place... If you play games casually, then bitch about your $600 LCD's shitty performance for games... you shouldn't have bought an LCD to begin with... People who bought LCD's can accept their limitations and use them for what they're good for... space saving...
 
Well thats sad then. I remember reading your posts in some other threads and know you know a bit about game development so i will believe you. I just didn't think professional developers would do something like that. I would never lock the games logic with its framerate, thats just stupid.

I think I may have been slightly off the mark, I've just searched about more on this matter and have found this:

John Carmack of id sofware: "The game tic simulation, including player movement, runs at 60hz, so if it rendered any faster, it would just be rendering identical frames. A fixed tic rate removes issues like Quake 3 had, where some jumps could only be made at certain framerates. In Doom, the same player inputs will produce the same motions, no matter what the framerate is."

All ID engines before Doom3 ran the tic rate of the game at the same speed as your frame rate, the tic rate is the rate at which the game code checks all the world variables and alters them based on user inputs, things like jumping the height of the jump is no doubt calculated each tic and so depending on your frame rate you may or may not be able to get over certain heights based on how frequently your height it checked.

I believe having a frame rate of 125 or any multiple of this meant that a sample of your jump height was probably taken exactly at the apex of your jump, where as with other tic rates (frame rates) there maybe no measure of your height at exactly the apex, but aproaching or leaving it, which gives a total height less than the intended height.

Im sure it also effected weapon fire speeds, there was some debate about this issue with the Mini gun in Epics Unreal Torunament 2004 as well, although ticrates were variable by server admins for server side code I think players were altering their own games ticrate which wasn't set at the frame rate rather a client side command, altering this gave potentially slightly higher rates of fire, I read a thread on this ages ago, so how much is true I'm not sure.
 
I think I may have been slightly off the mark, I've just searched about more on this matter and have found this:

...

In Quake 2 CTF online, there was this map, which had this tiny window, that normally was just low enough that you couldn' get your character through... But if you capped your FPS low enough(or just had bad FPS), you could "squeeze" your way through by grappling through.

I didn't play Q3 because of real life, but I've heard similar tricks, that can only be done at certain FPS (usually higher than 100 or something)

Guess you never tried playing some of the much older games out there on newer hardware. Lots of them were broken when your FPS were too high. the whole game moved like a blur..
 
Guess you never tried playing some of the much older games out there on newer hardware. Lots of them were broken when your FPS were too high. the whole game moved like a blur..

GTA (2?) with uncapped speed FTW!! Nothing like getting in a vehicle and going forward only to instantly crash into the first wall miles away :p
 
If you've ever programmed games before (I've actually coded in every ID engine - with the exception of quake4) you can see the advantages, and the disadvantages of your tic update. Capping at 60hz is really a rigged up way to address the issue. The more complex a given game is, the more it can benefit from a higher tic update. FPS's really dont necessarily need to be higher than 60hz, because there are very few things that require that kind of precision (like certain jumps). However, if you do lower the update levels, you increase the probability of innacurate data from client to server... that can cause problems. You always want to avoid sometime happening different on a clients computer than on the server.
 
GTA (2?) with uncapped speed FTW!! Nothing like getting in a vehicle and going forward only to instantly crash into the first wall miles away :p

Yeah loads of old games had frame limiters which weren't strictly needed at the time but are sure as hell needed now, GTA 2 does spring to mind, I think also some of the older command and conquers as well, no doubt countless others.
 
And to avoid the head-splitting eye ache of the CRT.

and avoid having to put your hands over your eyes to avoid the glare while you peck at the tiny buttons on the bottom of it to adjust the screen dimensions for 20 minutes.
 
IMO a good CRT trumps a good LCD, however an average LCD is just as good if not better than an average CRT. The problem is you cannot buy good CRT's anymore.

LCD's despite their downsides do sport very large panel sizes for relatively cheap, have good brightness (my old Iiyama CRT is very dark now :() display no flicker, are very low profile, require no resizing at different resolutions or refresh rates, they don't have issues with colour at the corners of the screen, interferance or glare/reflection.

Still LCD's have plenty of issues, a lot of which aren't a problem if you get a good panel.
 
And to avoid the head-splitting eye ache of the CRT.

Then there's the major win -- they don't weigh 20-40 kilograms.There's also the minor part where you don't need lots of spare space behind your desk for the tube part of the screen to stick out, or spare desk space for it to stick forward.

As for the OP -- eh, it keeps them happy and off the streets, let them have fun. Plus, as they said, it keeps the minimum and average FPS off the floor.
 
Well im reading up on vsync and my LCD refresh rate is 60hz.
ew! that's like touching another mans arm...its just wrong. it doesn't do 75? regardless 100hz on a good crt kicks lcd ass. I'd take it over widescreen gaming anyday
 
It's not so much that people want 100+ fps for the 100+ fps average, but for getting their minimum fps to something like 70+, because the minimum fps always occurs when you need the high frame rate most.
 
and brighter colors
and sharper images
etc...

Debatable... You'll never get good black levels with an LCD...

ability to make it to lan's in one trip in.

I need 2 trips with or without a CRT... I carry all the Cat-5 cables, power-strips, Hubs, games, extra headsets/mice... Even then, it's risky trying to carry a computer under one arm and an LCD under another... I've almost killed a couple computers that way :eek:

And to avoid the head-splitting eye ache of the CRT.

Amazing how we lived without LCD's for so long... Maybe it's a flaw with your eyes, and not the technology itself? Turn up your refresh-rate and you no longer have that "issue"

and avoid having to put your hands over your eyes to avoid the glare while you peck at the tiny buttons on the bottom of it to adjust the screen dimensions for 20 minutes.

How many LAN's are played outdoors? Are you kidding? LOL... You setup the CRT once... you shouldn't have to adjust it afterwards...

And the price-point... for HD Video Editing I can grab a FW900 for $200... or a Dell 2407 for $600... Hmmmmmmmm 3 monitors or 1 for the same price? :rolleyes:

I'll admit that LCD technology has gotten better, it still isn't on-par with CRT's ... I remember buying a $800 Viewsonic 19" LCD and returning it that same day because the ghosting was so bad with 16ms input I couldn't fucking play UT2004... it was soooo bad I went back to my Sony Trinitron E200... 17"... kicked many asses with that monitor, then gave it away :(

I'll enjoy my 100fps in source at 98hz on 24" CRT screen :D
 
It's still a pain to find and adjust a good CRT. And since all the good ones will be used, you don't have much of a life expectancy. I'd love to get my hands on a good one, but I don't really want to risk e-bay especially considering I'll end up paying the cost of the monitor in shipping.

I want to know why the heck they don't make them any more. They are absolutely required for serious photo and graphics work, and new technologies won't fill that nice for quite some time now.

That being said, I love my 2005fpw.
 
Pro hardcore elite skilled pwnage REAL gamers only use CRT monitors... and yet I have been consistently skooling all these so called pro-pub monkeys on my LCD monitors for close to 5 years now... oh well I guess I'll never be a real gamer... :(
 
and brighter colors
and sharper images
etc...
Don't forget pixel perfect geometry, no EM issues (particularly when facing north/south), and my favorite, the ability to go 16:10 widescreen :cool:

While I can still see motion blurring on "2ms" panels (which I suspect are not that much better than 6ms, 12ms, 16ms panels) and the dithering on TN panels is noticeable in a lot of cases, I just won't ever go back to CRT. In twitch games, I will disable vsync and deal with the tearing for more responsiveness (particularly compared with triple buffering setup).
 
but if you get 60FPS constantly that is good. You will have a consitant gameplay experience that way.

You get better image quality using VSYNC on than with it off.
 
Contrary to popular belief - your average police department helicopter *might* hit 140mph with the wind on half a tank of fuel... Most of the time they are lucky to break 100-120mph. Outrunning a police helicopter isn't impossible, it's actually pretty easy on a sport bike, however they have a 3-4 mile visibility.. so you have to outrun them for quite some time before you decide to disappear.
In a straight line, maybe, but since there are few, if any such places where you can go 200 mph unfettered for long enough, the difficulty in out-running a helo are very real.

Also, take if from someone familiar with a case in NY - if, for instance, a cop should wreck during the chase and die, they WILL catch you even if it is a week or two later.
 
I'm pretty sure that in games such as counterstrike and day of defeat, having high fps is very important because you cannot send data to the server any faster than your computer updates itself. High fps is so important that server companies market servers as capable of 'x' number of fps (500 is what most people shoot for). There are also ways to set the amount of updates you send to and receive from the server as well as the amount of interpolation you use. (cl_cmdrate, cl_updaterate, rate, ex_interp) I always get the commands mixed up because my computer is fast enough that I get 100fps constant, but I believe you're supposed to set cl_cmdrate (maybe cl_updaterate) to your average fps for the most accurate gameplay.

I've noticed that recoil control is especially affected by the framerate. When my framerate dips and I'm using a high recoil weapon in DOD:S, the height of the bullets continually rise even though I'm pulling down enough to account for recoil and I can see my crosshair staying steadily at a certain level.

Edit, I believe the human eye can detect variation and choppiness up to around 75hz on average, but if this is predictable and smoothed out, as is the case with TV/Movies, roughly 30hz is sufficient.
 
What you just said has nothing at all to do with framerate. You are talking about tick rate of the games internal clock. In any well written game the display update will be entirely independant of the games logic/network/IO updates
 
I misspoke. cl_cmdrate should be set to reflect the server's fps.

And if you're experiencing fps lag, there is a good chance that other things will be lagging as well. This is especially noticeable with keyboard/mouse input.

edit: If you're network/IO/logic lags, the display will reflect that (but may not seem laggy in and of itself; ie: other players may just skip around while your movement is smooth). But if the display lags, the network/IO/logic will not directly reflect that, but may be affected by what's causing the display to lag. (We might be agreeing, but just misunderstanding each other?)
 
What you just said has nothing at all to do with framerate. You are talking about tick rate of the games internal clock. In any well written game the display update will be entirely independant of the games logic/network/IO updates

It might not be locked at the frame rate, but it doesn't mean the 2 things don't effect each other. The ticrate defines the maximum unique frame rate (the frame rate of unique frames, not ones that are identical), if you draw a new frame in the game, but the world hasn't updated then the positions of everything will be identical and the frame should render identical to the previous.

If your ticrate is very low, for example approx 10, then you will see definate stutter of game objects as everything is moved around, whether your frame rate is 50, or 500

It seems silly at first glance, the idea of limiting the ticrate to the frame rate, but in singleplayer games it really makes no difference. If, for example, your tickrate is high but your frame rate is low you will have smooth in game mechanics but you're going to percieve them in a jerky manner.

Ideally the 2 should be matched in speed so that each new frame you draw is unique and theres no wasted power in rendering duplicate frames when you don't need them, or calculating game logic when it's never going to be displayed back to the user.

Multiplayer is different and was no doubt the driving force to switch to a set ticrate so that all clients are equal, at least in theory.
 
It might not be locked at the frame rate, but it doesn't mean the 2 things don't effect each other. The ticrate defines the maximum unique frame rate (the frame rate of unique frames, not ones that are identical), if you draw a new frame in the game, but the world hasn't updated then the positions of everything will be identical and the frame should render identical to the previous.

If your ticrate is very low, for example approx 10, then you will see definate stutter of game objects as everything is moved around, whether your frame rate is 50, or 500

It seems silly at first glance, the idea of limiting the ticrate to the frame rate, but in singleplayer games it really makes no difference. If, for example, your tickrate is high but your frame rate is low you will have smooth in game mechanics but you're going to percieve them in a jerky manner.

Ideally the 2 should be matched in speed so that each new frame you draw is unique and theres no wasted power in rendering duplicate frames when you don't need them, or calculating game logic when it's never going to be displayed back to the user.

Multiplayer is different and was no doubt the driving force to switch to a set ticrate so that all clients are equal, at least in theory.

That's not necessarily true as what the player sees can be nidependent of what the server is logging. Server data generally tells a certain location of an object, it's trajectory, it's speed, and its acceleration/deceleration. That position may only update 5 times a second, however on a client computer it could be updating 100 times a second. It has no effect on the accuracy of the object, because on the client side of any computer, the math is being drawn between updated points, to maintain the smoothness. Using algorithms increases efficiency for online play. If an object always moves at a certain distance uninterrupted, theres only a need to transmit its initial location and it's final destination, and let all client computers draw the path to avoid unneeded bandwidth usage. This is how rockets in id games work.. you click to fire, the rocket path is automatically calculated and logged across all client computers, all computers know where the rocket will be at any time since it never changes speed.. unless a player intersects the path at the right time, the rocket travels uninterrupted and explodes just as it would have. The server/client do not spend time updating the rockets location in real time - its unneeded. Now, in a mod I made for Quake3 - I changed these parameters for rockets, and made them update in real time, and made them entities effected by gravity, etc... Quake3 almost had a stroke.
 
Yes, counterstrike handles this through interpolation. The problem is that you need to know precisely when the other player changes direction and shoots and if there's lag somewhere along the way, things can get messed up. Here's a scenario that's commonly used in counterstrike and dod when players are trying to quickly peek a corner and shoot:

Player1 is staring at a wall.
They hold down left, let off, tap right, shoot, and press right again.
Depending on the rate of updates, different things could happen:
-Laggy updates somewhere:
Player 2 see's the player1 keep running out and has more time to shoot. When player1 see's himself as back around the corner running in the opposite direction, the lag causes player2 to instead shoot and kill player1.

Also, player1 could run toward the corner, never peek, but the interpolation could cause him to peek for player2.

I would argue that 100 updates per second is a much better number than 5 especially with the prevalence of broadband connections. Of course, if you have high latency to a server or your bandwidth is lower, you would want less as having a higher number of updates would choke your connection.

Basically, you want as many as possible without saturating your connection.
 
I didn't read the whole thread but in hl1 and hl2 based games the amount of data transferred is dependent on the fps. Also, in most games people play competitively you need at least 100fps to make most jumps, some require 333. So while you sit there thinking that you only need 60fps I'll be playing with 300+ on my crt @ 150hz :) You can go along thinking you're right but you're quite far from it.
 
I didn't read the whole thread but in hl1 and hl2 based games the amount of data transferred is dependent on the fps. Also, in most games people play competitively you need at least 100fps to make most jumps, some require 333. So while you sit there thinking that you only need 60fps I'll be playing with 300+ on my crt @ 150hz :) You can go along thinking you're right but you're quite far from it.

This is an exaggeration - frames per second does not directly correlate to the amount of date updating - its definately a derivative, but not directly connected.
 
Yes, counterstrike handles this through interpolation. The problem is that you need to know precisely when the other player changes direction and shoots and if there's lag somewhere along the way, things can get messed up. Here's a scenario that's commonly used in counterstrike and dod when players are trying to quickly peek a corner and shoot:

Player1 is staring at a wall.
They hold down left, let off, tap right, shoot, and press right again.
Depending on the rate of updates, different things could happen:
-Laggy updates somewhere:
Player 2 see's the player1 keep running out and has more time to shoot. When player1 see's himself as back around the corner running in the opposite direction, the lag causes player2 to instead shoot and kill player1.

Also, player1 could run toward the corner, never peek, but the interpolation could cause him to peek for player2.

I would argue that 100 updates per second is a much better number than 5 especially with the prevalence of broadband connections. Of course, if you have high latency to a server or your bandwidth is lower, you would want less as having a higher number of updates would choke your connection.

Basically, you want as many as possible without saturating your connection.

It was really just an exaggeration to give an example of the differences between server/client. It's been a while since I programmed game data, but this particular bit is something I was very familiar with.
 
Vsync = Input lag.


This is the biggest reason why gamers turn vsync off. Obviously vsync also reduces framerate and if you enable triple buffering to increase the frames when the fps dips below your monitor refresh then you'll suffer an even greater input lag.

Personally I always keep vsync off because I find the input lag too intrusive (even in single player games). I can't enjoy a game unless it's really responsive. Some games fare better than others. I enable it for old games though (HL1 era).
 
That's not necessarily true as what the player sees can be nidependent of what the server is logging. Server data generally tells a certain location of an object, it's trajectory, it's speed, and its acceleration/deceleration. That position may only update 5 times a second, however on a client computer it could be updating 100 times a second. It has no effect on the accuracy of the object, because on the client side of any computer, the math is being drawn between updated points, to maintain the smoothness. Using algorithms increases efficiency for online play. If an object always moves at a certain distance uninterrupted, theres only a need to transmit its initial location and it's final destination, and let all client computers draw the path to avoid unneeded bandwidth usage. This is how rockets in id games work.. you click to fire, the rocket path is automatically calculated and logged across all client computers, all computers know where the rocket will be at any time since it never changes speed.. unless a player intersects the path at the right time, the rocket travels uninterrupted and explodes just as it would have. The server/client do not spend time updating the rockets location in real time - its unneeded. Now, in a mod I made for Quake3 - I changed these parameters for rockets, and made them update in real time, and made them entities effected by gravity, etc... Quake3 almost had a stroke.

Server/client sync is more complex and indeed prone to more issues and tends to be simplified to save on server CPU load and optimised bandwidth usage, I'm mainly talking about client side or singleplayer where theres a direct relationship between the ticrate of the games frame rate.
 
Couple of you seem confused when talking about rate, cl_updaterate, cl_cmdrate and their relation to fps.

In source games, your client cannot send more updates than frames it can render. Period. No if ands or buts.

Set your cl_cmdrate to 100, join a 100 tick server, run net_graph 3. Now type fps_max 30 and watch what happens to your cmdrate (ie: your output). Yep, you guessed it, it's 30. Source games have always been this way. The client should set their cmdrate to their average fps.

That's why people with low fps in close combat in source games are so hard to hit. They get close to you, their fps drops to 10 and so does their cmdrate. In essence they're "rating" via a crappy fps, sending fewer updates to the server on their position which makes them lag or skip across the screen.

Tick rate is indeed how often the server updates the game world, but it also depends on what rate the clients can send updates. Are they sending at the same 100 tick? They're not unless their cl_cmdrate is set to 100 AND the client can sustain 100 fps.

For example, lets say everyone playing on a 100 tick server gets 40 fps, you bet you're now playing on, in essence, a 40 tick server. The server can only update the world at the rate in which it receives updates, which is very much limited by the client's fps. In this extreme example the tick rate becomes 40, as the server doesn't just "make up" the 60 tick difference.

Hope that helps.
 
Back
Top