Left 4 Dead FPS?

left4dead2009-02-1902-28-30-29.jpg
left4dead2009-02-1904-23-49-02.jpg
 
^^^ now that's [H]ard. 2560x1600 with everything maxed... oh, and getting 236fps. :D
 
The game is easy to max, graphics are weak for today's standards. I get over 200FPS on GTX 285 SLI @ 1920x1200, 16XAA/16AF. Occasionally dips to ~150 for hordes. A 260 should spit this thing out easily.

I know Valve tries to allow a large audience to play it but I just wish they had different texture packs for high-end systems that could be downloaded on Steam or something.
 
pc in sig. 1920x1080 max settings and 4x msaa and 16x af and the lowest I saw was 65fps but I would say the average is easily around 100fps or better. I dont know how some people are having trouble with fire and explosions because I can fire off molotovs and my framerate doesnt seem to really change at all. in fact it stayed above 120fps during the explosions at the particular spot I was testing at.
 
with the newest nvidia driver installed and using SLI it sits between 200 and 250
 
Well my current 8800gt 512mb OC'd with Q6600 stock which will due me till June when i'm doing an overhaul on whole system sits at 40+ on 1920x1080 if that gives you any idea. (Completely maxed settings 16xQ CSAA and 16xAF). And most humans can't really see a difference when you get past 30-40fps anyway.
 
Well my current 8800gt 512mb OC'd with Q6600 stock which will due me till June when i'm doing an overhaul on whole system sits at 40+ on 1920x1080 if that gives you any idea. (Completely maxed settings 16xQ CSAA and 16xAF).
Overclock your CPU, it'll help a ton in any Source game.

And most humans can't really see a difference when you get past 30-40fps anyway.
Although it's a misnomer, humans can see upwards of 80FPS (more correctly, the different between 70 and 80FPS) depending on the conditions (light, color, etc.). The "humans can't see past 30FPS" myth for some reason keeps being propagated on enthusiast forums.
 
And most humans can't really see a difference when you get past 30-40fps anyway.

Here, let me help you into your flame suit. :D

Clearly you have no idea that it has nothing to do with what we can "see", but more with what we can feel. Playing an FPS doesn't involve just looking at the screen, it involves actively interacting with it. The difference between playing an FPS at 40FPS and 80FPS is complete night and day.

Please, let this horrible "myth" just die...
 
Overclock your CPU, it'll help a ton in any Source game.

Although it's a misnomer, humans can see upwards of 80FPS (more correctly, the different between 70 and 80FPS) depending on the conditions (light, color, etc.). The "humans can't see past 30FPS" myth for some reason keeps being propagated on enthusiast forums.

Source games are more cpu than gpu dependant?
 
Source games are more cpu than gpu dependant?
At this stage, yes. Graphically, the Source engine doesn't stress video cards much anymore (it is over four years old). However, there are still many cases where it will stress the living hell out of your CPU, such as 40+ player CS:S matches, 25+ player TF2 matches, rending tons of zombies in L4D, doing some insane physics mash-ups in any game, etc. Clocking your CPU raises your minimum framerate in these situations and others, and makes quite a visible difference in the fluidity of gameplay.
 
Source games are more cpu than gpu dependant?
Source is weird. a wimpy cpu can handle Source games but at the same time it responds well to even faster cpus. of course if you dont have a decent gpu to begin with then that needs to be the first step.
 
Not sure what my exact numbers are but it feels like a pretty much constant 60+ FPS at 1900x1200.

Specs in sig
 
lol.. no kidding, why burn resources with that program when you can easily use built in commands.
 
I'm sorry, i was mistaken 72 fps is the maximum... Flaming is not mature criccio and it is what your doing, not what i was doing... I can't oc my proc yet cause i need new mobo as mines proprietary. June 16th i'm getting New x58 mobo, i7 920, 6gb 1600 ddr3, new vid card (DX11 hopefully), maybe a new vel raptor drive, and maybe Win 7, so i can wait till then. Really it's irrelevant cause with my 40+ fps i still pown hardcore at L4D.
http://www.daniele.ch/school/30vs60/30vs60_3.html
 
I'm sorry, i was mistaken 72 fps is the maximum... Flaming is not mature criccio and it is what your doing, not what i was doing... I can't oc my proc yet cause i need new mobo as mines proprietary. June 16th i'm getting New x58 mobo, i7 920, 6gb 1600 ddr3, new vid card (DX11 hopefully), maybe a new vel raptor drive, and maybe Win 7, so i can wait till then. Really it's irrelevant cause with my 40+ fps i still pown hardcore at L4D.
http://www.daniele.ch/school/30vs60/30vs60_3.html
Well written article, but unfortunately the author came to wrong conclusions or dumbed it down for the general reader so much that it's incorrect. Fact of the matter is our vision is incredibly adaptive and its quality is conditional. The article covers the general nature of the eye, but what it doesn't cover is how adaptive it is: lighting conditions, rhodopsin characterization, time of day, even how much sleep the host had, and much more all contribute to how much and how well you can process visual information in addition to personal differences between two people. Technically, you could quantitatively characterize visual capability in set situations, but I haven't taken nearly enough physical biochemistry or neurology to do that.

Anyway, it seems that the article was written to debunk the "30FPS" myth (10 years ago!), and I think its a valid point that monitors should be set at 60-72Hz since most people have no problem seeing 60-72FPS and differences from them. However, it should also be noted that this is in no way a constant, and that the human eye is capable of much more.
 
I said it above, but I will say it again. Game FPS has absolutely nothing to do with what we can "see". Playing a game at 30-40FPS is entirely different then playing at 80-90FPS. Especially a shooter. It completely transforms the game.
 
My laptop @ 1080P for L4D probably averaged 20-40fps with no action and thats with mid grade settings. When I upgraded to my Q9550 w/ GTX 260, my minimum with massive hordes is probably 40-50, but usually sits anywhere from 90-110 with mild action. Interestingly, a friend of mine plays perfectly fine with 30-40fps, so I can only imagine what he'd be like with a better card. Personally I noticed a difference back in the days of CS when I upgraded from my crap card to my Gainward GF3, though some of it was probably due to getting cable vs dial up too lol.
 
The newer drivers (182.06) gave me a nice performance boost. Now I play on 1920x1200, 8xMSAA, 8xAF, no vsync & everything else max without dropping below 60 FPS. It also fixed an annoying shadow flickering bug :)
 
I said it above, but I will say it again. Game FPS has absolutely nothing to do with what we can "see". Playing a game at 30-40FPS is entirely different then playing at 80-90FPS. Especially a shooter. It completely transforms the game.
Yet some humans actually like believing that their eyes are lame, and can't see faster than a regular TV frame rate. N00bs I hate that g'damn myth.
 
My aging vid card is the special edition G80 8800GTS 640 SSC with the 112sp like the 8800gt (see sig) and it handles L4D better than I thought it would @ 1920 x 1200 with 4xMSAA/4xAF and everything else maxed.

I'm typically getting ~ 56-68 FPS which can dip to ~ 42ish when the hoards come. And it's much smoother with vsync on + triple buffering, than it is with it disabled.

Frankly, I'm a bit surprised my fps is as low as it is considering how smooth the game seems to run for me.
 
alright, hovers at around 60 FPS solid the entire time

... I think I'm CPU limited.
 
alright, hovers at around 60 FPS solid the entire time

... I think I'm CPU limited.
at what settings? are you sure you dont have vsync on? your cpu isnt bad and a 4850 is only about 20% slower than my card. I hit over 120-130 fps easily at times even with stuff going on and thats at 1920x1080 max settings and 4x msaa and 16x af.
 
Wow, so many flamers lately. Grow up man, calling less informed people noobs is immature. To be honest, just cause your hardware's better doesn't mean you won't get powned by someone running minimum settings on a P4...
 
When I do the console showfps command it flickers between green and yellow. I know its yellow when it's under 60 fps.

I have Vsync turned off as well, but everything else is maxed out, with 16x AF and 4xMSAA. Maybe the Catalyst drivers forced Vsync on somewhere...
 
1920x1080
All settings low except model and shaders on medium.
20-30 fps nearly the whole time.

Athlon 64 3500+
6800gt

Will be upgrading as soon as I decide whether to stick with Q6600/Q9550 or spend the little bit extra for I7. Oh, and for another paycheck or two. Can't wait!!
 
1920x1080
All settings low except model and shaders on medium.
20-30 fps nearly the whole time.

Athlon 64 3500+
6800gt

Will be upgrading as soon as I decide whether to stick with Q6600/Q9550 or spend the little bit extra for I7. Oh, and for another paycheck or two. Can't wait!!
at this point you might as well go i7 for a new build.
 
On my side/temp PC (Sig rig down atm), Athlon X2 2.4GHz, 2GB DDR2-800, HD3870, 1080x1920
50-40 fps with nothing going on/small area, 20 fps when all fucking hell breaks loose
Max everything.
 
GTX 280 SLI - nvidia 180.48 drivers (I need to update, I know)
e8600 @ 4.4ghz
1920x1200
8xMSAA, 16xAF, everything to max setting

Played game on advance difficulty on last mission of Blood Harvest and only two were able to make it out alive. I got caught by the smoker just as the rescue vehicle pulled in.

The fps was monitored with Fraps and the benchmark was only initiated after the start of the mission and terminated before the game ended. This was done so that it would eliminate any irregular fps dips and spikes due to level load.

Max: 284
Min: 75 (rarely fell below 100 fps - it's all the hordes fault)
Avg: 168
 
Back
Top