Another crysis performance moan. GTS to GT

Very impressive, thanks for the confirmation. I'm waiting on my 8800GT to get here on monday. What doesn't make sense is that it beats a GTX in that test, doesn't sound right

Most of the time it's only a few frames behind a GTX/Ultra in many of the games Ive personally benchmarked.
 
Oh, let me also add this about the 8800GT (Superclock/KO/SSC)

They are single slot for one thing.
They do draw less power than GTS/GTX/Ultra's.

However, there is one issue that seems to be prevalent with respect to the GTs.

If you don't change the default/atuo fan speed for gaming, then expect crashes/lockups/freezing/etc.

The default/automatic fan speed for them is set at 29%.
Now, Im not sure if this is a driver issue or a hardware bug, but the fan doesn't seem to speed up during gameplay unless you do a bit of tweaking or you could always Directly set the fan speed yourself.

I just use NTune and set the fan speed to 60% (noise is not really noticable with speakers on during a game), and that way I havent experienced any freezes, lockups, or anything of that nature ever since.

Hope my posts have helped to shed some light on the GT for ya. :)

You have - I'll be looking to sell my GTS 640 Superclocked as soon as I find GTs in stock at a reasonable price again, many thanks, I appreciate the testing you did.

Work/school, I know the feeling....done college in 2 weeks. Heh.
 
Try this out lots of people including me have seen huge improvements visually and perfomance wise. http://files.filefront.com/Cubans+CustomCrysisConfigs+12/;9133381;/fileinfo.html

taken originally from here: http://www.incrysis.com/forums/viewtopic.php?id=14544

Jesus Christ, thanks for the link!

Game looks awesome and runs pretty damn smooth with the highest settings config @ 1600x1200.

*spoilers*




I was playing through the Alien ship and HDR made that level look 10x more bad ass.
 
You have - I'll be looking to sell my GTS 640 Superclocked as soon as I find GTs in stock at a reasonable price again, many thanks, I appreciate the testing you did.

Work/school, I know the feeling....done college in 2 weeks. Heh.

:p Yeah, Im so slammed right now. Oh, and glad to be of service, and good luck in finding a card in stock, and in school.

By the way, I got my card from http://www.xpcgear.com/

Seems like they have some stock still. ;)
http://www.xpcgear.com/nvidia8800.html

I personally prefer EVGA (due to the factory overclocks, warranty, overall customer service, and their step up program), but if you are really eager to get a GT, they do have some supply on hand.
 
I have to say, this thread got me worried I wasted money on the 8800 I bought a few days ago off egg.. I also have a 165, hope I get good performance
 
I have to say, this thread got me worried I wasted money on the 8800 I bought a few days ago off egg.. I also have a 165, hope I get good performance

If it's an 8800GT, GTX, or Ultra, you have NO need to be worried.

If however it's a 320MB or 640MB 88800GTS that we're talking about (not counting the G92 based 512MB GTS soon to be released), then you've got a problem.

A GTS on average, loses out to a GT by about 33% in Crysis.
That IS the difference between a smooth and hiccup-free experience with HIGH settings versus a jerky game experience at mostly MEDIUM settings.

Once again, if it's a GT/GTX/Ultra flavor of the 8800 series that you have purchased, then you need not be apprehensive.
 
If it's an 8800GT, GTX, or Ultra, you have NO need to be worried.

If however it's a 320MB or 640MB 88800GTS that we're talking about (not counting the G92 based 512MB GTS soon to be released), then you've got a problem.

A GTS on average, loses out to a GT by about 33% in Crysis.
That IS the difference between a smooth and hiccup-free experience with HIGH settings versus a jerky game experience at mostly MEDIUM settings.

Once again, if it's a GT/GTX/Ultra flavor of the 8800 series that you have purchased, then you need not be apprehensive.

It is a GT, but what I mean is. The OP and th rest of the people complaining have a 165 Opteron like I do.. so it sounds like a 165 is a major bottleneck with a 8800GT compared to the C2D people here. I hope not! :)
 
Oh, let me also add this about the 8800GT (Superclock/KO/SSC)

They are single slot for one thing.
They do draw less power than GTS/GTX/Ultra's.

However, there is one issue that seems to be prevalent with respect to the GTs.

If you don't change the default/atuo fan speed for gaming, then expect crashes/lockups/freezing/etc.

The default/automatic fan speed for them is set at 29%.
Now, Im not sure if this is a driver issue or a hardware bug, but the fan doesn't seem to speed up during gameplay unless you do a bit of tweaking or you could always Directly set the fan speed yourself.

I just use NTune and set the fan speed to 60% (noise is not really noticable with speakers on during a game), and that way I havent experienced any freezes, lockups, or anything of that nature ever since.

Hope my posts have helped to shed some light on the GT for ya. :)

I HIGHLY second his statement on the fan speed.

I sat debugging this for a WEEK before someone mentioned that, and it made all the difference
 
It is a GT, but what I mean is. The OP and th rest of the people complaining have a 165 Opteron like I do.. so it sounds like a 165 is a major bottleneck with a 8800GT compared to the C2D people here. I hope not! :)

Oh I see, my bad.

Well, google is your friend, I think :p

I personally haven't seen any benchmarks using that combination, but I would be truly shocked to see it performing at around C2D/Quad levels, irrespective of how Crysis scales with CPU's.

Sorry to sound pessimistic, but I hope you can prove me wrong, for your sake :)

I HIGHLY second his statement on the fan speed.

I sat debugging this for a WEEK before someone mentioned that, and it made all the difference

I was one click short of RMA'ing mine, until I read about others having this same problem, and of course reading about the subsequent fix as well.

I still don't like the fix, it feels like a duct-tape solution, but hey, whatever works at the end of the day ya know.
 
at op, just turn the shaders and shadows down! I play everything high but shaders and shadows at medium 1680x1050 8800gts 640mb
 
at op, just turn the shaders and shadows down! I play everything high but shaders and shadows at medium 1680x1050 8800gts 640mb

But the game looks almost entirely different with shaders set to Medium, instead of High.

He can get away with shadows set to medium, as that doesn't result in a very noticeable difference in graphics.
High shaders however, makes things look very noticably better.

I personally have my post-processing set to medium (that speeds things up), as well as shadows.
Also, what I would recommend is a system.cfg file in his root directory which enables almost entirely free edgeAA for him, making the game look even better.

The command is:
r_useedgeaa=2

That clears up the spotty looking leaves/branches on trees as well.
 
Jesus Christ, thanks for the link!

Game looks awesome and runs pretty damn smooth with the highest settings config @ 1600x1200.

*spoilers*




I was playing through the Alien ship and HDR made that level look 10x more bad ass.

Your welcome just glad it helped you out. I haven't played SP yet cuz I'm holding out for 9800 myself.
 
Try this out lots of people including me have seen huge improvements visually and perfomance wise. http://files.filefront.com/Cubans+CustomCrysisConfigs+12/;9133381;/fileinfo.html

taken originally from here: http://www.incrysis.com/forums/viewtopic.php?id=14544

Other than popping up an annoying overlay in the upper right, those don't do anything except make it look piss-poor and perform just as bad, in my experience (and yes, I followed the directions). Level 4, Loaded game to main menu, set all to low, apply, exit, reload game...

Loads of Bloom doesn't make something look good, and 14fps isn't what I'd call fast on an 8800GT/4gb/5000+.

And no, it didn't look like "low" either, so I know it worked. What in the world is that supposed to do?
 
It is a GT, but what I mean is. The OP and th rest of the people complaining have a 165 Opteron like I do.. so it sounds like a 165 is a major bottleneck with a 8800GT compared to the C2D people here. I hope not! :)

Please let me and the rest concerned know. I'm interested to see what kind of performance you get, since we have similar setups.
 
Other than popping up an annoying overlay in the upper right, those don't do anything except make it look piss-poor and perform just as bad, in my experience (and yes, I followed the directions). Level 4, Loaded game to main menu, set all to low, apply, exit, reload game...

Loads of Bloom doesn't make something look good, and 14fps isn't what I'd call fast on an 8800GT/4gb/5000+.

And no, it didn't look like "low" either, so I know it worked. What in the world is that supposed to do?

Runs just fine for me so I don't know what to tell you. The bloom can be a bit much at times but when it looks good, it looks DAMN good. And you can get rid of the display in the upper right hand corner by typing r_displayinfo 0 in the console.
 
Other than popping up an annoying overlay in the upper right, those don't do anything except make it look piss-poor and perform just as bad, in my experience (and yes, I followed the directions). Level 4, Loaded game to main menu, set all to low, apply, exit, reload game...

Loads of Bloom doesn't make something look good, and 14fps isn't what I'd call fast on an 8800GT/4gb/5000+.

And no, it didn't look like "low" either, so I know it worked. What in the world is that supposed to do?

I agree for the most part. But it "felt" like it helped me out, I'm using the level 3 and it looks like high settings. the FPS dip down to 17 sometimes but throughout the game its been kinda playable and enjoying the game very much so I have not screwed around with the settings since. I'm gonna try removing the custom config file and setting everything to high except shadows, post processing, and maybe shaders to medium.
 
I agree for the most part. But it "felt" like it helped me out, I'm using the level 3 and it looks like high settings. the FPS dip down to 17 sometimes but throughout the game its been kinda playable and enjoying the game very much so I have not screwed around with the settings since. I'm gonna try removing the custom config file and setting everything to high except shadows, post processing, and maybe shaders to medium.

Looks like total poo on my system compared to high. Everything is blurry, edges look nasty and broken, shadows are goofy, and it still runs piss-poor. I'll admit the water is better though.

Meh. The game isn't worth it. I'll try it again in a year when they fix the engine, and better cards are out. I'm tired of 1152x864 on an 8800GT, especially on a big monitor.
 
Please let me and the rest concerned know. I'm interested to see what kind of performance you get, since we have similar setups.

ditto this. If this game only runs well on intel boxes, I'm gonna be pissed.
 
Other than popping up an annoying overlay in the upper right, those don't do anything except make it look piss-poor and perform just as bad, in my experience (and yes, I followed the directions). Level 4, Loaded game to main menu, set all to low, apply, exit, reload game...

Loads of Bloom doesn't make something look good, and 14fps isn't what I'd call fast on an 8800GT/4gb/5000+.

And no, it didn't look like "low" either, so I know it worked. What in the world is that supposed to do?

I'm just the messenger many people are thrilled with these tweak to the cfg files. Sucks that it didnt work out for you. Whats your system like?
 
I'm just the messenger many people are thrilled with these tweak to the cfg files. Sucks that it didnt work out for you. Whats your system like?

eVGA 8800GT Superclocked
Brisbane 5000+ X2
Biostar Tforce 570 SLi board
Audigy 2
4gb G.skill (2x2gb)
 
Damn. This thread makes me want to ebay my 8800 GTS 320 and get a 8800 GT if its only going to be about a $10-50 difference judging by closed auctions.

Should I or is my aging CPU too much of a bottleneck to bother?
 
Crysis is a 9800GTX game, period. I'm not even buying it until there are graphics cards out that can run it with shaders and post processing on very high with 4xaa.
 
Well I didn't upgrade for just crysis. I upgraded because it would cost me nothing except shipping charges from EVGA step-up. And the step up window closed in about 20 days. Plus I wanted a card with more ram then my GTS 320mb since I run 1680x1050 resolution. And I'm sticking with this card at least till the 9800 comes out. Plus this card runs 11c cooler then my GTS did so thats always welcomed.
 
eVGA 8800GT Superclocked
Brisbane 5000+ X2
Biostar Tforce 570 SLi board
Audigy 2
4gb G.skill (2x2gb)

Dont take this personally but its definitely user error with that system. There's other cfg tweaks around the web and few Ive seen posted on this forum try those out.
 
Crysis is a 9800GTX game, period. I'm not even buying it until there are graphics cards out that can run it with shaders and post processing on very high with 4xaa.

For Very High, perhaps, even then only time will tell.

However, there is absolutely no reason why anyone can't enjoy smooth framerates, 16xAF, 2xEdgeAA, and all settings on HIGH, right now!

An 8800GT/GTX/Ultra with plenty of RAM and a fast CPU will afford you that opportunity.

Also, you can edit the system.cfg file to include some of those "Very High" effects, without taking a noticable performance hit.

My setup is a testament to that fact.
 
Dont take this personally but its definitely user error with that system. There's other cfg tweaks around the web and few Ive seen posted on this forum try those out.

everything else clips along just like it should. Gears of War, UT3, Bioshock, Hellgate, 9300 3dmarks, etc. There's nothing else out there I can't run like I want to. Crysis is the only thing that doesn't run like it should.

It's not worth futzing with tweaks. It'll either run or it won't, and since it won't right now (mostly outside. anyhting inside is fine, but outside is 15fps, and there are enough major outside sections to matter), I'll wait for the next gen of cards and play it right. There are plenty of other better games out there right now that I can spend time with.
 
Agreed.

To me it's like a car that is insanely fast and powerful, but uses up a 20 gallon tank of gas in 5 mins. What good is it?

It's annoying people are praising Crytek here. Imo they did a piss poor job coding this engine to work with current hardware. I remember that whenever id released a new game based off their new engine, if you had a top of the line PC, the game ran @ playable levels with max settings. Games are still being develped using their engines, For example, COD4. Heavily modified Quake 3 engine.

COD4 @ 1920x1200 4xAA, max settings, runs smooth as butter on my system.

Here is the key: At those settings, it looks BETTER THAN CRYSIS @ A MUCH LOWER RES WITH MEDIUM-HIGH SETTINGS, AND RUNS MUCH FASTER... and is far more than playable.

So what's the explanation here. "Oh there is a lot more going on behind the scenes in crysis". What a load of bs. If I can't see it, wtf does it matter to me?

"oh its for the future!" Uh huh. THEN RELEASE IN THE FUTURE.

Anyone played/remember when EQ2 came out? @ it's highest settings, it looked and still looks incredible. But at the time of it's release, it couldn't run near those settings with bleeding edge hardware. And SOE was always saying "We built this game with the future in mind". Huh, some good it did them. WoWs 9mill subscribers anyone??

And can someone tell me what games used the Farcry engine? I haven't heard of any but I could be wrong. Did they even license it.

Soon as id tech 5 is released for licensing, imo, no one is gonna care one bit about Crysis's engine. Developer wise. Not because id tech's will look better @ max settings, because it probably won't, but because it will run on current hardware, and it will make coding and releasing games on mulitplatforms a godsend for them.

Anyhow, 2+ cents from me again :/

Please stop using that marketing excuse 'this game will scale well in the future' bullshit.

They released the game now. We are playing the game now.

If it was meant for a system next year or two years from now, then why did they release it now?

Why not keep working on the game to make it even better?

Everytime I hear that PR Hype I can smell the bullshit wading through the monitor screen.

I bet next year at this time (when Crysis is playable) there will be a few more games that look just as good, if not better than Crysis.

Then what will people say? It looked great at release even though no one could play it at those 'uber' settings? It was the first to look that good? So what, if less than 1% of gamers can play it at those settings?
 
For Very High, perhaps, even then only time will tell.

However, there is absolutely no reason why anyone can't enjoy smooth framerates, 16xAF, 2xEdgeAA, and all settings on HIGH, right now!

An 8800GT/GTX/Ultra with plenty of RAM and a fast CPU will afford you that opportunity.

Also, you can edit the system.cfg file to include some of those "Very High" effects, without taking a noticable performance hit.

My setup is a testament to that fact.

i still call shens, at least with an AMD setup. :p Especially since the config tweak at the beginning really does look pretty piss-poor in comparison to the true settings. I bet you can't run it without the config tweak at the settings you claim to smoothly.

I don't call 15fps playable, and that's the best I can get outside with everything on high on an 8800gt, and I've even started overclocking the thing, without the tweak.
 
Agreed.

To me it's like a car that is insanely fast and powerful, but uses up a 20 gallon tank of gas in 5 mins. What good is it?

It's annoying people are praising Crytek here. Imo they did a piss poor job coding this engine to work with current hardware. I remember that whenever id released a new game based off their new engine, if you had a top of the line PC, the game ran @ playable levels with max settings. Games are still being develped using their engines, For example, COD4. Heavily modified Quake 3 engine.

COD4 @ 1920x1200 4xAA, max settings, runs smooth as butter on my system.

Here is the key: At those settings, it looks BETTER THAN CRYSIS @ A MUCH LOWER RES WITH MEDIUM-HIGH SETTINGS, AND RUNS MUCH FASTER... and is far more than playable.

So what's the explanation here. "Oh there is a lot more going on behind the scenes in crysis". What a load of bs. If I can't see it, wtf does it matter to me?

"oh its for the future!" Uh huh. THEN RELEASE IN THE FUTURE.

Anyone played/remember when EQ2 came out? @ it's highest settings, it looked and still looks incredible. But at the time of it's release, it couldn't run near those settings with bleeding edge hardware. And SOE was always saying "We built this game with the future in mind". Huh, some good it did them. WoWs 9mill subscribers anyone??

And can someone tell me what games used the Farcry engine? I haven't heard of any but I could be wrong. Did they even license it.

Soon as id tech 5 is released for licensing, imo, no one is gonna care one bit about Crysis's engine. Developer wise. Not because id tech's will look better @ max settings, because it probably won't, but because it will run on current hardware, and it will make coding and releasing games on mulitplatforms a godsend for them.

Anyhow, 2+ cents from me again :/

COD4 is not based on the Quake/doom engine anymore. Infinity ward made a custom engine this time, entirely proprietary. I agree otherwise though ;)
 
OK, so I got my GT today. According to the GPU stress test utility, I get around 33 average FPS. Ingame I get from 25-45 FPS it seems. I'm happy :cool:
Oh, forgot to mention settings:
1440x900
Everything high except post-processing (medium)
 
Correct me if I'm wrong but given the engine and development put into this game, wouldn't a sequel released by the same company require significantly less time, money, and effort. I can see why it sucks that you can't play the game now with max settings but by the time a sequel comes out they're still in the sweet spot or no?
 
i still call shens, at least with an AMD setup. :p Especially since the config tweak at the beginning really does look pretty piss-poor in comparison to the true settings. I bet you can't run it without the config tweak at the settings you claim to smoothly.

I don't call 15fps playable, and that's the best I can get outside with everything on high on an 8800gt, and I've even started overclocking the thing, without the tweak.

You must be doing something wrong, very wrong then.
Either that, or it's a combination of your CPU/GPU/RAM/OS/drivers.

I get TWICE the performance you do.

Once again, for reference:

crysisspd_g_02.gif

http://www.tweaktown.com/articles/1211/2

There are numerous sites which can corroborate what I have said.

With a sufficiently high clocked C2D/Quad Core, 4GB of RAM, and either an 8800GT (superclocks, KO's, SSC, and the like help), 8800GTX, or Ultra, you will see performance as indicated above.

That is on HIGH settings by the way.

Oh, and it helps to run it in DX9 mode, and I also happen to run the 64bit .exe as well.
Not to mention the fact that i can run 16xAF and still maintain 30+ FPS average.

For what it's worth here is my custom system.cfg files contents (short and simple, so no, no major tweaking there to get the type of smooth performance I have been alluding to):

con_restricted=0
r_useedgeaa=2
r_sunshafts=1
e_detail_materials_view_dist_xy=4096

That enables sunshafts (an otherwise DX10 only feature), enables edgeAA for virtually free (makes many objects look better, and also removes the spottiness from trees), and also increases the texture LOD on objects.
 
OK, so I got my GT today. According to the GPU stress test utility, I get around 33 average FPS. Ingame I get from 25-45 FPS it seems. I'm happy :cool:
Oh, forgot to mention settings:
1440x900
Everything high except post-processing (medium)

Exactly.
30+ FPS average is very attainable in this game, and that's with a mere $250 card (in the form of an 8800GT).

There is no excuse for anyone to be getting poor performance in this game, unless of course they can't get their hands on an 8800GT/GTX/Ultra, and the rest of their rig has to be up to par as well (both hardware and software wise; i.e. clean install of drivers, properly defragged, no spyware, not too many apps running in the background, etc.)
 
You must be doing something wrong, very wrong then.
Either that, or it's a combination of your CPU/GPU/RAM/OS/drivers.

I get TWICE the performance you do.

Once again, for reference:

http://www.asharinet.com/Pics/crysisspd_g_02.gif[/IMGc]
[url]http://www.tweaktown.com/articles/1211/2[/url]

There are numerous sites which can corroborate what I have said.

With a sufficiently high clocked C2D/Quad Core, 4GB of RAM, and either an 8800GT (superclocks, KO's, SSC, and the like help), 8800GTX, or Ultra, you will see performance as indicated above.

That is on [B][U]HIGH[/U][/B] settings by the way.

Oh, and it helps to run it in [B][U]DX9[/U][/B] mode, and I also happen to run the [B][U]64bit[/U][/B] .exe as well.
Not to mention the fact that i can run 16xAF and still maintain 30+ FPS average.

For what it's worth here is my custom system.cfg files contents (short and simple, so no, no major tweaking there to get the type of smooth performance I have been alluding to):

con_restricted=0
r_useedgeaa=2
r_sunshafts=1
e_detail_materials_view_dist_xy=4096

That enables sunshafts (an otherwise DX10 only feature), enables edgeAA for virtually free (makes many objects look better, and also removes the spottiness from trees), and also increases the texture LOD on objects.[/QUOTE]

Go look at the system they're using to get those numbers.

Yeah, if you have a Core Quad, significantly overclocked, with a bunch of significantly overclocked ram and video cards that most likely aren't stock either, sure.

Again, you won't get that at stock speeds on our hardware, especially with an AMD setup. Note that everyone having performance issues is on some form of AMD Core, either Opteron or the new Brisbane chips, and most of us are at stock speeds or lightly overclocked.

It's a brand new, fresh install of Vista Home Premium, with the latest drivers for everything, and everything else runs like a scalded cat. Crysis is the ONLY thing which is slow as shit, so I'm pretty damned sure it's safe to say it's the game and not the hardware this time.

Find ONE of those sites using a dual core AMD, or everything at stock speeds, and then we'll compare numbers. :) Either processors are finally making a difference, or Crytek just doesn't know shit about programming for AMD folk.

edit: I'm also not using some form of canned demo that they provide. I'm telling you what I see playing the game in outside environments. Inside sure, I get 40+ FPS easily. But most of the game isn't inside!
 
Exactly.
30+ FPS average is very attainable in this game, and that's with a mere $250 card (in the form of an 8800GT).

There is no excuse for anyone to be getting poor performance in this game, unless of course they can't get their hands on an 8800GT/GTX/Ultra, and the rest of their rig has to be up to par as well (both hardware and software wise; i.e. clean install of drivers, properly defragged, no spyware, not too many apps running in the background, etc.)

he's running it at a medium res, honestly. significantly lower than you were, iirc.

And average means there are PLENTY of places you're getting less than 30fps, and for some of us, that's not acceptable. I

And WTF is it with crysis not supporting any of the middle of the road 4:3 resolutions? You get 1600x1200, and then 1152x864. WTF happened to 1280x960?
 
Go look at the system they're using to get those numbers.

Yeah, if you have a Core Quad, significantly overclocked, with a bunch of significantly overclocked ram and video cards that most likely aren't stock either, sure.

Again, you won't get that at stock speeds on our hardware, especially with an AMD setup. Note that everyone having performance issues is on some form of AMD Core, either Opteron or the new Brisbane chips, and most of us are at stock speeds or lightly overclocked.

Your previous post almost made it sound like you were calling into question MY results specifically, but thanks for clearing that up :)

And yes, my system is actually very similar to the one they were using in their benchmarks over at tweaktown, just read my sig, and then you'll come to realize why I can get 30+ FPS average much like they do, and enjoy a smooth ride from beginning to end. :)
 
Back
Top