Holy crap my Crysis framerate!!

I have high settings and 8x AA at 1280x1024 (my 20.1inch Samsung monitor's native), and the game ran beautifully!!

I beat the demo and I have to say... I can't wait for 11/12 when I can play more! I already started over on the hardest setting.

No graphical glitches at all, and averaging between 35 and 55fps (verified through FRAPS). I am very happy with this game so far. I do hope that the code is a little more optimized at release, which I believe it will be.

Rig in Sig. I am using the BETA Nvidia drivers just released for Crysis.
 
You F'n kidding right? One of the best video cards you can buy and I'm averaging 15 FPS at 1900/1200 Very High Settings and No AA. Give me a break...I bumped everything down to High and now getting around 20-25 FPS.

Aside fromt that this is a bad ass game...graphics are sick...

Yes its one of the best you can buy, but its also been out for almost a full year now. Just like last gen cards couldn't run everything at high resolutions with maxed settings and AA a year ago, these cards can't handle the newest games flawlessly either.

This whole exchange right here summarily defines almost exactly why I stopped dumping money into high end PC builds.

And for laughs, tomorrow evening, I'm going to benchmark it on my rig, starting at 640x480 @ low. Then I'm going to laugh bitterly and go right back to playing my 360, and put all hope of playing Crysis in the same file as "sometime in the next two years I'll build a brand new computer"
 
I'm running the game at 1280x1024 with a combination of high/medium settings and 2xAA and I'm averaging about 25fps with the rig in my sig.

That said, I'm glad I was able to hold out on any type of upgrade. Crysis was the one game that I was worried about and wanted to be sure that my next rig would be able to play the game in all it's glory, so until we get some GPU's that can handle the game at it's highest settings, i'll be content with the 25fps.
 
I'm running it on an AMD 3800+ (single core) cpu, 7900gtx gpu, with 2gb of RAM, winXP and the new 169.01 drivers.

With a resolution of 1024x768 and everything set to medium (no AA) it runs pretty well, by which I mean it's running smoothly, with no hitching or stuttering, I haven't looked at fps because I prefer to go by feel.

At those settings it doesn't look much better than farcry, the textures are a bit blurrier, although the world is certainly a little "denser".

I seem to recall pretty much the same discussion when FarCry was released, namely "Ok on decent current gen hardware, but didn't stretch it's wings until the next gen came out" :p
 
liked the game alot so far, just didn't run so well. 25ish on medium @ 1680x1050. I just hope that one of these new gpus comes soon.
 
I'm getting 22 - 37 FPS with all settings at medium except water detail (water in this game is BEAUTIFUL) at 1280x1024 with no AA. Perfectly playable with the built-in motion blur. My system is as follows:

AMD Opteron 165 (oc'ed at 2.7Ghz)
ASRock Dual939-SATA2
2GB OCZ Platinum PC3200
PNY 7900gt (1.45V modded, oc'ed at 636/717, Z-Tweaked_v158.19 drivers)
WinXP Pro 32bit
 
I tried it just now with everything on high, 1680x1050 w/ 8xQ AA is smooth as butter.

1920x1200 w/ 16x AA = painful death.

Using the new beta drivers in XP.

Will try with new drivers in Vista tomorrow, see if DX10 is sexier.
 
Hey, I get 7.69 FPS average on the GPU benchmark with my system, 1650x1050 everything on HIGH.
 
My performance in Crysis ain't too hot in Dx10/Vista either. 1680 x 1050, Very high, no AA, i get terrible fps. Feel bad for the people who wanted to run it higher.
 
high, aa=4x, 16x10, i am getting around 25-35fps, with q6600 @3700, 2gb ddr2, gts640, 169.01, vista 32. but it seems very fluid.
i am using fraps. how do i run the gpu bench? (=noob)
 
Just to add to my previous post...

While I think it's good they're pushing the envelope here, I find it a little stupid that even a 1 month old laptop, maxed out, can't even reasonably play this game at MINIMUM settings. Sure, it's an unoptimized demo, but still, I think we should be seeing SOMEWHAT better performance on the low end.

I agree with you, but considering how this game is a complete difference experience between minimum and high settings, you're cheating yourself by playing it at low, IMO...

I certainly (just me perhaps) will wait until I have done an upgrade to play this game. High is a slideshow right now, but it looks so completely different, that I've decided to wait...

But, yes, all those videos we've seen of Crysis in action, I'd really like to know exactly what hardware they were running those on... Because right now I'm not even expecting Nvidia's NEXT-gen hardware to run Crysis that well...
 
^what he said^

Can we wait for the game and official reviews be4 we cut this games balls off?

The game's supposedly due to be released in just three weeks...

It's my expectation that the demo is the gold version, otherwise, how can they have the game out by the release date? Printing media, boxing it up, shipping... takes time...
 
The demo ran fine for me. 1600x1200 everything on high besides post-processing and I was getting in the 20-30 fps range with FRAPS. I'm sure I could tweak some of the settings but even at 25 or 30 fps, I felt the game was playable and didn't detract from my overall gameplay experience. It wasn't butter-smooth but I had no trouble running around wasting baddies.
 
I agree with you, but considering how this game is a complete difference experience between minimum and high settings, you're cheating yourself by playing it at low, IMO...

I certainly (just me perhaps) will wait until I have done an upgrade to play this game. High is a slideshow right now, but it looks so completely different, that I've decided to wait...

But, yes, all those videos we've seen of Crysis in action, I'd really like to know exactly what hardware they were running those on... Because right now I'm not even expecting Nvidia's NEXT-gen hardware to run Crysis that well...

exactly. i want to experience itjust like all the hype material that has been released. there is a HUGE quality differene between med and high yet alone low. and an equaly huge difference between high and very high. also, everyone so far has listed no AA/AF, what a joke. until i can get get FPS that will never go under 40 and average around 60, using 16x10 VERY HIGH 8x MSAA/16x AF, i will not be buying the game. lets face it nothing about the game, neither game play or story is unique, only its visuals are. and seeing how radical the G92 architechture with its 56 TMUs is, i don't think ill be waiting long.;), and of course, by all means all of you who want to play it now full of jaggies, not maxed out quality wise and sub 30 fps, have that right.
 
What bothers me isn't that the GTX struggles. I can accept that because I've been gaming for a long time and it isn't the first time a high-end card has fallen under the sword of a next gen game.

What bothers me is the way that the industry has misrepresented the functionality of Direct X10 and Vista's suitability as a gaming platform. I think if you look at the total mix of information from statements by Microsoft, Crytek and Nvidia about DX10 and Crysis in particular prior to the release of this demo, you would reasonably think that DX10 and the 8800 series marked the begining of some kind of significant improvement in PC gaming.

In interviews with Crytek developers where they discuss the demonstration videos, I have seem them tout DX10 and Nvidia's hardware. The demo videos show outrageously good looking scenes with almost zero "pop-in," at least 4x anti-aliasing, full shadows, and frame rates that must at least be 40fps minimum.

The bottom line is that I think all those involved with publishing and promoting this game have either made misleading statements or omissions (by touting 8800 hardware while showing videos of the game using settings far beyond the 8800's capabilities) to push the Vista operating system and DX10 hardware.

The reality has been slower performance in Vista than in XP almost across the board and only a marginal image quality improvement. In Crysis, the image quality improvement from DX10 is offset by poor performance of the 8800 series to such an extent that I am guessing some people with Vista and DX10 cards will want to dual boot XP and run in DX9 mode to gain much needed frames.

So while the developers all this time have been plugging Multicore CPUs (Intel?) ,Nvidia products and Vista/DX10, the whole time they must have known that to get an acceptable frame rate on existing hardware the game would have to be scaled down to the point where it looks more like Far Cry than the demo videos they were showing. Yet they have been bullshitting us for months about how well it "scales."

I think it is obvious that Crytek, Intel, Nvidia, and Microsoft have some sort of arrangement to use Crysis as a mutual marketing vehicle. Yet, the Crysis demo only shows that all of their products are insufficient to produce something substantially better than Far Cry.

In other words, yes Crysis will eventually look the demo videos, but only by a brute force improvement in GPU processing power. It obviously has very little to do with DX10 and Vista.

After all that, it is some ugly trees and foliage we saw in Oblivion a year ago. That is a raw deal for consumers, even by PC gamer standards.

Other than this game, is there any other reason to even consider a graphics card faster than the GTX, adding 2 more gigs of memory, going to 64 bit, getting vista, and buying a quad core processor? Name one other game.

In conclusion, Crysis is the son of Satan, its mother is a jackal, and it is trying to establish its counterfeit kingdom here on earth using the power of Microsoft, Intel, and Nvidia.

Ok I'm done.


QFT

sticky
 
What an ass whuppin' from hell this demo was on my rig last night.

AGP based 7800 GS Superclock, folks. A Pentium 3.0 HT from Intel that's about 4 years old now. 2 gigs of 3200 RAM.

All this on a 24 inch NEC2490wuxi monitor.

Uh yeah, not even close to native rez.

I had to dumb this thing down to 1280X720 with medium settings to get it to run halfway decently. AA? Ha, the thing won't even let me turn on AA.

Yes, I've been planning to upgrade soon, but I'm waiting to see what comes out in November from Intel and from ATI and Nvidia before I finish it.

All I know is I have to do something before long! Whew!
 
bottom line. never ever ever ever buy current hardware and expect it to run next gen games at playable frame rates and visuals even close to a next gen games hype material, NO MATTER what developers tell you. their job is to sell you the best hardware NOW, not in the future.
 
installing the demo right now....man this thing is going to rape my 7900 gto sli setup.damn my 24" monitor!!
 
My gaming rig can handle most settings on high at 1280x1024. When intense combat swings into motion, however, then things get ugly. :(
 
I really hope I'm not the only one that laughed under my breath at that statement.

:D :p

I didn't laugh, but considering the expectations he mentioned, I'm not sure you can even do that with Far Cry yet...

I only have 7800GTX XXX 512MB in SLI, not exactly an 8800, true, but I don't think my gear will do what he mentioned in Far Cry...

My point is he's going to be waiting possibly a few generations of hardware... Just when everything's going to be looking like Crysis, anyway... So, it's no longer a demonstration of new technology, it's just a game, just like others around it...

Putting all of this together, with Crysis not being playable at settings we've been shown, I think it's going to hurt sales... Without the eye candy, it's just another game, so no rush to get it... If the new candy won't play on what we have now, again, no rush to get it...

I'll buy it just to support the devs, but not everyone's going to do that for them...

(I can still remember Far Cry being almost unplayable on my 9800pro, until I got an x800xt, or whatever it was... Which wasn't exactly all that much better, anyway...)
 
My gaming rig can handle most settings on high at 1280x1024. When intense combat swings into motion, however, then things get ugly. :(

Game experience isn't going to last too long if the framerate's so low that you can aim at your target...

My framerate's so low that I turned to a "spray and pray" of bullets approach to fighting some baddies...
 
is there any kind of bulit in benchmark we can use ?

i played the demo through at 1680x1050 no AA med settings it looked and ran great so now i would like to test it, so what my actual FPS was. i am going to use fraps but if there is a bulit in test i would like to try it.
 
there's a benchmark batch file in the bin32/bin64 folders of the game.

It has one GPU bat file and one CPU batch file. The test loops 4 times and outputs results to a dos window.
 
from http://forum.beyond3d.com/showpost.php?p=1082458&postcount=314

benchmark_cpu.bat
Code:

@echo Running CPU benchmark 1
@echo Results will depend on current system settings
@pause
@echo Running...
@crysis.exe -DEVMODE +map island +exec benchmark_cpu
@type "..\Game\Levels\island\benchmark_cpu.log"
@pause

benchmark_gpu.bat
Code:

@echo Running GPU benchmark 1
@echo Results will depend on current system settings
@pause
@echo Running...
@crysis.exe -DEVMODE +map island +exec benchmark_gpu
@type "..\Game\Levels\island\benchmark_gpu.log"
@pause
 
I ran the benchmarks and got average of 30 FPS with all settings high no AA at 1920x1200 under XP 32bit. In vista, the thing wont even run. start the demo and black screen, so I'll update my findings when I get it running with very high settings.
 
I'm glad I'm not the only one having issues with Vista + 169.01

I can't get the game to run 64-bit mode while in full screen with that combo.

I can get it to run 32 bit mode with no issues. 64bit + 169.01 = 3-4 fps faster, but that's only comparing windowed modes between 163.75 and 169.01.

I can run very high spec hit peaks in the 30-33 range but the average is around 22fps, not playable in my opinion. To get a good playable fps. I had to drop from 19x12 to 16x10 and cut the shader and shadows to medium. Game still looks great but peaks in the 50's now and holds a 30fps avg much better.
 
there's a benchmark batch file in the bin32/bin64 folders of the game.

It has one GPU bat file and one CPU batch file. The test loops 4 times and outputs results to a dos window.

found it but i had to install it myself, running some tests now.

did anyone else have the cpu test crash on them? so far the gpu test works no problems,
 
I had the both tests crash in 32 bit mode for some reason, but it was random, sometimes I could get it running 4 times in a row (16 loops) and one time it would just crash out after the 2nd or 3rd loop.

In 64 bit mode it was more stable, but 169.01 had an annoying bug for me that would not let it run fullscreen in 64 bit mode. 163.75 and 163.71 worked fine in 64 bit full screen yet had 2-3 fps less.
 
I got 25-40 fps (mainly in 33fps) on my rig 1200*700 all medium with high settings for sound. Its okay if you ask me, but I'm definitely waiting for the next series of cards to come out to play this game.
 
:( so I have to downgrade my monitor and upgrade my video cards? Guess I'll just skip this title then.

I bet my westy it will have better framerate upon release.

You only get to play a game for the first time once. Sure, we can turn the settings down, take shadows and AA out, maybe lower texture res- But what's the point? To me, this is the equivalent of downloading some theater vidcap of a great movie, and watching it in mono sound.

I waited a few years for FEAR, got it for $12 bux at target the other day. Plays great at 1920X1080 with a 8800GTS. I'll be looking for Crysis at the same place and price two years from now.
 
time for lawsuits......

Or for cryin' out loud...on what grounds? :rolleyes:


bottom line. never ever ever ever buy current hardware and expect it to run next gen games at playable frame rates and visuals even close to a next gen games hype material, NO MATTER what developers tell you. their job is to sell you the best hardware NOW, not in the future.

We all knew this game would be pushing the envelope and then some. We knew it was going to be a resource hog.

I think a lot of this comes down to how well a game is coded, among other things.

Even on my humble rig, I can run other games and demos (including COD4 and the new UT) with more generous settings by far vs. what I had to do here. I ran those at 1920X1200 with ok settings and was just fine. I played Half Life 2 Episode 2 at that native rez and ok settings just fine as well.

But not this thing! So, there's got to be a message in there somewhere.

And of course, finally: It's just a demo. Usually the final products ends up being better and I suspect we'll see some things ironed out.

I wouldn't hang my hat completely on some demo for anything, ever.
 
What bothers me isn't that the GTX struggles. I can accept that because I've been gaming for a long time and it isn't the first time a high-end card has fallen under the sword of a next gen game.

What bothers me is the way that the industry has misrepresented the functionality of Direct X10 and Vista's suitability as a gaming platform. I think if you look at the total mix of information from statements by Microsoft, Crytek and Nvidia about DX10 and Crysis in particular prior to the release of this demo, you would reasonably think that DX10 and the 8800 series marked the begining of some kind of significant improvement in PC gaming.

In interviews with Crytek developers where they discuss the demonstration videos, I have seem them tout DX10 and Nvidia's hardware. The demo videos show outrageously good looking scenes with almost zero "pop-in," at least 4x anti-aliasing, full shadows, and frame rates that must at least be 40fps minimum.

The bottom line is that I think all those involved with publishing and promoting this game have either made misleading statements or omissions (by touting 8800 hardware while showing videos of the game using settings far beyond the 8800's capabilities) to push the Vista operating system and DX10 hardware.

The reality has been slower performance in Vista than in XP almost across the board and only a marginal image quality improvement. In Crysis, the image quality improvement from DX10 is offset by poor performance of the 8800 series to such an extent that I am guessing some people with Vista and DX10 cards will want to dual boot XP and run in DX9 mode to gain much needed frames.

So while the developers all this time have been plugging Multicore CPUs (Intel?) ,Nvidia products and Vista/DX10, the whole time they must have known that to get an acceptable frame rate on existing hardware the game would have to be scaled down to the point where it looks more like Far Cry than the demo videos they were showing. Yet they have been bullshitting us for months about how well it "scales."

I think it is obvious that Crytek, Intel, Nvidia, and Microsoft have some sort of arrangement to use Crysis as a mutual marketing vehicle. Yet, the Crysis demo only shows that all of their products are insufficient to produce something substantially better than Far Cry.

In other words, yes Crysis will eventually look the demo videos, but only by a brute force improvement in GPU processing power. It obviously has very little to do with DX10 and Vista.

After all that, it is some ugly trees and foliage we saw in Oblivion a year ago. That is a raw deal for consumers, even by PC gamer standards.

Other than this game, is there any other reason to even consider a graphics card faster than the GTX, adding 2 more gigs of memory, going to 64 bit, getting vista, and buying a quad core processor? Name one other game.

In conclusion, Crysis is the son of Satan, its mother is a jackal, and it is trying to establish its counterfeit kingdom here on earth using the power of Microsoft, Intel, and Nvidia.

Ok I'm done.

I'd give it a little while. Last I hear from Cvat Yerli, he was having a *lot* of trouble with Vista and the drivers especially. DX 10 may end up working a lot better once he gets everything playing together nicely, but it may not happen until the patch or several driver releases. I agree that it's pretty lame that it taken so much time and trouble to get the thing working, but it may be too soon to totally write it off.
 
Blackstone said:
What bothers me is the way that the industry has misrepresented the functionality of Direct X10 and Vista's suitability as a gaming platform. I think if you look at the total mix of information from statements by Microsoft, Crytek and Nvidia about DX10 and Crysis in particular prior to the release of this demo, you would reasonably think that DX10 and the 8800 series marked the begining of some kind of significant improvement in PC gaming.

Exactly right.

I'm starting to see certain outlets all but say outright not to bother with DX10 or Vista (still) right now.

Maybe this is the kind of thing they were thinking of.

Honestly, on a high end rig...how much more does DX10 really give you vs. staying with XP and maxing out DX9 as much as you can, even on this thing?
 
I didn't laugh, but considering the expectations he mentioned, I'm not sure you can even do that with Far Cry yet...

I only have 7800GTX XXX 512MB in SLI, not exactly an 8800, true, but I don't think my gear will do what he mentioned in Far Cry...

My point is he's going to be waiting possibly a few generations of hardware... Just when everything's going to be looking like Crysis, anyway... So, it's no longer a demonstration of new technology, it's just a game, just like others around it...

Putting all of this together, with Crysis not being playable at settings we've been shown, I think it's going to hurt sales... Without the eye candy, it's just another game, so no rush to get it... If the new candy won't play on what we have now, again, no rush to get it...

I'll buy it just to support the devs, but not everyone's going to do that for them...

(I can still remember Far Cry being almost unplayable on my 9800pro, until I got an x800xt, or whatever it was... Which wasn't exactly all that much better, anyway...)

no. your statement that my requirements are somehow too high and even Far Cry would not be playable at those requirements is none sense. i think thats because you are a bit out of the loop, still using a 7800 gtx setup, and even though its sli, is still dated and well, slow. and in fact, my requirements as far as Far Cry is concerned are, well, not even close to "pushing it" using 8800 hardware. ill just let the numbers speak for themselves. and your statement about needing to wait a few generations to satisfy my requirements is equally amusing, perhaps even more so taking into account your obvious lack of knowledge about Far Crys performance, as I hinted based on the G92 architecture, I would not be surprised if a single card solution that is able play crysis using my requirements will be out soon.

http://www.xbitlabs.com/articles/video/display/gf8800-games_5.html
 
Just installed it... here's my thoughts...

Took forever to decompress and install...

64 bit game and GPU benchmark do not work AT ALL for me. Don't know if I'm the only one. Had to run the 32 bit versions. When I went into the system settings and clicked "optimal" settings, the game chose Very High (haha, yeah right).

On the benches, I got between 18.21 and 21.63 fps on 1920x1200 with 2xAA and "High" setting (Vista64) and 22.36 and 23.19 fps (XP).

At 1920x1200 with no AA and "High" I get between 25 and 26 fps (Vista64) and between 26.15 and 26.99 fps (XP).

I am running:

E6850 @ 3.67GHz
4GB G.Skill
EVGA 8800 Ultra
Vista Ultimate x64
169.01 drivers

To me the game seems really unoptimized, STILL. I would have hoped a rig like mine would have been able to at least run at native res with some eye candy @ 30 fps. Dropping to medium may get me to that level, but honestly, I'm a little put off that I have to do that.

One last note- the motion blur is used a little excessively, IMHO.
 
Back
Top