Crysis performance with 5870 Crossfire.....:(

Rogue71

[H]ard|Gawd
Joined
Aug 25, 2007
Messages
1,311
i should say the LACK of performance. i checked my gpus using afterburner and they seemto be only at about 68-70% while playing crysis 64bit (vista sp2 comp mode). i have to set my details at medium and 2 on low. im still only gettting 34-42 fps, moistly hanging around the 34-36 range. its rather hard on the eyes. i am running it at 5760x1200...so im not expecting miracles. is this just becuase the game needs nvidia to work right or is this just a alck of support on ati's part? i really dont care about the game but i wanted to use this as a means of testing out the crossfire abilities. is warhead the same?
 
Warhead should run a bit better, as it's better optimized( i.e. they took down the details ).

At any rate, I think it's possible that your 5870's are simply running out of video memory.
 
You really need intel for pushing crossfire. Also 1gb is not enough VRAM at 5760 in crysis.
 
i should say the LACK of performance. i checked my gpus using afterburner and they seemto be only at about 68-70% while playing crysis 64bit (vista sp2 comp mode). i have to set my details at medium and 2 on low. im still only gettting 34-42 fps, moistly hanging around the 34-36 range. its rather hard on the eyes. i am running it at 5760x1200...so im not expecting miracles. is this just becuase the game needs nvidia to work right or is this just a alck of support on ati's part? i really dont care about the game but i wanted to use this as a means of testing out the crossfire abilities. is warhead the same?

I think you're asking a little much from those cards. They're no slouches, but at that resolution, I wouldn't expect you to get 60+ fps (even with some settings set to medium and low)
 
I agree with what the others have said. You are asking too much of those cards. 1GB of Vram is just not enough for that high a resolution

Additionally,

Crysis has been shown to not make use of crossfire very well (SLI is pretty bad too)

Normally two 5770s in CF will be near a single 5870 in performance but you see in Warhead that two 5770s in CF is only equal to a single 5850.

For the 5870, you see the difference between single and CF is a measly 11 FPS. CF gains in regular Crysis will be even worse since Crysis is less optimized than in Crysis Warhead.

52538235.jpg
 
Last edited:
no the problem is the game and its horrible optimization from crossfire and SLI.. its pretty bad when a single 8800GT @ 860/1836/1050 out performs 2 8800GT's in SLI @ 650/1650/1050(yes i know the overclocks lower but the second card did not have the 1.1v bios mod on it) in that game.. in crysis warhead/wars its a completely different story but even saying that sli and crossfire are still horribly optimized in that game as well but much better then in crysis..

in the long run its a lack of support on both sides.. Nvidia optimized it as much as they possibly could and ATI just didnt give a crap about the game anymore after Crytek threw the game to the curb due to Warhead being released.. so in the end you can mostly blame crytek on this since it should of been their responsibility to make sure the engine worked correctly in SLI and crossfire instead of half ass fixing it and then dumping the rest of the problem on Nvidia and ATI..

also that graphs a little misleading with the 11FPS difference.. since the drop off isnt as drastic with the 5870 when adding AA vs when adding AA to the Nvidia cards.. so at low settings crossfire doesnt scale for shit in warhead but the more AA you start adding you start to see the difference..
 
Crysis do scale well with cf...
No idea why some people claim not..
your CPU is bottlenecking it, my i7 4.2g barely max it out on 5970
change ur CPU to i7 will help a lot

the only issue with multiGPU on this game is it stutter in harbor map..

And fps gain for me from 5870 is more than 60% avg...
 
thanks for the replies. i was also reading up on crysis and the fact that it doesnt use either ati nor cf very well at all compared to nvidia cards. i do realize that it doesnt matter what you throw at crysis, its not going to return high fps... i was just wondering if there was some trick out there i hadnt heard of yet for this game. i noticed a big negative difference going form the gtx275 to the 5870 in crysis and didnt notice much of an improvement with crossfire, so i figured it was more the game. thats ok there are plenty other games to be playing. and no i have no plans on switching to intel thanks.
 
more mhz...why the 955 is only 3.4ghz on a h50 is beyond me.

Volts to 1.45
multi to 19 or 19.5
Upp volts till its stable or 80 degrees. 25mins work tops and you'll gain a shitload of cpu headroom.

As for that reso, 1gb of ram just aint enough, its that simple.
 
thanks for the replies. i was also reading up on crysis and the fact that it doesnt use either ati nor cf very well at all compared to nvidia cards. i do realize that it doesnt matter what you throw at crysis, its not going to return high fps... i was just wondering if there was some trick out there i hadnt heard of yet for this game. i noticed a big negative difference going form the gtx275 to the 5870 in crysis and didnt notice much of an improvement with crossfire, so i figured it was more the game. thats ok there are plenty other games to be playing. and no i have no plans on switching to intel thanks.

its your only chance to make your 5870CF sing its song..

else you will be limited to its 60% of power...

I didnt know I was bottlenecking in Crysis until I switch from Q6600 @ 3.6 to Core i7 @ 4.3.

I thought it was the driver issue that the GPU usage always low.... but its really just CPU couldn't push it out..
 
Its true. TechGuru or Guru3d? something like that has an article about maching up an appropriate processor for your video card. It shows pretty well that overclocking your iCore 920 doesn't improve fps much with a single 5870, indicating its more gpu limited. With dual 5870s in crossfire though, overclocking your iCore 920 gives a significant fps increase up to 4.2ghz suggesting more of a cpu bottleneck than a gpu bottleneck.

Really, 6xxx series might potentially require a gulftown processor at the rate video cards are improving
 
considering there is just about a 15% usage rate on my i7 at 4.0
in warhead, it's somewhat weird that so much MHz is necessary.
 
considering there is just about a 15% usage rate on my i7 at 4.0
in warhead, it's somewhat weird that so much MHz is necessary.


dont know.. havent seen a difference between my x2 6400+ @ 3.2ghz and my x4 940 @ 3.4ghz.. frame rates all look the same on my GTX 260 216 core.. but the difference when you add crossfire or SLI to the mix.. it just needs more cpu power to push both cards.. its a generally thought that 3.6ghz is the safe zone for multi card setups with the phenom II, 45nm C2Q's and i7's.. the numbers have backed it up as well..
 
so in the end you can mostly blame crytek on this since it should of been their responsibility to make sure the engine worked correctly in SLI and crossfire instead of half ass fixing it and then dumping the rest of the problem on Nvidia and ATI..

Well not really. The problem with Cyrsis's scaling is likely due to how good it looks. Crytek did some crazy things with shaders because that was the only practical way at the time to do them (like ambient occlusion). The tricks they used ended up making CF/SLI scaling a problem. The poor scaling in the game is due to inherent flaws in CF/SLI, not the engine.
 
considering there is just about a 15% usage rate on my i7 at 4.0
in warhead, it's somewhat weird that so much MHz is necessary.

15% is the same as a single core maxed out. Remember your i7 has HT which makes it appear to windows as having 8 cores. So really 50% CPU usage in Windows could very well mean that you're CPU is maxed out at 100%.
 
considering there is just about a 15% usage rate on my i7 at 4.0
in warhead, it's somewhat weird that so much MHz is necessary.

because you have HT on, and Crysis takes 2 FULL core for it....
 
Benches done core I7 at 4095 MHZ,
Raid 0 striped set.


Its Crysis although the ATI setups are faster with Crossfire then SLI(From my testing)
there is a huge bottleneck at any resolution.Be it CPU and PCI express 2.0 @16x

With resolutions above 1920x1080 the games becomes a real monster to run ,which we all know.Anyhow.

Whats kinda messed up with my test that at 1280x720 the trifire 5970+5850 is equal to a single 5850.


Heres a test I did awhile ago with ATI 5970+5850 TRIFIRE. 1920x1080 VS 1280x720.

Crysis compared 1080 vs 720 1080 is a little faster.


Link to full size Crysis 1080vs720



So I ran the 5850 at same settings and a single card does not seems to have the same bottleneck with CPU and PCI express 2.0 @16x
 
Last edited:
more mhz...why the 955 is only 3.4ghz on a h50 is beyond me.

Volts to 1.45
multi to 19 or 19.5
Upp volts till its stable or 80 degrees. 25mins work tops and you'll gain a shitload of cpu headroom.

As for that reso, 1gb of ram just aint enough, its that simple.

beyond you? then obviously you dont know that not all 955 chips oc the same. that being said, i had it oced to 3.7 on my old board but it was a bit hot for my liking so i backed it down to 3.6. with my new board i picked up ddr3 1600, which is oc to begin with. the crosshair 3 oc's better with the fsb, which means my ram will be running slower or faster, depending on how i configure it. i can oc the cpu to 3,8 in the auto oc part of my bios but it downclocks the ram alot (although i didnt run tests on it). i cannot simply up the voltage to get more MHz on this board. its the first thing i tried doing. if i leave my ram at 1600, 1.9v as its supposed to be, i cant get anything stable past the 3.4 @1.4v.....which to me is too high to begin with for 3.4. i did incremented testing for all the multipliers up to 1.4625v. i dont wanna fry anything.


in the games that matter (dirt 2 nfs:shift etc..and a bunch of older games i play) i notice a big fps jump with my setup so im not gonna screw with too much if its just for crysis, as the gains probly wont be worth the hassle from what ive been reading.

i think im gonna try oc'ing with the fsb and see if i can at least get it to 3.6 or 3.7. ill just have to downclock my memory.
 
Last edited:
15% is the same as a single core maxed out. Remember your i7 has HT which makes it appear to windows as having 8 cores. So really 50% CPU usage in Windows could very well mean that you're CPU is maxed out at 100%.

my issue was the fact that both my 5870s were only at about 60% in afterburner. so youre saying you assume my cpu would be at 100% which is why it cant feed the cards anymore? ill have to run another program to monitor that then.
 
my issue was the fact that both my 5870s were only at about 60% in afterburner. so youre saying you assume my cpu would be at 100% which is why it cant feed the cards anymore? ill have to run another program to monitor that then.

It could also be that afterburner is simply wrong. That 60% is an approximation. It could be, for example, that there is a bottleneck on the card hence the 60% utilization. 60% overall could mean that some part is working at 100% and limiting the performance of the rest of the card. I don't know how afterburner is measuring GPU usage, but there are parts of the 5xxx series that will go completely unused by Crysis and I don't know if that affects the usage estimation (for example, the tessellation units).
 
It could also be that afterburner is simply wrong. That 60% is an approximation. It could be, for example, that there is a bottleneck on the card hence the 60% utilization. 60% overall could mean that some part is working at 100% and limiting the performance of the rest of the card. I don't know how afterburner is measuring GPU usage, but there are parts of the 5xxx series that will go completely unused by Crysis and I don't know if that affects the usage estimation (for example, the tessellation units).

well i could see if the cpu was bottlenecked and couldnt feed the cards anymore than 60%. i suppose afterburner could be wrong, but in other games it reads 99% usage (dirt2), and it does read when the clocks bump down when ther not in 3d mode, or when one card isnt used, so im more inclined to think its just the cpu vs. the game, plus its an ati card and crysis was made for an nvidia card with its enhancements. i know the crossfire/sli scaling sucks on that game though, which is the dumbest thing ever. its one of the most power hungry games around, and it would have made much more sense to give it more gpu enhanced rather than cpu but oh well.
 
beyond you? then obviously you dont know that not all 955 chips oc the same. that being said, i had it oced to 3.7 on my old board but it was a bit hot for my liking so i backed it down to 3.6. with my new board i picked up ddr3 1600, which is oc to begin with. the crosshair 3 oc's better with the fsb, which means my ram will be running slower or faster, depending on how i configure it. i can oc the cpu to 3,8 in the auto oc part of my bios but it downclocks the ram alot (although i didnt run tests on it). i cannot simply up the voltage to get more MHz on this board. its the first thing i tried doing. if i leave my ram at 1600, 1.9v as its supposed to be, i cant get anything stable past the 3.4 @1.4v.....which to me is too high to begin with for 3.4. i did incremented testing for all the multipliers up to 1.4625v. i dont wanna fry anything.


in the games that matter (dirt 2 nfs:shift etc..and a bunch of older games i play) i notice a big fps jump with my setup so im not gonna screw with too much if its just for crysis, as the gains probly wont be worth the hassle from what ive been reading.

i think im gonna try oc'ing with the fsb and see if i can at least get it to 3.6 or 3.7. ill just have to downclock my memory.


1. the max recommended voltage for the phenom II is 1.55v
2. anything below 63C is safe on temps and if your using an H50 and reaching anywhere near that temp then you have other problems
3. your voltages are all wrong..
4. the auto overclocker is overclocking the FSB(HT) and not the ram thus its lowering the ram speed..

the avg voltage up to 3.4ghz is between 1.4-1.44v 3.6ghz is 1.44-1.475v, 3.8ghz 1.475-1.5v and if you have a decent processor 4ghz up to 1.55v.. should give you an idea of where your voltages need to be.. as long as you stay under 1.55v you are fine..


as far as the GPU's sitting at 60% while running crysis it sounds about right.. when i had 2 8800GT's in SLI my gpu usage only hit 31% on both cards.. memory usage was only at 205mb on each card..
 
well i could see if the cpu was bottlenecked and couldnt feed the cards anymore than 60%. i suppose afterburner could be wrong, but in other games it reads 99% usage (dirt2), and it does read when the clocks bump down when ther not in 3d mode, or when one card isnt used, so im more inclined to think its just the cpu vs. the game, plus its an ati card and crysis was made for an nvidia card with its enhancements. i know the crossfire/sli scaling sucks on that game though, which is the dumbest thing ever. its one of the most power hungry games around, and it would have made much more sense to give it more gpu enhanced rather than cpu but oh well.

Dirt2 is also DX11 and thus could simply be using more of the fixed function parts of the card than Crysis (like tessellation). Also, the idea that a game is made for an nvidia card or ATI card is laughable at best. It isn't like there are sets of duplicate functions in DirectX, with one set working better on Nvidia and one working better on ATI. Are there some things that Nvidia cards do better than comparable ATI cards and vice versa? Sure, but if a game is using those things its because it needs them. DirectX doesn't expose vendor specific extensions like OpenGL does, so I don't know what "enhancements" you are referring to.
 
Dirt2 is also DX11 and thus could simply be using more of the fixed function parts of the card than Crysis (like tessellation). Also, the idea that a game is made for an nvidia card or ATI card is laughable at best. It isn't like there are sets of duplicate functions in DirectX, with one set working better on Nvidia and one working better on ATI. Are there some things that Nvidia cards do better than comparable ATI cards and vice versa? Sure, but if a game is using those things its because it needs them. DirectX doesn't expose vendor specific extensions like OpenGL does, so I don't know what "enhancements" you are referring to.


any game that uses physx heavily will run better on an nvidia card.
 
40FPS @ 5760x1200. Your doing just fine. It's Crysis at an insane rez.
There are no single Nivida cards that can do this, even the upcoming Fermi card can't.

I'm sure if you use one monitor your FPS will be way above 60fps.
 
1. the max recommended voltage for the phenom II is 1.55v
2. anything below 63C is safe on temps and if your using an H50 and reaching anywhere near that temp then you have other problems
3. your voltages are all wrong..
4. the auto overclocker is overclocking the FSB(HT) and not the ram thus its lowering the ram speed..

the avg voltage up to 3.4ghz is between 1.4-1.44v 3.6ghz is 1.44-1.475v, 3.8ghz 1.475-1.5v and if you have a decent processor 4ghz up to 1.55v.. should give you an idea of where your voltages need to be.. as long as you stay under 1.55v you are fine..


as far as the GPU's sitting at 60% while running crysis it sounds about right.. when i had 2 8800GT's in SLI my gpu usage only hit 31% on both cards.. memory usage was only at 205mb on each card..

as i appreciate your input and will defintely look at some of those voltages closer as im now testing for a higher oc, it is quite common knowledge that alot of amd 955 cpus crap out in the mid to late 50s. techincally 62 is the max safe temp for the 955.

this has nothing to do with my h50 not being able to keep the cpu cool.

ive had 2 motherboards using the same 955be chip...m4n82 deluxe and now the crosshair 3 formula (different ram obviously). neither could get to a stable 3.7, although i was close on the m4n82. 3.6 seems to be the wall for this chip, although i didnt want to press it past 1.475v as its just way too hot for the potential gain.

ive been playing with my settings again last night and today and was able to reach a stable 3.6 by just using an fsb of 225 and a 1.4v on the cpu without upping the multi. the other option i found was multi of 17, fsb of 212 which gave me about the same results, with slightly lower ram speeds. would be nice if i could make a stable 3.7 without making a spaceheater, but i really dont want to kill my hardware by uppper the voltage too high.
 
40FPS @ 5760x1200. Your doing just fine. It's Crysis at an insane rez.
There are no single Nivida cards that can do this, even the upcoming Fermi card can't.

I'm sure if you use one monitor your FPS will be way above 60fps.

yea i just setup another ccc profile to use with older games that uses my center screen @1920x1200 so ill do some testing with crysis later. i was just hoping to add to the list of games i can play in eyefinity....and was curious what other peoples experiences were using ati cards with this game.

hey if i blink twice as fast i dont notice the lag..lol
 
Crysis doesn't use PhysX at all, and only games that explicitly support PhysX GPU hardware acceleration (a whopping 16 games: http://www.nzone.com/object/nzone_physxgames_home.html) will see any benefit from an Nvidia card.

i know that. it was just the easiest example of the "enhancement" possiblity you were asking about.

saying that all games present an equal playing field just because they use directx is just not true. that would only be true if all the games only used directx and nothing else to improve graphics and/or gameplay.
 
i know that. it was just the easiest example of the "enhancement" possiblity you were asking about.

saying that all games present an equal playing field just because they use directx is just not true. that would only be true if all the games only used directx and nothing else to improve graphics and/or gameplay.

Well, actually in the case of hardware accelerated PhysX the nvidia cards would get a lower FPS as they are now offloading work from the CPU ;)
 
Back
Top