f.e.a.r SLI seems broken

Eva_Unit_0

[H]ard|Gawd
Joined
Jun 1, 2005
Messages
1,991
I'm currently running the 81.85 drivers, and SLI seems to be *somewhat* functional, but not entirely in FEAR. By default it uses SFR, which for the most part seems to work, but whenever certain types of things are in the scene (like water, for example) the bar just sits at the bottom of the screen and the load balancing obviously isn't working correctly. And it's ironic because it goes haywire right at the times when you'd think it would benefit from SLI the most. Has anyone else seen this behavior with FEAR, or found a way to circumvent it?

On a side note, the first time after I installed the 1.01 patch it ran in AFR mode for some reason (which appeared to work better) but now it always runs in SFR mode again, even if I create a custom profile for FEAR and manually select AFR mode with coolbits.

Anyone have experiences with SLI (6800U's or otherwise) in FEAR?
 
No one with SLI plays fear besides me? Come on, someone else with SLI and fear go and enable your load balancing bars and go into a room with water...see if it balances correctly for you. Everyone out there with SLI should care if it's being properly used in fear or not ;)
 
Have you tried nvnews's forums to see if they are posting about your problems? Sometimes it is best to search many forums for your answer.
 
I had this problem too on my rig.

Simple fix. There is no profile for FEAR (Or it doesn't work or something) and the autoselect function doesn't work. You can either create your own FEAR profile, or you can modify the global driver profile. The key is you have to set the SLI-GPU Rendering mode to Alternate Frame Rendering.

Which I found odd, but in my testing, that's all that worked. In the Beta Test Demo, I had to use SFR if I remember right.
 
Sir-Fragalot said:
I had this problem too on my rig.

Simple fix. There is no profile for FEAR (Or it doesn't work or something) and the autoselect function doesn't work. You can either create your own FEAR profile, or you can modify the global driver profile. The key is you have to set the SLI-GPU Rendering mode to Alternate Frame Rendering.

Which I found odd, but in my testing, that's all that worked. In the Beta Test Demo, I had to use SFR if I remember right.
I get better fps when its set to auto, 60 max and it dips down to low 20s somtimes as low as 15 I thought I would get more consistent fps with two GTXs in SLi. Sir- Fragalot what settings do you run F.E.A.R.? My settings are 4xAA, 4xAF everthing max @ 1600x1200, no soft shadows, vsync on. what am I doing wrong? This guy is getting better fps with two GTs http://www.hardforum.com/showthread.php?t=968244
:confused: :confused:
 
Sir-Fragalot said:
I had this problem too on my rig.

Simple fix. There is no profile for FEAR (Or it doesn't work or something) and the autoselect function doesn't work. You can either create your own FEAR profile, or you can modify the global driver profile. The key is you have to set the SLI-GPU Rendering mode to Alternate Frame Rendering.

Which I found odd, but in my testing, that's all that worked. In the Beta Test Demo, I had to use SFR if I remember right.

alright I'll give that a shot. And yeah, the SFR/AFR difference is what really confused me. The demo worked nicely with SFR but the game is kinda flaky...and like I said, the very first time I ran the game after applying the patch is ran in AFR, but since then it's been using SFR. Weird. I'll get back in a few with how the custom profile works...I'll try it again, at least.

Edit: Alright, I tried making a custom profile and specifying AFR (both regular and alternate) and FEAR still uses SFR. Any idea why?

Edit again: I tried forcing AFR in the global driver settings, and still FEAR uses SFR. wtf?
 
Eva_Unit_0 said:
alright I'll give that a shot. And yeah, the SFR/AFR difference is what really confused me. The demo worked nicely with SFR but the game is kinda flaky...and like I said, the very first time I ran the game after applying the patch is ran in AFR, but since then it's been using SFR. Weird. I'll get back in a few with how the custom profile works...I'll try it again, at least.

Edit: Alright, I tried making a custom profile and specifying AFR (both regular and alternate) and FEAR still uses SFR. Any idea why?

Edit again: I tried forcing AFR in the global driver settings, and still FEAR uses SFR. wtf?

I'll look at mine again and see what the deal is. See what mine is actually doing.

lee63 said:
I get better fps when its set to auto, 60 max and it dips down to low 20s somtimes as low as 15 I thought I would get more consistent fps with two GTXs in SLi. Sir- Fragalot what settings do you run F.E.A.R.? My settings are 4xAA, 4xAF everthing max @ 1600x1200, no soft shadows, vsync on. what am I doing wrong? This guy is getting better fps with two GTs http://www.hardforum.com/showthread.php?t=968244
:confused: :confused:

I run it at 1280x1024 no AA and no AF. I had set V-Sync to on while plaing games, for benchmark purposes I turned it off for the FEAR performance database thread. At any rate, I haven't tried messing with my settings enough, and I haven't messed with SLi profiles again, since I found that setting AFR worked the best for me. I need to esperiment alot more with FEAR settings and SLi settings.
 
Could it have anything to do with my drivers? I'm using the 81.85's but I got them a while before they were officially posed on the nvidia site...I got them a few weeks ago when they first surfaced on guru3d. The version numbers are identical so I don't think they would be any different...but who knows.
 
Eva_Unit_0 said:
Could it have anything to do with my drivers? I'm using the 81.85's but I got them a while before they were officially posed on the nvidia site...I got them a few weeks ago when they first surfaced on guru3d. The version numbers are identical so I don't think they would be any different...but who knows.

I am using the official WHQL drivers, (81.85) and I still have some issues with FEAR.
 
Take off VSync and wtch you're FPS fly .. VSync is ahrd enough on frame rates in most games even with good heardware. And FEAR (pulling a FarCry back when it came out) is crushing frame rates and hardware.

-Proxy
 
Proxy said:
Take off VSync and wtch you're FPS fly .. VSync is ahrd enough on frame rates in most games even with good heardware. And FEAR (pulling a FarCry back when it came out) is crushing frame rates and hardware.

-Proxy

I realize this, however in tests at the settings I am running on, I get 40+ FPS 100% of the time. So I can afford to set V-Sync on my system, as long as AA and AF aren't enabled, and soft shadows are off. The game ran fine, and looked great.
 
Proxy said:
Take off VSync and wtch you're FPS fly .. VSync is ahrd enough on frame rates in most games even with good heardware. And FEAR (pulling a FarCry back when it came out) is crushing frame rates and hardware.

-Proxy
Turned v-sync of and fps went up to 75 high and 20s low, cant stand the tearing though, as long as I get 40 to 60 fps Im cool but when it dips down to 20fps or lower It sux. I figured I would be ok with two GTXs with the new batch of games coming out
:mad: WTF????? :mad:
 
Sir-Fragalot said:
I realize this, however in tests at the settings I am running on, I get 40+ FPS 100% of the time. So I can afford to set V-Sync on my system, as long as AA and AF aren't enabled, and soft shadows are off. The game ran fine, and looked great.
The game looks like crap when I turn AA off, to many jaggies, I have an LCD by the way could this be the problem? Oh yeah and soft shadows dont even work for me even with AA off, just wanted to try them to see how they looked. Should I try reinstalling the game ?
 
lee63 said:
Turned v-sync of and fps went up to 75 high and 20s low, cant stand the tearing though, as long as I get 40 to 60 fps Im cool but when it dips down to 20fps or lower It sux. I figured I would be ok with two GTXs with the new batch of games coming out
:mad: WTF????? :mad:

Try any other game and two 7800GTX's will eat it for breakfast. The problem is the way FEAR was designed and coded. While it looks good and pulls off some amazing effects, the sucker is far from effecient. I am not entirely sure what the deal is, but the developers always told us in the dev forum for the closed Beta Tests on FEAR ( I was a part of those) that the game was designed to operate at 1024x768. That is the optimum resolution according to them.
 
Sir-Fragalot said:
Try any other game and two 7800GTX's will eat it for breakfast. The problem is the way FEAR was designed and coded. While it looks good and pulls off some amazing effects, the sucker is far from effecient. I am not entirely sure what the deal is, but the developers always told us in the dev forum for the closed Beta Tests on FEAR ( I was a part of those) that the game was designed to operate at 1024x768. That is the optimum resolution according to them.
That sux!!! your right though Q4 runs perfect. Will they fix fear in a driver release or patch ? Can it even be fixed, if so I wont play it until then.
:D :confused:
 
lee63 said:
That sux!!! your right though Q4 runs perfect. Will they fix fear in a driver release or patch ? Can it even be fixed, if so I wont play it until then.
:D :confused:

The game is still highly enjoyable and you can get a good gaming experience out of it. No reason to hold off on playing it. Just understand that it is not going to allow most configurations to run at 1600x1200 with 4xAA and 16xAF with Soft Shadows on.
 
Sir-Fragalot said:
The game is still highly enjoyable and you can get a good gaming experience out of it. No reason to hold off on playing it. Just understand that it is not going to allow most configurations to run at 1600x1200 with 4xAA and 16xAF with Soft Shadows on.
Yeah your right, I hope COD2 runs better.

Thanx :D :D
 
There is no mystery here, folks. It's all in the shadows. If you disable the shadows, the game runs and looks a lot like SoF or some other outdated mediocre title. I think it's stupid to turn on shadows for multiplayer, though it's needed for atmosphere in single player.

Still, I think it's moronic to waste hundreds of dollars on a new video card or whatever (own a 6800 GT) just to see some fucking shadows.
 
Matrices said:
There is no mystery here, folks. It's all in the shadows. If you disable the shadows, the game runs and looks a lot like SoF or some other outdated mediocre title. I think it's stupid to turn on shadows for multiplayer, though it's needed for atmosphere in single player.

Still, I think it's moronic to waste hundreds of dollars on a new video card or whatever (own a 6800 GT) just to see some fucking shadows.

Didn't say there was a mystery. There are some performance issues, and some driver tweaking needed to make this game run decent. That's all.
 
For those who are all upset about the soft shadows: Don't be. In all honesty, they look like crap. They don't look nice like the soft shadows in Riddick did. It just looks like they render the shadow like 3 times and slightly offset each shadow from the others. Overall I think the game looks amazing, and I don't think it runs TOO horribly slow given what it looks like. On my dual 6800U's I've been running it at 1280 x 1024, no aa, 4x af, and maximum details (minus soft shadows) in the timedemo I got 74fps average, 98% above 40fps. I don't think that's too bad.

But what really does concern me is the flaky sli. It runs all nice and smooth for me until a crazy shader (like water or heat ripples from flames) and the sli goes haywire, and then the performance drops a bit. The water doesn't seem TOO bad...like it tends to be alright around water now. But heat ripples and flames still make it spaz out.

On an unrelated note, I find that this game gives me heat issues. This is the only game I've played (compared to farcy w/ hdr, Doom 3, HL2, WoW, etc.) that overheats my 6800U's lol. I kept getting artifacts and throttling so I had to remove my side panel...though with the side panel off it's been fine. No other game has caused this before.
 
Eva_Unit_0 said:
For those who are all upset about the soft shadows: Don't be. In all honesty, they look like crap. They don't look nice like the soft shadows in Riddick did. It just looks like they render the shadow like 3 times and slightly offset each shadow from the others. Overall I think the game looks amazing, and I don't think it runs TOO horribly slow given what it looks like. On my dual 6800U's I've been running it at 1280 x 1024, no aa, 4x af, and maximum details (minus soft shadows) in the timedemo I got 74fps average, 98% above 40fps. I don't think that's too bad.

But what really does concern me is the flaky sli. It runs all nice and smooth for me until a crazy shader (like water or heat ripples from flames) and the sli goes haywire, and then the performance drops a bit. The water doesn't seem TOO bad...like it tends to be alright around water now. But heat ripples and flames still make it spaz out.

On an unrelated note, I find that this game gives me heat issues. This is the only game I've played (compared to farcy w/ hdr, Doom 3, HL2, WoW, etc.) that overheats my 6800U's lol. I kept getting artifacts and throttling so I had to remove my side panel...though with the side panel off it's been fine. No other game has caused this before.

I have the same problems with it. However, I don't have the heat problem.
 
Back
Top