Terrible SLI performance?

Joined
Mar 20, 2004
Messages
32
I just slapped in my second BFG 7900GTX I received yesterday. I followed the SLI Guide sticky in this forum. I cleaned out the Nvidia drivers using Driver Cleaner Pro, installed the 84.43 drivers, enabled SLI in the display properties, switched SLI rendering mode to "SLI Multi-GPU rendering," and yet, I still think I'm getting the same / very slightly higher performance as when I had one card.

This is highly confusing. I don't see a gain anywhere. What gives? If anything, I got an extra 5-10 frames in Oblivion (getting about 30-40 outside), maybe an extra 10-20 frames in the Source games, I'm getting high 30's in Black & White 2, and weirdest of all though, Battlefield 2 is only getting 20's in one of the singleplayer levels - I believe it was the one where the Americans have to use boats get onto the mainland, and it takes place during the night.

In the "Mad Mod Mike" Demo from Nvidia, I'm getting low 20's and 30's. (this is at 1680x1050 fullscreen and 1280x800 windowed)

My performance shouldn't be this low, should it?

Help me out here guys. I would very much appreciate it.

ASUS A8N32-SLI
4800+ X2
2GB OCZ DDR500 PC4000 3-3-3-8
2x BFG Geforce 7900GTX's
Creative Soundblaster X-Fi XtremeMusic
Viewsonic VX2025wm (1680x1050 for all my games)
 
I find that trying to get sli to work properly via the nvidia control panel is a hit and miss affair.
Sometimes it will do as its told, but most of the time it doesnt.

I use Nhancer to make SLI game profile ( really good sli program ) and also coolbits to open the different sli modes in the nvidia control panel.

So, when i make a profile, I do it via nhancer, and then close nhancer then goto nvidia cp, and then set the rendering mode to what game your about to play is, i.e. afr/afr etc, thats all i need to touch in the nvidia cp, as nhancer takes care of the rest of it and will apply the profile automatically, but i have noticed that if the rendering mode isnt the same in nvidias cp, as the game you are about to play, then sli might not work and the game will probably render with just 1 card.

Its a pain to get sli all setup and working good for ya, i just use nhacer for everything, and only go into my nvidia cp when i need to change rendering mode and to turn on or off the gpu load program ( keep the gpu load balancing program running while making a profile, as some AA settings can make SLI default to single gpu rendering )

Experiment is the main thing, as SLI is a pain in the ass to begin with, but once sussed out, its a piece of cake from then on.

The way I have it is, I just make sure nvidia CP and nhancer game profile of the game i am away to play are both showing the same rendering method, and that seems to keep most of the problems away at my end...
 
yeah SLI is dissapointing. Besides benchmarks, i saw zero gain in my games. I even play at 1600x1200 but SLI is not useful. That is why im selling my dual 7900 GT and getting one x1900xt instead.
 
Geez, I forgot about the powersupply.

Silverstone Zeus 650W. "SLI Certified."

I'm gonna try out what Evil-Scotsman said.
SLI a pain in the ass? Heh, I hear that. This is frustrating beyond all else. $550 for a few extra frames, and in most of my games, no gain to be seen at all.

NV CP settings (Global):
4x
16x
High Quality
Not Available
Off
None
On
Off
Single-Display mode
Off
Off
Off
Off
Off
Off
SLI Multi-GPU rendering
Clamp

"enable SLI Multi-GPU" checked on the next page. If those settings help any.

Im gonna try the load balancing, to see if it's even USING the second card.

Dark Prodigy... a CPU bottleneck, with a 4800 X2?
 
I'll second the idea of getting and installing nHancer (check the forums for the beta 2.0 version). This will make making profiles easier, aswell as changing settings.
 
hmm SLI works fine for me in the games I play. Are you not using features like gamma correct aliasing or Transparency Antialiasing? What res were you playing? SLI makes the difference of being able to use 16x AF 4x AA with gamma correct AA and SuperSampled Transparency Antialiasing in games like Red Orchestra. In other games it makes the difference between being able to play at 1650x1080 vs 1920x1200. On average it gives atleast a 40% boost if not more (not in the case of all games or crappily coded ones)

As for Oblivion, that game is not only incredibly poorly programmed for PC but it is heavily cpu limited and also doesn't take SLI into account at all.You need to download the beta patch to get proper SLI enhancements (which I can't because Bethesda made the patch a beta and Direct2Drive are retarded)
 
SixWingedFreak said:
Geez, I forgot about the powersupply.

Silverstone Zeus 650W. "SLI Certified."

I'm gonna try out what Evil-Scotsman said.
SLI a pain in the ass? Heh, I hear that. This is frustrating beyond all else. $550 for a few extra frames, and in most of my games, no gain to be seen at all.

NV CP settings (Global):
4x
16x
High Quality
Not Available
Off
None
On
Off
Single-Display mode
Off
Off
Off
Off
Off
Off
SLI Multi-GPU rendering
Clamp

"enable SLI Multi-GPU" checked on the next page. If those settings help any.

Im gonna try the load balancing, to see if it's even USING the second card.

Dark Prodigy... a CPU bottleneck, with a 4800 X2?


If when you test, well if the game you test with the above settings ( say oblivion for example ) if there is an oblivion game profile in the nvidia control panel, then the options you have above, wont be the options the game uses, it will auto use the inbuilt game profile if there is one, if there isnt, it will use your global settings.

All the game profiles that come with nvidia drivers are useless and should be deleted on installation of new vid driver.

The only game profiles i have now are the ones i make with nhancer, as nvidia's profiles are shiat.

basically, if theres a game profile it will auto use that and not what you have set it to under global drivers settings.

you also NEED 2 pieces of software.

1. COOLBITS = A Must = it will allow you to select different rendering methods like Alternate frame rendering and AFR2 also Split Frame rendering and sli AA and single gpu rendering.

SLI anti aliasing, is really only useful for old type games where there wouldnt be a fps boost via normal afr/sfr rendering, thus you can make the game look nicer with a max of 32 for quad and 16xAA for normal sli rigs = making all the jagged edges dissappear. It works a treat for lockon the flight sim .

These options will replace the multi gpu rendering button you are using now.

and
2. Nhancer = A Must
 
Enable coolbits
In nvidia panel:
performance panel>advanced settings
enable specific game profile or under global settings>advanced>enable SLI rendering.

I use SLI with the nvidia drivers and run nearly everything at 16x12 with excellent framerates. I use profiles if available, let the in-game settings control AA.
Coolbits lets you select various rendering combinations under global advanced that are otherwise locked.
You just have to fuss with it a while, but IMO SLI will give you nice benefits.

Im assuming you attached the SLI brigde across the two cards and you run the monitor from the top (master) card.
 
When I load up 3DMark06 and go to System Info, it says "Display Device 1/1 - NVIDIA GeForce 7900 GTX"

Shouldn't it say that I have two of them? What the hell is going on, I want this freakin' thing to work correctly!

It must see two of them and all, because I can use SLI AA in the Nvidia control panel, and it works fine.

nHancer severely blows my mind. I understand the options and junk, but it just loads up all the other original profiles once I delete them. Ugh...
 
Actually 3DMark06 only reports one card in SLI.

Have you tried running 3DMark06 in SLI mode and in then in single card mode? You should be getting somewhere in the 9500 range with SLI and like 6500 without it. If you're htting around those numbers, then you SLI setup is fine I believe.

Your hardware is very similar to mine, you've got 7900 GTX's vs my pair of 7800 512 GTX, and I notice the difference with SLI in most of my games.
 
Now I have another question, can I just disable SLI in the NV control panel, or do I have to physically take out the card?
 
All you have to do is uncheck the "Enable SLI multi-GPU" check box in the NV control panel on the "SLI multi-GPU" page.
 
Alright. I did that an ran 3Dmark06. In the first test, Proxycon, there was a difference only by about 15 frames, with SLI off. That's at 1680x1050, no AA or AF, default settings pretty much.

It freezes up at the CPU test. Not really freezes up, but moreso runs at 0 FPS, and at1024x768 every single time. I don't understand it.


Speaking about the tests, are these supposed scores I should be getting with AA and AF? And what tests would I be running to get those scores?
 
run 3dmark06 with default everything. Don't even enable or disable any options in it's profile.


report back your scores.

the difference of 15 frames in 3dmark06 isn't bad. SLI doesn't magically boost your graphics double what they were before. If anything it's like going the a next gen card (typically next gen card upgrades net about a 40% boost in performance)

SLI really is kind of an esoteric thing to have though IMO...

I know I purchased my second card for very esoteric reasons... wanting to play games with transparency AA enabled primarily...
 
Not for nothin.....did you check both cards individually to make sure they are both functional???? I am being very simplistic only because it happened to me.
Second go into the system and make sure that you see both cards registered in device manager, and that both cards register drivers loaded.
Ive had to load the driver individually before as well, for each card.

Like I said you have to fuss with this stuff a bit but you'll love the IQ once you get comfortable with the performance panels.
 
disabling 'Peg Link mode" in your bios will also boost your frames.
 
Back home from work. Lots of replies, yay!

I disabled PEG Link Mode in the BIOS.

I ran 3DMark06, it takes a good five minutes or more to load :\ Is that normal? Ran it at default settings, and my comoyter locked up on me on "Deep Freeze." At frame 2099, specifically.

Sigh... I can't even find out my score.
 
Deep Freeze is the worst test....or should I say the hardest. That is the one in which most of my problems surfaced.

Actually, therefore I believe it is the best test.
 
i would try what was suggested earlier and test each card individually to make sure one of them isnt going bad.
you have your cards at stock clocks also?
 
I'm running everything stock, yes. I have no interest in overclocking for the time being -- this is bad enough as it is.

How can one be "going" bad? I mean, both come up in system/hardware monitor, I dont get any artifacting, SLI is apparently enabled because I can use SLI AA 8x and 16x... If one was bad, wouldn't my system in theory not work at all / get very bad graphical errors?

Sooo.. why did it freeze up on that test then?
 
Oh yeah, another question I haven't seen an answer to yet.

Why does the CPU test in 3DMark06 run at 0-1 frames for me? I installed the AMD x2 Driver to see if it makes a difference. Nada. The "Game" of that cpu test runs fine, however.

Why is this $4k computer riddled with problems. >_<;;
 
i think i get like .75 something fps in the cpu test 1 and like 1.1 fps in cpu test 2.
thats the way its supposed to be.

what powersupply are you using?
and if you havent done this already i would d/l memtest86 and make sure your ram isnt bad. some ram can be totally stable in 2d stuff but get whacked out in 3d.
i would let memtest run for a few passes to make sure...
 
Power supply is a Silverstone Zeus 650W. I was told in another thread a few months ago that it would be more than sufficent for 2x 7900GTX's, seeing as how they use less wattage than the 7800's.

Just installed the 84.21's. Gonna give them a test run.

Doesn't memtest require it be run in DOS, at startup? ...I don't have a floppy drive. :p
 
SixWingedFreak said:
Doesn't memtest require it be run in DOS, at startup? ...I don't have a floppy drive. :p
yeah it requires to boot of some media
you can d/l a bootable iso image from their website and just throw it on a cdrw or dvdrw
trust me it would be worth it to rule out a memory problem.


i run two 7800GTX 512mb with a 600w enermax noisetaker so your powersupply has plenty...
 
I appear to get the same performance with the stable drivers.

I'm just going to give up and say it works from herein. As to how well it works, that's subjective. I still say I only get about 10-20 extra frames overall, in NFS:MW, Oblivion, HL2, DOD:S, BF2, 3dMark06, etc etc. Nothing spectacular about it at all, aside from the fact the only graphical improvement I get is to bump my AntiAliasing up to 4x AA from 2x.

If this is what SLI is like, aggrevating to no end, a bitch to troubleshoot and set up, and $550 for at MOST 20 extra frames... then man... I sure as hell am not looking forward to two of the G80's. :mad: :mad: :mad:
 
My ST65ZF is the flat black version. I don't recall saying I had stability issues? I'm having performance issues... The only stability issue I had is when 3dMark froze on the last test.
 
I was referring to your 3dMark freeze. An unexplained lock-up usually brings into question stability of the system, but maybe it's not a big deal. I was just trying to make you aware of a common problem with those PSU's.
 
basically what you're getting is SLi. I had an SLi'd pc with 2x6800 ultras, and got (sometimes) 25 fps in doom 3 mp. Does this mean each card produced 12.5 fps each? Pfft. Its overhyped your lucky to gain + 30% from SLi - at least from what i experienced.
 
Back
Top