[H] users SLi guide

wow sorry to hear that, I just sold my friend for 200 a brand new striker extreme.. if I would have known your situation I would have done a trade with you since he is not going to overclock with it... anyhow do you think tri sli is worth it or more of just a "wow look what I have" thing??

3-Way SLI is only worth it if you are running your games at a resolution of 1920x1200 or 2560x1600 and want higher levels of AA and AF.
 
I have 2 8800 Ultras in SLI mode. I have been reading in this thread that some people are saying that RivaTuner will enable me to oclock both my cards...and some people are saying that it won't. nTune does not work for me as it just BSODs my system. I am running XP 64 bit. But is the general consensus that it will work with SLI?
 
I am curious but a friend of mine told me that if you oclock too high, nothing displays on the screen and si even if you reboot, you cant go back and reset back to the original settings. Is that true? Cuz I dont have an onboard video card to go back and reset my clocks...
 
Booting into Safe Mode bypasses RivaTuner if you attempt too high a overclock.
 
Thanks Kowan! I will keep that in mind. You have been a great help. I am assuming once in safe mode I can go ahead and run riva tuner and bring back the stock clocks?
 
Exactly or if needed, uninstall it then reinstall it once you're back in normal Windows since it will be a card defaults again.

Edit uninstall RivaTuner while in Safe Mode if needed.
I haven't tried starting RivaTuner while in Safe Mode. All programs won't work there, but will allow you removing the problem software to boot normal again.
 
Overclocking one with RivaTuner, overclocks both cards.

There is a issue with RivaTuner v2.06 SLi overclocking and XP. You cant use independant shader overclocking and SLi with XP, Only one card will be overclocked. This affects G80 and G92.
For SLi overclocking in XP and RT v2.06
1.)Go to power user tab in Rivatuner
2.) Choose "RivaTuner\System"
3.) Scroll down to "NVAPIClockControl" and Set Value to "0"

A post I made on the issue.
http://www.rage3d.com/board/showpost.php?p=1335298695&postcount=5
and the answer to why overclocking SLi in XP wansnt clocking both cards.
http://forums.guru3d.com/showpost.php?p=2543408&postcount=180
This hopefully will be fixed in Rivatuner v2.07 according to Unwinder.
http://www.rage3d.com/board/showpost.php?p=1335305954&postcount=11

Both cards will now overclock properly but you lose independant shader overclocking.
Or G80 users can roll back to Rivatuner v2.05. G92 users will have to wait, or the newest Ntune allows independant shader overclocking and clocks both cards correctly.
A little tutorial I made on how to set your clocks and have them applied at windows startup with Ntune 6.00 Beta.
http://www.rage3d.com/board/showpost.php?p=1335289748&postcount=17

There is also a Temp monitoring issue in XP. Rivatuner hardware monitor does not report correct SLi temps (shows both cards at the same temps).
There is a easy fix, Heres what you do:
1.) Go to power user tab in Rivatuner
2.) Choose "RivaTuner\System"
3.) Scroll down to "NVAPIThermalControl" and Set Value to "0"

You will now get the correct temps on both cards.
 
Good post purgatory. I'm using Vista and didn't test it on XP.
I'd seen either your tutorial on NTune beta or another's and couldn't remember where I'd seen it. :)
 
There is a issue with RivaTuner v2.06 SLi overclocking and XP. You cant use independant shader overclocking and SLi with XP, Only one card will be overclocked. This affects G80 and G92.
For SLi overclocking in XP and RT v2.06
1.)Go to power user tab in Rivatuner
2.) Choose "RivaTuner\System"
3.) Scroll down to "NVAPIClockControl" and Set Value to "0"

A post I made on the issue.
http://www.rage3d.com/board/showpost.php?p=1335298695&postcount=5
and the answer to why overclocking SLi in XP wansnt clocking both cards.
http://forums.guru3d.com/showpost.php?p=2543408&postcount=180
This hopefully will be fixed in Rivatuner v2.07 according to Unwinder.
http://www.rage3d.com/board/showpost.php?p=1335305954&postcount=11

Both cards will now overclock properly but you lose independant shader overclocking.
Or G80 users can roll back to Rivatuner v2.05. G92 users will have to wait, or the newest Ntune allows independant shader overclocking and clocks both cards correctly.
A little tutorial I made on how to set your clocks and have them applied at windows startup with Ntune 6.00 Beta.
http://www.rage3d.com/board/showpost.php?p=1335289748&postcount=17

There is also a Temp monitoring issue in XP. Rivatuner hardware monitor does not report correct SLi temps (shows both cards at the same temps).
There is a easy fix, Heres what you do:
1.) Go to power user tab in Rivatuner
2.) Choose "RivaTuner\System"
3.) Scroll down to "NVAPIThermalControl" and Set Value to "0"

You will now get the correct temps on both cards.

I think that is what happened to me...because even though I had my clocks at 650 - 1700 - 1100 I noticed no change in 3dmark at all. Frustrated I uninstalled it and removed both rivatuner and 3dmark06 off my machine. Ah well I will try it another time hehe.
 
I've also had to set NVAPIFanControl to 0 in order to control the fans in my setup. Not sure if both fans are being controlled or just the primary card. I will be doing more testing tonight.
 
I'll have to check this out on my card's OCs tonight.. I forget if I am on 2.05 or 2.06 RT.. hrm.. regardless great info, purgatory!

edit: well well well.. it did only OC the primary card. wow! time to fix!!

wow.. well, I followed the directions, but when I try and OC card #2 it says it wants to reboot etc.. and now the clocks for that card are totally screwed like core @ 396 and such.. this happened before.. so I have to uninstall 2.06 and the drivers and start over.
 
I'll have to check this out on my card's OCs tonight.. I forget if I am on 2.05 or 2.06 RT.. hrm.. regardless great info, purgatory!

edit: well well well.. it did only OC the primary card. wow! time to fix!!

wow.. well, I followed the directions, but when I try and OC card #2 it says it wants to reboot etc.. and now the clocks for that card are totally screwed like core @ 396 and such.. this happened before.. so I have to uninstall 2.06 and the drivers and start over.

When you set "NVAPIClockControl" value to "0" just overclock as you always have, then check your clocks on both cards. You dont have to clock the cards seperatly, in fact you cant
(you will get what just happened to you). Overclock your primary card and the clocks will apply to both cards. Make sure after you set "NVAPIClockControl" value to "0" you restart Rivatuner.
 
yeah - i screwed up because I forgot the clocks are both set from the clock controls for card #1.. unlike the fan controls where you have to select each card individually.. so after uninstalling, cleaning, and re-installing it all works fine now. woot!
 
hoochiemamma!!!! nice... all over this when I go home. :) (updating the OP also)
 
Guys,

I'm new to the SLI scene and I was looking through the profiles list in the control panel. It seems that when you look at the SLI setting, almost half of the games are "nvidia recommends single gpu' setting is defaulted.

How are these games 'sli profile optimized' but not even set up to use SLI? Or am I misunderstanding it when it says 'recommend single gpu only'.
 
Guys,

I'm new to the SLI scene and I was looking through the profiles list in the control panel. It seems that when you look at the SLI setting, almost half of the games are "nvidia recommends single gpu' setting is defaulted.

How are these games 'sli profile optimized' but not even set up to use SLI? Or am I misunderstanding it when it says 'recommend single gpu only'.

Don't worry about it. Trust me they do work properly in SLI mode. You can turn on the visual indicators and see it working in games. Not to mention you'll see the performance increase. (Depending on resolution and settings.)
 
Don't worry about it. Trust me they do work properly in SLI mode. You can turn on the visual indicators and see it working in games. Not to mention you'll see the performance increase. (Depending on resolution and settings.)
I also checked the system configuration and it said 'graphics adapters' 1 of 1 but for memory is said 1 of 4, 2 of 4 etc ...

I ran 3dmark06 (not that I care just wanted to see for shits and giggles) and it had a green SLI bar on the left which I supposed is the indicator that it's working.
 
yep..and the green should dance around a little signaling the rendering tasks each card is getting. I never mess with those game profiles unless I need to force AA or AF, which is rarely these days.. the default config seems to work great 99.9% of the time for me..
 
Does SLI still only let you use the memory of 1 card like it used to? Say, you have GTS 512 in SLI, you only have 512 of video ram. Or does it now let you use the full 1024?
 
You use the memory of both, but the load is shared one card at a time so you have the memory of one card not both in SLI.
Worth taking the time to read through Scalable Link Interface.
 
Data is replicated to each card in an SLI or CrossFire setup for that matter, so ones total VRAM is that of a single card in the mutli-GPU array.
 
yep.. the cards share the rendering tasks, so they do use all available mem they each physically have, but for only for the task that card has been given.. it's still not like one card with twice the memory, especially not using AFR rendering methods.. each card is responsible for rendering a whole frame, which in high res cases is why two cards with 256mb of mem may struggle more so than say a single 512mb card...
 
So what your saying is that SLI cards with 512 mb of ram, is kinda a bottleneck considering there are games that use more than 512 mb of vram?

I did notice that the arperture of sli cards is over 1gb, or something around that.
 
I wouldn't go as far as calling that a bottleneck.. I think that term gets tossed around too much.. I think saying something is "limited" by marginally adequate resource might be more accurate. And in this case I guess for someone playing a resource intensive game at 1920x1200 and up you might experience some aggressive texture fetching from system mem and/or disk... If your main system isn't too powerful this may cause some performance issues.. aggressive texture fetching happens more in an SLI system anyways due to managing both video cards rendering a lot of the same stuff.. SLI systems can for sure put a lot of strain on the main system, especially these new fast cards... I think that three card and quad SLI don't yeild a heck of a lot of performance gain over two card SLI is primarily due to that.. they can tune the drivers all they want, but I think the system's ability to manage the cards becomes a much larger limiting factor when you go beyond two card SLI. it's my theory. sorry, lil tangent there.. but sorta related.
 
Contrary to how SLI is supposed to have worked for like the last year, my 8800 GT SLI using the 174 beta makes me reboot to use SLI. It will do this whether or not it says there are programs it need to close and whether or not I disable the second monitor before hitting the 'enable SLI' button.

Also weird: the display properties menu show three monitors: 2407, Plug and Play, and Default Monitor. No idea what is going on here, I only have two monitors and two video cards.

Ideas?
 
Sounds like the driver might need a cleaning and re-install.. maybe.. when that type of stuff happens that's where I typically start. :)
 
Just shitty nVidia driver quality control as per usual. I need to disable my second monitor and quit and restart the nvidia control panel and THEN enable SLI. Also this only worked if I exited the nvidia app in the taskbar. Didn't matter which drivers I used, or that I deleted them and ran driver cleaner before reinstalling. :(
 
i just got 9800gtx+ sli in direct x10 32bit (64bit doesnt want to run for some reason D:<)

it rapes crysis in the ass with my q6600 at 3,4ghz+raid 0+4gbs of ram

everything at very high EXCEPT for shaders and shadows (which are medium)

35-50fps as my average when looking at water and outside

90+fps was my max
 
Hate to break it to you but with shaders at "medium" you're basically playing Far Cry 1. You're missing out big time with those settings. Better off joining the rest of us in forced DX9 mode with the very high tweak.
 
Hate to break it to you but with shaders at "medium" you're basically playing Far Cry 1. You're missing out big time with those settings. Better off joining the rest of us in forced DX9 mode with the very high tweak.

lol why? shaders/shadows to me are worthless, i can barely tell the difference and i barely notice

but w/e wheres the link to these tweaks?
 
Google 'Crysis XP very high tweak'.

Shaders is the setting with the single most obvious visual impact, followed by post-processing.

I have Crysis, Warhead, and GTX 260 SLI, so I have spent plenty of experience dabbling around with this stuff.
 
Google 'Crysis XP very high tweak'.

Shaders is the setting with the single most obvious visual impact, followed by post-processing.

I have Crysis, Warhead, and GTX 260 SLI, so I have spent plenty of experience dabbling around with this stuff.

with me and shaders i just dont notice them

also i cant seem to run crysis in direct x9 mode

i can only start the game when i put the disc in and click the start game menu

also this is through 64bit vista enterprise
 
You have to create a shortcut to the executable and right-click properties type "-dx9" (without quotes) in the target section after the regular file name (with a space in between).

I don't know how you can't see the difference, it was very noticeable to me from high to very high, but whatever works for you man.
 
This is probably a silly question, but if you want to avoid microstuttering with the 9800GX2 while playing certain games like Crysis, can you just turn off SLI in software?
 
Back
Top