Geforce GTX 690 Quad-SLI Owner's Experience

HydrasunGQ

[H]ard|Gawd
Joined
Mar 17, 2002
Messages
1,049
I'm starting this thread for the owner's of GTX 690's running Quad-SLI. Please post your experience, tweaks, etc.

Currently I'm running Quad-SLI but seem to have a scaling issue. In games such as BF3, Batman: AC, etc. I can't seem to get performance any better than a single 690. In some cases such as the Street Fighter IV Benchmark I get less performance.

Please let me know if anyone has any suggestions to improve performance.

Also, the multi-monitor configuration suggested on (http:// www. geforce .com/ hardware/ technology/ sli/ system-requirements) seems to be off. The configuration suggested did not work for me. To get my three screens working I had to connect 2 DVI's on card 1, and 1 DVI on card 2.
 
That's a sick setup, but if you're running quad SLI you might benefit from an Ivy Bridge chip and a Z77 mobo so that you can run PCIE 3.0. Other than that I have set up Nvidia surround before and connecting monitors to both cards is completely fine (and often necessary).
 
I'm starting this thread for the owner's of GTX 690's running Quad-SLI. Please post your experience, tweaks, etc.

Currently I'm running Quad-SLI but seem to have a scaling issue. In games such as BF3, Batman: AC, etc. I can't seem to get performance any better than a single 690. In some cases such as the Street Fighter IV Benchmark I get less performance.

Please let me know if anyone has any suggestions to improve performance.

Also, the multi-monitor configuration suggested on (http:// www. geforce .com/ hardware/ technology/ sli/ system-requirements) seems to be off. The configuration suggested did not work for me. To get my three screens working I had to connect 2 DVI's on card 1, and 1 DVI on card 2.

Some games do not benefit well from SLI and CrossfireX scaling. There isn't much you can do about that other than making sure you have the latest drivers which include profiles for your games. Otherwise you have to play with the SFR / AFR settings to see what the performance looks like. Beyond that, NV Surround requires that you hook up your monitors the way you described. 2 on the first card, and 1 on the second.
 
I would say in addition to Dan's comment, that the 2GB of VRAM per card could certainly be an issue that might be affecting the scaling, especially on large resolutions. You would probably Need 3 or 4 of the 4GB 670s to get better scaling in many instances where there are games that benefit from it.
 
I would say in addition to Dan's comment, that the 2GB of VRAM per card could certainly be an issue that might be affecting the scaling, especially on large resolutions. You would probably Need 3 or 4 of the 4GB 670s to get better scaling in many instances where there are games that benefit from it.

Depends on the game, levels of AA etc. But hell even though Mass Effect 3 isn't usually demanding, and supports only 1 monitor, with all the texture mods I'm running about 2.4GB of VRAM usage. So it adds up quick.
 
Sorry for being out of topic but DAMN!!!! Quad setup is sick!!! I wish i have money to burn!
 
I'm starting this thread for the owner's of GTX 690's running Quad-SLI. Please post your experience, tweaks, etc.

Currently I'm running Quad-SLI but seem to have a scaling issue. In games such as BF3, Batman: AC, etc. I can't seem to get performance any better than a single 690. In some cases such as the Street Fighter IV Benchmark I get less performance.

Please let me know if anyone has any suggestions to improve performance.

Possibly CPU bottleneck. You have 4x the GPU power and only 3 times pixels (5760x1080) I guess. Additionally, the wider FOV eats some more CPU power as well because more objects are visible.
 
People throw the term bottleneck around so easily. He might be running into some PCIe bandwidth problems, but that would be about it. At 5.0GHz he's doing fine with Sandy Bridge. Even if it is a technical bottleneck, there aren't faster options out there. Ivy Bridge and Sandy Bridge-E won't do anything for him where gaming is concerned outside of PCI-Express Gen 3 compatibility.
 
With this much GPU power, a CPU bottleneck is not out of the question, even at 5 GHz. People dismiss bottlenecks so easily :p
PCIe bandwidth is not the problem - it might cost 10% in some scenarios or so, but to prevent proper scaling from 1x690 to 2x690 is impossible.
People need to understand that GPUs have progressed much much faster (performancewise) than CPUs in the last decade. It is only logical that CPUs will bottleneck especially enthusiast setups more and more - unless devs put the GPUs to really good use. But with things like FXAA and respect for consoles, that is not gonna until the next console gen. And maybe not even then.

How to find out (2 possibilities):

1. Lower GPU-specific settings like MSAA or resolution a little bit. fps don't change (much) -> CPU bottleneck
2. Lower CPU clocks by 10% to 4.5GHz. fps drop by 10% as well -> CPU bottleneck
 
Last edited:
With this much GPU power, a CPU bottleneck is not out of the question, even at 5 GHz. People dismiss bottlenecks so easily :p
PCIe bandwidth is not the problem - it might cost 10% in some scenarios or so, but to prevent proper scaling from 1x690 to 2x690 is impossible.
People need to understand that GPUs have progressed much much faster (performancewise) than CPUs in the last decade. It is only logical that CPUs will bottleneck especially enthusiast setups more and more - unless devs put the GPUs to really good use. But with things like FXAA and respect for consoles, that is not gonna until the next console gen. And maybe not even then.

How to find out (2 possibilities):

1. Lower GPU-specific settings like MSAA or resolution a little bit. fps don't change (much) -> CPU bottleneck
2. Lower CPU clocks by 10% to 4.5GHz. fps drop by 10% as well -> CPU bottleneck

What you aren't understanding is that there really is nothing that can be done to extend the performance of a 24/7 system at around 5.0GHz using Sandy Bridge. Ivy Bridge's IPC improvements are minimal at best and when it comes to gaming, they are virtually non-existent. Sandy Bridge-E won't help either because it has lower clock speed headroom than Sandy Bridge and lacks the IPC improvements of Ivy Bridge which also clocks lower than Sandy Bridge, and about as well as Ivy Bridge. The extra cores of Sandy Bridge-E also will not be of use 99% of the time as games won't leverage the additional cores. Performance wise, memory speeds above DDR3 1600MHz reach a point of diminishing returns and quad-channel bandwidth isn't likely to help either. And 10% improvements due to PCIe bandwidth increases is being generous, but it's about all he'd get with an upgrade over Sandy Bridge.

So what is the guy to do? I understand and have stated in the past myself that CPUs are always and have always been the bottleneck of system peformance. But there are two issues with this concept: 1.) At some point, the cost of an upgrade vs. the performance gained is marginal at best. 2.) Sometimes you are already at the cutting edge of what's available and simply have to wait for technology to catch up to your wants / needs as a consumer.

He might gain some performance from Ivy Bridge, but it will be marginal at best. It's a lot of money for little to nothing. And that upgrade is far more cost effective than going LGA2011 and will probably provide just as much performance improvement, which is again minimal. I'd recommend waiting at this point. I also wouldn't have gone with GTX 690's either. If they can be returned for now, I'd wait for 4GB GTX 680's and get two or three of those. 2GB of VRAM just isn't enough for multimonitor gaming and high levels of AA/AF.
 
Depends on the game, levels of AA etc. But hell even though Mass Effect 3 isn't usually demanding, and supports only 1 monitor, with all the texture mods I'm running about 2.4GB of VRAM usage. So it adds up quick.

Since when did it only support one monitor? Is that an Nvidia limitation? I played it with Eyefinity. The only limitation was that the cut scenes only displayed on one monitor.

As for the bottlenecks, while possilbe, I don't honestly see a limitation for the CPU beyond 2 GPUs. He stated that he wasn't getting scaling much better than just an SLI setup. I have seen people get some better performance from Tri 680s or Tri CF. Heck they have reviews here showing performance gains from Tri-SLI. However, Tri SLI/CF in the current high end seems to have rapidly diminishing returns. I think there are more benefits for PhysX perfomance though.
 
Last edited:
Since when did it only support one monitor? Is that an Nvidia limitation? I played it with Eyefinity. The only limitation was that the cut scenes only displayed on one monitor.

Mass Effect 3? Doesn't do more than one monitor without employing some kind of work around as far as I know. And from what I understand, doesn't it attempt to display all of the cut scenes on more than one monitor, without accounting for aspect ratio? The end result is parts of the cut scene (like people's heads) not showing.

EDIT: Nevermind, it does run on more than one monitor. I haven't checked to see how the cut scenes behave with it though.
 
Last edited:
What you aren't understanding is that there really is nothing that can be done to extend the performance of a 24/7 system at around 5.0GHz using Sandy Bridge. Ivy Bridge's IPC improvements are minimal at best and when it comes to gaming, they are virtually non-existent. Sandy Bridge-E won't help either because it has lower clock speed headroom than Sandy Bridge and lacks the IPC improvements of Ivy Bridge which also clocks lower than Sandy Bridge, and about as well as Ivy Bridge. The extra cores of Sandy Bridge-E also will not be of use 99% of the time as games won't leverage the additional cores. Performance wise, memory speeds above DDR3 1600MHz reach a point of diminishing returns and quad-channel bandwidth isn't likely to help either. And 10% improvements due to PCIe bandwidth increases is being generous, but it's about all he'd get with an upgrade over Sandy Bridge.

So what is the guy to do? I understand and have stated in the past myself that CPUs are always and have always been the bottleneck of system peformance. But there are two issues with this concept: 1.) At some point, the cost of an upgrade vs. the performance gained is marginal at best. 2.) Sometimes you are already at the cutting edge of what's available and simply have to wait for technology to catch up to your wants / needs as a consumer.

He might gain some performance from Ivy Bridge, but it will be marginal at best. It's a lot of money for little to nothing. And that upgrade is far more cost effective than going LGA2011 and will probably provide just as much performance improvement, which is again minimal. I'd recommend waiting at this point. I also wouldn't have gone with GTX 690's either. If they can be returned for now, I'd wait for 4GB GTX 680's and get two or three of those. 2GB of VRAM just isn't enough for multimonitor gaming and high levels of AA/AF.

I never said there is something he could do to improve performance. So your whole little essay was unnecessary ;)
The thing is, he could (and should with this setup - otherwise it is pointless) invest more in image quality in order to put more load on the GPUs. SSAA, AO, 3D etc.

Another possibility is, that the SLI profiles for some games are not suited for optimal performance with quad-SLI. In that case I would go to the forums over at Nvidia, there are some very competent guys there that have experience in changing the compatibility bits to improve performance.
 
Mass Effect 3? Doesn't do more than one monitor without employing some kind of work around as far as I know. And from what I understand, doesn't it attempt to display all of the cut scenes on more than one monitor, without accounting for aspect ratio? The end result is parts of the cut scene (like people's heads) not showing.

That wasn't my experience at all. I played ME 3 all the way through on Eyefinity. It only displays the cut scene on a single monitor though. I have my Eyefinity setup down for the moment in expectations of my 690 tonight, but I can bring it back up fairly quick and see if I can get some screenshotsof ME3 on Eyefinity and my settings. Unless there was some change in a recent update, I don't know where the information came from that it doesn't do multi-monitor. I will also test it on the 690, once I get it up, but that will take longer, I am going to rebuild the system from scratch with fresh install of Windows and the 690 drivers and a new SSD RAID setup as well.
 
Thought so, 4x690 would be a waste unless you are building a specialized GPU processing server :)

I don't even think it's technically possible, SLI's limit is for GPUs, whether they are in a four cards or two.
 
I don't even think it's technically possible, SLI's limit is for GPUs, whether they are in a four cards or two.

In those specialized servers, you aren't using SLI. You are sending various jobs to different GPUs using their PhysX engines to rapidly process jobs.
 
That wasn't my experience at all. I played ME 3 all the way through on Eyefinity. It only displays the cut scene on a single monitor though. I have my Eyefinity setup down for the moment in expectations of my 690 tonight, but I can bring it back up fairly quick and see if I can get some screenshotsof ME3 on Eyefinity and my settings. Unless there was some change in a recent update, I don't know where the information came from that it doesn't do multi-monitor. I will also test it on the 690, once I get it up, but that will take longer, I am going to rebuild the system from scratch with fresh install of Windows and the 690 drivers and a new SSD RAID setup as well.

Something I read. I just never tried it because I heard it jacked up the cut scenes and tried to display them across the display array. I tested it, and it indeed runs on multiple monitors. Though I didn't play with it. I just verified that it gives me the option to do it.
 
Something I read. I just never tried it because I heard it jacked up the cut scenes and tried to display them across the display array. I tested it, and it indeed runs on multiple monitors. Though I didn't play with it. I just verified that it gives me the option to do it.

getting off topic a little, but I think what is happening to the individual that noticed that problem is they are likely running one of the hack services that modifies settings for eyefinity setups. Much like the ones used to make Skyrim work. That may likely interfere with how ME3 operates. I have had to redo my system quite a few times to try and fix eyefinity issues with certain games, one of the reasons I am moving to the GTX690 instead. What you should notice in ME3 with Eyefinity or Surround is that you will get multi-monitor until the cut scene, and then it just shows the cut scene in your central display, causing the other displays to go black. But after the cut-scene, you should get multi-monitor gaming again.
 
Hi, I got 2 gtx690 too, with my asus p8p67 And i2600k I was having bad performance, so I changed to x79 and i7-3930.

Now I'm still having bad scaling.
I will change psu, for now I have xfx850w.

I also heard an user of quad gtx590 he told me his cards don't scale well.

Wtf!!!


Don't change cpu, yours is fine
I wait for the new psu, if it doesn't get better I wanna rma the 2 monster cards.
And get 4way gtx680 4gb
 
Hi, I got 2 gtx690 too, with my asus p8p67 And i2600k I was having bad performance, so I changed to x79 and i7-3930.

Now I'm still having bad scaling.
I will change psu, for now I have xfx850w.

I also heard an user of quad gtx590 he told me his cards don't scale well.

Wtf!!!


Don't change cpu, yours is fine
I wait for the new psu, if it doesn't get better I wanna rma the 2 monster cards.
And get 4way gtx680 4gb

Why would the PSU change anything? I wouldn't replace the PSU for this issue.

Another weird issue I'm noticing is that the WEI score I'm getting is a 7.4 (not that WEI is a good benchmark).
 
I use 4 switchcables from 4pin to 6 pin and 2 switchcables from 6 pin to 8 pin.
Depends on witch card I connect those I see a little difference in performance.
I see the main connector of the card is the one more far to the front of the pc, if I connect the 8 pin to the main I get better performance.
My psu is 850 watts, evga support advices me 1200watt of psu.

And if it doesn't work i wanna ask rma to change on 4 x4gb gtx680.
 
Gtx690 have only one sli connector.
So you cant do 4x690.
But you can do 2 690+ 2 680.

I will try this with the new psu,
But don't think it works because of drivers
 
To the people thinking their cards don't scale well:
Maybe try some SSAA and report back? :)
 
getting off topic a little, but I think what is happening to the individual that noticed that problem is they are likely running one of the hack services that modifies settings for eyefinity setups. Much like the ones used to make Skyrim work. That may likely interfere with how ME3 operates. I have had to redo my system quite a few times to try and fix eyefinity issues with certain games, one of the reasons I am moving to the GTX690 instead. What you should notice in ME3 with Eyefinity or Surround is that you will get multi-monitor until the cut scene, and then it just shows the cut scene in your central display, causing the other displays to go black. But after the cut-scene, you should get multi-monitor gaming again.

Well I played some this weekend. And ME3's Eyefinity support sucks. It works great so long as you do not enable bezel compesation. The second you do, all hell breaks loose. Cut scenes get cut off, you can't use menus properly, and basically the game is virtually unplayable. I don't like not being able to use bezel compensation, but it's not the end of the world.
 
Well I played some this weekend. And ME3's Eyefinity support sucks. It works great so long as you do not enable bezel compesation. The second you do, all hell breaks loose. Cut scenes get cut off, you can't use menus properly, and basically the game is virtually unplayable. I don't like not being able to use bezel compensation, but it's not the end of the world.

Yeah, honestly I stopped using Bezel compensation altogether as it seemed to break a lot of my experiences gaming in Eyefinity. If you play without it, it works a lot better. Just another reason why I am switching from the Eyefinity setup to the GTX 690. It seemed like everyday I was having to redo my setup to get one game or another to work properly and then it would break another game. It just wasn't worth the hassle. About 2 weeks ago, I completely ditched my Eyefinity setup and just used CF on a single screen. Even that had some issues with certain games. Though to be fair, in the past using SLI, I also experienced hiccups here and there, but nowhere near the same, and the community support for SLI has always been impressive. I will see how it goes when I venture back into that arena. I was supposed to redo my system with the new GTX690 this weekend, but unfortunately RL had other plans.
 
Gtx690 have only one sli connector.
So you cant do 4x690.
But you can do 2 690+ 2 680.

I will try this with the new psu,
But don't think it works because of drivers

I am pretty sure that is impossible. I have not seen the ability to SLI a GTX690 with a GTX680. Also 2 GTX690s and 2 GTX680 would be more than the max for SLI, since that is 6 GPUs. The max for SLI I am aware of is 4 GPUs whether you use 2 690s or 4 680s. You cannot even use 3 690s to the best of my knowledge.
 
I am pretty sure that is impossible. I have not seen the ability to SLI a GTX690 with a GTX680. Also 2 GTX690s and 2 GTX680 would be more than the max for SLI, since that is 6 GPUs. The max for SLI I am aware of is 4 GPUs whether you use 2 690s or 4 680s. You cannot even use 3 690s to the best of my knowledge.

You are correct. 4 GPUs is the max you can get with SLI or CrossfireX at present.
 
Yes but the 4way connector has 2 double connectors (where I will attach gtx680) and 2 single connectors(->gtx690).
I don't believe it works, but eventually just to run heaven.
Anyway I get much better gameplay with 2 680 than with one, or worse 2 gtx690.
So I'm not hoping to use this setup for any game.
Everyday I wait my new psu...
 
So is this a thread for you rich guys to gloat about having not one but two GTX 690s?






I kid, I kid! :LOL:
 
It's about how do you spend money. Not beeing rich.
My car is worth about like the 2 cards. A mazda 323 with 198000 km. And it's perfectly fine for me.
The cards not scaling makes this a waste
 
Friends of mine use to tune cars, spent 2 years saving on a peugeot 306.
To make it look better than the one in fast and furious.
 
Last edited:
Back
Top