GTX 670 4GB 4-Way SLI

Discussion in 'nVidia Flavor' started by fomoz, Jul 7, 2012.

  1. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    Thanks! :)

    I will, as soon as I'm done with Pong and PacMan. So far, all I can say is that I've never seen Space Invaders run so smooth!
     
  2. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    Hey guys, I have some updates for you. For some reason, now I get similar FPS at 1440p in Heaven with both "Span displays with Surround" and "Maximize 3D performance".

    I think I'm starting to see what's happening here and why nVidia hasn't answered my support ticket. First of all, I would like to say that I'm not an engineer. Also, I really hope that I'm wrong about this and that nVidia can fix this through a driver update. You guys can judge this for yourselves, I'll just leave it here.

    Intutively, I would assume that if I increase the resolution by 3.15 (2560x1440 to 8064x1440 bezel corrected), then, if the cards scale well, I can approximate my FPS by taking the FPS at 2560x1440 and dividing it by 3.15. So if I get 106.8 FPS at 2560x1440, then I should get somewhere around 33.9 FPS at 8064x1440, maybe even more if I was getting a CPU bottleneck at the lower resolution.

    Unfortunately, that's not the case. In reality, I get 23.9 FPS at 8064x1440.

    Here are my results from 2560x1400 and 8064x1440 runs in Heaven 3.0. I'm using nVidia 304.79 drivers and the cards are running at stock clocks except for 122% power target.

    2560x1440
    [​IMG]

    8064x1440
    [​IMG]

    What's going on? The story becomes more clear when you look at the PrecisionX graphs. I spliced them together, since I can't fit the stats for 4 GPUs on one screen.

    2560x1440
    [​IMG]

    8064x1440
    [​IMG]

    Looking at GPU usage and GPU temps, it becomes clear that at 8064x1440 the GPU's aren't even close to being engaged to their full potential. It looks like GK104 doesn't have enough VRAM bandwidth for 4GB. I guess now it makes sense why nVidia originally released them with 2GB frame buffer and 256-bit bus width. It also makes sense that the GK110 rumors say that it will come with 4GB and 512-bit bus.

    Overall, right now it looks like the 4GB GK104 is a scam by AIB partners. You simply can't benefit from the extra VRAM.
     
  3. atrance5

    atrance5 2[H]4U

    Messages:
    2,812
    Joined:
    Oct 3, 2008
    Dragon Age + 4x SLI = lol ( no matter what resolution )
     
  4. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    Laugh all you want, I'm only getting 42 fps in Dragon Age 2 and 29 fps in Heroes VI at 8064x1440.
     
  5. yanmeng

    yanmeng [H]Lite

    Messages:
    121
    Joined:
    Jun 20, 2008
    Have you considered that you could be CPU limited on that resolution? Can you try 3 card SLI on this resolution and check what fps can you get? If you loose 20~25% fps, your might be correct that video memory bandwidth is limited as 4GB memory is only an increase of memory size instead of memory bandwidth, which can be theoretically calculated by multiplying the memory interface bitness with memory frequency and number of SLI cards (Ex: 256bit * 1GHz * 4). If the fps is dropped less than 20%, it is possible that the system is CPU limited on that resolution for that game. It does need a lot of CPU power to drive 5376 CUDA cores (1344 cuda cores *4).
     
    Last edited: Aug 9, 2012
  6. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    It's not CPU limited, I tried overclocking the VRAM and I got a 10% increase in FPS from a 13.3% increase in VRAM clock (3005 to 3405). I can do the same test at 3.5GHz CPU clock when I get home, but I think that my FPS will be the same.
     
  7. cannondale06

    cannondale06 [H]ardForum Junkie

    Messages:
    16,180
    Joined:
    Nov 27, 2007
    we have known for months what the specs of GK110 will be and that it will have a 384 bit bus not a 512 bit.
     
  8. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    Oops, my bad :eek:
     
  9. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    Reposting from another thread fomoz is contributing to: he's using Z77, which limits his four cards to 4x PCIe 3.0 lanes each with added latency, both from the PLX switch and from the cards fighting for bandwidth from the CPU.

    I don't think this is a fair way to test this setup at all, and that he should make the effort to move to X79. No person in their right mind (I'm not picking on you bro!) should be using four GPUs on a 16-lane board, especially not top-end GPUs, and expect to get anything that could be confused with good scaling.
     
  10. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    Also, for the love of all things holy, could you please use preview thumbnails instead of the actual screenshot fomoz, and yanmeng, could you please not quote the whole picture post? Thanks!
     
  11. cannondale06

    cannondale06 [H]ardForum Junkie

    Messages:
    16,180
    Joined:
    Nov 27, 2007
    and could you use the edit button instead of making back to back posts 2 minutes apart? :p
     
  12. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    Thought about it, but I felt that a separate post was needed and made one. My prerogative :).
     
  13. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    I agree with you 100%. I thought that I could save some money and get away with Z77 with a PLX switch, but I might be wrong. I'll try to find an X79 platform to test this on.

    Having said that, here are a few indications that the PLX switch might not be the bottleneck here:

    1. This setup scales well at 2560x1440. Doesn't hitting high FPS mean that I have enough bandwidth?

    2. As I mentioned earlier in this thread, increasing the VRAM clock by 13.3% (3005MHz to 3405MHz) resulted in a 10% performance increase in Heaven at 8064x1440.

    3. sk3tch has X79, GTX 680 4GB FTW 4-way SLI and 3x 1080p 120Hz. clayton006 has X79, GTX 670 4GB SC 4-way SLI and 3x 1600p. You can read clayton006's thread on EVGA forums here. sk3tch knows about this thread and I'll PM clayton006 about it. As far as I know, they're both having the same problems as me.

    4. sambatico has the same issues with GPU usage with GTX 670 4GB 4-way SLI on X79. You can see his post here.

    I'll try updating my BIOS and I'm also looking forward to trying out Virtu MVP.

    By the way, does anyone know an easy way to disable a GPU for testing? I don't feel like taking out the card and I don't think that my motherboard has a PCI-E slot disable switch.

    How do I do that? I'm linking them straight from my Dropbox.
     
  14. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    Manually, if you made them a smaller picture then linked them to the original; there may be a way to automate it.

    I read through the Evga thread, and I never saw someone claim to get PCIe 3.0 working right with four cards, so there's really no way to tell.

    Thing is, they did state repeatedly that you'd need 4GB cards for a >3x1200p setup. Given that a single GTX670 can reasonably drive a single 1600p monitor, I'd think that three cards would be enough for that setup, though I understand the desire to use four.
     
  15. flu!d

    flu!d Gawd

    Messages:
    562
    Joined:
    Nov 7, 2007
    You're GPU utilization is all over the place, you need to count out a CPU bottleneck by loading up the GPU's and create a GPU bottleneck, just go easy on the AA.
     
  16. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    It's a Surround driver issue. Thanks though.
     
  17. Salamndar

    Salamndar n00b

    Messages:
    5
    Joined:
    Aug 7, 2012
    OK for the people who own an EVGA GTX 680 FTW SLI setup and have issues with cards temperature.

    I found the solution for you...........

    There seems to be a flaw in the design of the HSF bezel as usually nVidia cards tend to have a deeper bezel around the fan of all their cards.

    Check the fan in all these nVidia cards :

    The old 9800GTX
    [​IMG]

    The Trusty GTX 285
    [​IMG]

    The Hot very hot GTX480
    [​IMG]

    The Good old GTX580
    [​IMG]

    Now this shallower bezel around the fan allows for a pocket of air for the fan to breath air from when 2 or more cards are stacked together.

    Now with EVGA GTX680 FTW, EVGA made a huge mistake by making the bezel totally flat from the tip to the toe of the card.

    Take a look at the card
    [​IMG]

    Yes totally flat and that's totally wrong.

    I use 3 EVGA GTX 680 FTW and have seen horrifying tempratures on the top and midle cards.

    Ranging up to 96c for the mid. card and 84 for the upper card in a room with ambient temp. of 25c.

    I some how managed to improve that by doing the following:

    Got myself some small nice felt pads like these :

    [​IMG]

    took 4 pads of them and used 2 on the lower card and 2 on the mid card to raise the fan of the upper cards a little and emulate that air pocket that nVidia do.

    [​IMG]

    [​IMG]

    " pardon me for the dust, I live in a city in the middle of a desert and I clean my PC once every month or so and it is still not the time to clean it. "

    Any way as you can see from the pictures I add an 800 RPM 14" Fan to insure a good stream of fresh air.

    Now the temperature never goes above 75 for the mid card and 71 for the upper card.
     
  18. clayton006

    clayton006 Gawd

    Messages:
    966
    Joined:
    Jan 4, 2005
    Hey fomoz long time no see. My SR-2 bit the dust and now I'm back to my x79 setup only this time on water (5.0GHz). I can now stay stable with PCI-E 3.0 enabled as long as I remove my sound card. I sold one of my GTX 670 4GB cards. If we wanted to do some slight apples to close-to-apples I'd be willing to coordinate with you. I don't have my 3 30" monitors hooked up together but I'm sure I can do it and make my resolution the same as yours (try not to do bezel correction) and see if the x79 native lanes help us out here.
     
  19. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    Holy over-sized pictures Batman! Shrink those babies please!

    For the cooler design, they used 'beveled' ends back when motherboards only put one slot in between PCIe slots, which meant that they had to use a lower-profile fan as well.

    When motherboard makers started putting an extra slot in there, they added back to the fan depth, and made the ends square. Ostensibly, if you're using more than two cards, you probably have another solution in mind anyway (many apparently aren't that smart).

    If you're going to run two high-end cards back to back, you're going to need to either space them like you have, or water-cool them.

    This is also why case manufacturers have started putting an 8th slot position in; if only motherboard makers would put the first x16 slot in slot 1, then the second and third ones in slots 4 and 7, we'd be golden, but it seems you have to get a workstation board to get that setup.
     
  20. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    By the way, 4-way SLI Surround in DX9 works fine most of the time. DX10 and DX11 have the GPU usage issue, I think it might be related to one of the features in their API.
     
    Last edited: Aug 9, 2012
  21. atrance5

    atrance5 2[H]4U

    Messages:
    2,812
    Joined:
    Oct 3, 2008
    Hope you didn't dump 2k for Dragon Age in particular.

    An RPG disgrace.
     
  22. Salamndar

    Salamndar n00b

    Messages:
    5
    Joined:
    Aug 7, 2012
    Here are some pics for the GTX 680 FTW 4GB naked before applying some new thermal compound.

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]

    [​IMG]


    Over all, I didn't like the quality of the HSF at all !
     
  23. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    8,810
    Joined:
    Jun 13, 2003
    I'm confused- what's wrong with it?
     
  24. Salamndar

    Salamndar n00b

    Messages:
    5
    Joined:
    Aug 7, 2012
    HSF base was full of scratches and was not flat at all, the cables for the fan in one card was so short that they were cut as soon as the card was cracked open so I have to solder the cables tomorrow.

    Also some screws were stripped and took me some effort to remove some of them while other screws were so tight an stripped so was not able to unscrew them.

    I am thinking of water cooling the cards with a universal GPU block later this year!
     
  25. Swolern

    Swolern Limp Gawd

    Messages:
    196
    Joined:
    Aug 24, 2012
    Any progress on this setup?
     
  26. balance101

    balance101 Limp Gawd

    Messages:
    411
    Joined:
    Jan 31, 2008
    how much power do you pull from the wall? :p
     
  27. fomoz

    fomoz Limp Gawd

    Messages:
    394
    Joined:
    Jan 5, 2012
    Supposed to be fixed in 310.xx.
     
  28. sk3tch

    sk3tch [H]ard|Gawd

    Messages:
    1,443
    Joined:
    Sep 5, 2008
    For real? Any add'l details on this? When is 310.xx's ETA? 2013??

    My 3960x (4.5 GHz, 1.48v) and 4x GTX 680 Classifieds (no OC) maxes out (thus far) at 855W per my EVGA NEX1500's SuperNova software (which, reportedly, is not very accurate - but close enough). Power efficient cards.
     
  29. Krazypoloc

    Krazypoloc Gawd

    Messages:
    759
    Joined:
    Jul 23, 2010
    I'm finally getting my 3rd card back from RMA this week (came DOA). Excited to see if I hit any bottlenecks on the driver, CPU, Memory Bus, ect.
     
  30. clayton006

    clayton006 Gawd

    Messages:
    966
    Joined:
    Jan 4, 2005
    I split my Nvidia surround setup as now I'm playing on 1 120hz 1080p monitor and love it. When Dead Space 3 comes out through I'll probably put them all back together again. Let us know how it works out for you.