NVIDIA Kepler GeForce GTX 680 SLI Video Card Review @ [H]

Great review as usual guys! Any time line on the overclocking article? Which is the one I am waiting for the most. Any plans for watercooling them as well? I have a feeling these cards may do very well with "custom cooling" that lowers temps to allow the dynamic overclocking to push the card harder?
 
All i'm going to say is amd desperatly needs a turbo boost.

No, what they needs is to buckle down on getting their drivers in order. That was the only reason I didn't get a 7970 when they came out. Now, I'm glad I didn't. I'm sticking with Nvidia for the foreseeable future.
 
I won't lie, I felt that NV did also provide a slightly smoother game play at lower fps, I didn't notice the MS as much I guess, however I really disliked having to close some programs to modify my mulit-monitor setup at all, as in add a new custom res or change SLI settings, with AMD there is no need for that.
 
Great review! Serious firepower.... I know this is [H], but I'm thinking a serious arguement can be made that we are running short of games that push the new gaming engines.

...just had to laugh when I read this part though, "Two GeForce GTX 680 cards will cost you $1000 and two Radeon HD 7970 cards will cost you $1100. When you are spending so much money on a graphics solution you want every dollar to count, and to save those dollars where possible."

I dunno, for those who need to game at 5760x1200 and up, not sure $100 is an issue for these folks.
 
Yeah Witcher 2 and Shogun 2 also is pretty demanding.

Despite being way better looking and demanding than most of the games they test, I don't think they'll ever use Witcher 2 since it doesn't support Eyefinity. There's rumors that the Enhanced Edition coming April 17 will have DX11 enhancements though, so we'll see.
 
...just had to laugh when I read this part though, "Two GeForce GTX 680 cards will cost you $1000 and two Radeon HD 7970 cards will cost you $1100. When you are spending so much money on a graphics solution you want every dollar to count, and to save those dollars where possible."

I dunno, for those who need to game at 5760x1200 and up, not sure $100 is an issue for these folks.

I dunno about you, but I've managed to keep fairly current on my hardware and $100 is a lot of money to me. Not everyone who has top-end hardware is necessarily rich and can just throw money away. A lot of people save up and that extra $100 is another $100 you'd have to spend rather than saving it for something else.
 
Despite being way better looking and demanding than most of the games they test, I don't think they'll ever use Witcher 2 since it doesn't support Eyefinity. There's rumors that the Enhanced Edition coming April 17 will have DX11 enhancements though, so we'll see.

it doesn't officially support eyefinity, but I play it in eyefinity =p portrait though, not sure about landscape, I know though that Portait is 100% broken on NV Cards.
 
Fantastic results really. Faster, uses less power, and is cheaper. I really have the upgrade itch but simply can't afford it at the moment. But if I could GTX 680 is what I'd get for sure. And absolutely incredible speed on Skyrim. I play that a lot, along with other Bethesda games so high performance on those is definitely a factor for me.
 
I dunno about you, but I've managed to keep fairly current on my hardware and $100 is a lot of money to me. Not everyone who has top-end hardware is necessarily rich and can just throw money away. A lot of people save up and that extra $100 is another $100 you'd have to spend rather than saving it for something else.

I'd argue fairly current vs, $1000 on just released dual video cards is not the same thing.... maybe I can agree if your saying you're dumping your 570s for 680s, which is what folks are doing when they buy these SLI.

At any rate, it's all perspective.
 
I'd argue fairly current vs, $1000 on just released dual video cards is not the same thing.... maybe I can agree if your saying you're dumping your 570s for 680s, which is what folks are doing when they buy these SLI.

At any rate, it's all perspective.

Well, let's put it this way also: I COULD afford to outright buy two GTX 680s right now. Am I going to? No, because that would be a foolish way to spend money I have in savings.

I agree it is all perspective, but assuming that because someone CAN spend $1000 on video cards doesn't mean that a difference in $100 (or even $50) means nothing to them.
 
Great review. I was very surprised by NVIDIA's lead in multi-monitor / high resolution configurations. My assumption of the 680's inferiority based solely on the 2GB VRAM limitation were completely incorrect.
 
Two 680's with my current build. Think my PSU can handle it, maybe cutting it a little close?
 
I think the 680 is the clear winner in this generation. Lower power consumption, better performance, higher efficiency, higher memory efficiency, and (for now) lower price. Three cheers for nVidia!
 
german website www.pcgh.de has been reporting on the responsiveness-issues with sli/cfx for years. they call it 'micro-stuttering'. i believe, they had a method for tracking down the intervals between frames.

nice review, btw. sitting on a single 680 here with no intention to go multi-card but good read nonetheless. :)
 
Brent really said it "Obviously AMD’s driver engineers need to figure out how to utilize the hardware more efficiently."

Seriously drivers are the thing AMD needs to work on right now and I don't know why they are more serious about it. AMD has been producing some great hardware as of late, the 4000 series was really solid and good for the price, the 5000 series beat nVidia to the DX11 punch by months, the 7000 series seems to be extremely well designed and have lots of cool features.

However nVidia keeps killing in the driver arena. They have better OpenGL drivers (not that there are a ton of GL games, but they are there), better features, better per app control, and better multi-card scaling.

It isn't as though nVidia is the only company that can do this. There isn't one guy in the whole world that can write good drivers and nVidia has him or something. AMD needs to do what they need to do. Maybe they need to bring more people in, maybe they need to replace some existing people, maybe they need to change their development style. I don't know but they need to get on it.

The reason I am such an nV fan is the drivers. I just don't have the same problems with them.
 
german website www.pcgh.de has been reporting on the responsiveness-issues with sli/cfx for years. they call it 'micro-stuttering'. i believe, they had a method for tracking down the intervals between frames.

nice review, btw. sitting on a single 680 here with no intention to go multi-card but good read nonetheless. :)

Think you can record it with fraps, the data you're after is the 'frametime' I believe.
 
Two 680's with my current build. Think my PSU can handle it, maybe cutting it a little close?

Depends on how old the PSU is. If it's new then it's fine. But if it's old you might have issues. I had a 750W PSU that could barely output 600W without shutting down after 5 years.
 
Nvidia has or will release a tool soon that will help track down "micro-stutter" I'll look for the link

I'd definitely be interested in this tool. I think 680's would be particularly prone to microstuttering because my cards always run at different clocks in games due to GPU boost.
 
Petersen told us Nvidia is considering creating an API that would permit third-party applications like Fraps to read display times from the GPU. We hope they do, and we'll lobby AMD to provide the same sort of hooks in its graphics drivers. Beyond that, high-speed cameras might prove useful in measuring what's happening onscreen with some precision. (Ahem. There's a statement that just cost us thousands of dollars and countless hours of work.)

that is from http://techreport.com/articles.x/21516/12

seeing as this was written 6months ago, I'm not hopeful that this has been done....
 
Well, that says they were "considering" it. So basically, who knows if it will ever exist or not.
 
Hurrr I NEEDZ MORE VRAMZZZ Durrrr...Once again, proves that this whole "I need 3000000000000gb of Vram" is just horseshit. (7680x1200 will probably need it)
 
Looks like AMD's product driver development, or lack thereof, is getting spanked BIG time here... Hardware seems superior, but looks like their software integration deficiencies are really getting highlighted with the 7970 cards, hard to overlook when a company asks a premium price for their products, they better deliver.

In this case nvidia delivers, AMD is handicapping itself. I now have to reconsider waiting for a 4GB 680, instead of a 6GB 7970.

Good review [H].
 
Last edited:
Hurrr I NEEDZ MORE VRAMZZZ Durrrr...Once again, proves that this whole "I need 3000000000000gb of Vram" is just horseshit. (7680x1200 will probably need it)

I know. Can we please put the VRAM debate to bed now - it is pretty clear that 2GB is enough.
 
Brent really said it "Obviously AMD’s driver engineers need to figure out how to utilize the hardware more efficiently."

Seriously drivers are the thing AMD needs to work on right now and I don't know why they are more serious about it. AMD has been producing some great hardware as of late, the 4000 series was really solid and good for the price, the 5000 series beat nVidia to the DX11 punch by months, the 7000 series seems to be extremely well designed and have lots of cool features..
Remember that AMD laid off a lot of engineers both hardware and software department some time in last year about time when 7000 series at finalizing stage. Maybe it's been affected.

A topnotch product with mediocre support means no performance delivered.


.
 
I know. Can we please put the VRAM debate to bed now - it is pretty clear that 2GB is enough.

I think I'm starting to see that it's deeper than that. It's probably that 2GB is enough for Nvidia. These guys are apples and oranges and Nvidia seems to use vram better.

Nvidia "The way Vram was Meant to be Played"
 
well multiplayer BF3 with 2xaa ran into VRAM issues :p

Yet still outperformed the 7970 with higher ingame settings.

H SLI Review said:
Once we did all that, then it was only playable at FXAA, just the same as GTX 680 SLI. When we increased to 2X and 4X MSAA, performance was choppy and unplayable. Even though 7970 CFX has more memory, and a higher memory bandwidth, that didn't do anything for us here in multiplayer over GTX 680 SLI. While both setups were at FXAA, GTX 680 SLI had the better experience since HBAO was enabled.
 
Last edited:
Hurrr I NEEDZ MORE VRAMZZZ Durrrr...Once again, proves that this whole "I need 3000000000000gb of Vram" is just horseshit. (7680x1200 will probably need it)

I know. Can we please put the VRAM debate to bed now - it is pretty clear that 2GB is enough.

Its good to have these debates, coz frankly its one of the most frustrating things when your card runs out of VRAM when you know it could go harder if it had more VRAM.

I remember when I bought my 320mb 8800GTS and in all games at any resolution I intended to run at, it didn't hit a VRAM wall compared to the 640mb version. Within a year new games had been released that took advantage of the 512mb, 640mb and 768mb of high end cards (which a couple of years earlier had been 256mb) and the 320mb card just hit the wall at the same resolution that it played everything maxed out fine when I bought it.

So its good to have a healthy skeptism when it comes to how much VRAM you need, and its good to go a bit higher than you "need" as inevitably the games that come out a year from now will be more VRAM hungry than the current games.
I think I'm starting to see that it's deeper than that. It's probably that 2GB is enough for Nvidia. These guys are apples and oranges and Nvidia seems to use vram better.

Nvidia "The way Vram was Meant to be Played"
I've been suspicious of nvidia using RAM more efficiently for a couple of generations now.
 
Lol, "suspicious"...you make it sound like a bad thing. :p

LOL, well just saying I don't have any evidence, I just remember my ATI card choking at high resolutions sooner than my mate's computer with an nvidia of much the same vram, so I felt the 1GB in my ATI wasn't worth as much as the 1GB in his nvidia... could have been other issues though, hence just a suspicion ;)
 
Remember that AMD laid off a lot of engineers both hardware and software department some time in last year about time when 7000 series at finalizing stage. Maybe it's been affected.

A topnotch product with mediocre support means no performance delivered.


.

Well AMD drivers have always sucked, including when they were ATi. Hell there was a time I wouldn't touch ATi cards AT ALL because their drivers were such a disaster. They've gotten way better than that, I've owned 2 ATi cards so far (9800 and a 5870) and I haven't regretted the purchase. However they still can't get it up to nVidia's standards. It seems like they did a good job improving for a few years and then kinda stopped, and quality has been the same since then.

Isn't an area they should be doing layoffs in and they need the GPU market. Also if they got their shit together on this maybe they could get more in to the GPGPU market. That was part of the problem was again drivers. nVidia provided an extremely full featured, well supported, way to use their GPGPU features, CUDA, very soon after the hardware was there to do it. ATi fucked around for a long time with nothing useful (Stream was awful) before finally finalizing OpenCL, but then actually getting beat to the punch by nVidia who had working CL drivers before AMD did.

They need to get their software game up to spec. I love their hardware but I find I buy nVidia most of the time because I want the better software.

That's why I have a 680. When the 7000 series came out I liked it. However I decided after the problems I'd had, best to just wait and see on nVidia's offering. So now I have a 680 and am happy as can be.

Beating nVidia to the hardware punch doesn't help if the software doesn't back it up.
 
Ehh...no. If anything, 4x maybe started to a tiny bit but again, it was not really an issue.
No, 2x MSAA also went over the limit...
When we tried to turn on 2X MSAA it brought the FPS down to 40 FPS average and at 4X AA the game was completely choppy and unplayable. What you can see from the table is that at FXAA we are already tapping the full potential of the VRAM capacity at 2012 MB, going to 2X or 4X only takes us even higher over the capacity.

Yet still outperformed the 7970 with higher ingame settings.

Yes, it was able to have HBAO on, but no MSAA ( we are talking about 2GB VRAM limitation are we not?)


So while currently VRAM issues are not widespread, there are a couple of instances where you can go over 2GB.

1 being BF3 multiplayer with MSAA, also BF3 is launching new map packs soon, with even "bigger" maps, I'm wondering if this will make a difference since the 680s are already close to maxing out with just FXAA

2 Skyrim with mods will break 2GB and get around 2400-2500mb of VRAM usage.

Either way for 90% of the scenarios, 2GB seems to be fine still, just like it was with the 6970s. Just making a point that there are limitations that you can run into.
 
Playing devil's advocate here, if you were Nvidia, with less vram, how would you manage it when you were close to, or over capacity? Turn some things down that use vram. Is it possible they are throttling the intensity of settings like SSAO,/HBAO etc... the less obvious settings, particularly in faster paced multiplayer games to keep the frame rate high? By prioritizing AA/AF etc?
 
No, 2x MSAA also went over the limit...

Stop looking at the numbers, and look at the play experience - it's been shown many times that BF3 uses as much VRAM as possible, so just because the VRAM usage was a 2GB doesn't mean it was at the limit. The frame rates dropped at 2x, but didn't run out of VRAM (that was at 4x) and even with only 2GB it still offered a better play experience at all settings. 4xx AA at those resolutions isn't playable anyway, so the VRAM limit makes no difference (unless you like playing BF3 at 30 fps).

Of course you can drive the card over the VRAM limit, just like you can the 7970 - but if the frame rates are already on the floor, does it really matter? The bottom line is that the 2GB of VRAM on the GTX 680 does not impact it's performance compared to the 3GB on the 7970.
 
Last edited:
Back
Top