NVIDIA Kepler GeForce GTX 680 Video Card Review @ [H]ardOCP

Wow, that adaptive Vsync feature looks sweet! The question is how much the 4GB cards are gonna cost, could be nice quad-SLI solution for surround 1440p :)
 
It already overclocks itself though

From HOCP
"GPU Boost is guaranteed to hit 1058MHz in most games. Typically, the GPU will be going much higher. We experienced clock speeds in demo sessions that would raise to 1.150GHz and even 1.2GHz in such games as Battlefield 3"

so they got it 64mhz above...not a lot of room to move.

Check that xbit link that was posted, the 7970 OC kept up with the 680 very well IMO. what AMD needs to do now is release a driver that does some auto overclocking.


Right, just showing that it has the headroom there if you want to go at it manually. Personally I doubt I'll feel the need to do a manual overclock on a 680.
 
Very interesting review.

I'm looking forward to the 680 SLI review. Currently I'm runing 580 3gb SLI and I'm curious to see what sort of performance advantage the 680 SLI brings to the table.
 
Really depends on your situation. Running a 1680x1050 display? Stick to the ~$200 solutions. Running 2560x1600, you either run a couple of those $200 cards in SLI/CF & deal with multi-gpu issues or you buy a $500 card.

Well the current standard for displays is 1920x1080 or 1920x1200. When you look at the benchmarks, none of the cards listed can run a game like Battlefield 3 maxed out at anywhere near 60fps with a resolution of 2560x1600. So people who really want to crank it up would have to spend even more ridiculous amounts of money on GPUs.

As far as 1920x1080 goes, I can smoothly play battlefield 3 at that resolution completely maxed out except for not using MSAA on my 6950 (unlocked to 6970 specs). That card cost me about 230 when I bought it last Summer.


All in all the value proposition of cards like this isn't really working out even if you're using a resolution of 2560x1600.
 
Great to see nVidia winning the efficiency battle this round. Plus, pricing their card lower.

Price cuts by AMD to follow.
 
Looking good nvidia -- I'll give you that.

Would I say that it was "worth the wait"? No. If people are really calling +4FPS in certain games a "total smack down" on nvidia, than you need to re-examine what 4fps means in the grand scheme of things.

The cards are for all intents and purposes equal -- a few % here or there isn't a big deal. What nvidia has going for it is the price point. 499 - while expensive, *is* more attractive than 549. And these days, the price is what's probably MOST important.

If AMD can afford to knock 50 dollars off the price then it becomes a VERY hard decision.

nvidia has better drivers
AMD has better overclocking headroom it seems
nvidia is great at folding
AMD is great at bitcoin.

some people prefer green over red and vice versa. It's a damn close call no matter how you look at it. If AMD lowers the price 50 bucks then I say good for everyone, early adopters always get to pay the premiums. and 50 bucks isn't that big of a hit in the grand scheme of things.

Me? I'll be rocking my 6970+6950 for at least another year as thats more than enough power @ 1920X1080.

glad nvidia has a horse in the race now, late start but will be fun to see where it goes.
 
I know I'm several hours late to the discussion and its probably been commented on several times. But I did enjoy reading Kyle's statement "Plenty in stock too." and then clicking the link to see straight out of stock - auto-notify buttons =).
 
With Adaptive VSYNC turned on your games will maximize the framerates to your monitors refresh rate, therefore you won't experience tearing. However, if your framerate drops below the refresh rate VSYNC will kick into real-time FPS mode and deliver the real-time FPS being delivered rather than instantly drop to 30 FPS. You won't experience tearing below your refresh rate...
Are you certain of this? This is not how NVIDIA's swap tear extension worked. Output was not synchronized below the refresh rate, so tearing was still possible.
 
Couldn't help but notice that nVidia stock is barely up (almost hasn't changed) yet AMD stock has fallen from yesterday's $8.10 to about $7.95 :p.
 
As impressed as I am, I'll be sticking with my 6950 for now. I can't spend $500 on a GPU. I spend half as much on CPU which ends up lasting me twice as long. We might not see price reductions either if this is the only GPU that NVDA is releasing until the 700 series.
 
Well, I must say that I'm pleased with what I've seen of the 680. I really didn't expect them to price the 680 lower than the 7970, but they did. It seems to be faster and use less power as well.

With that said, I still won't pay that much for a video card. Hopefully this forces AMD to drop the prices of their entire lineup. I'll be waiting to see what Nvidia brings to compete with the 7800 cards.

My only concern is that AMD might try to convince themselves that the 3gb of vram justifies a higher price even though it apparently does not affect performance.
 
As impressed as I am, I'll be sticking with my 6950 for now. I can't spend $500 on a GPU. I spend half as much on CPU which ends up lasting me twice as long. We might not see price reductions either if this is the only GPU that NVDA is releasing until the 700 series.

There will be others coming in the next few months. The next chip or two down will be GK104's with a few SMs disabled, after that we'll see GK114(name?) which will be the half-size of the GK104. The GK100(name?) will probably come out as a 700 series part and be roughly twice the die size as what we have today. nVidia's usual pattern but mixed up a little to take advantage of AMDs current offerings.
 
This is kind of what I was thinking. And I guess the question is irrelevant for the 680 because performance is so high. But for the slightly lower end cards, where there are a lot of 1200p tests anyway, it might be interesting to get the reviewers' opinions on the benefits of having over 60 FPS.

I have no personal experience with a 120Hz display, but I see a lot of forum posts about how great games (and everything else) are at higher frame rates. I guess these people can figure it out on their own based on the FPS graphs, like you said.

And I agree about the FPS graphs, I don't even bother with those useless bar graphs they put up on every other site. Sure there are fewer comparisons here, but what they do have is far more useful than what everyone else does.

The human eye is both amazingly complex and amazingly simple and how the mind handles the information it gives changes between users. Also a lot can be mental. So the question is how high of a refresh rate does it take for a given user to see no difference in IQ from on refresh rate to the next. That can depend on users. Way back in the day with CRT's users were flipping out if they couldn't do 1600x1200 at 100hz or higher. Otherwise they would get dizzy (like people sensitive to a dieing halogen light). While LCD's don't have a perceivable difference in terms of refresh rate, increased allowable of FPS without tearing could have the same affect on those users.

Sometimes it comes off as an audiophile like wonderance if they really see the difference, or if because they know there is a technical difference, they allow themselves to be tricked into believing that they notice the difference. Some times I wonder if both of those are borderline issues like wine snobbery. Where they are so dead set on the fact that their eyes, ears, or taste buds are so much better then everyone elses, that they imagine the difference to prove, backed up with the technical information.
 
eduncan911 said:
Could [H]ardforum post some graphs of the vram usage at 5760x1080?

I constantly exceeded 2.7 GB (2.8GB at times) on my 3-way GTX 580s w/3GB of ram, which I just sold off in anticipation of the GTX 680s. But that 2 GB ram limit really has me concerned, especially after seeing what happened with vram limits with my previous 2-way 470s with only 1.2 GB of ram - the system would play nicely at 50 to 60 FPS, and then drop to the 20s at times, then back up to 50s - just like the graphs here show for the single card GTX 680 at 5760x1080.

I'm wondering if it is bumping up against the vram limit with the Ultra bitmap settings.

This. I sold my 2 470s because the very issue. I now run a single 6970 2gb and in BF3 mp I push 1.8gb usage often.

Great review as always though guys! [H]ard is the only site I place much stock in when making purchasing decisions. Great reviewers and great community! Love the 'real world' gaming #'s.

seconded.

plus, could we have some compute benchmarks using blender/cycles renderer?

Sounds like we have 3 votes! Time to ramp up the tests again [H]ard! ;)
 
Last edited:
As impressed as I am, I'll be sticking with my 6950 for now. I can't spend $500 on a GPU. I spend half as much on CPU which ends up lasting me twice as long. We might not see price reductions either if this is the only GPU that NVDA is releasing until the 700 series.

Yea, same here. I upgraded a month ago from a gtx 260 to a gtx 560 ti 448 for $215 and don't regret all. No way I'm spending $500 on a GPU, ever. $250 is my limit and I only upgrade when I can't play the games I want to any longer. I can't image upgrading from dual 580s to dual 680s. Seems like such a waste.
 
Yea, same here. I upgraded a month ago from a gtx 260 to a gtx 560 ti 448 for $215 and don't regret all. No way I'm spending $500 on a GPU, ever. $250 is my limit and I only upgrade when I can't play the games I want to any longer. I can't image upgrading from dual 580s to dual 680s. Seems like such a waste.

/perception
 
nVidia has done the impossible... they are making me think about jumping teams. I really have nothing against nVidia, except their BS marketing and slightly underhanded tactics dumping money into game studios to optimize for their cards, but then again who doesn't.

However, I do see a green card in my future now. Hoping that the mid-range segment is just as good. Good thing I'm broke so I could wait for both teams to flesh out these monsters.

Thanks for another great review [H]!! Look forward to the rest.
 
I'm definitely impressed by the performance in some spots and thank you for including 1080 (well, 1200) and 1600 resolutions.
I'm surprised that the 680 performs so much better at the lower resolution and, as you said in your conclusion, it brings up some interesting questions about both cards.

The big thing though, and I can't believe I'm going to say this: Nvidia finally delivered a card that performs great without having to harness the energy of a supernova. Now they need to bust ass and get lower priced cards out there
 
Please? Somebody tell me when or which driver updates allow Geforce surround on a single card?

For instance can you use a single regular (non-galaxy special edition) geforce 570 for surround? Was this done with a driver update, or an entirely new feature for the 680?


I must be living under a rock. :confused:
When did nVidia start supporting surround on a single video card?!

I thought you had to have 2 geforce cards to do that?
 
I am thoroughly impressed with the GTX680. I thought the performance would not be as great beyond 1080p due to the 2GB and 256bit bus. Also, Kyle gave a vague comment in another thread that in 7970 vs 680, 1080p goes to the 680 but beyond that is a different story. Story doesn't look TOO different at higher resolutions.

Of course, all of this is comparing stock clocks. I'm very interested to see how OC 680 compares vs OC 7970. Either way, Nvidia is playing very smart by releasing this card at $500 to undercut AMD instead of $600.
 
Yea, same here. I upgraded a month ago from a gtx 260 to a gtx 560 ti 448 for $215 and don't regret all. No way I'm spending $500 on a GPU, ever. $250 is my limit and I only upgrade when I can't play the games I want to any longer. I can't image upgrading from dual 580s to dual 680s. Seems like such a waste.

Most feel the same, but not all.

Which is good, because those folks who have $1000 to blow on dual video cards push the tech so we can get faster single cards at the mid-range.

Win-Win.
 
Please? Somebody tell me when or which driver updates allow Geforce surround on a single card?

For instance can you use a single regular (non-galaxy special edition) geforce 570 for surround? Was this done with a driver update, or an entirely new feature for the 680?

It wasn't mentioned at all in the article as a new feature, even though the benchmarks show results in surround for a single 680.

you'll need the 6x0 lineup for a single card surround..
 
Yea, same here. I upgraded a month ago from a gtx 260 to a gtx 560 ti 448 for $215 and don't regret all. No way I'm spending $500 on a GPU, ever. $250 is my limit and I only upgrade when I can't play the games I want to any longer. I can't image upgrading from dual 580s to dual 680s. Seems like such a waste.

I feel the same in that I wouldn't pay that much for a gpu. However, I can understand why people do. I mean, it's that that big of an investment for the enjoyment that you will derive for the next year or two. Plus, a lot of people have more income than I do.
 
I am seriously considering the 680 now over the 7970. The price changes for AMD over the next month or two will be the deciding factor.
 
I feel the same in that I wouldn't pay that much for a gpu. However, I can understand why people do. I mean, it's that that big of an investment for the enjoyment that you will derive for the next year or two. Plus, a lot of people have more income than I do.

Yea, and I'm sure there are those that think spending $250 on a video card when you can get nearly an entire computer for that price is crazy.
 
I am seriously considering the 680 now over the 7970. The price changes for AMD over the next month or two will be the deciding factor.

I think paying an extra $50+ for a 7970 over a 680 makes pretty much zero sense unless you are using your card for something other than gaming. If the 7970 was priced about $100 cheaper or even the same as the 680 I think you could go either way.
 
nVidia has done the impossible... they are making me think about jumping teams. I really have nothing against nVidia, except their BS marketing and slightly underhanded tactics dumping money into game studios to optimize for their cards, but then again who doesn't.

However, I do see a green card in my future now. Hoping that the mid-range segment is just as good. Good thing I'm broke so I could wait for both teams to flesh out these monsters.

Thanks for another great review [H]!! Look forward to the rest.

Honestly, if nVidia is willing to dump extra money for top tier games to push their tech and they win out I see that as being a good thing. It means they are willing to give money to developers so people who are green get a better experience. You can't hold that against them because AMD doesn't do the same.
 
You play in our Metro Server Bussiness6!!!
I just realized that.
7970 can command the 490 price range due to the Vram.
7950 should be 350-400.
 
Honestly, if nVidia is willing to dump extra money for top tier games to push their tech and they win out I see that as being a good thing. It means they are willing to give money to developers so people who are green get a better experience. You can't hold that against them because AMD doesn't do the same.

Then there's the AMD track record (or lack thereof) /wrt to driver support for A-list titles at launch. "Hey, here's a new beta driver that might make your game playable... we'll have a real release...sometime. Maybe."
 
Can Brent (or anyone) that gets one test out CUDA performance (or folding@home if the client supports it yet...probably wont). Could run CUDA-Z or something. Want to see the GPGPU performance difference over the older generations before I commit. :D
 
Back
Top