NVIDIA Kepler GeForce GTX 680 Video Card Review @ [H]ardOCP

Nice article.

For once, I have to say, good job Nvidia.

A price war is always good for us customers. Now AMD has to react.
 
Having only 256bit lanes along with only 2GB vRAM and a smaller die size will likely mean that nVIdia will be able to beat AMD in pricing no matter how low AMD decides to drop their cards MSRP.

I still can't get over nVidia having beaten AMD in efficiency... seriously, what the hell is going on? :p

The topic in the Shader discussion thread kind of lead into this. Nvidia building a new architecture obviously targeted this type of environment from the beginning. It's not like its a new thing for them, back when the 7800GTX launched it was the same thing. AMD then built the 4k series architecture with efficiency in mind, while Nvidia went brute force.

This gen AMD made a huge change in SP design. Which forced them to lose efficiency in the end this round, but its not like it isn't a close fight. Which should be good at least in terms of pricing for users. With AMD and Nvidia being this close in initial generation performance and efficiency. Expect a closer fight and more pricing battles for the next several generations, instead of the sharp changes in leaders and efficiency.
 
Anand has compute benchmarks and they don't look good. openCL/directcompute performance looks to be beneath GTX580 levels. This card doesn't look to improve upon nVidia's compute advantage but rather a stall whereas GCN was a big bump in that respect. It's definitely going to get interesting in the HPC segment.

What card in the 5xx lineup had only 8 SMs? My guess is that that is the product-line equivalent of the 680, and that the GK110 will have the full 16 modules like the 580 does now. If so, and the 680 is matching 580 compute performance, how much better do you think the GK110 will be?
 
I must be living under a rock. :confused:
When did nVidia start supporting surround on a single video card?!

I thought you had to have 2 geforce cards to do that?

This is awesome. My last two video cards have been AMD; and even though I decided to sit a generation out, this is good competition. nVidia finally earned their crown back and isn't gouging.

I was fully expecting $600 MSRP.
 
And just reading other sites, there are definitely a few games where it looks like the 7970 kills the 680. Looks like Kyle and Brent have to play more games ;) Civ 5, Crysis and Crysis 2 seem to be the common ones so far, but i'm curious if REAL world will disprove the canned benchies.

And that xbit link someone posted sure does show the OC'ed 7970 Holding it's own vs OCed 680.

Can anyone confirm the rumor that the 680 was really meant as a replacement for the 560ti for next gen of cards? If that's the case they are REALLY price gouging.
 
Anand has compute benchmarks and they don't look good. openCL/directcompute performance looks to be beneath GTX580 levels. This card doesn't look to improve upon nVidia's compute advantage but rather a stall whereas GCN was a big bump in that respect. It's definitely going to get interesting in the HPC segment.

yeah, saw those cheers.

more interested in the specific case of blender/cycles as a real world example of a compute application that we intend to use on a daily basis at work. :)
 
Could [H]ardforum post some graphs of the vram usage at 5760x1080?
This. I sold my 2 470s because the very issue. I now run a single 6970 2gb and in BF3 mp I push 1.8gb usage often.

Great review as always though guys! [H]ard is the only site I place much stock in when making purchasing decisions. Great reviewers and great community! Love the 'real world' gaming #'s.
 
What card in the 5xx lineup had only 8 SMs? My guess is that that is the product-line equivalent of the 680, and that the GK110 will have the full 16 modules like the 580 does now. If so, and the 680 is matching 580 compute performance, how much better do you think the GK110 will be?

I doubt we will ever see it. You figure it takes 3 months after creating the stock of chips to get to card integrators, to getting them to distributors, then to retail. Nvidia didn't call all of them up and tell them stop preparing the GK110, and to repackage the GK104, as the top end card. No, Nvidia was only ever going to release one product at this time and it was the GK104. If I had to guess why the numbering for the top end card being 104 instead of 110, it has something that Nvidia learned in development. Chances are they really overshot their power and size estimates on the 110, and turned to their 104 to beef up for a high end card launch. The 110 could probably see light in the future, once another process shrink happens, or they whittle it down for the next generation (much like making due with a much higher die size with the 6970 over the 5970, when the move to 32nm didn't happen).
 
I was gonna say bleesh to only 1xDP, but then I saw 2xDL-DVI instead of 1xDL-DVI and that makes up for it (mostly).

Pretty sweet.
 
I sure hope more stock appears on Newegg. They ran out awfully fast. I checked the second I got to work @ 7:30PST and everything was sold out. I recall the 7970 was a similar situation where they went out-of-stock really quick, but that stock started trickling in the next few days. Please let that be the case here...
 
The topic in the Shader discussion thread kind of lead into this. Nvidia building a new architecture obviously targeted this type of environment from the beginning. It's not like its a new thing for them, back when the 7800GTX launched it was the same thing. AMD then built the 4k series architecture with efficiency in mind, while Nvidia went brute force.

This gen AMD made a huge change in SP design. Which forced them to lose efficiency in the end this round, but its not like it isn't a close fight. Which should be good at least in terms of pricing for users. With AMD and Nvidia being this close in initial generation performance and efficiency. Expect a closer fight and more pricing battles for the next several generations, instead of the sharp changes in leaders and efficiency.

Absolutely. The next-gen AMD cards will improve upon efficiency and it'll be a closer battle. Considering that it's so close now in efficiency (relatively), it wouldn't be out of the question to see both companies battling each other in this respect as well. About time really, nVidia couldn't afford to keep going brute force anymore.

Also important to note that AMD has a rather important piggyback in this race that relies heavily on the efficiency front and that's their Fusion/HSA agenda. The push for efficiency there is of the utmost importance.

What card in the 5xx lineup had only 8 SMs? My guess is that that is the product-line equivalent of the 680, and that the GK110 will have the full 16 modules like the 580 does now. If so, and the 680 is matching 580 compute performance, how much better do you think the GK110 will be?

Who knows? The biggest question is always perf-per-watt and perf-per-dollar and any significant change like you're noting will obviously impact those two deciding factors. I think it may be quite different this time around considering AMD's new direction. They've abandoned high end desktop CPU goals and making a strictly gaming discrete GPU (which is what nVidia did here) just won't happen. I'd bet that it'll be AMD winning the computational/HPC benchmarks from now on. Their HSA/openCL and future direction relies heavily on succeeding in both compute+gaming but apparently nVIdia may be looking to tackle both markets separately. The GTX680 is an amazing GPU, but if I had one flaw to pick out that would be it.
 
Can anyone confirm the rumor that the 680 was really meant as a replacement for the 560ti for next gen of cards? If that's the case they are REALLY price gouging.

Based on their internal codes this does appear to be the case. 04 parts are typically mid range.


Rumor has it that they originally intended to have the GK100 be the top end, but chip partners couldn't promise high enough volumes on the larger chips, so they instead revamped the GK104 and binned it better to make it the high end card.

This strategy may hurt long term, with the bit width for the RAM being so low, but for right now it looks like its working out fine.

Now, in order to hit these speeds with what they planned to make their mid range product, they are probably having to bin chips more aggressively, which means that their costs really aren't THAT much lower, but yes, they will be lower.


Either way, even if this board had been intended to be the replacement for the Gt520, it still wouldn't be price gouging.

The market determines the price, not the production costs. It performs on par with a 7970, and as such it should be priced on par with a 7970. This is how it works. Production and development costs are mostly irrelevant. They probably figured that once they launched it AMD would drop their prices a little, so they decided to undercut AMD by ~$50.

The board is priced exactly where it should be priced.
 
Solid card now to decide ... 590 680 oc'd or 7970 oc'd
 
I really want to know Battlefield 3 Ultra(4x MSAA) VRAM usage at 1080p,1600p and multidisplay. This should settle the 2GB vs 3GB dilemma for a LARGE no of people.
The only reason I will have to upgrade my 460 1GB is that the usage at Ultra is 1015MB+ out of 1024MB which leads to single digit FPS crawling.

Kyle Please. Many others are also interested.
 
Those minimum frame rates are a nice improvement over 7970. Both benchmarks/price point are much better than I expected. This card is a winner! Now, only if they could be back in stock at Newegg...
 
Good. Some competition.
I won't be dropping my 7950 any time soon, but if nVidia can finally provide AMD some incentive to maintain competitive pricing and performance in future products, then it will be good for all of us as consumers.
 
I would say that you can extract 120Hz performance from 2D results already posted for the resolution you interested in. Not really any great examples here because they choose to go for best gaming environment (where I believe and I know most will agree) that playable resolution matters more then anything else. Even for the big believers in 120Hz, there is a severely sharp law of diminishing returns in IQ buy pushing FPS past 60, then increasing playable resolution. For now and forever, resolution is king. So if your review is about getting a feeling of what is the max resolution and settings to create the most enjoyable environment, 120hz has to take a back seat to resolution.

But for reviews that have more common 1080p tests you should be able to see how close the card will come to 120 FPS in whatever games you play. Though I am disappointed with how after years of doing this here at HardOCP, finding a place that does FPS graphs is nearly impossible.

This is kind of what I was thinking. And I guess the question is irrelevant for the 680 because performance is so high. But for the slightly lower end cards, where there are a lot of 1200p tests anyway, it might be interesting to get the reviewers' opinions on the benefits of having over 60 FPS.

I have no personal experience with a 120Hz display, but I see a lot of forum posts about how great games (and everything else) are at higher frame rates. I guess these people can figure it out on their own based on the FPS graphs, like you said.

And I agree about the FPS graphs, I don't even bother with those useless bar graphs they put up on every other site. Sure there are fewer comparisons here, but what they do have is far more useful than what everyone else does.
 
Zarathustra[H];1038523006 said:
Based on their internal codes this does appear to be the case. 04 parts are typically mid range.


Rumor has it that they originally intended to have the GK100 be the top end, but chip partners couldn't promise high enough volumes on the larger chips, so they instead revamped the GK104 and binned it better to make it the high end card.

This strategy may hurt long term, with the bit width for the RAM being so low, but for right now it looks like its working out fine.

Now, in order to hit these speeds with what they planned to make their mid range product, they are probably having to bin chips more aggressively, which means that their costs really aren't THAT much lower, but yes, they will be lower.


Either way, even if this board had been intended to be the replacement for the Gt520, it still wouldn't be price gouging.

The market determines the price, not the production costs. It performs on par with a 7970, and as such it should be priced on par with a 7970. This is how it works. Production and development costs are mostly irrelevant. They probably figured that once they launched it AMD would drop their prices a little, so they decided to undercut AMD by ~$50.

The board is priced exactly where it should be priced.


I actually agree the price is fair.

Just goes to show all the little fanboys that neither company is a saint and will try to make money, and both could use it after their stock beat downs.
 
Nvidia could have easily priced this card in the 550 range and they did not, I am very impressed/relieved. I have waited for this to come out to make a gpu purchase, glad I did. Just hope the price doesnt go up due to supply and demand I need my tax return to come in first.
 
I have no personal experience with a 120Hz display, but I see a lot of forum posts about how great games (and everything else) are at higher frame rates.

I did not much care for BF3 until I got a Samsung S23A950D. The difference it makes is amazing. Just gotta lower the detail to get there! Surprisingly, stuff disappears, making it easier to spot players, lol. It really does a lot for aiming wo/ losing tracking. I feel like I've already got 1/2 a clip into them before they even shoot @ me.
 
Wow, I can't believe the performance of this card. Even at high resolutions and multi-monitor. Maybe people will shut the hell up about the 2 GB VRAM finally. :p

Great review, seems like a well-deserved Gold. I'm a little surprised that all cards released at $499 as well...just excellent all around.
 
you do know that amd isnt going sit back and chill LULZ dont forget the new 7990 coming out and im sure nvidia has something else up there sleeve after that. There both equel basically choose your poison now is it. Game On!
 
Looks like the 680 does exactly what I want for a single GPU setup: max settings at 2560x1600.

Nice price, too. :)
 
Wow! AMD is going to have to drop the 7970 to $200 for anyone to even bother with it. This is how you launch a new card. This is finally a next generation GPU.
Huge Exageration.
Still this Nvidia card is a pure fucking beast.
I was surprised to see it almost matching the 7970 in Eyefinity.
Still I was a little confused Brent. When the 680 was faster than the 7970 by a tiny amount of frames you touted it as being faster (which is truthful), yet in eyefinity resolutions when the 7970 was faster you just said it was equal...
That was rather odd.

Still Faster than 7970 (stock 7970), Cheaper, Power Efficent, and NV surround is very easy to set up I like that a lot. The adaptive Vsync feature looks amazing.

Well Nvidia was late but they gave us a lot to work with. Now if AMD drops the price on the 7970 to 490, and the 7950 to 400. we got ourselves a shoot out.
Also OC benches with 7970 and 680 are important. The 79x0s OC so well on air its rediculous, It would be good to see what the 680s reach and what the fps diffrence between the two is when OCed.
 
Months later and the same performance once in the higher resolutions in majority of new games. Something crazy is going on with Skyrim or the 580 has some insane unlocked potential.

Price is nice though, gotta love that.

Must see OC results though, I'm guessing there won't be much OC headroom on the 680 but the MSI 7970 they were able to up the AA level on the OC'ed card in BF3.

And looking at some other reviews, any chance of getting Crysis and Civ 5 tossed in?

Actually it looks like they overclock pretty well. Guru3D was able to get the 680 to 1264/6634 without the ability to manually tweak voltage.

http://www.guru3d.com/article/geforce-gtx-680-review/25
 
Looks like a very impressive outing from Nvidia. Especially at 50 bucks cheaper than the 7970.
Still, $500 is way more than people should be spending on a GPU as far as I'm concerned. 200-250 seems to be much better value and I hope both Nvidia and AMD will have impressive products for that more reasonable price category as well.
 
A great review and awesome card. I’m just not so sure that for the majority on single monitor, 1900x1200 and below, there was any kind of real game play advantage outside of Skyrim (and that’s with every possible control panel menu option).
AA on a few select titles is not a $400-$500 option I’m willing to pay for, considering I can get it on about half the titles with my current 6950 at 1900x1200 at playable frame rates.
I blame the consoles. Again, awesome review and Nvidia definitely surprised me. I just wish there was more than one title I play that could actually benefit from it.
 
Looks like a very impressive outing from Nvidia. Especially at 50 bucks cheaper than the 7970.
Still, $500 is way more than people should be spending on a GPU as far as I'm concerned. 200-250 seems to be much better value and I hope both Nvidia and AMD will have impressive products for that more reasonably price category as well.

Really depends on your situation. Running a 1680x1050 display? Stick to the ~$200 solutions. Running 2560x1600, you either run a couple of those $200 cards in SLI/CF & deal with multi-gpu issues or you buy a $500 card.
 
nV really stepped up to the plate here and gave us what we've been asking for. A smaller, quieter, more power efficient card. Features that actually matter, like Adaptive Vsync (this I'm very interested in) and more efficient transparency AA we can use in more games. Dynamic overclocking is a good idea, so long as it does not interfere with manual overclocking attempts.

The overclocked results will be interesting, with the 680 being short on memory bandwidth (how high will that vram clock go?) vs the 7970 which is known to have gobs of core clock headroom. I think a heavily overclocked 7970 has an excellent chance of winning that matchup, but the 680 will still win on power/heat and features. It does look like the 680 can't hold a lead at multidisplay resolutions - runs out of bandwidth I'm guessing - and I wonder if a clock boost on the memory will show there.

By the way, any noise testing done? Didn't see that ...

Also, how long until we see non-reference cards with 4GB? :eek::D
 
Actually it looks like they overclock pretty well. Guru3D was able to get the 680 to 1264/6634 without the ability to manually tweak voltage.

http://www.guru3d.com/article/geforce-gtx-680-review/25

It already overclocks itself though

From HOCP
"GPU Boost is guaranteed to hit 1058MHz in most games. Typically, the GPU will be going much higher. We experienced clock speeds in demo sessions that would raise to 1.150GHz and even 1.2GHz in such games as Battlefield 3"

so they got it 64mhz above...not a lot of room to move.

Check that xbit link that was posted, the 7970 OC kept up with the 680 very well IMO. what AMD needs to do now is release a driver that does some auto overclocking.
 
After digging around, apparently these cards do support bitstreaming audio.
I'm going to have to play around with using my GTX570 as a PhysX card. Overkill...sure, but I have to try it before trying to sell that card.
 
Back
Top