GeForce 680 & 670 vs. Radeon 7970 & 7950 Gaming Performance @ [H]

Interesting overclocking results

I see a 7970 in this review that is getting a moderate to high overclock vs a gtx 680 that is getting a low to moderate overclock

Either way the #s stand as they are

My 2 GTX 680s get higher overclocks than the sample in this review. The better of the 2 gets over 1332Mhz/6800mHZ, However that is the way the cookie crumbles in these type of reviews. Excellent work though.

I thoroughly enjoyed this review.
 
H's latest round of testing is still very much a win for Nvidia's gaming focused Kepler chips/cards, but it stills feels like the reliance on FXAA without comparing to 2x AA or 4x AA muddies things a bit.

Your concerns are noted and understood.
 
Interesting overclocking results

I see a 7970 in this review that is getting a moderate to high overclock vs a gtx 680 that is getting a low to moderate overclock

Either way the #s stand as they are

My 2 GTX 680s get higher overclocks than the sample in this review. The better of the 2 gets over 1332Mhz/6800mHZ, However that is the way the cookie crumbles in these type of reviews. Excellent work though.

I thoroughly enjoyed this review.

The OC numbers look fine to me. Keep in mind they have to be stable OC's, not barely-stable OCs good for only one bench at 100% fan. OC is always a crapshoot anyway. Worst come to worst you can extrapolate the performance yourself if you don't agree with the clocks, since both GPUs scale pretty linearly.
 
I understand that FXAA is a game feature when directly coded in, but it is more like a feature in the way Nvidia coded the 1080p 4x AA drivers for Batman Arkham City.

I opened the pull down tab from your own 680 SLI vs 7970 Cfx review (yes Cfx drivers are truly horrible) and looked at both Batman AC and BF3. In BF3 switching from AA to to FXAA resulted in a 21.6fps gain for 680s versus an 18.1fps gain for the 7970s and in that case those 3.6 frames seemed to matter.

Batman already heavily favors team green, but even there the performance boost of going from 4x AA to FXAA provided an extra benefit to Nvidia by the slimmest of margins.

Your other tests were either FXAA only or like Skyrim were AA only. Skyrim looks more like an AMD driver failure rather than the coding tricks done on the Batman series.

H's latest round of testing is still very much a win for Nvidia's gaming focused Kepler chips/cards, but it stills feels like the reliance on FXAA without comparing to 2x AA or 4x AA muddies things a bit.

I suggest we take this debate to another thread, if you'd like to debate the meaningfulness of shader algorithms like FXAA. I will state that I am all for shader based algorithms like FXAA/TXAA/MLAA/SMAA and whatever else comes down the pike. The future of AA lies not in Multisample traditional AA, but shader based AA IMO.

We will continue to use what allows the best image quality and performance in games and use features that are built into games.
 
Interesting overclocking results

I see a 7970 in this review that is getting a moderate to high overclock vs a gtx 680 that is getting a low to moderate overclock

Either way the #s stand as they are

My 2 GTX 680s get higher overclocks than the sample in this review. The better of the 2 gets over 1332Mhz/6800mHZ, However that is the way the cookie crumbles in these type of reviews. Excellent work though.

I thoroughly enjoyed this review.

Most of the reviews I have seen by other sites peg the max sustained boost clock for a 680 in the same general range, plus or minus 20MHz, of what the H's max boost clocks were. For the 7970, 1260 is definitely above average silicon for a full reference, first release chip (though the newer GHz editions should hit it without too much trouble). I also considered it a positive for the review to point out this above average speed in my first post in this thread (several pages back).

Even with AMD's purposely/foolishly/self-inflicted drivers, IPC seems nearly the same; imagine what AMD could do if it had a small army of competent coders to take advantage of its current silicon.
 
Interesting overclocking results

I see a 7970 in this review that is getting a moderate to high overclock vs a gtx 680 that is getting a low to moderate overclock

My 680 won't even hit 1200 MHz. So I think the number they chose is just fine.

I'm glad they chose the numbers they did, not only because they represent average/high overclocks for each card, but because they are close to being clock-for-clock, which I kept hearing a bunch of BS about regarding 7970 testing. Now we can see that the 680, even with a slightly lower clock, is right up with the 7970.
 
No complaints from me. I enjoyed the review. I just wanted to mention that a overclocking review can vary by the sample used.

Excellent review. Solid results. They could just as easily gotten a sample that doesn't do 1150mhz as there are some out there in the official GTX 680 overclocking thread. I just think they got a sample from the mid range of the bunch on the green side and a really good sample on the red side. Some 7970s can come right over 1300mhz though too. (very rare)
 
This is the review I've waited for since the 680 got released. OC vs. OC.

:D
 
It doesn't dynamically overclock though, as Brent has said hundreds of times.

They will never understand. From now on Nvidia will always be "cheating" just because they are winning most of the time in performance, and if not there then in power consumption and price.
 
Amen, and that is unfortunately hard to show, as stuff like that doesn't show up in a raw fps graph. This is the trouble we face many times trying to relate that experience to you, I often use the descriptive word "smoothness" for it, which you'll find used a lot in regards to SLI.

Kyle - We are working with NVIDIA on a tool to show this easily to our readers. Should be soon. We certainly want to be able to better identify this and show it to our readers in an objective manner. But do know this, NVIDIA is taking active steps in its hardware and its drivers to deliver a smoother framerate over time and a better gaming experience.

"Smoothness" certainly may be subjective and if a title is pausing for a full count of one or better at some points when you're moving forward in it, I'd say that's fairly objective. Particularly if the behavior repeats each time you run a map and at the same points. The frequency of its occurrence matters too. I'm not talking about when a map or level is just loading.

That's meaningful when it's only happening on one card or card configuration and not another.

It will be nice if you do display this in a clear format for casual users or for visitors who aren't familiar with [H]ard. For some of us, your word is enough.

I care greatly about fluid game play, not about winning benchmarks. I understand winning benchmarks is important for nVidia and AMD because that's what most consumers currently focus upon.

There are an assload of sites I can visit if I just want to stare at average FPS bar graphs.

Keep up the good work guys, it's appreciated.
 
This should bring the 7970's down to $400 and the 7950's to $350. If the 7970's hit $350 this year, I'll grab a second for SLI

Either way it's still a win for the gaming community.
 
that was a good review. currently running a 7950 1100/1500 oc but not happy with the drivers at the moment. the evga 670 just came in stock on newegg so i purchased one to replace it. havent had a nvidia card since the geforce gts 250.
 
This is the review I've waited for since the 680 got released. OC vs. OC.

:D

Did you miss this page from the article they posted on 4/4? http://hardocp.com/article/2012/04/04/nvidia_kepler_geforce_gtx_680_overclocking_review/5

Personally, I find OC vs OC interesting... at best. It might teach us something about how the architecture scales, but there are too many variable IMO. You just can't guarantee an overclock. The numbers Brent hits won't be the numbers I hit. The numbers I hit won't be the numbers you hit.

Don't get me wrong, I like seeing the info, but it definitely belongs in the follow up, not the primary reviews.

UPS tells me I get my 670 tomorrow afternoon. I'm scheduled to telecommute so I can be home to get it. :p
 
Why would anyone debate about FXAA? FXAA isn't nvidia biased, I can't believe anyone would suggest that. All post process AA has a minimal performance impact and works on ALL hardware - MLAA works on nvidia cards just like FXAA works on AMD cards. It has a minimal performance impact on any hardware.

The point I disagree with however - FXAA was designed from the get go as a cheap mans AA, that is not disputable. There was a great blog post from one of the creators of FXAA at nvidia and basically he regretted that people were getting the impression that FXAA is any type of improved AA - MSAA is still better, but it just has more of a performance cost. He went on to explain that this is why they went on to create TXAA, something that really pushes image quality (FXAA doesn't.) The nice thing about FXAA is that it doesn't require native application support and has minimal performance impact....but in the end MSAA is better. I do agree that 2x-8x MSAA during gaming benchmarks is meaningful.

Anyway, all of that rambling aside, great article. Nvidia really has won this round all around, kudos to them... AMD really needs to step it up with sea islands or whatever their next node is.
 
Thank you for the benchmarks.. very informative. Can you repeat the process @ 5760x1200 with crossfire and SLI please? If you had 4GB 6XX parts to throw in, that would be nice to see as well.
 
Please take the generic overclocking and GPU Boost questions elsewhere.
 
The valley in that Deus Ex graph is really odd to me, it would almost seem driver related. How can the Nvidia cards just seemingly drop off for that one section (316-436) so consistently while the 7970 maintains more or less the same average framerate from just prior to the Nvidia slump. The data based on the AMD results would more or less suggest that the workload at that point from the software side remains roughly the same as the rest of the test, are guys able to shed light as to whats happening at that point during your game play loop?
 
It doesn't dynamically overclock though, as Brent has said hundreds of times.

"Fair" was a poor choice of words on my part as it injects subjectivity to the comment. It's a matter of perspective and arguing over it is akin to arguing over the meaning of life...never ending and for all intents and purposes, completely pointless.

OC vs OC is relevant to me because I couldn't care less how they perform "out of the box." Thanks for the review and I hope AMD's driver team gets their head out of their collective asses so that I can salvage whatever I can from this ill timed upgrade.
 
Since Deus Ex seems to run so much better on AMD compared to the other games what makes that game so different? is that a older tech or is it a engine we will see in the future more?
 
So if you are an overclocker, then for $50 less you get a 7970 that will beat an OC 680. If not then get a 670 that has similar performance in general for the games tested. Of course a given OC is not set in stone either, you might get a lousy overclocking card from either company.

So which one is the better value then? The cheaper 670 or the highly over clockable 7970? or 680? My 7970 will go up to 1295/1600 (benchmarking) but the fan noise is just terrible and irritating for me. Have gamed with it at 1260/1575 without any noticable issue other then fan noise. Drivers have been fine also but dealing with a single card plus I usually don't jump on games recently released except maybe Max Payne 3 :). Maybe my point here is it is more then just numbers on the overall experience, looks like HardOCP is going green not because of numbers but from a better experience. Be it smoothness, drivers, noise and what not. I normally keep my 7970 at 1100/1575 since it just purrs there, little noise and very smooth, I really don't see any stock 670 keeping up with it on the games I play. So far a very worth while and fun card.

I like the review with out of box performance and then what the hardware when pushed can perform. Most folks probably never OC so having plain stock settings is a good indicator. There are plenty that do OC and want to know how a card performs when pushed.

Great hearing HardOCP in persuing a more objective benchmark on a frame to frame basis. Really just the problem area of a game maybe really only needs a close evaluation method.

Do wish more games where tested but I do understand that HardOCP probably puts more testing in one game in a review then others do with all the canned game benchmarks combined. For example Metro 2033 with upcoming Metro would be appropriate as well as Crysis 2 which Crysis 3 also coming. Both which AMD seems to have it down. I am not sure why Skyrim is used being DX9 and virtually any recent low to high end card will be able to play, well except Skyrim is one of the most popular titles now. More games if possible but not to compromise test quality which I think HardOCP does very good at.

Thanks for the review and keeping it straight to the point in the review.
 
Biased review is biased. The end of the conclusion was retardedly over the top ass kissing. This is ecially true given that the 7970 when "manually" auto-adjusted, outperformed the 680 in a similar mode.

The 6 series cards are overclocked stock. Even overclock-overclocked they get their ass handed to them when similar clocks are applied to the 680 and 7970. You never go on about this fact, but you'll go on endlessly about crossfire issues.

Re-read your article from a standpoint of someone looking for bias being presented.

Hell, you didn't even bother to correct the folks saying (erroneously) that the 680 is a better value, as it beats the 7970, even overclocked, and especially considering it uses so much less power and produces so much less heat. 7970 beats 680 when it gets the same clocks as 680. 7970 does NOT produce massively more heat or use significantly higher wattage.

I refer you to the following:

http://www.guru3d.com/article/geforce-gtx-680-review/9
7970:
System in IDLE = 163W
System Wattage with GPU in FULL Stress = 355W
Difference (GPU load) = 192W
Add average IDLE wattage ~3W
Subjective obtained GPU power consumption = ~ 195 Watts

680:
System in IDLE = 144 W
System Wattage with GPU in FULL Stress = 307 W
Difference (GPU load) = 163 W
Add average IDLE wattage ~10 W
Subjective obtained GPU power consumption = ~ 173 Watts

Maybe try to dig into every aspect of cards reviewed, instead of picking and choosing.

To the guy stating FXAA isn't NV: You might want to check your facts.
 
Thanks for this great review.

The only thing I wished to see was a single GTX 670 running Eyefinity / Surround resolutions, just like what you did for the GTX 680. Clearly the 680 is a great card for multi-monitor, but the 670 seems very capable for 25% less. I'm wondering what that extra $100 gains you at 5760x1200 by stepping up from 670 to 680.
 
Biased review is biased. The end of the conclusion was retardedly over the top ass kissing. This is ecially true given that the 7970 when "manually" auto-adjusted, outperformed the 680 in a similar mode.

The 6 series cards are overclocked stock. Even overclock-overclocked they get their ass handed to them when similar clocks are applied to the 680 and 7970. You never go on about this fact, but you'll go on endlessly about crossfire issues.

Re-read your article from a standpoint of someone looking for bias being presented.

Hell, you didn't even bother to correct the folks saying (erroneously) that the 680 is a better value, as it beats the 7970, even overclocked, and especially considering it uses so much less power and produces so much less heat. 7970 beats 680 when it gets the same clocks as 680. 7970 does NOT produce massively more heat or use significantly higher wattage.

:rolleyes:

Did you even read the whole thing or just the conclusion and start spewing BS?

1) 680 cards are NOT "overclocked stock". GPU Boost is not an overclock and it has been covered by Kyle and Brent numerous times. Anyone thinking otherwise at this point is a moron.

2) Clock-for-clock, the 7970 and the 680 are pretty damn close. In this review they were within a few FPS of each other for the most part and the 7970 even had slightly higher clocks, if you really want to talk "clock for clock". The 680 certainly does not get it's "ass handed to it" by the 7970.

3) As for wattage/heat, 20-30W more is not insignificant.

4) How can it be "erroneous" to think the 680 is a better value? Obviously heat and power consumption don't matter to you but they do to some people. Plus, it appears the gaming experience is smoother on the 680 (again, subjective).
 
:rolleyes:

Did you even read the whole thing or just the conclusion and start spewing BS?

1) 680 cards are NOT "overclocked stock". GPU Boost is not an overclock and it has been covered by Kyle and Brent numerous times. Anyone thinking otherwise at this point is a moron.

No comment

2) Clock-for-clock, the 7970 and the 680 are pretty damn close. In this review they were within a few FPS of each other for the most part and the 7970 even had slightly higher clocks, if you really want to talk "clock for clock". The 680 certainly does not get it's "ass handed to it" by the 7970.

Agree here

3) As for wattage/heat, 20-30W more is not insignificant.

Disagree, it's insignificant and hilarious to think even 10% of $450+ videocard buyers give a rats ass about 20-30 watts difference, get real

4) How can it be "erroneous" to think the 680 is a better value? Obviously heat and power consumption don't matter to you but they do to some people. Plus, it appears the gaming experience is smoother on the 680 (again, subjective).

Disagree, you can easily make the arguement that a 7970 is a better value as long as you don't plan on going multiGPU (and those issues might not last beyond another driver release)

responded to points
 
responded to points

...Just like you can easily say the 680 is a better value because it performs better at stock, has a "smoother" gameplay experience and runs cooler and with less power.

Again, just because you don't care about power consumption and heat doesn't mean that no one does. Your "less than 10%" statistic is laughable and is clearly something you just pulled out of your ass.
 
For a lot of people, power and noise is more important than direct overclockability. Not so much the direct power usage delta, but the reduced noise that can be achieved.

I would love to see 5760x1200 tested, not just 2560x1600...
 
...Just like you can easily say the 680 is a better value because it performs better at stock, has a "smoother" gameplay experience and runs cooler and with less power.

Again, just because you don't care about power consumption and heat doesn't mean that no one does. Your "less than 10%" statistic is laughable and is clearly something you just pulled out of your ass.

Here is some real world testing for power consumption..

7970
iQCfJ.jpg


GTX 680
QZFEt.jpg



both at stock setting...

You get the idea...

btw, it doesn't really run cooler when both are reference, but there are some 7970 runs hotter than others due to horrible thermal paste applied.
both stay under 75 just fine.
 
:rolleyes:

Did you even read the whole thing or just the conclusion and start spewing BS?

1) 680 cards are NOT "overclocked stock". GPU Boost is not an overclock and it has been covered by Kyle and Brent numerous times. Anyone thinking otherwise at this point is a moron.

2) Clock-for-clock, the 7970 and the 680 are pretty damn close. In this review they were within a few FPS of each other for the most part and the 7970 even had slightly higher clocks, if you really want to talk "clock for clock". The 680 certainly does not get it's "ass handed to it" by the 7970.

3) As for wattage/heat, 20-30W more is not insignificant.

4) How can it be "erroneous" to think the 680 is a better value? Obviously heat and power consumption don't matter to you but they do to some people. Plus, it appears the gaming experience is smoother on the 680 (again, subjective).


Yes he seems to be misled about nVidia's GPU Boost technology. But he did make some valid points in my opinion. Like you said in #2, both top tier cards were very close, and honestly i don't even care who had the edge, BUT the 7970 is $50 dollars cheaper. So why is everyone trampling on AMD?

As for #3) well I would also agree with someone who said it is pretty insignificant. And I think the 10% came from those stats that he got from guru3d where total system was in the 300watt range. So 10% of 300w = 30w, just my guess after reading those posts. Serioulsy, try to realistically use a 20-30watt light bulb in a dark room, or make toast using 30 watts. I've posted about these power comments before and it just makes me laugh. This is supposed to be [H]ardOCP, people on here are running overclocks, multiple gpu's, multiple monitors, enthusiast level hardware to do what mostly? Play video games! Please don't tell me you're going to use 20-30watts to some kind of an advantage in your argument.

For #4) Now you're somehow going to equate the 20-30w heat and power advantage that the 680 has with the $50 cheaper price tag that the 7970 has? Wow, thats a stretch I must say. And well I'll leave the "smoothness" thing alone since that's a whole different dilemma.

For the bias that Solomutt was talking about, I do see where he was coming from. I perceive it ever so subtly, it is contained in the writer's voice. But that doesn't mean I don't like Kyle and Brents reviews b/c I do. I put a lot of stock in their testing methods and real world application but they obvioulsy like nVidia better and they don't hide the fact of what they use in their personal rigs and that's fine. I would be the same way and because of their testing and the way the data has come out, I plan on picking up a 670 when supply hopefully ramps up in the near future.
 
They were using the 7970 in their personal rigs before this. It's not like they just don't even touch AMD at all. nVidia came out with a better product (in their mind) and so they switched.

I get your points and I somewhat agree about the power (it's not a big deal to me personally but to say it's not for anyone is ignorant) but I guess my point was that calling them out for being biased was way uncalled for. That's like saying because you like the gameplay experience of one card over the other suddenly you are "biased". They have numbers to back it up, it's not some completely subjective thing.

And no, if you re-read his post he is suggesting that less than 10% of buyers of these cards care at all about power consumption. That is just some made-up WAG statistic with no backing.
 
They were using the 7970 in their personal rigs before this. It's not like they just don't even touch AMD at all. nVidia came out with a better product (in their mind) and so they switched.

I get your points and I somewhat agree about the power (it's not a big deal to me personally but to say it's not for anyone is ignorant) but I guess my point was that calling them out for being biased was way uncalled for. That's like saying because you like the gameplay experience of one card over the other suddenly you are "biased". They have numbers to back it up, it's not some completely subjective thing.

And no, if you re-read his post he is suggesting that less than 10% of buyers of these cards care at all about power consumption. That is just some made-up WAG statistic with no backing.

I switch to GTX 680 from 7970 because I feel like it, and I do that all the time in between hardware just for the experience. I did not think nVidia came out with a better product, and to be honest 7970 is slightly faster when I OC both of them, since 680 can't scale that well, especially with the temp restriction.

The power like I show previously, only 9W difference? (Without OC of course)
I don't see how that can be a big deal on stock settings to be honest.
 
They were using the 7970 in their personal rigs before this. It's not like they just don't even touch AMD at all. nVidia came out with a better product (in their mind) and so they switched.

I get your points and I somewhat agree about the power (it's not a big deal to me personally but to say it's not for anyone is ignorant) but I guess my point was that calling them out for being biased was way uncalled for. That's like saying because you like the gameplay experience of one card over the other suddenly you are "biased". They have numbers to back it up, it's not some completely subjective thing.

And no, if you re-read his post he is suggesting that less than 10% of buyers of these cards care at all about power consumption. That is just some made-up WAG statistic with no backing.

just to clarify, I never said people who pay $450+ for a video card dont care about power usage, I said 10% might give a shit about 20-30 watts, but even THAT percentage is probably too high
 
Alright, if you guys wanna see bias where there is none, go for it. I'm through arguing with you.

One last thing, though; running one test and saying the difference is 9W is disingenuous when the 680 has dynamic clocking that can very well result in lower usage depending on the application. [H] changed their power usage charts specifically because of that.
 
Alright, if you guys wanna see bias where there is none, go for it. I'm through arguing with you.

One last thing, though; running one test and saying the difference is 9W is disingenuous when the 680 has dynamic clocking that can very well result in lower usage depending on the application. [H] changed their power usage charts specifically because of that.

So, 680 uses the power that is close to 7970..?

Just because it runs lower power on games like CS1.6, doesn't mean the person play that game all day long, which is pointless.

Heaven reflects to BF3 quite a lot in my experience, it jumps the second boost if its under 70c.

I am not arguing with you, I am just stating the facts without bias elements in there. ;)
And as a owner of both hardware, that's my experience...
 
Back
Top