GeForce GTX 680 3-Way SLI & Radeon 7970 Tri-Fire Review @ [H]

I've skipped through this thread so excuse me if this has been asked already but.. why not test skyrim with a few of the most popular texture mods? I'm willing to wager that most [H] readers would be using these mods anyway, so that would be more relevant. Not to mention, mods like Skyrim HD 2K are beyond WIP and are unlikely to be changed significantly in the future. And textures and lighting mods really kill performance so it could serve as a benchmark that isn't murdered by high end setups.
 
While I appreciate the response, and I do understand where you are coming from, the simple fact here is the "you can't quantify it but you can "see and feel it" can and is used to bias the findings.It's exactly right out of the nvidia marketing book. It's like watching a 100 meter dash and saying the winner wasn't the winner because he/she didn't run "smooth enough".
You can quantify it, if you have a way of measuring frame time. They have said they are working on this. I believe Techreport has a few reviews which highlight this problem.
 
While I appreciate the response, and I do understand where you are coming from, the simple fact here is the "you can't quantify it but you can "see and feel it" can and is used to bias the findings.It's exactly right out of the nvidia marketing book. It's like watching a 100 meter dash and saying the winner wasn't the winner because he/she didn't run "smooth enough".

The driver issue is important, but as I have said......it applies to one TINY application of Crossfire, one that isn't utilized by many folks. I think overall AMD has really stepped up their driver quality and response to new games since the 7900 series has been released. They certainly have had their issues over the past year, I'll agree. The Triple Crossfire issue will be fixed, I'm sure. It would be interesting to re-do this when a proper driver is available.

I think you are completely missing the point of these videocards. You seem to be convinced that people spend ~$1500 so they can benchmark. Hardocp (and I agree with them) believe people spend $1500 on videocards to play games. So while "smoothness" may be subjective, if it makes a difference in gameplay I definately want to know. My 100 hours in Skyrim and 80 hours in BF3 weren't spent running fraps and benchmarking the games, they were spent playing.

Too many people still have the old school mentality of benchmarking. I personally do not care how "fast" my card is, I care about the quality of gameplay it can deliver. [H] has observed on several occasions that the quality of gameplay is better on SLI due to the issues AMD has with stuttering.

If you want to argue AMD benchmarks better, sure, fine, Nvidia plays better. To use your sports analogy think gymnastics not running. While the technical part of the performance matters, contestants are also scored on the overall feel of their performance. Nvidia feels better.
 
Ya I spent 1650 on my 3 7970's to play any game I want maxed out with eyefinity. So far it has been nothing but a huge disappointment.
 
I think you are completely missing the point of these videocards. You seem to be convinced that people spend ~$1500 so they can benchmark. Hardocp (and I agree with them) believe people spend $1500 on videocards to play games. So while "smoothness" may be subjective, if it makes a difference in gameplay I definately want to know. My 100 hours in Skyrim and 80 hours in BF3 weren't spent running fraps and benchmarking the games, they were spent playing.

Too many people still have the old school mentality of benchmarking. I personally do not care how "fast" my card is, I care about the quality of gameplay it can deliver. [H] has observed on several occasions that the quality of gameplay is better on SLI due to the issues AMD has with stuttering.

If you want to argue AMD benchmarks better, sure, fine, Nvidia plays better. To use your sports analogy think gymnastics not running. While the technical part of the performance matters, contestants are also scored on the overall feel of their performance. Nvidia feels better.

No, I am not missing the point at all.
I fully realize the gameplay aspect of these reviews and they sure are helpful. I do not need a lecture on what I do or do not feel is important.
I don't benchmark shit.
I play games......I wish I had the time to put in as many hours on a game as you do.

I do think framerate contributes to how well a particular card can allow you to play a game.
I would venture BF3 at 26 FPS would suck pretty hard.

The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.

GoGo nvidia.:rolleyes:
 
I'm not sure what you're getting at magoo, I like the 7900 and I think its killer for single screen crossfire. I still use that box a lot despite having owned both 680 sli and 7970CF...... I don't have any experience with 3d Surround, and many people seem to have issues mainly with that -- surround. I can't speak for it, but I definitely don't get the warm fuzzies. If anything someone needs to light a fire under AMD's software team, I'm anxious to see how/if they respond.
 
In still using 69xx series and trifire scaling is beyond pathetic with a lot of games ive tried. Seems to be a driver juggling contest to get games to make use of it, the new 12.4 drivers trifire in eyefinity is still broken, BF3 loads up about 55% per gpu and thats it, Crysis 2 same deal, pretty sure TF2 for whatever reason only makes use of one gpu...

Really regretting even opting for 3 gpu's, at the time it seemed like a natural progression as 6970 crossfire worked really well, reality is its nothing more than a headache. Why even make it an option if support for it is near non existant?
 
The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.

GoGo nvidia.:rolleyes:

How is an experience superior if a benchmarking tool records higher frame rates yet the gaming experience isn't as good? You can't be saying that higher frames rates ALWAYS mean a better gaming experience because any gamer with experience knows that's simply not true.

And again, this particular review as you have repeatedly said yourself is about a very specific and not widely used configuration. A configuration that you have said repeatedly is so rare that AMD shouldn't make it a priority. And it looks like they are taking your advice.

Multi-GPU beyond 2 cards and Eyefinity is simply not working particularly well on the 7970s currently. That doesn't mean that it is a bad card. The 680 has its weaknesses as well such as GPU compute power.

For whatever reason you think its biased to point out this issue with the 7970s but at the same time claim that since this is a setup used by so few people that AMD shouldn't give it priority. Ok, fair enough but that's as biased as anything in this review.
 
How is an experience superior if a benchmarking tool records higher frame rates yet the gaming experience isn't as good? You can't be saying that higher frames rates ALWAYS mean a better gaming experience because any gamer with experience knows that's simply not true.

And again, this particular review as you have repeatedly said yourself is about a very specific and not widely used configuration. A configuration that you have said repeatedly is so rare that AMD shouldn't make it a priority. And it looks like they are taking your advice.

Multi-GPU beyond 2 cards and Eyefinity is simply not working particularly well on the 7970s currently. That doesn't mean that it is a bad card. The 680 has its weaknesses as well such as GPU compute power.

For whatever reason you think its biased to point out this issue with the 7970s but at the same time claim that since this is a setup used by so few people that AMD shouldn't give it priority. Ok, fair enough but that's as biased as anything in this review.

I think you missed my point there.
I have no problem with the article pointing out that AMD should up their game in software support, man I've been on that bandwagon too.
Yup, triple GPU is really a tiny niche of people, and I really doubt the utility of this sort of review, other that to demonstrate a whole table full of hardware that few people want or can afford. Yeah it's really nice to look at......but not very many people have it, thus the real need to jump AMD about this point in the article is suspect.....but not biased.....it's the truth. Does the use of three GPUs translate or trickle down to dual or single performance? How can you know that when the three monitor software on AMD's side is "ancient" and not relevant to dual or single card users.

The bias in the article comes from using an (up to this point in time) unmeasurable SUBJECTIVE "feeling" about what you are measuring, comparing it to something you can, and then declaring the inferior product better because it "feels smoother". (this is not to say nvidia's gpus were inferior in each game, they weren't,obviously)
That would require a "pepsi challenge" in my book.
I don't know, I just get this gut feeling that the nvidia product was destined to be better, even before the thing was done.

It's just my opinion. I don't have a horse in this race. I own both company's products and have no preference either way.:D
 
Last edited:
You can quantify it, if you have a way of measuring frame time. They have said they are working on this. I believe Techreport has a few reviews which highlight this problem.

Which has been stated multiple times in multiple threads. I just think magoo is purposely trying to troll. I will believe the subjective opinion of [H] and the comments supporting their analysis from the users over his one flawed viewpoint.

Because this phenomena is exactly why I like the adaptive v-synch tech in NV's new drivers. My games feel smoother at the same average framerates due to it.
 
The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.

GoGo nvidia.:rolleyes:

Product reviews by nature are always subjective. When reviewing any product you can have stats, but there is always the subjective opinion of the reviewer. For example with cars, you can compare horsepower etc, but the reviewer will still give you an opinion on which model feels better to drive. This is not bias, bias implies that someone's opinion is predetermined by unrelated factors. If a game feels "smooth" is not an unrelated factor, it is pretty directly related to enjoyment of a game.

I think you either dont understand the meaning of the word bias, or don't understand that a review is simply the opinion of one person on a specific product with information backing up why he came to that conclusion. Brett did a very good job explaining why he came to his conclusion. It could also be that [H] just isn't the place for you. You seem to prefer simple benchmarks instead of actualy gameplay experienes, there are plenty of places which will offer benchmarks and never actually play anything with the cards in question.
 
Two questions Brent,

Does AMD take these criticisms to heart? I mean, are they completely fucking oblivious to what various hardware websites are saying? Its kind of disheartening to see this stuff, because I like rooting for AMD ( I like pulling for the underdog I guess). If they made any fucking attempt to expand their software team like nvidia has, I swear, they would be much better off. I guess AMD's upper management doesn't give two fucks about what websites are saying.
You answered your own question, no.
 
I think you missed my point there.
I have no problem with the article pointing out that AMD should up their game in software support, man I've been on that bandwagon too.
Yup, triple GPU is really a tiny niche of people, and I really doubt the utility of this sort of review, other that to demonstrate a whole table full of hardware that few people want or can afford. Yeah it's really nice to look at......but not very many people have it, thus the real need to jump AMD about this point in the article is suspect.....but not biased.....it's the truth. Does the use of three GPUs translate or trickle down to dual or single performance? How can you know that when the three monitor software on AMD's side is "ancient" and not relevant to dual or single card users.

The bias in the article comes from using an (up to this point in time) unmeasurable SUBJECTIVE "feeling" about what you are measuring, comparing it to something you can, and then declaring the inferior product better because it "feels smoother". (this is not to say nvidia's gpus were inferior in each game, they weren't,obviously)
That would require a "pepsi challenge" in my book.
I don't know, I just get this gut feeling that the nvidia product was destined to be better, even before the thing was done.

It's just my opinion. I don't have a horse in this race. I own both company's products and have no preference either way.:D

I think you're calling personal opinion bias and the two aren't synonymous. Brent presented his empirical data and simply noted his personal experience. As long as he was being honest in conveying that personal experience.

But I guess I can see why you might think there is bias here as the review does make note at the beginning of the less than fast response by AMD to get TriFire and Eyefinity working in their latest drivers.
 
You answered your own question, no.

Thanks for the worthless response :rolleyes: my question is, do AMD reply to the emails and feedback they get ------ since Brent & Kyle have more direct contact with AMD PR than we do. I just want to know if they're planning on fixing their shit or not. When Kyle fires off an email to AMD saying, "your shit is fucked up" do they plan on fixing it? Status quo is clearly not cutting it right now at AMD.

Do not reply to this please Dark, thanks.
 
Last edited:
While I'm not making any excuses, and if I had three GPUs I'd be screaming pretty loud, there aren't many people in that crowd.

btw, that's an excuse.


Product reviews by nature are always subjective. When reviewing any product you can have stats, but there is always the subjective opinion of the reviewer. For example with cars, you can compare horsepower etc, but the reviewer will still give you an opinion on which model feels better to drive. This is not bias, bias implies that someone's opinion is predetermined by unrelated factors. If a game feels "smooth" is not an unrelated factor, it is pretty directly related to enjoyment of a game.

I think you either dont understand the meaning of the word bias, or don't understand that a review is simply the opinion of one person on a specific product with information backing up why he came to that conclusion. Brett did a very good job explaining why he came to his conclusion. It could also be that [H] just isn't the place for you. You seem to prefer simple benchmarks instead of actualy gameplay experienes, there are plenty of places which will offer benchmarks and never actually play anything with the cards in question.

Well said!

Great review guys, glad you found a workaround to the BSOD issue. I am just lol'ing at all the bitching/conspiracy theories going on. How about we just call the contest a draw at 0 FPS for all configs since without a workaround the 7970s would BSOD = 0FPS and with the lack of availability for 3x 680 to be placed at everyone's doorstep = 0FPS.

The arguments just make me laugh.
 
The driver issue is important, but as I have said......it applies to one TINY application of Crossfire, one that isn't utilized by many folks. I think overall AMD has really stepped up their driver quality and response to new games since the 7900 series has been released. They certainly have had their issues over the past year, I'll agree. The Triple Crossfire issue will be fixed, I'm sure. It would be interesting to re-do this when a proper driver is available.

The main benefit of Tri-fire/SLI is to run EyeFinity/Surround at high resolutions.

Only the initial driver released in January (3 months ago) works in this configuration.

The review is SPECIFICALLY focused on Tri-SLI vs. Tri-CrossFire

People who spent $1,500+ on video cards are unable to use current drivers, this is a show stopper bug. 5 months to fix it while NV works right out of the box.

Why wouldn't they focus on this?

Also, awesome to see 3 top of the line cards + OC'd 2600k drawing 650-800W only from the wall. No need for 1500W PSUs here.
 
btw, that's an excuse.




Well said!

Great review guys, glad you found a workaround to the BSOD issue. I am just lol'ing at all the bitching/conspiracy theories going on. How about we just call the contest a draw at 0 FPS for all configs since without a workaround the 7970s would BSOD = 0FPS and with the lack of availability for 3x 680 to be placed at everyone's doorstep = 0FPS.

The arguments just make me laugh.

That's an interesting way to look at it. :D
 
I also question this logic:Yes, AMD *can* fix these drivers issues, but considering that they have had microstuttering problems for the last 3 generations of cards (worse than NVIDIA's at any rate) and they haven't fixed the Eyefinity bug with 3 cards in the last 5 months, what makes you think they are going to fix them going forward?

You're right that the track record is bad, but I'd wager that between AMD's new management and the fact that multi-card and multi-monitor gaming is finally beginning to gain some mainstream traction, AMD will put more emphasis on this. In any case, I'm going to go AMD for graphics because my mobo won't do SLI, and I'm not going to upgrade it for a while yet.
 
The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.

GoGo nvidia.:rolleyes:
No, it's not. It's clearly stated that this is their personal opinion. With every video card / cpu / other hardware release, the reviewers at [H] get accused of bias by some overzealous fanboy and the really funny thing is that every time it happens it's the other company than the previous time that they're being accused of bias towards. What it comes down to is that people are butthurt that "their" company isn't the favourite at any given time.

I've been reading [H] reviews for 10 or so years, (even if I've been a member for less than that) and my experience over those ten years is that nobody who works for [H] is biased towards any particular company... they support whichever card provides the best experience at any given time, as they should.

I'd almost be willing to lay money on someone claiming they're biased towards AMD after reading this article which was just posted:

http://www.hardocp.com/article/2012/04/26/sapphire_hd_7870_oc_edition_video_card_review

Not that there's any bias in it, it's just the way of fanboys to complain and moan and accuse when "their" company isn't "winning"
 
Last edited:
I've been reading [H] reviews for 10 or so years, (even if I've been a member for less than that) and my experience over those ten years is that nobody who works for [H] is biased towards any particular company... they support whichever card provides the best experience at any given time, as they should.

Agreed. I have been reading the [H] since 2000 and I have seen them being accused of favoring NV when NV was kicking ass, then they were accused of favoring ATI when ATI was kicking ass. The same as well for AMD/Intel. It's hilarious when you have the overall picture and these kids start making sweeping assumptions based on the particular article they read that day.

It all has happened before and will happen again. I honestly don't know how Kyle, Steve and company do it with all the misinformed criticism. Especially when the content is free.
 
Agreed. I have been reading the [H] since 2000 and I have seen them being accused of favoring NV when NV was kicking ass, then they were accused of favoring ATI when ATI was kicking ass. The same as well for AMD/Intel. It's hilarious when you have the overall picture and these kids start making sweeping assumptions based on the particular article they read that day.

It all has happened before and will happen again. I honestly don't know how Kyle, Steve and company do it with all the misinformed criticism. Especially when the content is free.

It's kinda weird how the amount of AMD fanboys has increased A LOT since I joined ~1.5 years ago, seems either a lot of the Nvidia fanboys got banned or have been warned of bans, or just don't post as much anymore. Every god damn thread be it Intel subforums or Nvidia sub forums there is some AMD kid going in saying the company is crap and that AMD is the best in the world. Why can't anyone just be neutral :confused:
 
Do the additional PCIe lanes from an N200 chip (like on the Asus WS Revolution mobo used for this article) perform identically to the additional (>16) PCIe lanes on an X58 or X79 platform (in PCIe 2.0 mode)? Is there any benefit to having more "native" PCIe lanes in the chipset (LGA1366) or on-die (LGA2011) when compared to adding an N200 chip to an LGA1155 mobo?
 
Does anyone have Terry Makedon's email address? Please? Can we petition AMD to put him back on the fucking driver team along with 3-4 more capable bodies?

I'm really tired of this shit. People who spend 1500$ on GPU's shouldn't have a sub par experience, its really frustrating because I think the 7970s are great hardware. I've really enjoyed them in crossfire, but I only do single screen. Seeing these bad stories about eyefinity leaves a bad taste in my mouth.
 
Does anyone have Terry Makedon's email address? Please? Can we petition AMD to put him back on the fucking driver team along with 3-4 more capable bodies?

I'm really tired of this shit. People who spend 1500$ on GPU's shouldn't have a sub par experience, its really frustrating because I think the 7970s are great hardware. I've really enjoyed them in crossfire, but I only do single screen. Seeing these bad stories about eyefinity leaves a bad taste in my mouth.

First off, I will say that I wouldn't touch a 3x CF rig, but for more reasons than in this review.

But don't let reviews tell you what you want or need. It's good to have experienced advisors and sites like {H] provide a great service. That said, you really have to try things for yourself. Not necessarily cheap or easy but necessary unfortunately.
 
First off, I will say that I wouldn't touch a 3x CF rig, but for more reasons than in this review.

But don't let reviews tell you what you want or need. It's good to have experienced advisors and sites like {H] provide a great service. That said, you really have to try things for yourself. Not necessarily cheap or easy but necessary unfortunately.

Yeah, point taken, I have tried both sides and have purchased both 680s and 7970s. Honestly with the 7970s overclocked they won a lot of games on single screen 2560 resolution, so I ended up selling. Not because the 680s were bad, they were quieter and had some cool features but they just weren't an upgrade at all and several games were slower on them. (ie crysis, metro 2033, witcher 2, alan wake). The 7970s I used in CF overclocked like crazy so it probably wasn't a fair comparison, though.

Anyway, Its just frustrating, my personal opinion is that the hardware is good, but the software team behind it is behind the curve in terms of supporting the people that spend the most on their products. Not supporting someone that spends 1500$ on an AMD trifire setup? Ridiculous. I just feel like venting because I like rooting for AMD because they're the underdog, yet they don't throw their resources where they're needed the most - in software development for their hardware. It wasn't like this during the 5000 series days that I remember, AMD had a lot of positive press and they had rolled out eyefinity and all was great. What went wrong? I don't know, but they need to throw more bodies at their software development, period.
 
Silly question time: is the driver restriction Trifire only, or does it affect Quadfire as well?
 
Yeah, point taken, I have tried both sides and have purchased both 680s and 7970s. Honestly with the 7970s overclocked they won a lot of games on single screen 2560 resolution, so I ended up selling. Not because the 680s were bad, they were quieter and had some cool features but they just weren't an upgrade at all and several games were slower on them. (ie crysis, metro 2033, witcher 2, alan wake). The 7970s I used in CF overclocked like crazy so it probably wasn't a fair comparison, though.

Anyway, Its just frustrating, my personal opinion is that the hardware is good, but the software team behind it is behind the curve in terms of supporting the people that spend the most on their products. Not supporting someone that spends 1500$ on an AMD trifire setup? Ridiculous. I just feel like venting because I like rooting for AMD because they're the underdog, yet they don't throw their resources where they're needed the most - in software development for their hardware. It wasn't like this during the 5000 series days that I remember, AMD had a lot of positive press and they had rolled out eyefinity and all was great. What went wrong? I don't know, but they need to throw more bodies at their software development, period.

Basically, AMD is trying to reform(destroy) ATI. The following article is a hint. http://www.tomshardware.com/news/AMD-ATI-All-In-Wonder-Llano-Godfrey-Cheng,15189.html
 
so what impact would 4GB cards have on this?

ZERO, GTX680 is bandwidth locked card.

So no, even with 10GB of Vram, you wont get increase, because of the 256bit memory interface, and 192GB/s transfer as a maximum.
 
ZERO, GTX680 is bandwidth locked card.

So no, even with 10GB of Vram, you wont get increase, because of the 256bit memory interface, and 192GB/s transfer as a maximum.
you dont have to have more bandwidth just to utilize more vram.
 
I saw 2 reviews, in which there was 0 to 2% gains from 2GB to 4GB GTX680 cards, (multi display 5K res) but they tested with stock cards.
 
Hahah. Great response, although he's right.
No, he's not right. VRAM is (more or less) storage space, and you can certainly run out of VRAM in a situation where the actual scene being rendered isn't taxing, e.g. Skyrim with HQ texture mods. I have yet to see any indication that the 7970s memory bandwidth is of any performance benefit since it loses to the 680 in the majority of benchmarks even at 5760x1200 on multi-GPU setups.
 
No, he's not right. VRAM is (more or less) storage space, and you can certainly run out of VRAM in a situation where the actual scene being rendered isn't taxing, e.g. Skyrim with HQ texture mods. I have yet to see any indication that the 7970s memory bandwidth is of any performance benefit since it loses to the 680 in the majority of benchmarks even at 5760x1200 on multi-GPU setups.

http://www.overclock.net/t/1196856/official-amd-radeon-hd-7950-7970-owners-thread/9960#post_17097247

FYI this feedback is from xoleras (a user who is also on hardforum) who has run both 680 SLI and 7970 CF on single screen 2560 x 1600. so enough with the lies "7970 loses to GTX 680 in the majority of benchmarks" :D

For bandwidth limitations of GTX 680 being a factor look at the following examples where the perf scaling from GTX 580 to 680 is < 20% . GTX 680 has enough shading power over GTX 580 (1536 shaders at 1100 Mhz vs 512 shaders at 1550 Mhz).

http://www.hardware.fr/articles/857-12/benchmark-alan-wake.html (10% scaling at 1080p max)
http://www.guru3d.com/article/geforce-gtx-680-review/18 (18% scaling at 1080p)
http://www.anandtech.com/show/5699/n...x-680-review/7 (17% scaling at 1080p)

These are some of the most demanding games in the present. Looking at the future I would say the GTX 680 will run into such scenarios even more.
 
Last edited:
http://www.overclock.net/t/1196856/official-amd-radeon-hd-7950-7970-owners-thread/9960#post_17097247

FYI this feedback is from xoleras (a user who is also on hardforum) who has run both 680 SLI and 7970 CF on single screen 2560 x 1600. so enough with the lies "7970 loses to GTX 680 in the majority of benchmarks" :D

For bandwidth limitations of GTX 680 being a factor look at the following examples where the perf scaling from GTX 580 to 680 is < 20% . GTX 680 has enough shading power over GTX 580 (1536 shaders at 1100 Mhz vs 512 shaders at 1550 Mhz).

http://www.hardware.fr/articles/857-12/benchmark-alan-wake.html (10% scaling at 1080p max)
http://www.guru3d.com/article/geforce-gtx-680-review/18 (18% scaling at 1080p)
http://www.anandtech.com/show/5699/n...x-680-review/7 (17% scaling at 1080p)

These are some of the most demanding games in the present. Looking at the future I would say the GTX 680 will run into such scenarios even more.

Good grief dude, you don't have to quote me on this stuff. Most people here are smart enough to make purchasing decisions. Anyway , like I said in that post 680 sli just wasn't an upgrade, they're about the same when the 7970 is oc'ed - and my 7970CF box was oc'ed like crazy anyway. I do enjoy my 7970CF so there.

They're both good cards, but AMD needs to fix their eyefinity/ trifire shit. Thank goodness I use single screen. Do people at AMD even see these reviews and/or care about them? Seriously WHAT THE F? Also, someone petition Terry Makedon and 10 more bodies to go to the AMD software dev team, Brent, i'm counting on you to stay on AMD's ass until they fix their eyefinity.....I'm not using it but would be pretty pissed if I were and had to deal with bullshit..
 
Last edited:
xoleras
I am using your feedback to reply to some inaccurate statements. If you feel I should not I won't.
 
Back
Top