GeForce GTX 680 3-Way SLI & Radeon 7970 Tri-Fire Review @ [H]

"Thanks for the email – yes this is a known issue, and is certainly considered a critical issue to be resolved ASAP. Catalyst 12.4 will be releasing on Wednesday and will include a couple of Eyefinity fixes (noted below). This specific issue you refer to is actively being investigated and we are trying to get it resolved for Catalyst 12.5; as soon as we have a driver release with the fix I will send it your way to verify that it resolves the issue for you as well. -Andrew Dodd"

My name is Andrew Dodd!!! F'ckin A!!!

You're Andrew Dodd? So this is all your fault? Okay guys, LYNCH HIM! Just kidding :cool:
 
FINALLY. A review that not only says that it's not all about frame rates but the overall smoothness of the video.
 
Why the fuck can no one ever just say the HD 7970 is not a better card than the GTX 680. Every fucking review here comparing the 7970 and 680 is said to have some kind of error that invalidates the entire review(you didn't particularly say this) just because the HD 7970 doesn't win. Just fucking accept the fact that the 7970 IS NOT the best card out right now and that the whole "MOAR VRAM IS BETTER LOLZ" is complete horseshit as seen by this review and the others, since when the details are cranked up the card itself(7970) doesn't even have enough power to make real use of the extra VRAM.

You are correct, I didn't say the HD 7970 was the best, worst or anything in-between.
These two cards do some things equally well, some worse, some better.
That said, especially in this kind of a review, wait until the problem is fixed.....no the whole review isn't invalid, there IS a driver that works and the CAPs all work. It would just be nice to use a current driver......but it doesn't exist. It's like fighting with one arm behind your back.....unless you're Mike Tyson you're gonna bet your ass will be kicked.:D

This makes no sense. The 7970 was launched three months before the 680 and cards like the 7970 and 680 are supposed to support these kinds of exotic setups. If AMD doesn't care about this market and months later still hasn't fixed its CF and Eyefinity issues in its latest drivers, then just tell us how long should we wait? For the AMD 8000s?

I know that there a lot of people complaining about the fairness of this review. I won't comment on that but the fact that in three months AMD hasn't bothered to fix their drivers to support users like myself tells me that they aren't interested in my business. Like you said, small user base and that's fine. This member of that small user base seems to have a better supported product from nVidia. And in my case being a 3D user, AMD isn't even an option anyway.

AMD just doesn't have the software support that nVidia does currently.

If you will recall, AMD started this entire "exotic" use of multi-gpus and multiple monitors.
It's currently used by many people, but not many triple GPU users and certainly not many 3D users.
While I'm not making any excuses, and if I had three GPUs I'd be screaming pretty loud, there aren't many people in that crowd. AMD is certainly focusing on other aspects of their GPU software.
I'd just like to see the review done when AMD has had a chance to fix their problems, instead of running a comparison of two VERY expensive systems with one having gimped software.

You are more than correct, nvidia has the software for this small segment and for your niche of using 3D Surround.

that said, as I said before, my Crossfire and EyeFinity are working just fine.:)
 
I just came from xfire 7970's to 680's in SLI i have had multiple setups of both AMD/Nvidia. The SLI does normally "feel" better overall and my switch was due to the driver issues AMD is not working on quickly enough for me. I only run a single 2560x1600 (in SLI) currently but have a 47" and two dell U2410 on either side of this. I have tried eyefinity on 3 24"s but not on sli with nvidia's surround. The AMD solution worked decent and i have always appreciated AMD's display set up as it has worked well for me. I still have to rearrange my displays every time i get out of SLI but it seems the lagging nvdia control panel is minimal now days compared to my 285's in sli.

Great Review always looking to move up someday to get a nice 3 monitor setup so thanks Kyle!
 
If you will recall, AMD started this entire "exotic" use of multi-gpus and multiple monitors.
It's currently used by many people, but not many triple GPU users and certainly not many 3D users.
While I'm not making any excuses, and if I had three GPUs I'd be screaming pretty loud, there aren't many people in that crowd. AMD is certainly focusing on other aspects of their GPU software.
I'd just like to see the review done when AMD has had a chance to fix their problems, instead of running a comparison of two VERY expensive systems with one having gimped software.

You are more than correct, nvidia has the software for this small segment and for your niche of using 3D Surround.

that said, as I said before, my Crossfire and EyeFinity are working just fine.:)

nVidia started multi-GPU and AMD multi-monitor, at least at the consumer level, nVidia did have support for multi-monitor in their professional GPUs and AMD forced nVidia to bring it into the gaming market so kudos to AMD for that.

When AMD gets out a new driver then sure, I guess [H] should do this again. But it has been going on for almost 4 months now, sooner or later it's not going to matter with this generation of cards anyway.
 
Thanks for the review. I must say it's pretty weird reading right after reading this review of 2-way, 3-way and 4-way SLI/Crossfire:

http://www.vortez.net/articles_pages/gtx680_quad_sli_vs_hd7970_quad_crossfirex,1.html

In their review, crossfire consistently provides better scaling and FPS than SLI, but they don't actually seem to 'play' the games they're testing, so an evaluation of the the 'feel' of the games is missing. That's the sort of thing I very much value here at [H].

Yes, I'm disappointed that AMD hasn't yet fixed some driver issues, but at least they CAN be fixed. Buying a 2Gb card for my eyefinity setup, however, is something that cannot be fixed with a driver update. I must say I'm probably going to grab a 7950 like the Gigabyte or Asus custom versions, and a 2nd 7950 in a few months to crossfire, as I'll be damned if I'm going to buy a $500 graphics card that's already running out of texture memory in some games the day I throw it into my system.
You won't run out of texture memory, at least on current games, on a 2GB card running 5040x1050. You will be performance limited by the 2 card setup at that resolution before you will run out of VRAM, except in very specific circumstances like Skyrim with heavy texture mod use. I also question this logic:
Yes, I'm disappointed that AMD hasn't yet fixed some driver issues, but at least they CAN be fixed. Buying a 2Gb card for my eyefinity setup, however, is something that cannot be fixed with a driver update.
Yes, AMD *can* fix these drivers issues, but considering that they have had microstuttering problems for the last 3 generations of cards (worse than NVIDIA's at any rate) and they haven't fixed the Eyefinity bug with 3 cards in the last 5 months, what makes you think they are going to fix them going forward?
 
It would be so cool if [H] was allowed to go into either the AMD or Nvidia offices and visit the staff in charge of writing the drivers. They could perhaps show us some of their testing setups and how they go about squashing bugs.

Kind of like your Global Foundries visit, just more up close and personal.

Think AMD/Nvidia would ever go for something like that?
 
biast review. I personally own both cards in trifire/sli and this reviewer is full of it.
 
I've been running at least two GPUs for the last 5 years. I guess my eyes are bad.
I've been using multiple GPUs since the 5870s, and I only noticed it in some games, and it was very frame rate dependent. But I couldn't play at 5760x1200 in most games because even though I was getting "playable" frame rates, e.g. 40-50fps in BC2, the game felt choppy and unplayable. Obviously this is a YMMV issue and not everyone is as sensitive to it, but you can't simply discount this, especially since most people don't have the luxury of "try before you buy" with these types of setups.

No I don't. Maybe to a handful of people, but not many.
I'm sorry but this makes no sense. You said "Let's be honest, it's a problem for about two dozen people......I mean really, all those running triple GPU solutions, please raise your hand.....So should AMD make this an emergency? or should they just take their time and fix it right?" - but this review is specifically for people who are running or are interested in running 3 GPUs. If you aren't going to run 3 GPUs, then no, this isn't important, but if you ARE, this is a huge show stopper issue, so it's totally relevant and important to this review.

It's not the GPUs, it's the software and the games. There's not a game out that needs three cards.
I guess "needs" is subjective, but there are definitely games out there that can take advantage of multiple video cards at multimonitor resolutions. BF3 gains noticeably from a 3rd card.

and you know this how? I have not had a single issue and I've had the 7970 pair I currently run since the series was released. I have not had a single issue with Crossfire+Eyefinity. But that's just me. MMV I guess.:D
I haven't personally used the 7000 series cards but from the overall posting trends on the forum, it seems like a lot more users are experiencing problems with AMD's drivers in CFX on the 7000 series than people having problems with NVIDIA's 500/600 series. My personal experience over the history of using both companies GPUs is that NVIDIA generally has a more reliable driver for multiple-GPUs. I did recommend to my friends to grab 6950 CFX back when those came out though, and I think AMD makes excellent single GPU solutions, I just don't think that I would be comfortable with AMD's 7000 series driver situation right now if I was buying for myself.
 
Sorry to be critical, but someone should bring this up:

You use TrSS vs SSAA in apples to apples comaprisons. What? That is most certainly NOT apples to apples. Transparency AA is not full scene, TrSS is NOT full scene SSAA, if you want to use SSAA in these comparisons you have to force it in nvidia inspector. You compare TrSS vs SSAA in several comparisons: Thats kind of annoying because TrSS only covers transparent textures while SSAA covers everything. If you want SSAA use SGSSAA in nvidia inspector and make the comparison, you're comparing 2 different things completely. TrSS is nowhere NEAR as good as SSAA and you pass the comparisons off as if they're the same thing.

If you really want apples to apples you should do these tests by choosing "override application setting" in nvidia inspector, and choosing SGSSAA. That is the only means for an apples to apples comparison. TrSS is nowhere near as intensive or as good as full screen SSAA. The other issue is verifying that the override happens, nvidia inspector override a lot of times doesn't work, thus it should be verified to be working. But you can rest assured that SGSSAA will be a lot slower than TrSS and it will be a more valid comparison, as of right now it is not a valid comparison. This doesn't nullify your findings, nvidia is better for scaling beyond 2 gpu's for sure, but ultimately your ssaa vs trss comparison is flawed unless i'm really missing something.

I'm not sure which graph you are refering to, but I think it might be this one? - http://www.hardocp.com/image.html?image=MTMzNTI2MjQ0MHdIS1VwNnhNVldfN18zX2wuZ2lm

The point of that graph was to show 2 things: 1.) The dips in performance that occur when 4X TR SSAA is turned on with the 3-way SLI config, showing that gameplay suffers with that quality setting turned up. 2.) That the TriFire config is unplayable with Adaptive SSAA turned on, to a much greater degree than 3-way SLI, which makes it very not-useful for this game, even in TriFire.

As for calling them SSAA, I'm calling them exactly what both NVIDIA and AMD call the features they have implemented in their drivers. On the NV side, it is called Transparency Supersample Antialising. On the AMD side it is called Adaptive Supersample Antialiasing. The technologies are based on SSAA, in AMD's case it is an Adaptive algorithm and determines what objects recieve SSAA, in NVIDIA's case it kinda just works on everything, and it works well. Turn it on in Skyrim and watch those jaggies dissapear on alpha textures. So yes, it is SSAA. If you've got a problem with how NVIDIA and AMD do things, I suggest taking it up with them, it is their technology, we just use what's available from both to further improve image quality in games beyond what a game offers.

We always go with what is officially supported in the driver and control panel presented to the gamer from the control panel, no third party software or hacking to exploit features, NVIDIA and AMD offer these options in the control panel, and therefore we use them to improve image quality and compare image quality between them, we always use the default driver configs.
 
I can see your point about using whats in the control panel, but you dont need 3rd party software to use SGSSAA on Nvidia hardware, there is an official tool on Nvidia website.

MSAA + Super Sampled Transparencies is not apples to apples with AMD's Adaptive SSAA (full scene SGSSAA).

Nvidia spefically state it is Transparency-SSAA btw in control panel, but I can see the confusion.
 
I can see your point about using whats in the control panel, but you dont need 3rd party software to use SGSSAA on Nvidia hardware, there is an official tool on Nvidia website.

MSAA + Super Sampled Transparencies is not apples to apples with AMD's Adaptive SSAA (full scene SGSSAA).

Nvidia spefically state it is Transparency-SSAA btw in control panel, but I can see the confusion.

It's as close as you can get between the two with what's in the control panel. For the end user/gamer, the person is going to sit down and if they want TR SSAA from NVIDIA they are going to use that option, which does reduce aliasing on alpha textures. If you want Adaptive SSAA from AMD, you are going to use the Adaptive SSAA slider which does reduce aliasing on alpha textures. In this regard, they are comparable, those are the usable options that relate to SSAA from the control panel.
 
I'm not sure which graph you are refering to, but I think it might be this one? - http://www.hardocp.com/image.html?image=MTMzNTI2MjQ0MHdIS1VwNnhNVldfN18zX2wuZ2lm

The point of that graph was to show 2 things: 1.) The dips in performance that occur when 4X TR SSAA is turned on with the 3-way SLI config, showing that gameplay suffers with that quality setting turned up. 2.) That the TriFire config is unplayable with Adaptive SSAA turned on, to a much greater degree than 3-way SLI, which makes it very not-useful for this game, even in TriFire.

As for calling them SSAA, I'm calling them exactly what both NVIDIA and AMD call the features they have implemented in their drivers. On the NV side, it is called Transparency Supersample Antialising. On the AMD side it is called Adaptive Supersample Antialiasing. The technologies are based on SSAA, in AMD's case it is an Adaptive algorithm and determines what objects recieve SSAA, in NVIDIA's case it kinda just works on everything, and it works well. Turn it on in Skyrim and watch those jaggies dissapear on alpha textures. So yes, it is SSAA. If you've got a problem with how NVIDIA and AMD do things, I suggest taking it up with them, it is their technology, we just use what's available from both to further improve image quality in games beyond what a game offers.

We always go with what is officially supported in the driver and control panel presented to the gamer from the control panel, no third party software or hacking to exploit features, NVIDIA and AMD offer these options in the control panel, and therefore we use them to improve image quality and compare image quality between them, we always use the default driver configs.

Thanks for the reply, just one point, transparency supersampling only covers transparency textures and thus has a very minor performance hit compared to "true" fullscreen SSAA. Just because jaggies disappear doesn't mean they're doing the same thing, TrSS does *much* less work than full screen SGSSAA. For true SSAA with nvidia hardware, again, you need to use nvidia inspector (this isn't 3rd party by the way, this is written by nvidia) and you should select SGSSAA in AA mode. Or you can download the SSAA tool written by nvidia. This is the only 100% reliable way to enable true SSAA on nvidia hardware. As far as AMD's adaptive AA I'm not too familiar with the recent iterations on it but suffice to say something is wonky with the drivers (fuck you AMD driver team:mad:)

Anyway, I just see this misconception a lot (TrSS = SSAA?? no...), but your explanation of using what is there for all users makes sense. Thanks for the reply, sorry if it came across as being critical and such.
 
Last edited:
biast review. I personally own both cards in trifire/sli and this reviewer is full of it.
What exactly is "biast" about the article? Why is it "biast"? Give us examples or your opinion is worth less than zero.
 
I'm not sure which graph you are refering to, but I think it might be this one? - http://www.hardocp.com/image.html?image=MTMzNTI2MjQ0MHdIS1VwNnhNVldfN18zX2wuZ2lm

The point of that graph was to show 2 things: 1.) The dips in performance that occur when 4X TR SSAA is turned on with the 3-way SLI config, showing that gameplay suffers with that quality setting turned up. 2.) That the TriFire config is unplayable with Adaptive SSAA turned on, to a much greater degree than 3-way SLI, which makes it very not-useful for this game, even in TriFire.

As for calling them SSAA, I'm calling them exactly what both NVIDIA and AMD call the features they have implemented in their drivers. On the NV side, it is called Transparency Supersample Antialising. On the AMD side it is called Adaptive Supersample Antialiasing. The technologies are based on SSAA, in AMD's case it is an Adaptive algorithm and determines what objects recieve SSAA, in NVIDIA's case it kinda just works on everything, and it works well. Turn it on in Skyrim and watch those jaggies dissapear on alpha textures. So yes, it is SSAA. If you've got a problem with how NVIDIA and AMD do things, I suggest taking it up with them, it is their technology, we just use what's available from both to further improve image quality in games beyond what a game offers.

We always go with what is officially supported in the driver and control panel presented to the gamer from the control panel, no third party software or hacking to exploit features, NVIDIA and AMD offer these options in the control panel, and therefore we use them to improve image quality and compare image quality between them, we always use the default driver configs.
Out of curiosity, why do you take this approach with drivers yet use third-party configuration tools for overclocking? This is an enthusiast website, and NVIDIA offers tools to enable true SSAA with their products, which to me is the same as using a vendor's tools (e.g. Sapphire's TriXX or whatever) to overclock/tweak their cards.

I think this would add to your "Highest Playable Settings" comparisons, especially on games where we are already running 100+ fps maxed out through in-game options.
 
My issue with Nvidia, and I really do believe that this is being downplayed is that the 680 was basically paper launched (unless you speak Russian). Setting aside a mini paragraph to make mention of it really isn't enough in my eyes. People can barely buy one let alone three.

So far Newegg doesn't have them. You can get them on Amazon from a retailer you haven't heard of, but that's at $600+. Tiger has a few but they are at far above that 499 price. It's been about a month and availability... fuck let's be honest. It's in the shitter. AMD has not pulled something like this (not quite of this caliber)... even with the 5 series. You might have seen one or two marked up on Newegg, but it was far more available and in stark contrast to seeing none.

If we can write a mini novella on the XFire config on three cards you can buy today. Surely we can see a little more than a sneeze at the fact the people will have problems buying even one 680.

When AMD didn't have enough of the 5 series in stock there wasn't a single reviewer that didn't mention it ...at great length. Here we have the 680 just as bad in term of availability if not worse. Nvidia should be held to the same standard.

There's no doubt that the 680 is the better product but it needs to be available.
 
My issue with Nvidia, and I really do believe that this is being downplayed is that the 680 was basically paper launched (unless you speak Russian). Setting aside a mini paragraph to make mention of it really isn't enough in my eyes. People can barely buy one let alone three.

So far Newegg doesn't have them. You can get them on Amazon from a retailer you haven't heard of, but that's at $600+. Tiger has a few but they are at far above that 499 price. It's been about a month and availability... fuck let's be honest. It's in the shitter. AMD has not pulled something like this (not quite of this caliber)... even with the 5 series. You might have seen one or two marked up on Newegg, but it was far more available and in stark contrast to seeing none.

If we can write a mini novella on the XFire config on three cards you can buy today. Surely we can see a little more than a sneeze at the fact the people will have problems buying even one 680.

When AMD didn't have enough of the 5 series in stock there wasn't a single reviewer that didn't mention it ...at great length. Here we have the 680 just as bad in term of availability if not worse. Nvidia should be held to the same standard.
I could have bought plenty of them from Newegg since launch if I wanted. you do have to act fast but they do show up at Newegg and other places too. heck today alone I could have bought one at least 3 or 4 times.
 
I could have bought plenty of them from Newegg since launch if I wanted. you do have to act fast but they do show up at Newegg and other places too. heck today alone I could have bought one at least 3 or 4 times.

Like I said, it was not this bad for AMD. I do check New Egg in particular at least once a day. Let's not pretend they are in any kind of availability that would make any kind of sense. You really don't have a choice of a vendor. If I prefer EVGA or Gainward it's going to be harder than a lesser known brand.
 
Yeah for a paper launch my three seem to be real. Yes they are in short supply but it just takes a little effort to snag them right now, it took me 8 days from launch day when I got two to the Friday of the following week to get the third, paid an average of $520 a card not including the overnight shipping charges, a bit of a premium but not bad.
 
Like I said, it was not this bad for AMD. I do check New Egg in particular at least once a day. Let's not pretend they are in any kind of availability that would make any kind of sense. You really don't have a choice of a vendor. If I prefer EVGA or Gainward it's going to be harder than a lesser known brand.
but these seem to be selling better than the 7970 did too. so its probably a combination of fairly low stock and really high demand more than just low volume of cards. and supposedly the low availability is mainly just a US problem.
 
Yeah for a paper launch my three seem to be real.
Really? I can't believe you even said that.

Yes they are in short supply but it just takes a little effort to snag them right now, it took me 8 days from launch day when I got two to the Friday of the following week to get the third, paid an average of $520 a card not including the overnight shipping charges, a bit of a premium but not bad.
That's just so laughable, not from a you didn't get 3 but I it took me 8 days perspective. If you can't see the problem in that then we just shouldn't even pretend to debate the topic.
 
This isn't an NVIDIA problem so much as it's a problem with the fabs. Not much NVIDIA can do, being fabless, same as AMD. Demand is high, supply is low. This isn't the same thing as a paper launch.
 
This isn't an NVIDIA problem so much as it's a problem with the fabs. Not much NVIDIA can do, being fabless, same as AMD. Demand is high, supply is low. This isn't the same thing as a paper launch.

There's been a supply shortage that hit all manufacturers building products from TSMC on that node (even AMD). However, none of them are so short of supply that they can't keep a SKU in stock.

With regards to a paper launch I don't view it as 0. I view it as when you come to market with so little of supply that it's not even reasonable that it should have been launched on that date.
 
That's just so laughable, not from a you didn't get 3 but I it took me 8 days perspective. If you can't see the problem in that then we just shouldn't even pretend to debate the topic.

I purchased a lot of high demand low volume products over the years. 8 days is nothing, that's no longer then to get a build to order laptop or desktop really. It took me about the same amount of time to get just 2 GTX 7800 512 cards back at the end of 2005.

Yes there needs to be a better supply of 680s but with a little effort they can be found.
 
Nice review guys. Be interesting to see how some gtx 680s with 3Gb perform in the games where there was an apparent bandwidth issue. How soon till we might see one of those on the market?
 
I purchased a lot of high demand low volume products over the years. 8 days is nothing, that's no longer then to get a build to order laptop or desktop really. It took me about the same amount of time to get just 2 GTX 7800 512 cards back at the end of 2005.

Yes there needs to be a better supply of 680s but with a little effort they can be found.

I said they could be found but "better supply" is an understatement. They need to be available from every manufacturer. At least one model...one model in particular that I'm trying to get.
 
5 car payments :(

Who needs a car when you unemployed sitting home (in your Moms basement) playing games. I think I stuffed every cliche in there :D
But seriously, I agree.
This type of review appeals to the masses because of fanboi cheer-leading only, but is practical to less the 1% of [H] gamers. This type of article should come out when a new gen card is released and many would wonder if adding 1 or 2 same old cards would pay off or not compared to going next gen. Of course drivers would be mature by then, and the proper review could be done with PCI-e 3.0 boards and such things as learning proper AA levels/quality(with visual blow-ups please!). I would hope.
 
In the BF3 review, you mention

How can this be when the 7970 setup gets 63fps min compared to 52fps min for the 680 setup? Also the graph seems to show the 7970 TriFire has higher frame rates at all times.

Framerates lie, a fact that we've come to accept in our testing. What is shown on the screen in fps is not always how a game feels. Despite it having a high fps, things like choppiness can still occur, despite the technical high framreate per second. These things take away from the smoothness and feel of a game.
 
It's called microstuttering which is inherent to all AFR rendering solutions (which CF and SLI are). Nvidia is employing something they call "frame metering" to combat this, AMD is doing nothing obviously. In this regard, SLI has always been better than CF, just look at the last 10 articles here on [H]. In every single one the same observation.

Yep, however NV's thing works to help smooth framerate in SLI, it definitely does work, there is a noticeable difference in the feeling as you play a game with SLI vs. CFX.
 
You have to consider not only the raw frame rate, but also the frame intervals. I'll take it to the extreme just to give you an idea. What would feel better to you?

1) 50fps where each frame is drawn on the monitor every 1/50th of a second?
2) 60fps where 1 frame takes 1/10th of a second to come up followed by 59 frames that come in the remaining 9/10th of a second.

nVidia does a better job at making sure frame intervals are nearly equal. AMD (I suppose?) just blasts them out as fast as they can, and sometimes a frame takes longer to compute and we perceive that as jerkiness in the video.

Perfect explanation of the phenomena.
 
Very nice review.

BUT...........I think the overall attitude and result is somewhat biased by the fact that you are still moaning over AMDs driver problems. Crossfire is as good or even better in some of your results, but quickly downplayed by the "smooth-ness factor" whatever that is.

Let's be honest, it's a problem for about two dozen people......I mean really, all those running triple GPU solutions, please raise your hand.....So should AMD make this an emergency? or should they just take their time and fix it right?

That and the fact that you can't even buy one GTX 680 anywhere, let alone three.

Face it, there are only a hand full of games at present that require two GPUs let alone three.......yeah, I've had three way SLI and CrossfireX in the not so distant past.....and honestly I couldn't find a game that was worthy.

I'm not a big fan of either company, they both have their problems. Sure I'd like to see better AMD driver support, but to down-play their success by consistantly bashing their software support doesn't do anybody any good......who has cards for sale that you can actually buy?;)

Actually, it is fact that NVIDIA employs a type of frame metering dealing with frametime, to smooth framerate in SLI, while AMD does not, it renders each framerate at its highest amount and spits them out. So, it is incorrect to say that framerate is everything, and smoothness doesn't count. It very much does, and it is a noticeable factor in gameplay.

And, are we simply allowed to ignore AMD driver issues? Of course not, it is our responsibility to bring this information to our readers attention, and hopefully help AMD improve upon issues like this.
 
Thats fair enough brent and I like that you guys have touched upon that in a few recent reviews. Having owned both Tri CFX & Tri SLI systems (aswell as dual CFX/SLI) I can safely say I agree with your conclusion. I will say this though, capping frame rate usually helps hide the issue.
 
I said they could be found but "better supply" is an understatement. They need to be available from every manufacturer. At least one model...one model in particular that I'm trying to get.

Maybe it's a US issue but in the UK there's definitely GTX 680's available at the major retailers. You'd have to pay slightly above RRP (£420-430 compared to RRP of £400) and there's not one from every manufacturer but if I wanted to I could get a GTX 680 delivered tomorrow.

At launch there were actually cards available at the RRP, but only for a day or so.
 
Excellent article. Certainly knocks AMD for six. Maybe I missed it, but I didn't spot any single-monitor benchmarks. I know that for Eyefinity you have to use a particular driver, but the article indicates that that's not the case for single monitor usage (I'd suggest a single 1440p monitor). This comparison would show any performance improvements in AMD's driver.

I also have a request: please include VRAM usage stats. This would show if a game was silently downgrading the graphics and if there are any driver efficiencies being used (when all the gewgaws are turned off).
 
Actually, it is fact that NVIDIA employs a type of frame metering dealing with frametime, to smooth framerate in SLI, while AMD does not, it renders each framerate at its highest amount and spits them out. So, it is incorrect to say that framerate is everything, and smoothness doesn't count. It very much does, and it is a noticeable factor in gameplay.

And, are we simply allowed to ignore AMD driver issues? Of course not, it is our responsibility to bring this information to our readers attention, and hopefully help AMD improve upon issues like this.

While I appreciate the response, and I do understand where you are coming from, the simple fact here is the "you can't quantify it but you can "see and feel it" can and is used to bias the findings.It's exactly right out of the nvidia marketing book. It's like watching a 100 meter dash and saying the winner wasn't the winner because he/she didn't run "smooth enough".

The driver issue is important, but as I have said......it applies to one TINY application of Crossfire, one that isn't utilized by many folks. I think overall AMD has really stepped up their driver quality and response to new games since the 7900 series has been released. They certainly have had their issues over the past year, I'll agree. The Triple Crossfire issue will be fixed, I'm sure. It would be interesting to re-do this when a proper driver is available.
 
Two questions Brent,

Does AMD take these criticisms to heart? I mean, are they completely fucking oblivious to what various hardware websites are saying? Its kind of disheartening to see this stuff, because I like rooting for AMD ( I like pulling for the underdog I guess). If they made any fucking attempt to expand their software team like nvidia has, I swear, they would be much better off. I guess AMD's upper management doesn't give two fucks about what websites are saying. Second: does the new 12.4WHQL fix any of the crashing issues you observed?

While I have been happy with 2x GPU 7970s, honestly I have not seen microstutter at all - I think you have to use specific adapters that are eyefinity approved to avoid tearing and such. I've compared oc 7970 CF to 680 SLI in my own system, and i'll say that I prefer the former most of the time - mostly because I've overclocked the living shit out of my 7970s....but anyway but I haven't tried surround at all. This article makes it sound like I shouldn't try on the AMD setup.

Thanks,
 
Back
Top