Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You can quantify it, if you have a way of measuring frame time. They have said they are working on this. I believe Techreport has a few reviews which highlight this problem.While I appreciate the response, and I do understand where you are coming from, the simple fact here is the "you can't quantify it but you can "see and feel it" can and is used to bias the findings.It's exactly right out of the nvidia marketing book. It's like watching a 100 meter dash and saying the winner wasn't the winner because he/she didn't run "smooth enough".
While I appreciate the response, and I do understand where you are coming from, the simple fact here is the "you can't quantify it but you can "see and feel it" can and is used to bias the findings.It's exactly right out of the nvidia marketing book. It's like watching a 100 meter dash and saying the winner wasn't the winner because he/she didn't run "smooth enough".
The driver issue is important, but as I have said......it applies to one TINY application of Crossfire, one that isn't utilized by many folks. I think overall AMD has really stepped up their driver quality and response to new games since the 7900 series has been released. They certainly have had their issues over the past year, I'll agree. The Triple Crossfire issue will be fixed, I'm sure. It would be interesting to re-do this when a proper driver is available.
Ya I spent 1650 on my 3 7970's to play any game I want maxed out with eyefinity. So far it has been nothing but a huge disappointment.
I think you are completely missing the point of these videocards. You seem to be convinced that people spend ~$1500 so they can benchmark. Hardocp (and I agree with them) believe people spend $1500 on videocards to play games. So while "smoothness" may be subjective, if it makes a difference in gameplay I definately want to know. My 100 hours in Skyrim and 80 hours in BF3 weren't spent running fraps and benchmarking the games, they were spent playing.
Too many people still have the old school mentality of benchmarking. I personally do not care how "fast" my card is, I care about the quality of gameplay it can deliver. [H] has observed on several occasions that the quality of gameplay is better on SLI due to the issues AMD has with stuttering.
If you want to argue AMD benchmarks better, sure, fine, Nvidia plays better. To use your sports analogy think gymnastics not running. While the technical part of the performance matters, contestants are also scored on the overall feel of their performance. Nvidia feels better.
The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.
GoGo nvidia.
How is an experience superior if a benchmarking tool records higher frame rates yet the gaming experience isn't as good? You can't be saying that higher frames rates ALWAYS mean a better gaming experience because any gamer with experience knows that's simply not true.
And again, this particular review as you have repeatedly said yourself is about a very specific and not widely used configuration. A configuration that you have said repeatedly is so rare that AMD shouldn't make it a priority. And it looks like they are taking your advice.
Multi-GPU beyond 2 cards and Eyefinity is simply not working particularly well on the 7970s currently. That doesn't mean that it is a bad card. The 680 has its weaknesses as well such as GPU compute power.
For whatever reason you think its biased to point out this issue with the 7970s but at the same time claim that since this is a setup used by so few people that AMD shouldn't give it priority. Ok, fair enough but that's as biased as anything in this review.
You can quantify it, if you have a way of measuring frame time. They have said they are working on this. I believe Techreport has a few reviews which highlight this problem.
The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.
GoGo nvidia.
You answered your own question, no.Two questions Brent,
Does AMD take these criticisms to heart? I mean, are they completely fucking oblivious to what various hardware websites are saying? Its kind of disheartening to see this stuff, because I like rooting for AMD ( I like pulling for the underdog I guess). If they made any fucking attempt to expand their software team like nvidia has, I swear, they would be much better off. I guess AMD's upper management doesn't give two fucks about what websites are saying.
I think you missed my point there.
I have no problem with the article pointing out that AMD should up their game in software support, man I've been on that bandwagon too.
Yup, triple GPU is really a tiny niche of people, and I really doubt the utility of this sort of review, other that to demonstrate a whole table full of hardware that few people want or can afford. Yeah it's really nice to look at......but not very many people have it, thus the real need to jump AMD about this point in the article is suspect.....but not biased.....it's the truth. Does the use of three GPUs translate or trickle down to dual or single performance? How can you know that when the three monitor software on AMD's side is "ancient" and not relevant to dual or single card users.
The bias in the article comes from using an (up to this point in time) unmeasurable SUBJECTIVE "feeling" about what you are measuring, comparing it to something you can, and then declaring the inferior product better because it "feels smoother". (this is not to say nvidia's gpus were inferior in each game, they weren't,obviously)
That would require a "pepsi challenge" in my book.
I don't know, I just get this gut feeling that the nvidia product was destined to be better, even before the thing was done.
It's just my opinion. I don't have a horse in this race. I own both company's products and have no preference either way.
You answered your own question, no.
While I'm not making any excuses, and if I had three GPUs I'd be screaming pretty loud, there aren't many people in that crowd.
Product reviews by nature are always subjective. When reviewing any product you can have stats, but there is always the subjective opinion of the reviewer. For example with cars, you can compare horsepower etc, but the reviewer will still give you an opinion on which model feels better to drive. This is not bias, bias implies that someone's opinion is predetermined by unrelated factors. If a game feels "smooth" is not an unrelated factor, it is pretty directly related to enjoyment of a game.
I think you either dont understand the meaning of the word bias, or don't understand that a review is simply the opinion of one person on a specific product with information backing up why he came to that conclusion. Brett did a very good job explaining why he came to his conclusion. It could also be that [H] just isn't the place for you. You seem to prefer simple benchmarks instead of actualy gameplay experienes, there are plenty of places which will offer benchmarks and never actually play anything with the cards in question.
The driver issue is important, but as I have said......it applies to one TINY application of Crossfire, one that isn't utilized by many folks. I think overall AMD has really stepped up their driver quality and response to new games since the 7900 series has been released. They certainly have had their issues over the past year, I'll agree. The Triple Crossfire issue will be fixed, I'm sure. It would be interesting to re-do this when a proper driver is available.
btw, that's an excuse.
Well said!
Great review guys, glad you found a workaround to the BSOD issue. I am just lol'ing at all the bitching/conspiracy theories going on. How about we just call the contest a draw at 0 FPS for all configs since without a workaround the 7970s would BSOD = 0FPS and with the lack of availability for 3x 680 to be placed at everyone's doorstep = 0FPS.
The arguments just make me laugh.
I also question this logic:Yes, AMD *can* fix these drivers issues, but considering that they have had microstuttering problems for the last 3 generations of cards (worse than NVIDIA's at any rate) and they haven't fixed the Eyefinity bug with 3 cards in the last 5 months, what makes you think they are going to fix them going forward?
No, it's not. It's clearly stated that this is their personal opinion. With every video card / cpu / other hardware release, the reviewers at [H] get accused of bias by some overzealous fanboy and the really funny thing is that every time it happens it's the other company than the previous time that they're being accused of bias towards. What it comes down to is that people are butthurt that "their" company isn't the favourite at any given time.The whole premis of my argument is: to dismiss the performance of a GPU despite it superiority because it doesn't "feel right" to that individual reviewer is bias, pure and simple.
GoGo nvidia.
I've been reading [H] reviews for 10 or so years, (even if I've been a member for less than that) and my experience over those ten years is that nobody who works for [H] is biased towards any particular company... they support whichever card provides the best experience at any given time, as they should.
Agreed. I have been reading the [H] since 2000 and I have seen them being accused of favoring NV when NV was kicking ass, then they were accused of favoring ATI when ATI was kicking ass. The same as well for AMD/Intel. It's hilarious when you have the overall picture and these kids start making sweeping assumptions based on the particular article they read that day.
It all has happened before and will happen again. I honestly don't know how Kyle, Steve and company do it with all the misinformed criticism. Especially when the content is free.
Does anyone have Terry Makedon's email address? Please? Can we petition AMD to put him back on the fucking driver team along with 3-4 more capable bodies?
I'm really tired of this shit. People who spend 1500$ on GPU's shouldn't have a sub par experience, its really frustrating because I think the 7970s are great hardware. I've really enjoyed them in crossfire, but I only do single screen. Seeing these bad stories about eyefinity leaves a bad taste in my mouth.
First off, I will say that I wouldn't touch a 3x CF rig, but for more reasons than in this review.
But don't let reviews tell you what you want or need. It's good to have experienced advisors and sites like {H] provide a great service. That said, you really have to try things for yourself. Not necessarily cheap or easy but necessary unfortunately.
Yeah, point taken, I have tried both sides and have purchased both 680s and 7970s. Honestly with the 7970s overclocked they won a lot of games on single screen 2560 resolution, so I ended up selling. Not because the 680s were bad, they were quieter and had some cool features but they just weren't an upgrade at all and several games were slower on them. (ie crysis, metro 2033, witcher 2, alan wake). The 7970s I used in CF overclocked like crazy so it probably wasn't a fair comparison, though.
Anyway, Its just frustrating, my personal opinion is that the hardware is good, but the software team behind it is behind the curve in terms of supporting the people that spend the most on their products. Not supporting someone that spends 1500$ on an AMD trifire setup? Ridiculous. I just feel like venting because I like rooting for AMD because they're the underdog, yet they don't throw their resources where they're needed the most - in software development for their hardware. It wasn't like this during the 5000 series days that I remember, AMD had a lot of positive press and they had rolled out eyefinity and all was great. What went wrong? I don't know, but they need to throw more bodies at their software development, period.
so what impact would 4GB cards have on this?
you dont have to have more bandwidth just to utilize more vram.ZERO, GTX680 is bandwidth locked card.
So no, even with 10GB of Vram, you wont get increase, because of the 256bit memory interface, and 192GB/s transfer as a maximum.
Wrong.ZERO, GTX680 is bandwidth locked card.
So no, even with 10GB of Vram, you wont get increase, because of the 256bit memory interface, and 192GB/s transfer as a maximum.
Wrong.
No, he's not right. VRAM is (more or less) storage space, and you can certainly run out of VRAM in a situation where the actual scene being rendered isn't taxing, e.g. Skyrim with HQ texture mods. I have yet to see any indication that the 7970s memory bandwidth is of any performance benefit since it loses to the 680 in the majority of benchmarks even at 5760x1200 on multi-GPU setups.Hahah. Great response, although he's right.
No, he's not right. VRAM is (more or less) storage space, and you can certainly run out of VRAM in a situation where the actual scene being rendered isn't taxing, e.g. Skyrim with HQ texture mods. I have yet to see any indication that the 7970s memory bandwidth is of any performance benefit since it loses to the 680 in the majority of benchmarks even at 5760x1200 on multi-GPU setups.
http://www.overclock.net/t/1196856/official-amd-radeon-hd-7950-7970-owners-thread/9960#post_17097247
FYI this feedback is from xoleras (a user who is also on hardforum) who has run both 680 SLI and 7970 CF on single screen 2560 x 1600. so enough with the lies "7970 loses to GTX 680 in the majority of benchmarks"
For bandwidth limitations of GTX 680 being a factor look at the following examples where the perf scaling from GTX 580 to 680 is < 20% . GTX 680 has enough shading power over GTX 580 (1536 shaders at 1100 Mhz vs 512 shaders at 1550 Mhz).
http://www.hardware.fr/articles/857-12/benchmark-alan-wake.html (10% scaling at 1080p max)
http://www.guru3d.com/article/geforce-gtx-680-review/18 (18% scaling at 1080p)
http://www.anandtech.com/show/5699/n...x-680-review/7 (17% scaling at 1080p)
These are some of the most demanding games in the present. Looking at the future I would say the GTX 680 will run into such scenarios even more.