NVIDIA 3-Way SLI and AMD Tri-Fire Redux @ [H]

the conclusion of the article recommends getting the AMD setup. how on earth can you conclude that [H] is nvidia biased?
It's a matter of observing trends... not just looking at this one article and it's conclusions.
 
the conclusion of the article recommends getting the AMD setup. how on earth can you conclude that [H] is nvidia biased?

Indeed. The reason I read [H] reviews is the low-BS factor.

My take away: NV has a noticeable edge on bleeding edge over-clocked rigs that you'll pay dearly for. AMD has the value advantage in spades and will yield better perf on rigs people are most likely to actually own.

Not all gamers run over-clocked monsters. Some of us build rigs right behind the bleeding edge and aim for the value:perf sweetspot. Right now that is dominated by AMD GPUs.
 
It's a matter of observing trends... not just looking at this one article and it's conclusions.

that's fine and dandy but i disagree with the way you're interpreting the trends. to me it's obvious that they revisited this out of their own curiosity and to satisfy their readers. i see no bias in [H]'s reviews. i'm not stating this because i happen to be posting on their forums, it's just the way i see it. frankly i can't see how anyone finds bias in their reviews because their recommendations fluctuate with each generation of gpu's, based on real results.
 
Thanks for taking the time to do the review again, it was well done.

In your conclusion relating to value, I dont feel the extra performance is all you are paying $400-$500 more for, you left out features and IQ enhancments like SSAA, TrMSAA, AO, 3D, 3D surround etc. and conversely MLAA and EQAA and AMD. These all play a factor in perciveved value.

And the 6900's clocks were at 830 which is understandable, but it will sure be interesting to see 3 seperate cards at 880. And I know the CPU helped a lot but I cant help but wonder if the 3rd GPU not running at 4x made much difference in Tri + Surround.

Anyway cheers for the review.

If they are driver level extras then no its still not worth it.
 
Very nicely done article......as always.
I applaud the effort and looking into what people were saying about the first article.

Now I am going to be looking to build a new system I guess.....

Isn't that X79 board coming out soon?:eek:
 
Finally! Ive been waiting for something like this for some time. Props to [H] for this very well done follow up.
 
Very nicely done article......as always.
I applaud the effort and looking into what people were saying about the first article.

Now I am going to be looking to build a new system I guess.....

Isn't that X79 board coming out soon?:eek:

Q4 likely. Im itching to upgrade and have money set aside for a biggie. Debating on P67 or wait for X79
 
Love the follow up. But I must say, I'm absolutely flabbergasted that people would think there's no difference between a 3.6GHZ Core I7 920 and a 4.8GHz Core I7 2600k. The cpu is all and all faster in every single way, so if it can get info to the cards quicker, why wouldn't we see a performance increase? Especially, when we're benching 3x video cards. I'm honestly surprised AMD performed about the same or worse (in one game) but they might be at their limits, (or it's the NF200 controller, or some CAP issue, as others have stated... *shrugs*.)

Anyway, it only cements my decision to sell the GTX465 I have and get a 6950 or three. ;)

Great job Kyle and Brent. Also, can't wait for the Quads review.
 
I am impressed by your honesty and investigating reader's concerns. I had not revisited the original thread after the first few pages, so I had no idea another article was coming.

I also voiced the CPU limitation for the single video card reviews, and my posts went ignored. With the affordability of the sandy bridge systems, I will again suggest that you use a sandy bridge chip for all video card reviews. The i7 920 at 3.6Ghz has shown its usefulness, but a new 2500k + motherboard are only $400, which is more than affordable for your enthusiast readers.
 
Kyle and Brent, that was a great followup to the article. I hope you find out soon from ATI why F1 2010 saw such negative scaling when jumping to the new testing platform. Out of curiosity, how are you guys planning on testing the quad vs. quad? I can just see Brent knocking out the power within an 8 block radius as soon as he flips the switch on the quad Fermi rig :p

Indeed. The reason I read [H] reviews is the low-BS factor.

My take away: NV has a noticeable edge on bleeding edge over-clocked rigs that you'll pay dearly for. AMD has the value advantage in spades and will yield better perf on rigs people are most likely to actually own.

Not all gamers run over-clocked monsters. Some of us build rigs right behind the bleeding edge and aim for the value:perf sweetspot. Right now that is dominated by AMD GPUs.

You are definately damn right about that. Not all of us are comfortable in overclocking CPU's to the bleeding edge as many of us just want a nice performance boost without worrying about all the instability issues that you could run into. I usually only overclock as far as a processor will let me go on stock voltage.
 
Thanks for your hard work Kyle and Brent, listening to your readers and the feedback you give is why I keep coming back.
 
Am I the only one that gets the impression that on HARDOCP when an NVIDIA configuration beats an AMD configuration the entire subject is dropped until the next generation BUT when an AMD configuration beats an NVIDIA configuration the topic gets revisted over and over and over again until NVIDIA takes the lead again?!

TBH... I would expect a 3x 6970 Tri-Fire configuration to at least close the gap with the 3x 580 configuration where the CPU is less of a limitation. This series of articles have all been comparing 3x NVIDIA cards versus 2x AMD cards (not exactly apples to apples comparison ladies... the AMD configuration may have 3 GPU's but if history is any indication 3 actual 6970's in tri-fire would outperform that config by a good margin).


The only thing we are bias towards is value. But what we suggest you can take it or you can leave it, all the data is there for you to make your own conclusions should they be different than ours.

Keep in mind in this thread alone we have been called out for being bias for NVIDIA and for AMD. That just makes me think we are getting it right. :)
 
Great set of articles. Do I have a [H]igh end system or the budget for one? No, but I enjoy seeing what these top-tier rigs can do. Plus, I get to learn something too- in this instance so does [H], and likely, even AMD and nVidia.

The candor and humility from Kyle and Brent is also impressive. Surprising? No, but impressive.
 
Kyle and Brent, that was a great followup to the article. I hope you find out soon from ATI why F1 2010 saw such negative scaling when jumping to the new testing platform. Out of curiosity, how are you guys planning on testing the quad vs. quad? I can just see Brent knocking out the power within an 8 block radius as soon as he flips the switch on the quad Fermi rig :p



You are definately damn right about that. Not all of us are comfortable in overclocking CPU's to the bleeding edge as many of us just want a nice performance boost without worrying about all the instability issues that you could run into. I usually only overclock as far as a processor will let me go on stock voltage.

http://www.hardwareheaven.com/revie...-590-quad-sli-review-power-temp-noise-oc.html
http://www.guru3d.com/article/his-radeon-hd-6990-crossfire-review/9
http://www.guru3d.com/article/geforce-gtx-590-sli-review/3

Lies? Stupid? Can we please express ourselves a bit better? This language is flame bait at best. Be better than that. Thanks. - Kyle

On another note, great review and thanks for revisting it. It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
 
Last edited by a moderator:
Who is the hell is spreading lies? I want to know because i'm just joking around given the image a certain piece of silicon has.

Also who said anything about buying a budget CPU? I only commented on the fact that not all of us buy a processor and overclock it to within an inch of its life. Notice no mention in my original post was mentioned of what kind of cpu whether budget or not. Read the post before drawing paranoid conclusions like some conspiracy theroist.


http://www.hardwareheaven.com/revie...-590-quad-sli-review-power-temp-noise-oc.html
http://www.guru3d.com/article/his-radeon-hd-6990-crossfire-review/9
http://www.guru3d.com/article/geforce-gtx-590-sli-review/3

Sorry to burst your bubble but maybe you should research before spreading lies.

And if you are spending $1500 on 3 GTX 580s and don't OC at all your just completely stupid. Why spend so much money and then buy a "budget" cpu? It is clear that this setup is not for you, which is fine, but don't say its crap just because it needs a higher speed CPU.

On another note, great review and thanks for revisting it. It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
 
Thanks for the review. It's a great timely investigation into a factor which many believed was a total non-issue. Now we know better, and performance in this era seems to be even more nuanced than we thought.
 
Thanks Kyle, I knew I had seen these somewhere before. I know some people were mentioning about the difference in PCIe lanes in the forum before you did the redux, so I thought I would ask.



I am sorry, but exactly what bias are you referring to and how exactly are you coming to that conclusion? Notice they did this after a ton of readers commenting on different things which might be skewing the results. HardOCP did another review because the users asked for it, not because they wanted to prove one thing or another. Also note that Kyle was pretty adamant that his first (err, Brent's) findings were correct and another review wouldn't change it, thus the whole "eating crow" comment. And also notice that his conclusion really did stay the same, that the AMD setup is still the far more economic one.

On another note I remember a long period of time where HardOCP was all about Eyefinity, ran Eyefinity sponsored events and you couldn't go more than a few days without hearing yet more articles and information about Eyefinity. Quite frankly it made me a little sick, just because it seemed like over advertising. However, the fact remains that Kyle is going to follow the technology and is going to recommend whatever he feels is the best representation of the latest technology for gaming and performance.
I never said anything about HARDOCP fudging results... the numbers are numbers and people will come to their own conclusions based on them. I take some issue with the fact that this entire line of articles are not apples to apples comparisons (2x AMD cards versus 3x NVIDIA cards)*** and that there seems to be an ongoing trend that when NVIDIA takes the lead the topic is dropped (as if that was the preconceived conclusion) and when AMD takes the lead the topic is "Redux" over and over again. Whether that is because the majority of readers are NVIDIA fans (and as such, beg HARDOCP to revisit the conclusions) or that certain staff members are the biased ones the result is the same... an NVIDIA bias. *shrug*

I could not really care less who is on top this week... it's just a pattern I've noticed and I don't expect everyone to have noticed (and I am certainly NOT making any assumptions about HARDOCP staff being intentionally malicious in any way).

***traditionally 3x physical cards will outperform 1x dual GPU card + 1 single GPU card...
 
Very interesting article and a HUGE thumb ups for listening to the community and revisiting results.

I am pretty happy with my 4.4GHz i7 / GTX580 SLI combo, but I have on occasion debated if picking up a 3rd 580 might be worth it. Your first article really put the brakes on that - but it looks like it may be a worthwhile upgrade afterall.
 
I think the bias is to expect a setup that costs $500 more to outperform the cheaper setup; further, the 3 seperate cards should, as you say, beat a dual + 1 setup. It isn't that nVidia won, thus it is the right conclusion. It is the logical conclusion based on the two points above.
 
Kyle, I noticed that you guys have had some holes in the schedule since you are now working on a quad vs. quad article along with planning to revisit 3x vs 3x afterwards. Have you reviewers been given any information as to when you "might" start recieving ES or preview samples of a certain Sunnyvale based institution's upcoming piece of slicon that might plug into a motherboard with a certain socket with a + on it? :)

No comment.
 
and still its amd thats the choice and earns the award, things have changed putting the 580's into better lights, but they're not all to bright if you ask me, they're 500 bucks more and consume 200W more!
 
you can't get info like the out of serious hardware sites, go ask some chineese.

kyle should have it by the 20th, thats atleast when a friend of mine is getting his

I was just hoping for a yes or no answer.
 
So if you are going to build a high end system.....make sure it has a high end CPU?

Sounds reasonable.

I honestly could not imagine shelling out money for 3 high end video cards and not matching a top end CPU with it.
 
Glad to see you guys finally start upping the CPU frequencies.

I wouldn't mind seeing some CPU usage graphs along with this. Is this more along the lines of SuperPi where it's not about computation, it's just about pure Ghz that gives better results?
 
This is what I initially expected. The world is how I thought it was. Nvidia has always been more cpu dependent but since most people with uber high end videocards have a uber high end setup this was never really proven until now. AMD should lose performance in multicard until the vram runs out on nvidia hardware. Value and bang buck still favors AMD.

I knew beforehand that Nvidia shines best on faster cpu's and can truly spread it's wings. I remember commenting way back on a GTX 560 ti review about how it fared much better against the 6870 with a faster cpu as many other websites had shown but anyhow. ..

This leaves me with several known facts and some unanswered questions

Known Facts
AMD still has the best value hands down in price/performance
3 6970 could have been more competitive here (or 3 unlocked 6950s)
AMD still shines in terms of power consumption
AMD now can be considered more cpu efficient as well!! :)

Unanswered
Did nvidia's boost come from the faster cpu alone or did the 3rd card not being limited to 4x play a role here. If so, how much of a role did the pcie lane play vs the increase in processing power? :confused:
 
7Upping the frequencies is good for raw power ... but i think we all know that we should do reviews on lower frequencies right? I mean ... overclockers are a niche ...

Most users will just play with everything at stock ... or at low overclocks ...

So... it's like painting everything pink to look so wonderful ... but in the end... when someone gets the same hardware and test it on lower performance systems... they get really "WTF?" :)

So ... i think [H] does a nice job reviewing with the 3.6Ghz system ...
 
Can't wait for 3- 6970 tests. Will be interesting to see if I should pick up another 6970! Great review by the way.
 
Just a thought, but have you guys redone any of the tests with the 2600k @3.6 GHz? I'm a little curious if the regression analysis would reveal even more interesting results. :)
 
7Upping the frequencies is good for raw power ... but i think we all know that we should do reviews on lower frequencies right? I mean ... overclockers are a niche ...

Most users will just play with everything at stock ... or at low overclocks ...

So... it's like painting everything pink to look so wonderful ... but in the end... when someone gets the same hardware and test it on lower performance systems... they get really "WTF?" :)

So ... i think [H] does a nice job reviewing with the 3.6Ghz system ...

I don't think anyone building the set ups mentioned in this review are your average Joe users.

These set ups are for the Extreme gamers who also happen to be Overclocking enthusiasts.

Like really, who's going to get 3 x 580's and run their CPU at stock?
 
kyle\brent Any feedback on performance at default clock speeds on the 2600k system ? Guess i'm trying to dig at how much the newer platform p67/2660k played in the difference's you've seen. thanx
 
kyle\brent Any feedback on performance at default clock speeds on the 2600k system ? Guess i'm trying to dig at how much the newer platform p67/2660k played in the difference's you've seen. thanx

From the prior review thread, it looks like Kyle OC'ed the computer prior to sending it to Brent, so the reviewed cards never touched the machine at stock speeds.
 
I wonder if AMD test their CrossFire on AMD CPU's more than NVIDIA does?

Could explain why AMD have better CPU usage considering how silly it would look if an AMD CPU isn't fast enough to power their GPUs.
 
Zarathustra[H];1037202822 said:
Mind blown...

I had been completely convinced that there is no such thing as CPU limiting on any Core i5 or Core i7 or better, at least not at any resolution and settings that anyone would want to play at with modern video cards.
.

That notion was disproven awhile ago, though. Guru3D did a 580 3-way SLI review a few months back at 25x16 and its graphs clearly illustrated a CPU bottleneck in some games (using a highly OC'd i7, IIRC).
 
would be interesting to know at what point cpu wise a gtx 580 tri-fire solution unleashes its potentially or does it scale linearly with increased mhz , guess i might just have to do some of my own testing :) Also would be easy to test at default clocks although that's not very [H].
 
Very interesting article and a HUGE thumb ups for listening to the community and revisiting results.

I am pretty happy with my 4.4GHz i7 / GTX580 SLI combo, but I have on occasion debated if picking up a 3rd 580 might be worth it. Your first article really put the brakes on that - but it looks like it may be a worthwhile upgrade afterall.

No, that would be a very much mistaken upgrade. For resolutions above 1600p, you'll want 2GB vRAM, especially with the kind of cash you'd be spending where you'd expect to enable copious AA at will.

So an AMD solution or the 580 3GB.
 
Who is the hell is spreading lies? I want to know because i'm just joking around given the image a certain piece of silicon has.

Also who said anything about buying a budget CPU? I only commented on the fact that not all of us buy a processor and overclock it to within an inch of its life. Notice no mention in my original post was mentioned of what kind of cpu whether budget or not. Read the post before drawing paranoid conclusions like some conspiracy theroist.

My bad didn't know you were joking. Your username made it a bit more difficult to know if you were joking or not, my bad :(

Its kinda interesting that some people are still not content with the tests. Cmon guys its been redone with a faster CPU just like you requested and your still not happy, I guess its hard to please people these days.
As I said before...It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
 
Don't worry as you're okay. I put a :p at the end of the post to signify that I was joking, considering that Brett joked in the previous thread about how his new 4.8ghz test bed was going to cause a power outage or something.

My bad didn't know you were joking. Your username made it a bit more difficult to know if you were joking or not, my bad :(

Its kinda interesting that some people are still not content with the tests. Cmon guys its been redone with a faster CPU just like you requested and your still not happy, I guess its hard to please people these days.
As I said before...It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
 
As I said before...It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.

Even if you're OC'ing to the bleeding edge, unless you want to spend $500 for an average of ~10% on the 5 games tested here, the AMD solution is the better choice. Keep in mind too that if you went 3x6970 it would narrow that gap even further, plus give you the extra 500MB per card for more IQ.

But really, for $100 cheaper you can get 4x6970s, each with the larger framebuffer, for the same price as 3x580s. It's almost $300 cheaper if you get the 3GB 580s.

Prices using Newegg's lowest price for each card.

3x580: $1470
3x580(3GB): $1620
4x6970: $1360
4x6950: $1020 (HOLY CRAP now there is value!)

So really, unless money is no object and you plan on going quad-SLI for the highest end rig possible, why do 3x580? Doesn't make any sense to me from either a cost or a performance perspective, either do tri-fire or quad-fire 6970 for less. Shoot, if you go unlocked 6950s...it's just ridiculous the extra power you get for the $ from AMD this generation.
 
I still think something is really wrong here.

Why does AMD cards not getting faster, (and sometimes even getting slower!), with a faster CPU speed, like Nvidia's cards? It's not logical at all. I would really like someone to come out with a logical answer to that.

Why does F1 getting SLOWER with a faster CPU speed? It's not logical at all.

My Quad-Fire set-up is not behaving like that on my Maximus IV P67.

I hope AMD will comment on those strange results.
 
Back
Top