It's a matter of observing trends... not just looking at this one article and it's conclusions.the conclusion of the article recommends getting the AMD setup. how on earth can you conclude that [H] is nvidia biased?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It's a matter of observing trends... not just looking at this one article and it's conclusions.the conclusion of the article recommends getting the AMD setup. how on earth can you conclude that [H] is nvidia biased?
the conclusion of the article recommends getting the AMD setup. how on earth can you conclude that [H] is nvidia biased?
It's a matter of observing trends... not just looking at this one article and it's conclusions.
Thanks for taking the time to do the review again, it was well done.
In your conclusion relating to value, I dont feel the extra performance is all you are paying $400-$500 more for, you left out features and IQ enhancments like SSAA, TrMSAA, AO, 3D, 3D surround etc. and conversely MLAA and EQAA and AMD. These all play a factor in perciveved value.
And the 6900's clocks were at 830 which is understandable, but it will sure be interesting to see 3 seperate cards at 880. And I know the CPU helped a lot but I cant help but wonder if the 3rd GPU not running at 4x made much difference in Tri + Surround.
Anyway cheers for the review.
Very nicely done article......as always.
I applaud the effort and looking into what people were saying about the first article.
Now I am going to be looking to build a new system I guess.....
Isn't that X79 board coming out soon?![]()
Indeed. The reason I read [H] reviews is the low-BS factor.
My take away: NV has a noticeable edge on bleeding edge over-clocked rigs that you'll pay dearly for. AMD has the value advantage in spades and will yield better perf on rigs people are most likely to actually own.
Not all gamers run over-clocked monsters. Some of us build rigs right behind the bleeding edge and aim for the valueerf sweetspot. Right now that is dominated by AMD GPUs.
Am I the only one that gets the impression that on HARDOCP when an NVIDIA configuration beats an AMD configuration the entire subject is dropped until the next generation BUT when an AMD configuration beats an NVIDIA configuration the topic gets revisted over and over and over again until NVIDIA takes the lead again?!
TBH... I would expect a 3x 6970 Tri-Fire configuration to at least close the gap with the 3x 580 configuration where the CPU is less of a limitation. This series of articles have all been comparing 3x NVIDIA cards versus 2x AMD cards (not exactly apples to apples comparison ladies... the AMD configuration may have 3 GPU's but if history is any indication 3 actual 6970's in tri-fire would outperform that config by a good margin).
Kyle and Brent, that was a great followup to the article. I hope you find out soon from ATI why F1 2010 saw such negative scaling when jumping to the new testing platform. Out of curiosity, how are you guys planning on testing the quad vs. quad? I can just see Brent knocking out the power within an 8 block radius as soon as he flips the switch on the quad Fermi rig
You are definately damn right about that. Not all of us are comfortable in overclocking CPU's to the bleeding edge as many of us just want a nice performance boost without worrying about all the instability issues that you could run into. I usually only overclock as far as a processor will let me go on stock voltage.
http://www.hardwareheaven.com/revie...-590-quad-sli-review-power-temp-noise-oc.html
http://www.guru3d.com/article/his-radeon-hd-6990-crossfire-review/9
http://www.guru3d.com/article/geforce-gtx-590-sli-review/3
Sorry to burst your bubble but maybe you should research before spreading lies.
And if you are spending $1500 on 3 GTX 580s and don't OC at all your just completely stupid. Why spend so much money and then buy a "budget" cpu? It is clear that this setup is not for you, which is fine, but don't say its crap just because it needs a higher speed CPU.
On another note, great review and thanks for revisting it. It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
I never said anything about HARDOCP fudging results... the numbers are numbers and people will come to their own conclusions based on them. I take some issue with the fact that this entire line of articles are not apples to apples comparisons (2x AMD cards versus 3x NVIDIA cards)*** and that there seems to be an ongoing trend that when NVIDIA takes the lead the topic is dropped (as if that was the preconceived conclusion) and when AMD takes the lead the topic is "Redux" over and over again. Whether that is because the majority of readers are NVIDIA fans (and as such, beg HARDOCP to revisit the conclusions) or that certain staff members are the biased ones the result is the same... an NVIDIA bias. *shrug*Thanks Kyle, I knew I had seen these somewhere before. I know some people were mentioning about the difference in PCIe lanes in the forum before you did the redux, so I thought I would ask.
I am sorry, but exactly what bias are you referring to and how exactly are you coming to that conclusion? Notice they did this after a ton of readers commenting on different things which might be skewing the results. HardOCP did another review because the users asked for it, not because they wanted to prove one thing or another. Also note that Kyle was pretty adamant that his first (err, Brent's) findings were correct and another review wouldn't change it, thus the whole "eating crow" comment. And also notice that his conclusion really did stay the same, that the AMD setup is still the far more economic one.
On another note I remember a long period of time where HardOCP was all about Eyefinity, ran Eyefinity sponsored events and you couldn't go more than a few days without hearing yet more articles and information about Eyefinity. Quite frankly it made me a little sick, just because it seemed like over advertising. However, the fact remains that Kyle is going to follow the technology and is going to recommend whatever he feels is the best representation of the latest technology for gaming and performance.
Kyle, I noticed that you guys have had some holes in the schedule since you are now working on a quad vs. quad article along with planning to revisit 3x vs 3x afterwards. Have you reviewers been given any information as to when you "might" start recieving ES or preview samples of a certain Sunnyvale based institution's upcoming piece of slicon that might plug into a motherboard with a certain socket with a + on it?![]()
you can't get info like the out of serious hardware sites, go ask some chineese.
kyle should have it by the 20th, thats atleast when a friend of mine is getting his
7Upping the frequencies is good for raw power ... but i think we all know that we should do reviews on lower frequencies right? I mean ... overclockers are a niche ...
Most users will just play with everything at stock ... or at low overclocks ...
So... it's like painting everything pink to look so wonderful ... but in the end... when someone gets the same hardware and test it on lower performance systems... they get really "WTF?"
So ... i think [H] does a nice job reviewing with the 3.6Ghz system ...
kyle\brent Any feedback on performance at default clock speeds on the 2600k system ? Guess i'm trying to dig at how much the newer platform p67/2660k played in the difference's you've seen. thanx
Zarathustra[H];1037202822 said:Mind blown...
I had been completely convinced that there is no such thing as CPU limiting on any Core i5 or Core i7 or better, at least not at any resolution and settings that anyone would want to play at with modern video cards.
.
Very interesting article and a HUGE thumb ups for listening to the community and revisiting results.
I am pretty happy with my 4.4GHz i7 / GTX580 SLI combo, but I have on occasion debated if picking up a 3rd 580 might be worth it. Your first article really put the brakes on that - but it looks like it may be a worthwhile upgrade afterall.
Who is the hell is spreading lies? I want to know because i'm just joking around given the image a certain piece of silicon has.
Also who said anything about buying a budget CPU? I only commented on the fact that not all of us buy a processor and overclock it to within an inch of its life. Notice no mention in my original post was mentioned of what kind of cpu whether budget or not. Read the post before drawing paranoid conclusions like some conspiracy theroist.
My bad didn't know you were joking. Your username made it a bit more difficult to know if you were joking or not, my bad
Its kinda interesting that some people are still not content with the tests. Cmon guys its been redone with a faster CPU just like you requested and your still not happy, I guess its hard to please people these days.
As I said before...It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.
As I said before...It shows that AMD is the best choice for most of us who do no-mild OC's. It also shows that AMDs cards are at their max with the current CPUs and therefore more practical, while Nvidias seem to still be limited by current CPUs with really high overclocks which makes them less practical.