NVIDIA GeForce 3-Way SLI and Radeon Tri-Fire Review @ [H]

Status
Not open for further replies.
Civ 5, Metro, BC2, Crysis (surely you mean crappy looking Warhead, not The Crysis)
running better on NV does not mean favoring Nvidia.
At least not in an unfair sense, because these are not TWIMTBP exclusive games, where AMD was cut out of development process.
(dunno what ppl who claim otherwise have been smoking)

Unlike F1 2010 and Dragon Age 2, which are AMD exclusively sponsored and developed games.

So if you think that having 2 out of 5 AMD Partnership games,
and not having a single exclusively Nvidia sponsored TWIMTBP,
is either fair or representative of gaming industry, oh well...

Also...you say you need good games.

How about S.T.A.L.K.E.R. Call of Pripyat - one of the 1st DX11 games, a game with phenomenal gameplay and replay value, loved by many on this forum, yet never a part of your benchmark set?
But cartonish looking corridor RPG with 1 cave and DX11 features only on paper enters your bench set no problemo.

And no, we can not take this to developers, because you are the ones who make the choice which games to bench, and frankly this set could be larger.

How about widening it?
Because I don't think you can really say "X gives a better gameplay experience then Y" with a straight face after benching 4 games?

How's that for a rant :cool:

Crysis Warhead and Metro 2033 both are TWIMTBP. Especially Metro 2033 was a total nVidia based game.

I am not so sure what are you trying smoking out of....


---------------

I find it funny to see some people to get butt hurt when something they leaning toward to isn't doing so well as what is on the opposite.

Not trying to name calling people here... ;)
 
Crysis Warhead and Metro 2033 both are TWIMTBP. Especially Metro 2033 was a total nVidia based game.
I am not so sure what are you trying smoking out of....

TBH I could have been wrong there, because I have intros disabled. Should have checked better.

Still, these games do not magically perform better on Nvidia, or kill fps for no apparent reason whatsoever like Dragon Age 2 with SSAO for example.

Both DA2 and F1 are discrepancies similar to HAWX(2)/Lost Planet 2.

Metro and Warhead are not.
 
Last edited:
TBH I could have been wrong there, because I have intros disabled. Should have checked better.

Still, these games do not magically perform better on Nvidia, or kill fps for no apparent reason whatsoever like Dragon Age 2 with SSAO for example.

Crysis's performance was superior on nVidia when it came out, but until ATi's 8.12HF driver, the table turned entirely. It's a pure TWIMTBP game. It's ATi who work hard enough to push the driver this far.

For BC2, same thing. It was superior on ATi, but nVidia turn the table with the new driver optimization.

So your argument is not valid in general.
 
Which argument are u refering to? Yes I was wrong about them not being TWIMTP. I thought we're passed this.
 
Which argument are u refering to? Yes I was wrong about them not being TWIMTP. I thought we're passed this.

What I mean is, even if they are TWIMTP or ATi. The driver update can easily turn the tables around. It's pointless to specifically put certain game that makes certain card average out.

Civ 5 is heavily nVidia game, and like you said its something like DA2/LP2..etc

But sometime these kind of games can flip the side. Look at Crysis, Metro 2033, BC2, DiRT 2... these games ALL turn the flavoring to the opposite at the end.
 
BTW, Civ5 is heavily leaning towards NV because of a driver feature NV currently supports, they have enabled some of the DX11 multi-threading routines that the game supports, and thus get a good boost in that game. AMD needs to catch up with their DX11 multi-threaded driver support. Civ5 is a unique example though, cause it is currently the only game that supports that DX11 feature. Note that BC3 will also support DX11 driver multi-threading, so that will be two games supporting that feature when that game is released. Also, I have to disagree with you on your DA2 comments HohnyF. DA2 makes good use of standard modern DX11 graphical features.
 
Point is it doesnt matter who sponsors the games, its just that games that are known to magically run far faster on certain hardware shouldnt be included in a limited 5 game test and then using those results to claim one system is superior to the other.
 
Well no amount of tweaking and working hard will help AMD in LP2 or HAWX(2). Or Nvidia in DA2.

Civ 5 you say xD

http://sites.amd.com/us/game/games/Pages/CivilizationV.aspx

Nvidia has a working set of multithreading features on this 1. That's why. Not because it was magically enchanced like HAWX/DA2.

EDIT:mad:Brent_Justice yeah xD (Civ5)

DA2...Well it does have a tesselation. Which works in a 8m circle around your character, and frankly it's mostly good for producing artifacts when your char moves. And a bit softer shadows.

That blur/DoF stuff, same as tess. it's better when turned off.
 
Last edited:
I think it's funny that people are complaining about the games that [H] uses, when Kyle has specifically created threads asking what games people want to see in the reviews. If you don't like their choices, I hope you posted feedback in those threads.
 
I think it's funny that people are complaining about the games that [H] uses, when Kyle has specifically created threads asking what games people want to see in the reviews. If you don't like their choices, I hope you posted feedback in those threads.

It doesnt take a genius to work out you dont bench DA2, F1, LP2, Hawx2 etc in a 5 game review.
 
It doesnt take a genius to work out you dont bench DA2, F1, LP2, Hawx2 etc in a 5 game review.

Why not?

LP2 and HAWX 2 is understandable since NO ONE plays it.

DA2 and F1 2010 have a huge player base, why not bench the game that people actually play? Because it makes butt hurt?
 
Because you can not claim "X gives better performance then Y" based on a 5 games sample, with
2 very non typically performing members in it.
Which you kinda could if you had leverage in mind while choosing your sample.

Although your point about popularity is valid.

But there's no good way out of this, other then including more games.
 
Because you can not claim "X gives better performance then Y" based on a 5 games sample, with
2 very non typically performing members in it.
Which you kinda could if you had leverage in mind while choosing your sample.

Although your point about popularity is valid.

But there's no good way out of this, other then including more games.

Isn't Metro 2033 and Civilization another 2 to average out? ;)
 
The games list is fine. I just don't know what some of you are smoking. Now on to the new SB review system and then we can all make a final conclusion (increased CPU clockspeed/IPC + 3rd card running at higher than PCIE 4X).
 
The games list is fine. I just don't know what some of you are smoking. Now on to the new SB review system and then we can all make a final conclusion (increased CPU clockspeed/IPC + 3rd card running at higher than PCIE 4X).

I think so too, would be good to have the same games as before plus maybe add a couple more, one AMD and one Nvidia enhanced or helped developed. Games tested should be useful in getting information from what is being tested, games that are played more and that would need this kind of hardware (otherwise sims would be benched :eek:). I have to say the in debt, played through game testing at HardOCP is superior to other site testing using automated benchmark testing on numerous games. The five or so games tested tells us way more then the ten-twenty of the hands off/brain off testing others use.

Now for a 3 gpu setup, AMD seems to have a much versatile solution, meaning motherboards with only two pcix slots and or limited 16x/8x bandwidth pcix slots you can still get the ultimate performance with AMD with 3 gpus. With Nvida the choices in motherboards drop dramatically plus this would probably also increase the cost of that solution as well.

Configuration type testing maybe useful as well, as in what is the best way to setup a TriFire setup. 6990 first, 6970 first, monitor hookup to which one and why, that is if it really makes a difference. Also the previous rig could be used again if the Radeon 6970 is put in the 4x slot and retested as a thought, would be interesting to see if this would dramatically change things on the tests. The 6990 would be in a 16x slot and two of the 580's would be in a 16x slot with each having a card in the 4x slot. Just a thought on the last.
 
I'm willing to accept the results from the sandy build as the final result. After reading the last page I now definitly have reasonable doubt about the results. Having the 3 cards in a build that should offer no pcie or cpu bottleneck will put this issue to rest for me.

You will. But do you really believe the Nvidia loyalists will? Brent is fighting a lost fight, sadly.

First it was the VRAM limitation, and now they found the ''BIG'' 4x limitation on the 3rd PCIe lane! My God! :eek: Call the army! The police! LOL. Not Brent's and AMD's fault if Nvidia can't do 3 GPUs on 2 PCIe slots like AMD. BLAME Nvidia for this.

After the next review, they will find something else, like the choice of games was not adequate, or too much games where AMD are always ''winning'', or the P67 motherboard 8x 8x PCIe speed, the choice of the heatsink, the weather outside, the elections, a butterfly in China, etc. :rolleyes:

I even post some results where my ''crippled'' P67 motherboard is beating some 980/990x 580 Quad-SLI systems in GPU scores. Systems with those ''awesome'' 16X PCIe slots, and my ''crippled'' PCIe lanes are winning!

I have an I7 920 at home with an ''awesome'' X58 motherboard, but the kids are now using it, since my P67 system is faster. :)

But remember my post. Vega, HonyF, etc, they will nit-pic everything, and they will find something to discredit Brent's hardwork and hours lost doing those long testing sessions. Poor guy is loosing HOURS testing this, and those loyalists will go on a rampage again to discredit everything again.

I've found at least 5 or 6 different internet forums where Vega is posting that HardOCP reviews are totally ''invalid'', discrediting HardOCP and Brent's work everywhere he can post it. The guy is raging! And then he's using HardOCP forums to post countless build threads, etc! Bitching against HardOCP everywhere, and then using HardOCP's bandwidth for free to post. Sigh.

Remember my words. they will find something, because they can't accept the fact that all the money they spent on Nvidia is NOT giving them a faster system then AMD.

Remember my words, and the names of those attacking Brent and HardOCP. You will see.

Those guys should really start their own forums and review sites if they are sooooo goooood and awesome, and know-it-all. They are so good at discrediting others hardwork, and try to make themselves looks like ''professionnals''. So why don't they start their own review sites, and show us how to properly do reviews, and show us their awesomeness? Using their real names instead of pseudo. No. It's so much easier to discredit others hardwork and nit-pic every details, hiding behind a keyboard.

It's not them loosing all those hours doing the testing.
 
Last edited:
So yeah this review proves that Nvidia can't even compete against AMD. Even if Nvidia had put 3GB as a standard feature, they charge to much. And as of now, AMD has better Drivers and compatibility, you don't need a 990X at 4.8ghz to achieve thus results. So many instances were Nvidia just can't do it, because of a inferior design. Apple to Apple testing should be done, if one side cant produce results, then thats what to draw a conclusion from, I see fan boys here suggesting tests being ran in the favor of one company.... NO, get over it. If you pay for Nvidia's inflated prices you should try and be humble about all the features your getting, CUDA, PhysX, 3D Surround and TWIMTBP.
 
Keep in mind that the 3 580s get a total of 36 PCIe lanes while the 6970+6990 have only 32. For a totally fair setup on that mobo, the 6970 would have to be placed in the x4 slot, while the 580s be restricted to x8/x8/x4.

But that is all moot, as we hopefully see the results from the new rig soon. :D
 
Keep in mind that the 3 580s get a total of 36 PCIe lanes while the 6970+6990 have only 32. For a totally fair setup on that mobo, the 6970 would have to be placed in the x4 slot, while the 580s be restricted to x8/x8/x4.

Oups. I'm wondering if the Nvidia local idol will post that little detail on 10 different forums also...

So much for the ''3rd Nvidia card is is only at 4X! Stop the presses!''. :rolleyes:
 
Are people suggesting Nvidia requires more PCIe bandwidth to achieve the same results as AMD with less bandwidth..... how does that make Nvidia better?
 
Point is it doesnt matter who sponsors the games, its just that games that are known to magically run far faster on certain hardware shouldnt be included in a limited 5 game test and then using those results to claim one system is superior to the other.

Goodness guy, you really need to give this up. Every game out there will lean one way or the other depending on what features it utilizes. If they don't use the latest and greatest games that WE play, WTF are they supposed to use? In case you havn't noticed, good PC games have become very slim picking lately.

I suppose that you probably won't be happy then, until Brent starts adding games like Hello Kitty Adventure or Sesame Streent Elmo's A to Zoo or Jumpstart Advanced Preschool Fundamentals to his testing suite.
 
if your going to eliminate all possible bottlenecks for the gtx's you should clock the 6990+6970 at the full 880 and still it wolud be better with 3 6970's
 
All this bickering, and going around on every internet forums, discrediting Brent and HardOCP hard-work... and all this for a couple of fps.... Sigh

Crazy world. And 3 of my patients came to see me crying today after getting cancer diagnostics... :(

A couple of fps. That's all.
 
It doesnt take a genius to work out you dont bench DA2, F1, LP2, Hawx2 etc in a 5 game review.
Cool, except they didn't bench LP2 or Hawx2, so what the fuck is your point? DA2 and F1 2010 are popular DX11 titles so I don't see why you would exclude them. Just because a game doesn't favor your particular card it's not a good choice? :rolleyes:
 
Are people suggesting Nvidia requires more PCIe bandwidth to achieve the same results as AMD with less bandwidth..... how does that make Nvidia better?

Whatever it takes to make Nvidia look good is all they want.
 
Whatever it takes to make Nvidia look good is all they want.

That would be roughly 10 Nvidia sponsored titles, in a ten title test bench, using custom 3gig 580s against Stock AMD cards. No custom AMD cards must be used, since somehow, this would again be a poor testing premise.

I think the most fair comparison Brent could do, is 2 6990 in xfire vs 4 580 3gigs in quad sli. of course one solution would be almost twice the cost of the other, but fair's fair. :rolleyes:
What will these poor souls do when aftermarket 6990s start to roll out ?
 
That would be roughly 10 Nvidia sponsored titles, in a ten title test bench, using custom 3gig 580s against Stock AMD cards. No custom AMD cards must be used, since somehow, this would again be a poor testing premise.

I think the most fair comparison Brent could do, is 2 6990 in xfire vs 4 580 3gigs in quad sli. of course one solution would be almost twice the cost of the other, but fair's fair. :rolleyes:
What will these poor souls do when aftermarket 6990s start to roll out ?

Lol pretty funny too. The whiny fannys want Brent and Kyle to spend X amount of money on the 3gb versions and would still complain about something like Lost Planet or Batman not being tested.
 
I don't see why people are so butthurt about this.

No offense but I would estimate 95% of the people reading this website can't afford two 580s, let alone three (or the 3GB models).

I'm an NVIDIA owner at the moment for my primary system and I'm glad ATI is so competitive, it's better for the industry.
 
I don't see why people are so butthurt about this.

No offense but I would estimate 95% of the people reading this website can't afford two 580s, let alone three (or the 3GB models).

I'm an NVIDIA owner at the moment for my primary system and I'm glad ATI is so competitive, it's better for the industry.

Best post I have read so far IMHO.

I guess it is normal human nature, when you invest time/money/effort into something the last thing you want to hear is that you may have made a bad choice. Especially when the seemingly better option costs much less financially. Some people go to any lengths to discredit any results they don't like. It is then that their true biased agenda can be seen by any rational person.
 
Best post I have read so far IMHO.

I guess it is normal human nature, when you invest time/money/effort into something the last thing you want to hear is that you may have made a bad choice. Especially when the seemingly better option costs much less financially. Some people go to any lengths to discredit any results they don't like. It is then that their true biased agenda can be seen by any rational person.

I am one of the people that actually has a 3x SLI 580 setup and to be honest these numbers don't mean that much to me, I don't have any of these games besides Warhead and I played that 3 years ago. As Rizen said it's good to have competition to hold prices down. The fact is that there are other factors besides pure performance and I went with the 580s for S3D support as well as performance. Bottom line like Brent said, fast is fast and I'm simply not having performance issues in games in 2D Surround with my sig rig and the little bit of performance I might see with Tri-Fire doesn't make up for the lack of 3D support to me.

I've been very happy with the 580s thus far and I'd still buy hem over the 6900s because of S3D but I'd probably get 3GB 580 cards if I were buying today.
 
How would this perform on Sandy Bridge platform? with the somewhat limited pcie bandwidth.
 
How would this perform on Sandy Bridge platform? with the somewhat limited pcie bandwidth.
Did you read this thread at all? They are re-testing with a 4.7GHz 2600k right now.
 
Interesting results. Going to be eating at least a little bit of crow. :D
 
How would this perform on Sandy Bridge platform? with the somewhat limited pcie bandwidth.

Using an ASUS P8P67 Work Station Revolution board with NF200 chipset....
 
All you AMD fanboys make me laugh, if these benches were run with the 3RD AMD GPU at 4X and LP2 and Hawx2 were used in a 5 game test you would be crying bloody murder, seriously wtf is wrong with some people..

More games on a mobo that actually supports Tri properly is all I ask for.

I still think AMD's Eyefinity solution is better however the testing was flawed.
 
All you AMD fanboys make me laugh, if these benches were run with the 3RD AMD GPU at 4X and LP2 and Hawx2 were used in a 5 game test you would be crying bloody murder, seriously wtf is wrong with some people..

More games on a mobo that actually supports Tri properly is all I ask for.

I still think AMD's Eyefinity solution is better however the testing was flawed.

All you Nvidia fanboys make me laugh. You are the exact same way, Crying bloody murder that the tri sli lost because they cant make a 2 slot solution, seriously wtf is wrong with people..
 
You make no sense, who is crying because Nvidia dont make a Tri solution for 2 slots? I thought we were comparing Tri vs Tri performance?
 
Status
Not open for further replies.
Back
Top