NVIDIA GeForce 3-Way SLI and Radeon Tri-Fire Review @ [H]

Status
Not open for further replies.
Like I said if the 3rd AMD GPU was running at 4X and 2 of the 5 games were LP2 and Hawx2 I'm sure your tone would be different..
 
I don't see why people are so butthurt about this.

No offense but I would estimate 95% of the people reading this website can't afford two 580s, let alone three (or the 3GB models).

I'm an NVIDIA owner at the moment for my primary system and I'm glad ATI is so competitive, it's better for the industry.

Those who can't afford two 580s should refrain from calling a failed benchmark for what it is,

but in your case, being an owner of two measly $500 cards makes you an expert on the whole fucking industry?

What experimental school of logic did you attend :D
 
Color me impressed.

I wonder if you could achieve minimum 40FPS if you dropped down to x1080 panels

5760 x 1200 = 6 912 000 pixels
5760 x 1080 = 6 220 800 pixels
5040 x 1050 = 5 292 000 pixels
4800 x 900 = 4 320 000 pixels

The age of multi-monitor gaming is here! We have cheap TN panels, now all we need is to remove the bezels/make them small as possible.
 
I have made some 2560x1600 resolution comparisons for those that will find value in that lower resolution comparison if concerned about VRAM.
 
Color me impressed.

I wonder if you could achieve minimum 40FPS if you dropped down to x1080 panels

5760 x 1200 = 6 912 000 pixels
5760 x 1080 = 6 220 800 pixels
5040 x 1050 = 5 292 000 pixels
4800 x 900 = 4 320 000 pixels

.

4800x2560 = 12 288 000 pixels!!!

This is about as much as possible to get these days.
[email protected], 3GB 580s, full x16 lines on UD9. Watch the scaling:

SLIScaling.jpg


The age of multi-monitor gaming is here! We have cheap TN panels, now all we need is to remove the bezels/make them small as possible.

I might buy TN again. When I go blind one day :D
 
Last edited:
Personally I think Nvidia Tri + 3 x monitors will have trouble scaling, nothing to do with resolution but more to do with their implementation of 1 card per screen vs AMD's primary card for all 3 screens. I'm willing to bet AMD has a better method for multi monitor, but again this is nothing to do with the resolution per se.

Also I find it odd that some people think Tri users must use 3 x screens and that a single monitor is not considered enough, has anyone not heard of 120fps on a 120hz screen or 3D???
 
Also I find it odd that some people think Tri users must use 3 x screens and that a single monitor is not considered enough, has anyone not heard of 120fps on a 120hz screen or 3D???

I have as I have both and I think that 3 580s to power a single 1080P just isn't worth it.
 
Like I said if the 3rd AMD GPU was running at 4X and 2 of the 5 games were LP2 and Hawx2 I'm sure your tone would be different..

With AMD you dont have to use 3 PCI-E slots and thats the point. AMD is tri fire while taking up only 2 slots, making it far more in reach, easier to use and far cheaper while having better performance.
 
I have as I have both and I think that 3 580s to power a single 1080P just isn't worth it.

Well then you will know that it takes 3 x 580 to reach 120fps solid in many games on a 120hz screen, and it takes 3 x 580 to run many games at a solid 60 fps in 3D. I guess it all depends if you want quality or quantity.

With AMD you dont have to use 3 PCI-E slots and thats the point. AMD is tri fire while taking up only 2 slots, making it far more in reach, easier to use and far cheaper while having better performance.

lol there are plenty of cheap boards that support Tri SLI, but yes one bonus of dual GPU graphics cards is that less expasion slots are required..
 
I would be interested to see how Quad-CF scaling is working compered to Triple-CF.

For 1920x1200 (2m pixels) it dose not mouths, actually make things worse, but seen some comparisons it actually all ready scales on 2560x1600 (4m pixels), wonder how it would do with 5760x1200 (7m pixels)

I have a water-cooled Rampage III with a i7 970 and 3x 5870, got the 3th real cheep second hand but its not a big difference between 2x and 3x CF, 5870s just scale not that good, but the 6970 is giving me hope ^_^

And i am actually thinking of going triple 6970s and if it scales go for a quad setup, got in my 800D a 7x140 rad (560x140 + 280x140 + 140x140), so cooling should not be a big problem ;)

Just really curious how quad CF would work out on this resolution.
 
I am one of the people that actually has a 3x SLI 580 setup and to be honest these numbers don't mean that much to me, I don't have any of these games besides Warhead and I played that 3 years ago. As Rizen said it's good to have competition to hold prices down. The fact is that there are other factors besides pure performance and I went with the 580s for S3D support as well as performance. Bottom line like Brent said, fast is fast and I'm simply not having performance issues in games in 2D Surround with my sig rig and the little bit of performance I might see with Tri-Fire doesn't make up for the lack of 3D support to me.

I've been very happy with the 580s thus far and I'd still buy hem over the 6900s because of S3D but I'd probably get 3GB 580 cards if I were buying today.

Exactly, when people are spending close to $1000 on a couple of GPUs they WILL RESEARCH according to their needs. I knew 2x 580s in SLI would be faster than 2 HD6970s but the HD 6970s fitted my needs better. less cost for similar performance and I would get more frame buffer for Eyefinity resolutions.

I cared not one iota that 2x 580s are 20-30% faster, they were also 40% more expensive and I would have had to purchase a new PSU pushing the cost up to 50%. The truth is that Tri SLI 580s are more 30% more expensive than Trifire 6970s, if you can afford to buy them you can afford a new CPU and Mboard to run them at 16x/16x/16. They might be faster, but for the price difference they bloody well better be.
 
thanks for the high quality work and discussion Brent and Kyle. I hope this type of intense competition will make the performance leap in the next generation bigger than the usual.
 
The only real problem with the linked review is that they don't compare 3 x GTX580's in tri-SLI to the 6990 + 6970 combo, considering the drop in clock speeds on the 590's in comparison to the 580's - I personally wouldn't be at all surprised if 3 x 580's were faster than 2 x 590's....

....Then again, maybe that's why 3 x 580's aren't in the review?
 
The only real problem with the linked review is that they don't compare 3 x GTX580's in tri-SLI to the 6990 + 6970 combo, considering the drop in clock speeds on the 590's in comparison to the 580's - I personally wouldn't be at all surprised if 3 x 580's were faster than 2 x 590's....

....Then again, maybe that's why 3 x 580's aren't in the review?

quadsli have more shader power, but the sli scaling will be lower.
I think it Probably IS the same, but who knows, 590 is almost as quick as 6990, but having 3vs4 gpu it doesnt catch up, saying one thing, its damn bad scaling. even at low res where it cannot be limited, but I do not trust fudzilla, so i just take it as a grain of salt of strengthening 6990+6970 would beat trisli580..

since this is about crossfire and sli, I have 5850 CF, all drivers AFTER 11.2 makes everygame in CF flicker, ALOT, and I don't like the firefox bug, so I want 10.4, I have built two 6950 CF rigs and they have no issues like 5000 series, hd6000 is really perfection.
 
Last edited:
seems pretty consistent, except for vega that gets very high scores with trisli. /w 3gb cards.

Yep, but no one else then him is using a 11K$ sub-zero system with 4 inches of silicon everywhere. So the results V gets are totally irrelevent to 99.99% of the people out there. HardOCP reviews are more down-to-earth, and their results are useful to those 99.99%.

Except for Nvidia loyalists that like to post, repost, and repost, and repost, and repost (ad nauseum, like Hohny F just did), the little table he made in his basement, with suspect numbers he got (that's what he said) using in-game benchmarks, without any professionnal or proper method, and doing it totally differently then HardOCP are doing it.

Don't you find it strange that V gets different results and SLI scaling then almost every reviewers out there?. Hum. He's the only one on the planet with Nvidia cards scaling like that. He can puts any numbers he wants on that table... Using a pseudo, you can say and post anything you want.

So who cares about the results he gets, since he's getting them using a different ''method'' then HardOCP? You can't compare them. Irrelevant. And it's useful to 0.01% of the people out there, and only if those numbers are legit. But we don't know that... Nvidia ''infiltrator''? ''Sponsered'' by Nvidia? Working for Nvidia? We don't know. We don't have any way to know the truth. Nvidia are paying a lot of people to infiltrate internet forums, to sell more cards and make more money.

So we have to accept all the results he gets like gospel without knowing anything about him? Only guy on the planet getting perfect scaling with SLI?

Eveyone on the internet is also married with a top model. We all know that. :)

Brent is using a proper professionnal method. And he stayed up to 3AM to do those testing last night for us. To me, those results are alot more useful, and down to earth.

Who cares about 990x CPUs that can reach 6.4Ghz on LN2 for 5 minutes? :rolleyes:
 
Last edited:
I think some of you are confusing in-game and out of-game benchmarks. In-game benchmarks are a good thing. For example, the F1 2010 in-game benchmark is perfect as all it is doing is replaying a real recorded race using the in-game engine and showing the FPS. This is perfect as every joe snuffy in the world can test the same benchmark of real in-game play against each other if using the same settings. If [H] uses some custom run through of their own devising, unless they release what they did it is of little use to other gamers for reference.

Now, out of-game benchmarks that don't take actual game play into account like Heaven 2.5, Metro2033 etc are the ones that can be "tweaked" by different drivers/companies to get the maximum possible score. Those are the ones to worry about.

In my SLI scaling thread, A-10C, Crysis 2, Batman and Metro2033 were all in game FPS measurements and not "benchmarks".

In-game benchmarks can still be optimised for in the driver if the demo-run remains constant. And that does happen. Look at the [H] review of the 3870x2, Crysis run-through versus benchmark results.
 
uhhhhh

dude, those tests, they are at 720p... no wonder that they are cpu limited, don't ya think?

Screen Width1280 1280
Screen Height720 720

why did he pick 3dmark ffs

Dont pick benchmark programs to make points pls, not on the [H] of all places.
 
The next review after that one should be 750$, 3X 6950 (unlock to 6970), against 1500$ 580 Tri-SLI, at 5760X1200.

Imagine, for 1 second, if the 750$ was able to keep up with the 1500$ set-up at that resolution.

There would be a riot. :)
 
All this arguing is over how many angels can dance on the head of a pin. [H] is focused on the reality of what is possible with current hardware with games we actually play. Feel free to code a game that is completely free of any custom coding for either AMD or nVidia that people actually want to play that [H] can use in their testing.

I just find it interesting that those with the shortest time on these forums are those with the strongest opinion of how [H] should be run. :rolleyes:
 
Really well written article! Thanks for giving me a good 15 minute break at work :)
 
I just find it interesting that those with the shortest time on these forums are those with the strongest opinion of how [H] should be run. :rolleyes:

My guess...

People with more years on these forums - on average - have been on this earth a little bit longer.

With age comes maturity, and the realization that you cant (and probably even shouldn't) always have things your way.

Those of us who have been around a bit longer know how Kyle and the gang go to the greatest practical extents to tell the story as well as possible. We also know that they can not please everyone all the time, and are OK with that, because nobody can, and it is unreasonable to expect them to.

That doesn't mean we don't throw out a suggestion or ask a question every now and then however :p
 
Zarathustra[H];1037199340 said:
My guess...

People with more years on these forums - on average - have been on this earth a little bit longer.

With age comes maturity, and the realization that you cant (and probably even shouldn't) always have things your way.

Those of us who have been around a bit longer know how Kyle and the gang go to the greatest practical extents to tell the story as well as possible. We also know that they can not please everyone all the time, and are OK with that, because nobody can, and it is unreasonable to expect them to.

That doesn't mean we don't throw out a suggestion or ask a question every now and then however :p

I wouldn't have the [H] any other way.
 
Those who can't afford two 580s should refrain from calling a failed benchmark for what it is,

but in your case, being an owner of two measly $500 cards makes you an expert on the whole fucking industry?

What experimental school of logic did you attend :D
I've been on this forum for 10 years, I used to run a hardware review site and I've probably run more benchmarks in my life than most people on this forum. That said, I don't think that makes me any more or less qualified to comment.

What I do think, is that people who are just posting and bitching and whining and shitting up this thread should should refrain from posting unless it's actually constructive. Pointing out to Kyle and Brent where they might have misstepped in their evaluation is one thing - and a number of good points have been brought up in this thread, such as the PCI-E lanes and CPU limitations. But the people who are taking this personally, on both sides, need to calm down and act rationally. They're video cards. There is no need for the flame bait posts and the whining over the game selection and all of that nonsense. If you feel you have a legitimate concern, post it up, but at least try and remain semi-courteous and professional, otherwise you come off like an idiot and no one is going to take you seriously. Especially since they are actually agreed with the consensus and re-ran all the benchmarks! I guess some people just aren't going to be satisfied unless the benchmarks validate their own purchases.
 
Spewing nonsense is just spewing nonsense. So I don't think you can shit any thread that easily.

I'd rather see a bitching / whining post - there may be some substance in it, and I might learn a thing or two, or maybe laugh.

Post which piss me off are those without any substance like "Good job AMD(NV)".
Then there are "Excellent review, great job Brent" which I guess are form of feedback, but again, do little for me.

So 1st kind of posts ranks kinda highest on my list :D
 
So when is the update scheduled to go up?

Being worked on now. Tremendously great update, I am glad we took queues from you guys on this for sure. Very interesting. This is one of the most interesting outcomes I have seen using real world gameplay testing....
 
I think only Single Screen single GPU card performance crown. That is really where most people game. Also perhaps Single Screen Multi GPU could be a good scenario for them.

3DVision. This is an amazing technology, based on the time I spent with it last weekend on my 3D laptop, and it's the reason I'm switching back to Nvidia on my desktop, despite my frustration with Nvidia's vRAM stupidity. So much better than (and nothing like) the 3D movie gimmick. Not that I'd blindly recommend it - some people's eyes can't cope with it, maybe 20%.
 
Status
Not open for further replies.
Back
Top