Nice GPU roundup here, includes Crossfire and SLI (Techreport)

thanks for the link! Been waiting for something like this, especially since im planning on CF my 5770s soon
 
http://techreport.com/articles.x/19404/1

I quite enjoyed it :)
Even has the newer 460's in there as well.

Interesting, but unfortunate in their NV bias. Why was an overclocked and super-cooled GTX 480 AMP! card used for instance, instead of stock? And Metro and Borderlands are very NV-biased (for crissake, Metro is even fully 3D compliant, it's the epitome of a game heavily sponsored by TWIMTBP, i.e., NVIDIA). I also wish they tested more games, especially popular and good ones like MW2, Starcraft 2, STALKER:CoP, etc. rather than trash like Borderlands, so that one game (Metro at ONE resolution 1680x1050, and ONE setting for AA/AF, wtf? Maybe it's to make 460s look artificially good because they slow down at higher resolutions more than 58xxs?) doesn't skew average results so much. If you take out Metro or Borderlands results or both, that undoes GTX's lead in the average results. Also they failed to consistently test at 2560x1600 or higher; hell they don't even consistently test 1080p. I would be very interested in seeing 460 SLI tested more often at 5040x1050 to see how well it does at high-rez, because I know they start to slow down more than 58xx's in single-GPU at higher resolutions.

Crossfire is apparently solid for 57xx series, so what's up with crossfire not working as well for 58xx series?

Despite the flawed methodology TR used, the 460 is looking impressive. If it weren't for potential microstutter issues and 460's possibly slowing down too much at 5040x1050, I might trade my 5850 for a theoretical GTX 495 (460 SLI) card (my mobo doesn't do SLI). As it stands though the GTX 495 probably won't release before the fall/winter, and I'm happy with 5850 performance in the games I currently play and unwilling to give up multi-monitor capability, so I'll wait for the HD 6xxx series this fall/winter to see if the GTX 495 beats a HD 6870.
 
Last edited:
Interesting, but unfortunate in their NV bias. Why was an overclocked and super-cooled GTX 480 AMP! card used for instance, instead of stock? And Metro and Borderlands are very NV-biased (for crissake, Metro is even fully 3D compliant, it's the epitome of a game heavily sponsored by TWIMTBP, i.e., NVIDIA). I also wish they tested more games, especially popular and good ones like MW2, Starcraft 2, STALKER:CoP, etc. rather than trash like Borderlands, so that one game (Metro at ONE resolution 1680x1050, and ONE setting for AA/AF, wtf? Maybe it's to make 460s look artificially good because they slow down at higher resolutions more than 58xxs?) doesn't skew average results so much. If you take out Metro or Borderlands results or both, that undoes GTX's lead in the average results. Also they failed to consistently test at 2560x1600 or higher; hell they don't even consistently test 1080p. I would be very interested in seeing 460 SLI tested more often at 5040x1050 to see how well it does at high-rez, because I know they start to slow down more than 58xx's in single-GPU at higher resolutions.

Crossfire is apparently solid for 57xx series, so what's up with crossfire not working as well for 58xx series?

Despite the flawed methodology TR used, the 460 is looking impressive. If it weren't for potential microstutter issues and 460's possibly slowing down too much at 5040x1050, I might trade my 5850 for a theoretical GTX 495 (460 SLI) card (my mobo doesn't do SLI). As it stands though the GTX 495 probably won't release before the fall/winter, and I'm happy with 5850 performance in the games I currently play and unwilling to give up multi-monitor capability, so I'll wait for the HD 6xxx series this fall/winter to see if the GTX 495 beats a HD 6870.

About the reference cards, i have no idea why they did that. Probably received them as a promotional thing, or just wanted to compare the different flavors. Bias is apaprent here.

Bias in benching? Sure, but Metro 2033 is the most demanding game to date, so not to bench it would be silly - thats like saying its was an Nvidia bias to bench Crysis when it came out, which it may have been, but not to bench it would be unfair for the ATI people who want to see how their card performs on the toughest game evar. And although i partially agree with you that 1680x1050 was a bad call on Metro, they probably did it because most of the cards in their roundup would lock up and Fail on the benchmark at a higher resolution, so for the purpose of comparing all of them in one test, this makes sense.

And unreal 3 engine is probably the most used engine for last year's games, so not to bench that either would biased as well.
Also, i dont see much of a point in benching COD these days because anything over the 8800GTX can run it at full settings. Not a demanding engine in the least - and its only used by COD games.

And the reason that crossfire sucks on the 58xx series is the drivers. The older drivers work better, but changing them would be biasing the results wouldnt it?

To me it looks like techreports offered benchmarks of games in which ALL cards could be compared, without dropping some off for "fail" ratings. This comparison wasnt even supposed to revolve around the single cards anyway, which is where all of your points stand; it was meant for all intensive purposes to show the differences between crossfire and SLI, while giving the reader a standard to compare from (i.e. the single cards). The article is afterall called "SLI vs. CrossFireX: The DX11 generation".
 
Also they failed to consistently test at 2560x1600 or higher; hell they don't even consistently test 1080p. I would be very interested in seeing 460 SLI tested more often at 5040x1050 to see how well it does at high-rez, because I know they start to slow down more than 58xx's in single-GPU at higher resolutions.

Crossfire is apparently solid for 57xx series, so what's up with crossfire not working as well for 58xx series?

Despite the flawed methodology TR used, the 460 is looking impressive. If it weren't for potential microstutter issues and 460's possibly slowing down too much at 5040x1050, I might trade my 5850 for a theoretical GTX 495 (460 SLI) card (my mobo doesn't do SLI). As it stands though the GTX 495 probably won't release before the fall/winter, and I'm happy with 5850 performance in the games I currently play and unwilling to give up multi-monitor capability, so I'll wait for the HD 6xxx series this fall/winter to see if the GTX 495 beats a HD 6870.

interesting theory, not really sure where you are getting your info.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/34422-evga-gtx-460-1gb-superclocked-ee-external-exhaust-review-12.html
this review shows the 460 (both 1gb and 768mb versions) being pretty equal at high resolution (2560x1600) with a 5830, which is its direct competitor.

if you had been reading the forums as of late, especially the 5850 cfx vs. 460 sli comparison that was done you would see that the 10.7 drivers pretty much broke cfx and that with the current drivers 460sli smashes the more expensive 5850 cfx option. obviously you can use older drivers to fix the issue.
 
http://www.techpowerup.com/reviews/Axle/GeForce_GTX_460_768_MB/31.html

They also have the 1GB version on the summary.

interesting theory, not really sure where you are getting your info.
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/34422-evga-gtx-460-1gb-superclocked-ee-external-exhaust-review-12.html
this review shows the 460 (both 1gb and 768mb versions) being pretty equal at high resolution (2560x1600) with a 5830, which is its direct competitor.

if you had been reading the forums as of late, especially the 5850 cfx vs. 460 sli comparison that was done you would see that the 10.7 drivers pretty much broke cfx and that with the current drivers 460sli smashes the more expensive 5850 cfx option. obviously you can use older drivers to fix the issue.
 
About the reference cards, i have no idea why they did that. Probably received them as a promotional thing, or just wanted to compare the different flavors. Bias is apaprent here.

Bias in benching? Sure, but Metro 2033 is the most demanding game to date, so not to bench it would be silly - thats like saying its was an Nvidia bias to bench Crysis when it came out, which it may have been, but not to bench it would be unfair for the ATI people who want to see how their card performs on the toughest game evar. And although i partially agree with you that 1680x1050 was a bad call on Metro, they probably did it because most of the cards in their roundup would lock up and Fail on the benchmark at a higher resolution, so for the purpose of comparing all of them in one test, this makes sense.

And unreal 3 engine is probably the most used engine for last year's games, so not to bench that either would biased as well.
Also, i dont see much of a point in benching COD these days because anything over the 8800GTX can run it at full settings. Not a demanding engine in the least - and its only used by COD games.

And the reason that crossfire sucks on the 58xx series is the drivers. The older drivers work better, but changing them would be biasing the results wouldnt it?

To me it looks like techreports offered benchmarks of games in which ALL cards could be compared, without dropping some off for "fail" ratings. This comparison wasnt even supposed to revolve around the single cards anyway, which is where all of your points stand; it was meant for all intensive purposes to show the differences between crossfire and SLI, while giving the reader a standard to compare from (i.e. the single cards). The article is afterall called "SLI vs. CrossFireX: The DX11 generation".

Metro 2033 is demanding and I get what you are saying about treating it like Crysis, but to test it at a mainstream resolution at one setting and generalizing from that is b.s. How many people have 480s or 5870s or GTX 460 SLI and play at 1680x1050? Why not have 1080p and 2560x1600 and test only a subset of the cards, since we all know that weaker ones can't handle it anyway? Else forget 2033 entirely; you can't have it both ways.

Borderlands is among the most NV-biased of UE2.5/3 games out there. They could have benched anything from Mass Effect 2 (way better and more popular game) to DA:O to Bioshock 2 instead. (Also Borderlands is maxed by lower cards like you say MW2 is... though I agree w/ you that a UE3 engine must be tested simply because it's in so many games. But then, why did TR not also include a Source game since so many good games are based on Source?)

As for SLI vs CF I think enough people have tested it by now to say that 58xx CF = fail, 57xx CF = win, and NV is more consistent throughout its lineup. I'd still rather have one superfast GPU to two mainstream ones in SLI, for power/heat/noise/microstutter reasons, but if the price is right......
 
Last edited:
that review you linked still puts both models of the 460 above 5830 on all of the graphs.

I was comparing 460 to 58xx in terms of relative slowdown not absolute performance. The point is that 58xx cards don't slow down as fast when you crank up resolutions. Which is why I am curious to see 460 SLI performance on Eyefinity-resolutions like 3x22" and 3x24" arrays. Does that make my point clearer? I think you read what I wrote to mean that I was saying 5830s were faster than 460s, which is not what I said.
 
Metro 2033 is demanding and I get what you are saying about treating it like Crysis, but to test it at a mainstream resolution at one setting and generalizing from that is b.s. How many people have 480s or 5870s or GTX 460 SLI and play at 1680x1050? Why not have 1080p and 2560x1600 and test only a subset of the cards, since we all know that weaker ones can't handle it anyway? Else forget 2033 entirely; you can't have it both ways.

Borderlands is among the most NV-biased of UE2.5/3 games out there. They could have benched anything from Mass Effect 2 (way better and more popular game) to DA:O to Bioshock 2 instead. (Also Borderlands is maxed by lower cards like you say MW2 is... though I agree w/ you that a UE3 engine must be tested simply because it's in so many games. But then, why did TR not also include a Source game since so many good games are based on Source?)

As for SLI vs CF I think enough people have tested it by now to say that 58xx CF = fail, 57xx CF = win, and NV is more consistent throughout its lineup. I'd still rather have one superfast GPU to two mainstream ones in SLI, for power/heat/noise/microstutter reasons, but if the price is right......

I get what you're saying for metro now. But I have an inkling that they kept it like that because the 5830 CF setup would have failed on higher resolutions, and they wanted to keep it for the whole CF vs SLI thing.

And i did not know that about Borderlands, good to know man, thanks.

Source though, is older than Unreal 3. 2004 vs 2006 (roboblitz + Gears of War). Or more like 2007/2008 when Unreal 3 mainstreamed. I think everybody can run it at full these days :p. Ps, holy crap, i'm reading the list of unreal games page on Wiki, and there are a shitton, WAY more than i thought. Check this man http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_3.

And yeah, i agree with you about 1 big card vs 2 smaller cards. This review isnt for us eh? :)
Though, its nice to see some numbers if i decide to go 480 sli... you know... for ridiculousness' sake.
 
Yeah sorry I shoulda been clearer, I shoulda said something more like "if you're going to include 2033 despite its huge NV-bias, on the logic that people want to see the most demanding game getting benched, then bench it for real and not at a sissy resolution like 1680x1050. And bench more games so that one or two games doesn't skew the overall average so much. Moreover, why the wild variations in resolutions tested and lack of variation in settings (AA/AF) tested? It damages the credibility of Damage's review."

Similarly I shoulda said re: UE3: "if you're going to use a UE3 game on the logic that so many games use UE3 that you should bench it, even though UE3 is old and most high-end cards can max such games, then why not avoid one of the most NV-biased UE3 games out there (Borderlands) and instead include a better-rated or more popular or newer one like DA:O or Mass Effect 2? And using the same logic, why not include a Source game, which is another old engine that nevertheless has had some recent bestselling hits?"

Btw thanks for the UE3 link. It bears mentioning that Source has been updated a lot since 2004, so the 2004 Source is not the same as what was used in newer games like TF2 or L4D2... so it's not really accurate to call Source a 2004 engine... Valve could have called the sum of all of the upgrades "Source 2.0" or something if they really wanted to. http://en.wikipedia.org/wiki/Source_(game_engine)

Yeah I'm with you on 1 vs 2 cards unless the bang for the buck gets to be ridiculous... which is why I do like single GPUs to be included in SLI vs. Crossfire matches. It gives some perspective to things.

I get what you're saying for metro now. But I have an inkling that they kept it like that because the 5830 CF setup would have failed on higher resolutions, and they wanted to keep it for the whole CF vs SLI thing.

And i did not know that about Borderlands, good to know man, thanks.

Source though, is older than Unreal 3. 2004 vs 2006 (roboblitz + Gears of War). Or more like 2007/2008 when Unreal 3 mainstreamed. I think everybody can run it at full these days :p. Ps, holy crap, i'm reading the list of unreal games page on Wiki, and there are a shitton, WAY more than i thought. Check this man http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games#Unreal_Engine_3.

And yeah, i agree with you about 1 big card vs 2 smaller cards. This review isnt for us eh? :)
Though, its nice to see some numbers if i decide to go 480 sli... you know... for ridiculousness' sake.
 
I was comparing 460 to 58xx in terms of relative slowdown not absolute performance. The point is that 58xx cards don't slow down as fast when you crank up resolutions. Which is why I am curious to see 460 SLI performance on Eyefinity-resolutions like 3x22" and 3x24" arrays. Does that make my point clearer? I think you read what I wrote to mean that I was saying 5830s were faster than 460s, which is not what I said.

i get what you mean and i would be very interested to see surround vs. eyefinity with 460sli vs. 5830cf.
 
Back
Top