Crysis Warhead and Stalker CS - Multi-GPU Gameplay @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,647
Crysis Warhead and Stalker CS - Multi-GPU Gameplay - Want to know how AMD’s CrossFire and NVIDIA’s SLI perform in Crysis: Warhead and Stalker: Clear Sky in DX10 at HD resolutions? We put new-gen CrossFire and SLI head-to-head in six setup configurations to show you what they can deliver.

Please Digg to Share!

NVIDIA has squeaked its way back into the best value category as long as you do not find yourself having to buy an NVIDIA SLI motherboard. And in the case of the high end SLI configuration using two GTX 280 cards, you can afford to buy an SLI motherboard and still come in less expensive than CrossFireX.
 
Good Review overall, seemed that the GTX 280 SLI setup and 4870x2 setup trade blows.
 
i'll assume there would be no discernable difference in gameplay experience if the newer 216 shader 260gtx was swapped in instead.

nice scores from everything tested, based on their current cost in the marketplace. you really can't lose at any price point right now.
 
i'll assume there would be no discernable difference in gameplay experience if the newer 216 shader 260gtx was swapped in instead.

nice scores from everything tested, based on their current cost in the marketplace. you really can't lose at any price point right now.

From page one....

We decided to use the original 192 stream processor GeForce GTX 260 rather than the new 216 core card for a couple of reasons. Firstly, our evaluations have proven there aren’t any tangible gameplay differences between either. Secondly, the 192 core card can be found for a very low $230 which is $45 less than the least expensive 216 core card. Putting together an original GTX 260 setup will cost less than a Radeon HD 4870 X2 currently.
 
Nice review! nVidia and ATi, are trading blows here when it comes to the high-end. I can't wait on FarCry2 and Fallout 3:).
 
One good article of the kind more sites should go for, instead of 30 pages of random benchmarks. I'm happy to see Stalker:CS performing well and looking great.

And that Crytek keeps making engines for Nvidia hardware, so if you're only going to play that you know what cards to go for.

Hope to see a little something about FO3 and perhaps GTA4 PC when they've arrived
 
not that it changes much but TG has has had the 4870X2 for 450AR for some time now. that puts it at 50 dollars more then the GTX260 SLI with no SLI board needed. At that level I would have to say that the 4870X2 takes the GTX260 SLI here

Too bad it scales like crap for crossfire X. its almost enough to make me consider SLI again. a seconded GTX280 looks tempting.
 
after running the evaluation for the 4850 xcrossfire, do you guys think 4850 1gb in crossfire would have been better competition in these newer games? the the 1gb crossfire setup would be under 400 so i dont know how that would place it with the rest of the comp but seems a little more viable. what you do you guys think?
 
Brent and Kyle,

Thanks for the great review.

I think this test shows that right now you cannot go wrong with any setup.

About the AA benchmarks in crysis warhead-. I stumbled on this thread:
http://www.ngohq.com/graphic-cards/14519-ati-struggle-crysis.html
Basically it is claimed that crysis is using less demanding and lower quality type of AA for Nvidia than ATI. For all I know it can be bogus but do you know if it is true and in that case if this goes for warhead also?

I gues this would mean that the AA tests are not fully representative?
 
Nice article, does the ATI hardware still suffer from the erratic frame rate drops mentioned in the original 'Clear Sky - Gameplay Perf and IQ' review? or have drivers and/or game patches alleviated that problem?
 
I have to offer an alternate opinion, and it's just that.:D

I have found my CrossfireX gaming in Warhead much improved over Crysis, unlike your review which didn't see much at all.

Also I can run Warhead with my CrossfireX set to all "enthusiast" and the game is very,very playable.............just as you have said about GTX 280 SLI.....a much more enjoyable game versus Crysis the first.

I enjoyed this article very much.
 
i'll assume there would be no discernable difference in gameplay experience if the newer 216 shader 260gtx was swapped in instead.

nice scores from everything tested, based on their current cost in the marketplace. you really can't lose at any price point right now.

You are correct. The core 216 doesn't show any kind of noticeable performance increase in games...I can tell you first hand as I've played with both cards. In a lot of review sites, the 216 shader performs the same as the 192 shader gtx260....sometimes it will have a 1-3 fps advantage. Best to go with a 192 shader as that is where the good deal is at right now. A lot of idiot review sites compare the overclock versions of 216 shader cards to the stock clocked 192 shader cards...I think this is why some people think the 216 shader faster.
 
Looks like I made the right move in my most current upgrade. 2 Pny gtx260's ($420), Evga 750i FTW ($149), corasir 750w($119).

Getting it all tomorrow! tee hee hee, can't wait!
 
Looks like I made the right move in my most current upgrade. 2 Pny gtx260's ($420), Evga 750i FTW ($149), corasir 750w($119).

Getting it all tomorrow! tee hee hee, can't wait!
wow you sure have been active on the forums today. lol :p

yeah that sounds like a lot of bang for buck you got there. I hope you have a good cpu, 4 gigs of ram, Vista 64bit, and a nice monitor to go along with all your new toys. ;)
 
wow you sure have been active on the forums today. lol :p

yeah that sounds like a lot of bang for buck you got there. I hope you have a good cpu, 4 gigs of ram, Vista 64bit, and a nice monitor to go along with all your new toys. ;)

WIth my current set up, I've got vista 64 ult., 4 gigs of OCZ platinum, an E8400 sitting at 3.8, and a Dell ah...what the hell is it again....the 24 inch dell that is 1080p.

Yeah, I've got a lot to do at work, but each project is stalled because I'm waiting on responses from third parties. :)
 
I gues this would mean that the AA tests are not fully representative?

I would suggest that it has been years since they actually have been 1:1. Welcome to HardOCP 2004. ;)
 
2 questions:
1: even if theres no difference when running solo, might there have a been a difference with 2 of the new 260s vs the old 260s? SLI being the point, of course.
2: on the 2 GTX280s, how was performance (and the picture quality) with enthusiest shaders and no AA, instead of what you picked in the article?

And one comment, changing driver versions annoys me. The 4870X2 gained 4 fps @ 1920x1200 in crysis on exactly the same settings. So, the results make it impossible to compare 2 GTX260s vs a single GTX280. :(
 
I have GTX 260 SLI in 64-bit Windows. I found the DX10 performance atrocious in this game. I had to force DX9 mode.

Then again, I also went with different settings: mainstream shadows and blur, enthusiast shaders and post-processing, gamer everything else.

I really feel like the highest shaders and post-processing are worth the hit. The more vivid sun, light shafts, and detailed foliage really enhance the visuals.
 
I would think it would be worth it to go with crossfire simply because you wouldn't need an nvidia chipset
 
For the cost conscious out there, there is no denying the value of GeForce GTX 260 SLI.

I wouldn't call $500 an option for a lot of cost conscious upgraders out there but I suppose within the context of an SLI/CrossFire article it works.

Otherwise I enjoyed the review, thanks :)
 
Great review, but I wish we could have seen a 216SLI test or two, like was mentioned

But again, this answered every question I could have asked but one so GREAT review, like always
 
And I was so happy with my GTX 280 :-(

Now I want to get a second one. ;-)

If Big Bang would only support triple monitor SLI then I would consider it.
 
I just went back and looked at the Crysis Warhead and Stalker CS Gameplay and IQ articles.

From that it looks like a single GTX 280 equals or outperforms GTX 260 SLI. Is that possible?
 
If Big Bang would only support triple monitor SLI then I would consider it.

funny you mention that. my eventual goal is to end up with a triple-mon setup using SoftTH. i was going to comment that it was a great read, but i wouldn't have minded seeing comparisons of 2560x1600, just to get an idea of how the results scale at higher resolutions. and that is indeed how i feel.

Brent, Kyle, thanks again for another great read.
 
From that it looks like a single GTX 280 equals or outperforms GTX 260 SLI. Is that possible?

No. The GTX 260 SLI runs 5 FPS faster than a single GTX 280 at the same settings at 19x12.

The Crytek engine is not really that scalable with SLI. And Crysis is more more scalable than Warhead.
 
You made me go back and look.

I stand corrected ;)

Did not notice that GTX 280 Stalker was played in DX9.
 
Great article as always and glad you mentioned more than once that the value of a GTX 260 SLI setup can really only be factored when you already have an SLI motherboard. Now many of us don't like Nvidia chipset motherboards as to put it bluntly they mostly stink. The newer one used in this article I have read some nice things about but it costs almost as much as the 2 260 GTX cards that could be used for SLI. That really removes much of its value.

You can buy a P45 chipset motherboard for $140 and under, overclock the hell out of it and still crossfire if you like or throw a 4870x2 on it much cheaper then it would be to buy the motherboard used in this review + 2 260 GTX cards.

I am glad that starting with X58 this SLI limited to nvidia chipsets is going away. Its about time
 
Great article as always and glad you mentioned more than once that the value of a GTX 260 SLI setup can really only be factored when you already have an SLI motherboard. Now many of us don't like Nvidia chipset motherboards as to put it bluntly they mostly stink. The newer one used in this article I have read some nice things about but it costs almost as much as the 2 260 GTX cards that could be used for SLI. That really removes much of its value.

You can buy a P45 chipset motherboard for $140 and under, overclock the hell out of it and still crossfire if you like or throw a 4870x2 on it much cheaper then it would be to buy the motherboard used in this review + 2 260 GTX cards.

I am glad that starting with X58 this SLI limited to nvidia chipsets is going away. Its about time

I've had zero problems with my 780i which is now around 200$. It OC'ed my E8400 to 3.8 without much trouble. The 780is are NOT the 680is, it's a shame moer people can't understand that. :(
 
I've had zero problems with my 780i which is now around 200$. It OC'ed my E8400 to 3.8 without much trouble. The 780is are NOT the 680is, it's a shame moer people can't understand that. :(

Actully, the 780i IS a 680i with only one diffrence, PCIe 2.0, the fact that mobos based around the 780i might of getten some fixes/upgrades does not change the fact that the chipset is almost exactly the same.

Thats all I have to add, not gonna say what I THINK about the 680/780i mobo's/chipset.
 
Actully, the 780i IS a 680i with only one diffrence, PCIe 2.0, the fact that mobos based around the 780i might of getten some fixes/upgrades does not change the fact that the chipset is almost exactly the same.

Thats all I have to add, not gonna say what I THINK about the 680/780i mobo's/chipset.

Guess I must be lucky then.
 
Guess I must be lucky then.

I suppose Im a lucky one as well.

I had the 680i and never once had a problem.

Upgraded to the 790i Ultra and had a hardware problem after two months that made me and the support staff at EVGA think it was a Bios problem. Turned out to be a bad USB header on my case.

Upgraded again to the 790i FTW w/ digital PWM and it works flawlessly.

Some people just hate everyone else no matter what:rolleyes:
 
I suppose Im a lucky one as well.

I had the 680i and never once had a problem.

Upgraded to the 790i Ultra and had a hardware problem after two months that made me and the support staff at EVGA think it was a Bios problem. Turned out to be a bad USB header on my case.

Upgraded again to the 790i FTW w/ digital PWM and it works flawlessly.

Some people just hate everyone else no matter what:rolleyes:

Yeah, sounds about right.
 
So are we going to redo the review with Nvidia 180 Series Drivers and ATI 8.10 Hotfix/8.11 beta?
 
Back
Top