CJ leaks benchmarks of 5870/5850 vs. 285/295

And besides, I remeber Crysis is far more optimized for nVidia cards then ATi.

As in:
- I have viewed over the code and the code is buildt to favour NVIDIA architecture?
- It's a TWIMTBP game it must run better on NVIDIA architecture?
- I read a lot of FUD about crysis being "unoptimized" and this must apply to AMD's architecture?
- I don't have a clue, but it sounded cool?
 
No, every benchmark I have seen, even this one, shows an advantage for nVidia cards.

When you optimize game code (I don't do game programming, not on that scale, anyways), you have to go one direction or another.
 
It's more present in Warhead where it was stated by the developers to be better 'optimized.'
Crysis, is just a big game requiring a lot from any card.

Warhead, obviously less from nVidia cards (as stated by the benchmark results).
 
Last edited:
As in:
- I have viewed over the code and the code is buildt to favour NVIDIA architecture?
- It's a TWIMTBP game it must run better on NVIDIA architecture?
- I read a lot of FUD about crysis being "unoptimized" and this must apply to AMD's architecture?
- I don't have a clue, but it sounded cool?


That... was a really stupid comment.

1: Really... who actually works for Crytek here? You (should, but don't) know better then that.
2. No clue on that fail of an acronym
3. what is 'FUD'? I just know the crysis demo runs a lot better on my 9800gt then it did on a 4830 I ditched...
That AND even the newest and best of ATi's new cards cannot beat out an older nVidia card at this...
4. Are you sure you are not projecting?
 
That... was a really stupid comment.

1: Really... who actually works for Crytek here? You (should, but don't) know better then that.
2. No clue on that fail of an acronym
3. what is 'FUD'? I just know the crysis demo runs a lot better on my 9800gt then it did on a 4830 I ditched...
That AND even the newest and best of ATi's new cards cannot beat out an older nVidia card at this...
4. Are you sure you are not projecting?

Crysis demo is like shit, shit load of bugs and very un-optimize.

try out the real crysis you will see ATI actually outperform nVidia on very high standard (excluding the benchmark tool)
 
Crysis demo is like shit, shit load of bugs and very un-optimize.

try out the real crysis you will see ATI actually outperform nVidia on very high standard (excluding the benchmark tool)

I'll do that. Thanks you!
 
As in:
- I have viewed over the code and the code is buildt to favour NVIDIA architecture?
- It's a TWIMTBP game it must run better on NVIDIA architecture?
- I read a lot of FUD about crysis being "unoptimized" and this must apply to AMD's architecture?
- I don't have a clue, but it sounded cool?

I suppose looking over the numbers for about a dozen games, seeing a certain pattern forming and noticing one game that goes against said pattern is a concept lost on some people.
 
So I guess the whole AMD/ATI purchase is actually going to make them money now. AMD is getting slaughtered in the processor market.
 
I suppose looking over the numbers for about a dozen games, seeing a certain pattern forming and noticing one game that goes against said pattern is a concept lost on some people.

Well that and the fact that Nvidia and Crytek basically talked about their immense work together back before Crysis was released ;)

Of course, things change all the time. Heck, CryEngine 3 was shown at the ATI event, so who knows if the next round it becomes ATI optimized
 
58it1.png


cf vs sli
 
id expect the crysis numbers to improve drasticly with newer drivers being released.. we all saw how much the 4870 and 4890 improved in crysis with every driver release ATI came out with.. but the CF numbers look very good compared to the GTX 295.. now come on ATI.. bring out some drivers that are worth a damn for once!
 
Ati lost my business with their fail of a driver support for the 4850CF until about 6 months after release. I have no reason to believe that the 58xx series will be any different. Maybe by January these cards will actually work the way ati promises. Maybe.
 
Yep 5870 rock in corssfire! personally i think the drop in performance is driver issue not actuall horse power or anything.That beign said i'm expecting at least 15 % boost in Crysis performance in next months,also lower drop in performance.I remember when i bought my 295 and instal it with the drivers from the cd the fps difference with that driver vs the latest in Crysis was like 25%.
 
sweet.. hope they have some nice watercooled 5870 parts.. now bring on some bloody games that are like WOW and pretty like crysis
 
I look forward to the possibility of switching to the red side later this month!
 
Last edited:
With these types of numbers would a Q6600 @ 3.2Ghz bottleneck a 5870x2?
 
Well I`m fucked for crossfire but 5870 would be a nice upgrade from my 9800GX2 POS.

Chow nvidia!
 
Oh... I thought the evga board was exclusive to Nvida cards (what with the missing crossfire bridges, branding on the box, and mention of support in the manual)... this is good news indeed!

They do mention it...in very small writing...separate from the sli part.
 
With these types of numbers would a Q6600 @ 3.2Ghz bottleneck a 5870x2?

At 19xxX1200 resolution and up vs. say an i7 at 4.0 ghz sure it will be a few frames and some, but not enough to rush out and get an new i7/i5. The higher resolution you go the more gpu will become the bottleneck. The Q6600 still has some miles on it even for a multi GPU solution.

I think a good time to upgrade cpu for the 5870x2 will be around Gulftown release if you really feel the need to upgrade which is what I'm waiting for.
 
Oh... I thought the evga board was exclusive to Nvida cards (what with the missing crossfire bridges, branding on the box, and mention of support in the manual)... this is good news indeed!

The Crossfire bridges always come with the cards themselves. Actually if you go look at the evga forums you can find some pics of people using Crossfire. Since evga doesn't make ATI cards they don't advertise too much about it... ;)

That was the whole reason I went 1366 i7. The choice of either one!
 
With these types of numbers would a Q6600 @ 3.2Ghz bottleneck a 5870x2?

Yes it will cause power of 5870x2 is like 2x GTX295 even more so you can imagine how much power this babies need to run decent ;)

IMO try to oc your proc at least to 3.6
 
Last edited:
I'm interested in seeing the 5870x2, I just hope they don't throw in the "Sideport" chip again and boast what it can do for it, then never enable it later on.

What also sucks is that I will have to invest in another waterblock for the card. Maybe I will hold out a little longer with my 4870x2.
 
Until they are here and H has done a review....they might as well not exist. Hoping for great results though. My 280GTX has served me well but newer and better fps ftw!
 
Good numbers if the price is right. But I still gonna hang on my 4870 xfire setup for quite a while, as the titles I´m really looking forward to (Diablo III and Starcraft II) will most likely be playable in my current setup without any effort.

Anyway, it seems the question will finally be answered "Can it play Crysis?" lol
 
My 4850 is being stretched thin in a few games at 19x12, so the 5870 will be a worthwhile upgrade for me. I do plan on waiting for prices and supply to settle a bit before jumping on it though.
 
I think I might wait for prices to drop a bit and get the 5870. I think my 4850 crossfire set-up will hold me up for MW2 and OFP: DR.
 
Please elaborate.

I can answer that. I can't find the post on Beyond3D, but it was basically one of the ATI guys on there (probably Dave Baumann) who wrote a post about how games are optimized with respect to drivers

There's basically 7 shades of optimization. On the extreme ends, you are fully optimized for either Nvidia/ATI hardware, and thus performance for Nvidia/ATI hardware is great while the other side's hardware is awful.

Dead center, you code with respect to the drivers of both sides, but this means that the game isn't as efficient as it could be if it were completely based on one hardware architecture.

And then there are different shades in between.

An example of the extremes would be Lost Planet and Call of Juarez. Lost Planet ran significantly faster on Nvidia hardware. Call of Juarez ran significantly faster on ATI hardware (to the point that the 2900XT even beat the G80s at it). Nvidia worked closely with LP and COJ closely with ATI, so its not surprise what happened.
 
I have been looking on Toms for how the 285 and other cards perform at various settings.

Heres fallout 3 at 1920x1200 4AA 8AF running @78 FPS
http://www.tomshardware.com/charts/...-charts-2009-high-quality/Fallout-3,1317.html

The chart posted on the first page shows 79. Same numbers

The Left 4 Dead number dont match up as well ( the 285 is listed at 80 when the chart says 100).
Has anyone else looked to see how various games match to what the chart says they should be?
 
I have been looking on Toms for how the 285 and other cards perform at various settings.

Heres fallout 3 at 1920x1200 4AA 8AF running @78 FPS
http://www.tomshardware.com/charts/...-charts-2009-high-quality/Fallout-3,1317.html

The chart posted on the first page shows 79. Same numbers

The Left 4 Dead number dont match up as well ( the 285 is listed at 80 when the chart says 100).
Has anyone else looked to see how various games match to what the chart says they should be?

dont look at toms, their data isn't legit at all...
 
I have been looking on Toms for how the 285 and other cards perform at various settings.

Heres fallout 3 at 1920x1200 4AA 8AF running @78 FPS
http://www.tomshardware.com/charts/...-charts-2009-high-quality/Fallout-3,1317.html

The chart posted on the first page shows 79. Same numbers

The Left 4 Dead number dont match up as well ( the 285 is listed at 80 when the chart says 100).
Has anyone else looked to see how various games match to what the chart says they should be?

Driver version, platform, CPU muscle and just plain variance are additional factors to take into consideration. Performance numbers rarely match up across different reviews for these reasons.
 
Back
Top