CJ leaks benchmarks of 5870/5850 vs. 285/295

Maybe Nvidia will stop re releasing all their old cards but with new names?....maybe not.

I'm not excited to see the next version of the 8800gt nvida will poop out.
 
Maybe Nvidia will stop re releasing all their old cards but with new names?....maybe not.

I'm not excited to see the next version of the 8800gt nvida will poop out.

Fat chance. It seems to have worked well for them so far. :rolleyes:

While the Green team defenders rush in to defend their actions as "saving on R&D costs", IMO, Nvidia needs a good ass-kicking to get themselves back into gear, because they're certainly better than the name-changing shenanigans. ;)
 
51fps at 1680 in Crysis Warhead for the 5870 is a little underwealming guys. That is hardly next gen performance.

Honestly I am agnostic about which brand I go with, but frankly ATi supporters tend to cherry pick benchmark scores, and usually the benchmarks they focus on are the least important. For example, in the comparison between 5870 and GTX295, 5870 blows the doors off of consolized crap like Wolfenstein, but then looses in Crysis and only slightly wins in Warhead--the only two REALLY shader intense games we have. So what conclusion do we draw from that? We don't need better performance in Wolfenstein and Street Fighter, we need Crysis performance and 5870 just does not impress.
 
Last edited:
^


That's why I don't put too much faith in these numbers. They are maybe an indication, but I can't wait to see official reviews, when the newest drivers are also used. So far, these numbers are nice to know, but I won't judge the cards by it.


@ Blackstone

Crysis always seems to have been AMD's trouble child. Still, AMD cards can perform very well in Crysis, like the HD 4890. They maybe need more time to optimize their drivers to reach that level of performance. Its not fair to judge the HD 5800 cards on these numbers IMHO.
 
Last edited:
I like how I need to run the most powerful consumer GPU available in crossfire to break 30 FPS at 1900x1200 in Stalker: Clear Sky.

No wonder I think it's unplayable now.
 
^


That's why I don't put too much faith in these numbers. They maybe be an indication, but I can't wait to see official reviews, when the newest drivers are also used. So far, these numbers are nice to know, but I won't judge the cards by it.


@ Blackstone

Crysis always seems to have been AMD's trouble child. Still, AMD cards can perform very well in Crysis, like the HD 4890. They maybe need more time to optimize their drivers to reach that level of performance. Its not fair to judge the HD 5800 cards on these numbers IMHO.

I would bet money the official reviews turn out much like these numbers. They are very plausible. My point is that it is one thing to bring out a DX11 card, and it is another thing to bring out a single card that really handles Crysis and Warhead with some authority. All they had to do to win me over was hit 60fps at 1680 in Crysis and Warhead with a single GPU and they missed by a mile it seems. Instead I get an extra two frames over the 295 and I give up PhysX and 3D vision support?
 
51fps at 1680 in Crysis Warhead for the 5870 is a little underwealming guys. That is hardly next gen performance.

Honestly I am agnostic about which brand I go with, but frankly ATi supporters tend to cherry pick benchmark scores, and usually the benchmarks they focus on are the least important. For example, in the comparison between 5870 and GTX295, 5870 blows the doors off of consolized crap like Wolfenstein, but then looses in Crysis and only slightly wins in Warhead--the only two REALLY shader intense games we have. So what conclusion do we draw from that? We don't need better performance in Wolfenstein and Street Fighter, we need Crysis performance and 5870 just does not impress.

The 5870 beats the GTX 295 in nearly every game at 2560x1600 8xAA. Even if those Crysis numbers end up being representative of the overall performance, the fact that it's pretty much equal to 2x 275's is impressive in itself.
 
I guess Blackstone thought that the GT200 wasn't next gen either because it made Crysis go from unplayable to a little playable. And something tells me that if PhysX and 3D Vision are things to be brought up, it doesn't matter how great the 5870 performs, he wasn't switching anyways!

Anyways I'm waiting for the actual reviews. The 4870 and 4850 slides from ATI showed the 4870 just a bit faster than the 9800GTX and 8800GT respectively, and as we all know, they were more comparable to the GT200's than the G92s
 
The 5870 beats the GTX 295 in nearly every game at 2560x1600 8xAA. Even if those Crysis numbers end up being representative of the overall performance, the fact that it's pretty much equal to 2x 275's is impressive in itself.

Not to mention the price difference, DX11, and new features like EyeInfinity all being in the HD5870 favor. I guess some people will never be impressed.
 
Watch out now....lol

When Nvidia releases their GT300 series and it beats the 5800s, the Nvidia fans are going to believe the leaked info and buy two when it releases.

I'm not saying believe every thing on the Internet, but the upcoming gen of GPUs looks promising.

The nVidia fans will go a step further and write 5 egg reviews at newegg prior to them even taking delivery of the card.
 
Or some people just cant admit that their favorite company's rival can actually do something right

Reminds me of AMD K8 vs. Intel P4 and people defending the P4... or those denying that Core 2 was amazing
 
It isn't that I am not impressed. The higher resolutions and anti-aliasing would be quite useful to me. Those are crucial features. I am just wondering if the design is compromised in some way that limits its performance in Crysis.
 
Since no one has pointed this out

Battleforge
2560x1600 - 36.6
2560x1600 4xAA 8xAF - 27.1
-26%

Crysis Warhead
1920x1200 - 43.9
1920x1200 4xAA 8xAF - 37.0
-16%

Fallout 3
2560x1600 4xAA 8xAF - 65.7
2560x1600 8xAA 16xAF - 60.7
-8%

Resident Evil 5
2560x1600 4xAA 8xAF - 61.6
2560x1600 8xAA 16xAF - 60.2
-2%

Stormrise
2560x1600 4xAA 8xAF - 33.9
2560x1600 8xAA 16xAF - 32.4
-5%

Wolfenstien
2560x1600 4xAA 8xAF - 56.1
2560x1600 8xAA 16xAF - 52.8
-6%

(Yes, these are some of the better case scenarios)
 
I have my doubts they would be. Nvidia usually isn't one for silence this close to release, and usualy with a month out, the rumors would be much more prevalent. Look at 58xx 1 month before release... RV770 and GT200 as well.

Of course, NVDA didnt state *what* they were releasing to coincide with the Windows 7.... they were, after all, coming out with some GT21x cards soon

Remember the G80 launch?
Usually when NVIDIA is quiet, it''s good.
 
well this topic turns to cheering nvidia go go nvidia c'mon u can make gt300 in time lol calm down no need for fanatism we are talking about 5870 now
 
Nice performance boost.
The OEMs have to love the load/idle power of these.
 
I looked at the first image... its impressive.

Then the other image just blew my socks off!
 
Nice performance boost.
The OEMs have to love the load/idle power of these.

I think this is one of the major factors people are overlooking. The power required for these cards is extremely lovely.
 
It isn't that I am not impressed. The higher resolutions and anti-aliasing would be quite useful to me. Those are crucial features. I am just wondering if the design is compromised in some way that limits its performance in Crysis.

There's too many variables to suggest that the design is compromised. For one, how much influence did Nvidia have on the design of Crysis? That could mean that the game was more optimized for Nvidia hardware, and isn't any indication of ATI's prowess. As an example, look at when R600 vs. G80 battled... Lost Planet was decidedly Nvidia favored while Call of Juarez was decidedly ATI favored.

As far as Crysis being the most shader intensive game... sure. But does that indicate that is how the gaming industry will go? Keep in mind that CryEngine 2 was designed with some assumptions on where technology would go - it just happened to be at a time when technology was in a flux. DX9 -> DX10 promised to be a big jump (turned out to be a performance dud), the move to unified shader architecture, etc.

Benchmarks suggest that DX10.1 offers performance improvements over DX10, and that DX11 will make sure there is no repeat of the DX9-> DX10 performance debacle. Who's to say that Crysis wouldn't run extremely well if DX10.1 had been around from the start, rather than DX10? The few games that run on DX10.1 certainly show that there's performance improvements to be bad.

And I'll give you this piece: CryEngine 3 was demo'd at the AMD EyeInfinity thing. It was being run on 3 monitors. You can see the youtube video yourself. The engine already looks amazing (granted, we don't know its details etc.) and was being powered across 3 monitors (and since supposedly CF doesn't work with eyeinfinity yet, it was one card powering it). So again, who's to say that DX11 doesn't improve performance of an even more intense engine? Or who's to say that Crytek isn't working with ATI this round and will be more optimized for ATI? Surely, demo'ing it at an ATI event is a big statement itself, which at the very least suggests that "hey, this runs across 3 monitors in DX11 on this new card of theirs!"

So again, too many variables to suggest that the design is "compromised" in any way shape or form...

Remember the G80 launch?
Usually when NVIDIA is quiet, it''s good.

G80 was just one time. Are we forgetting that Nvidia had the GT21x cards on the roadmap, and we haven't heard a damn thing.... well wait, the G 210 (GT218) card was just reviewed, and it gets outperformed by a 55nm card that draws less power and has been out for nearly a year. Or how about the NV30 release?

Anyways, you're forgetting that Nvidia came out with the G80 first. And a month before its release was the only time when we really got a glimpse at what its specs were. Likewise, the 58xx's details were unknown until a month prior. Same for the RV770 (remember the 480 SP's vs. 800 SP's argument? Oh how wrong some people were...).

Of course, this only suggests that the GT300 isn't a month out from relase yet ;)

Anyways, while we're on the topic of history... history hasn't been kind to the guys who come out late. When one IHV has a lead to the market on a new generation, the first to market typically outperforms (or at least matches) the card that comes out later.

G80 vs. R600. R300 vs. NV30. NV10 vs R100. NV20 vs R200, and etc. Obviously, there are always exceptions to the rule but that's how I see it
 
I think this is one of the major factors people are overlooking. The power required for these cards is extremely lovely.
Increased performance and less power useage. Hats off to AMD.
This will make it an easier choice for those not having to upgrade their PSU. :)
 
Eh...for that performance, if it's true $400 starting isn't that bad at all.

But if it IS $300 or close to it like ti shows on that chart....I'm getting two.
 
Honestly I wish they would sell some versions of the cards for less without it. Yeah I have 3 monitors but I have no intention of hooking them all to one computer. It costs too much trying to keep up with the GPU game for normal resolutions, no way could I afford to keep up at triple resolutions. 42" is enough for me.

I highly doubt that removing the 3rd display port and dropping it down to 2 outputs would save much money at all. You'd probably be able to get the card for only like $2 cheaper without that 3rd display port, lol.
 
I knew there would some people disappointed. Funny how a single card with GTX 295 performance is not fast enough...please!!
 
Very impressive. However, I do not have any games that need this type of power. I want some new games that are more demanding so I can rationalize buying one.
 
Hmm, pretty nice. I'll definitely be interested in the official benchmarks soon, but this leaked info is pretty promising. I'm especially surprised at the 5850 pretty much matching or slightly beating the GTX 285, sort of crazy that my GTX 280 is about to be beaten by a mid-range part already. The 5870 is hanging with the GTX 295 too, but not exactly beating it (very competitive though.)

Hmm, sell off my current card or get another one on the cheap to do SLI, hmm.
 
I knew there would some people disappointed. Funny how a single card with GTX 295 performance is not fast enough...please!!
leave them to there sorrow :p just joking seriously ppl are high this card is actually faster than gtx 295 for like half of power usage and dx11 future and that infinity stuff and still ppl add some weird comments :rolleyes:
 
looking promising, sign me up for a 5870 or maybe even the X2.

Nvidia was caught with their pants down this generation, I've heard they are telling OEMs just that. The rest of the year is AMD's playground.
 
I can't figure out which one I want. The 70 or the 50... wonder if my system justifies getting the higher end card.
 
I can't figure out which one I want. The 70 or the 50... wonder if my system justifies getting the higher end card.

I'm getting the 70 methinks just to avoid having to crossfire down the road. I have an i7, but I can't justify multi gpu setups quite yet in my life (still in college...etc etc...)

Edit: Avoid having to crossfire *shortly* down the road. :) Graduation isn't far off ;)
 
The benchmarks look good. Will have to wait for more benchmarks and more information. I wonder what NVIDIA's got behind their curtains...



ATI is more cost effective than NVIDIA.

* ATI has more bang per the nanometer and buck than NVIDIA.

* ATI has less wattage (idle and load) than NVIDIA.

* ATI can match and exceed NVIDIA's latest card, even if it is by a little amount.

Don't these three points go to show inefficiency on NVIDIA's side? Out of laziness and marketing strategies, ATI has poorer drivers -- but even then their cards are competent against NVIDIA. Just imagine the performance ATI's cards could produce if ATI matched NVIDIA's driver quality.

NVIDIA burns more gas just to get a little bit ahead, while ATI puts more effort in matching that extra distance without burning as much gas unnecessarily.



I do not write this post in a "fanboy" perspective. I will choose the card that is most cost effective between ATI and NVIDIA; efficiency is a plus. Money doesn't grow on trees ya' know.
 
Last edited:
Back
Top