GTX 280 vs 5870: Confusing Results

Xorphious

n00b
Joined
Aug 21, 2009
Messages
17
I've a couple questions... and some comparison test results, and would love even responses from any on the [H] team regarding the following.

This is a little long, and not meant to "knock" the 5870, but instead to compare results that I personally find a little confusing with the supposed performance of the 5870.

Keep something in mind: I have never gotten the same results with any hardware as others either boast or complain about over the years; all the "hype" about what people can/cannot run on their set-ups. I could never understand why, but it's something to keep in mind when reading this.

First, the questions:

- How is the noise with the 5870?

I think it was [H]'s Steve I saw state that the last few gens of ATi's GPUs are almost intolerably loud.

- Regarding temps/heat... doesn't 88C under load seem a bit hot, which could affect the longevity of the GPU?

Under load, my GTX 280 never hits above 80C (generally around 60-65C), and that's on Crysis: Warhead maxed (VRM hits 80C), on Enthusiast with 4xAA.

Now, in terms of performance...

It would be assumed that the 5870 would pound the GTX 280 into dust, but there's some confusing numbers I've gotten in benching that are almost comparable to the 5870 that put me "on the fence" about moving from the GTX 280 to the 5870.

Keep in mind, these results are the types that I've heard countless people many times state they cannot get and/or are "impossible", but I've never gotten the same "problems"/results as others, so I'm not sure what the deal is.

I'm a graphics whore, so I run everything maxed, all settings on every game.

Rig: GTX 280, E8600, 2GB RAM, ASUS Rampage Formula

Crysis: Warhead (Train Level - start to finish)
Settings: Enthusiast, 4xAA, 16xAF, 1920x1200 res

Min: 8
Max: 23
Avg: 17.9

(5870 - [H] results)
Min: 12
Max: 37
Avg: 26.6

Only a 4 fps difference min, and 8 fps difference avg?

These results are close to the 5870, and that's with me running 4xAA... [H] tested @ 2xAA... does not make sense.

The min, max and average are, on average, only around 4-14 fps more with the 5870 over the GTX 280, at best. I'm not seeing a huge difference here, though I would have expected to.

Even the places where the fps drops really low in Crysis: Warhead, it's not unplayable; a bit choppy, which I'm sure would be a little better with the 5870, but not unplayable.

Sure, the 5870 is faster at the same high settings (except I run at 4xAA, which is higher than [H]'s testing), but not always by a margin that's enough to make a huge difference, comparing the numbers.

Here's an example of a couple other games (again, all in-game settings maxed):

L4D
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 42
Max: 63
Avg: 57.350

TF2
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 62
Avg: 59.343

Fallout 3
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 61
Avg: 59.183

I mean, how much better can you really get than this, that would make any difference in gameplay or visuals? I avg almost 60 fps in every game, with everything maxed.

I'm really just trying to get a feel for the situation, because I've always found that there's a lot more "hype" behind people stating what they can/cannot run on their systems, which I've always found to be different for me. Somehow, I'm always just "magically" able to run faster? I don't know... maybe people exaggerate things too much, too often... but these are accurate and true results on my end.

So, the question is: looking at these results, is it really worth the move from the GTX 280 to the 5870, based on what performance I get with the GTX 280 and my rig?

Note: there is no "flaming" here. I'm not "for" or "against" either brand. I run what ever runs best on the market at any given time. I'm simply trying to figure out why my results show something that seems drastically different that what would be expected between the GTX 280 and the 5870.
 
My results with HD4770s in Xfire is much higher than any reviews I have seen.

It will take some time and tweaking by many to figure out how great the HD5000 series perform.

It's still early.


When the HD4890 first came out I stated that if 2 of them in Xfire couldn't score 30,000+ in 3dmark06 within 30 days that they were a failure. Sure enough within about 30 days there were hard core tweakers working on them and scoring 30,000 pts in 3dmark06 with 2 HD4890s in Xfire.


I think a single HD5870 tweaked will hit 26,000+ pts in 3dMark06 soon and 2 HD5870s in Xfire will hit 37,000+ points in 3dmark06 soon as well.

Once the intel 32nm Golftowns are available these cards are really going to fly.
 
Are you saying that you've always experienced the same thing, getting higher results with your GPU's than others state/suggest is possible? Just not clear on the wording, but it could be my tired brain at the moment.

I never really use 3D Mark as a test. I prefer real-world gameplay, but I get what you're saying.

The confusing thing here is that there is such a drastic difference... meaning, the 5870 even without tweaking is showing to basically equal a GTX 295, which should crush a single GTX 280. However, my benches show something completely different, and this is far from the first time I've seen this happen.

When I used to run the x1900xtx, there were results that people said would never be possible in terms of performance that I managed to meet and/or surpass. I don't OC anything either.

From all my benching, I'm basically running at what is supposed to be the new uber-end GPU, such as the the 5870, and that just does not make sense.

How could I be running roughly the same as the 5870?

Whether it's still early for the 5870 or not, if it's supposed to be on-par with the GTX 295, yet I'm running close to it already, there's definitely some massive confusion here, lol.

How could I get truly any better results than already being able to max-out every game I play, if I'm not running as far behind the 5870 as would be expected?

Man, I don't know... I find this confusing, as I always have, since I always seem to get different results than many.
 
The fact is that with so many different possible hardware and software configurations, you can't compare your results directly to a published review. Even small things such as diver settings( mip-map quality, high quality vs standard AF, and many more) will throw off the numbers.
 
The fact is that with so many different possible hardware and software configurations, you can't compare your results directly to a published review. Even small things such as diver settings( mip-map quality, high quality vs standard AF, and many more) will throw off the numbers.

That's very true... except for the fact that, such as with [H], they're running all top-end hardware. I don't think my E8600 or 2 GB of RAM is equaling any more than what they're running, lol.

Driver settings I don't think would have that much of an impact in comparison between a supposedly much higher-end GPU than even the GTX 280, and everything I run is always at the highest quality I can manage, which is probably comparable (easily) to what someone such as [H] runs/tests on.

Either way, driver/other settings or not, when running high-end (on my end) vs high-end (on their end) hardware, the 5870 should blow the GTX 280 out of the water, because it's all-around a higher-end GPU. I'm not seeing that, and I find that very confusing.
 
It very well could be that your CPU is not allowing the 5870 to fully stretch its legs. Games are now starting to take advantage of the extra cores in quads, so it may be making a difference. I'm not going to pretend that I know thats the situation for sure, but just a possibility to explain your confusing results. Also, would V-Sync be on? Limiting the highs on the FPS will limit the AVG.
 
Crysis: Warhead (Train Level - start to finish)
Settings: Enthusiast, 4xAA, 16xAF, 1920x1200 res

Min: 8
Max: 23
Avg: 17.9

(5870 - [H] results)
Min: 12
Max: 37
Avg: 26.6

Only a 4 fps difference min, and 8 fps difference avg?

You say "only", but 8 fps difference in average is 50% improvement. It is also the difference between playable and not. Look at the bottom of the [H] review and you'll see Apples to Apples for Crysis Warhead, with the settings being the same that you run. The 5870 commands a pretty solid lead over the GTX 280.

Here's an example of a couple other games (again, all in-game settings maxed):

L4D
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 42
Max: 63
Avg: 57.350

TF2
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 62
Avg: 59.343

Fallout 3
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 61
Avg: 59.183

I mean, how much better can you really get than this, that would make any difference in gameplay or visuals? I avg almost 60 fps in every game, with everything maxed.

Not to burst your bubble, but a 4850 can max all those games and still be playable. If those are the games you play, then continue to be happy with your GTX 280. If it does what you need, don't bother upgrading.
 
I also have a GTX 280 and don't think I will be upgrading to these new ATI cards. It seems the 5870 is faster than the gtx280 around the same amount the 280 was faster than the 4890, and its not really that noticable a performance difference for me.

As an aside, how are you not smashing into the limits of your 2gb of ram?? I would think you would see a hefty performance increase with at least 4 given that some of these games can take 2 on their own
 
I also have a GTX 280 and don't think I will be upgrading to these new ATI cards. It seems the 5870 is faster than the gtx280 around the same amount the 280 was faster than the 4890, and its not really that noticable a performance difference for me.

As an aside, how are you not smashing into the limits of your 2gb of ram?? I would think you would see a hefty performance increase with at least 4 given that some of these games can take 2 on their own

+1. I use around 2.6GBs with simple multitasking.
 
You say "only", but 8 fps difference in average is 50% improvement. It is also the difference between playable and not. Look at the bottom of the [H] review and you'll see Apples to Apples for Crysis Warhead, with the settings being the same that you run. The 5870 commands a pretty solid lead over the GTX 280.

Not to burst your bubble, but a 4850 can max all those games and still be playable. If those are the games you play, then continue to be happy with your GTX 280. If it does what you need, don't bother upgrading.

Well, I would not say that "bursts any bubble", since I'm not into comparing in that manner. I have no "envy" that another GPU can run the same numbers, lol ;)

Yes, the GTX 280 is amazing in it's performance and, for the most part, performs perfectly for me.

I suppose the reason for consideration of switching is for a couple of reasons:
- it never hurts to have a bit more fps at the highest settings possible
- ATi's color has always looked a bit better to me over the years in comparison to Nvidia
- ATi's AA/AF has always been better than Nvidia's
...these things learned/noticed from many years of switching back-and-forth between brands.

It terms of "50% more performance", it's not, really. Just because the min fps is double, it's still only 8 fps, and not every other number is doubled, not even close. The max and average are pretty close to what I'm getting with the GTX 280, which does not make much sense to me.

In the apples-to-apples section (which I looked over, so thank you) I'm actually getting the same exact performance as the 5870 (which beat-out the 285 by far?)... should that not be the case, if the 5870 is equal/close to the GTX 295?

Apples-to-apples results:
me: max 35 fps, average 24.9
[H]: max 37 fps, average 25.9

Yet, they state that the 5870 basically "trounces" a GTX 280...?
I'm very confused by that statement (I even asked in the main review thread, and that was the response I received).

I also have a GTX 280 and don't think I will be upgrading to these new ATI cards. It seems the 5870 is faster than the gtx280 around the same amount the 280 was faster than the 4890, and its not really that noticable a performance difference for me.

As an aside, how are you not smashing into the limits of your 2gb of ram?? I would think you would see a hefty performance increase with at least 4 given that some of these games can take 2 on their own

I have no idea, but I've never had any problems running at only 2GB. Not many games take huge advantage of more RAM, so perhaps that is why. I'm sure some is used, but just like quad-cores, there are not many games that really utilize it all.

Either way, up until today I've been moving on the assumption that the 5870 did indeed "trounce" the GTX 280, but in comparing my benches from last night and today, I was shocked, and am certainly not clear on how much more of a performance boost the 5870 is going to offer over what I'm already running.

Perhaps the only advantage being better AA/AF than Nvidia, as I stated above, but if this thing is only performing slightly faster than a GTX 280, then better AA/AF or not, it's not worth the $400 to move from one GPU to the other... unless somehow my numbers are way off.

P.S.
I'm not "nitpicking" here or "busting chops", I'm just totally confused as to these results.
 
Last edited:
I promise I will drop it after this, but being so shocked at your ram just let me say "think about how much closer the cards would perform if you eliminated the ram handicap with 30$ of extra ram!" I remember reading guidance about an ideal minimum of 4 to 1 system to gpu ram ram ratio and I hate to see your card held back.
 
If i were you i would upgrade to 4GB of RAM. That would make sure that you have plenty of RAM for any game you play.
 
Yes, Crysis still is a challenge even for the latest video card on the market. I won't dispute that. But so what, it challenges even dual GPU cards like the GTX 295 and the 4870x2. The 5870 is the best single GPU card at this game. But Crysis is less than 1% of the games out there on the market.

Let's go ahead and compare to some of the games you mentioned.

Here's an example of a couple other games (again, all in-game settings maxed):

L4D
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 42
Max: 63
Avg: 57.350

5870 1GB
113.59 AVG
71 Min
Avg Margin + 56 FPS

Source: Hardwarecanucks benchmarks
http://www.hardwarecanucks.com/foru...sapphire-radeon-hd-5870-1gb-gddr5-review.html

Fallout 3
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 61
Avg: 59.183

5870 1GB
77.21 AVG
54 Min
Avg Margin +18 FPS

Source: Hardware Canucks benchmarks
http://www.hardwarecanucks.com/foru...sapphire-radeon-hd-5870-1gb-gddr5-review.html

I couldn't find a benchmark for TF2 but considering that it uses the dated Source Engine, I'd predict that the 5870 probably does it at over 100FPS as well. So are 18 FPS, 40 FPS and 56 FPS a considerable advantage over what your GTX 280 can do? I'd say yes. Do you need these huge framerates that go over 60? Only you can say yourself. But your premise is rather flawed. Aside from Crysis, there are lots of instances where the 5870 1GB flat out beats the GTX 285 by 20+ FPS. It's up to the consumer ultimately though to decide if they want to spend money on the extra performance.

If you're satisfied with your 280, good for you. It'll be fine for dx 9 and 10.1 games, but it won't do DX11 Tesslation, Directcompute, Eyefinity, or any of those other features that the 5870 brings to the table, besides the increase in framerates.

Handy list of 5870 reviews/benchmarks:
http://www.tomshardware.com/forum/270690-33-hd5870-hd5850-reviews-discussion
 
I understand your point, but I've not seen anyone "handicapped" by using 2GB of RAM with a high-end rig in gaming. I've compared my system directly to that of personal friends running GTX 280's and 4GB+ of RAM, and somehow my performance was still the same. Couldn't tell ya...

I don't run a 64-bit OS due to too many issues with other apps I use, so it's a moot point for me anyway.

Either way, I don't want this to turn into a discussion regarding RAM ;)
 
Yes, Crysis still is a challenge even for the latest video card on the market. I won't dispute that. But so what, it challenges even dual GPU cards like the GTX 295 and the 4870x2. The 5870 is the best single GPU card at this game. But Crysis is less than 1% of the games out there on the market.

Let's go ahead and compare to some of the games you mentioned.



5870 1GB
113.59 AVG
71 Min
Avg Margin + 56 FPS

Source: Hardwarecanucks benchmarks
http://www.hardwarecanucks.com/foru...sapphire-radeon-hd-5870-1gb-gddr5-review.html



5870 1GB
77.21 AVG
54 Min
Avg Margin +18 FPS

Source: Hardware Canucks benchmarks
http://www.hardwarecanucks.com/foru...sapphire-radeon-hd-5870-1gb-gddr5-review.html

I couldn't find a benchmark for TF2 but considering that it uses the dated Source Engine, I'd predict that the 5870 probably does it at over 100FPS as well. So are 18 FPS, 40 FPS and 56 FPS a considerable advantage over what your GTX 280 can do? I'd say yes. Do you need these huge framerates that go over 60? Only you can say yourself. But your premise is rather flawed. Aside from Crysis, there are lots of instances where the 5870 1GB flat out beats the GTX 285 by 20+ FPS. It's up to the consumer ultimately though to decide if they want to spend money on the extra performance.

Absolutely, but I should probably have mentioned... I always use vsync because tearing drives me nuts. So, of course there are going to be some numbers that are a huge leap over what I posted.

There's no need for anything over 60 fps. The human eye cant even make out the difference, so I'm not concerned about that.

If there's a solid 20+ fps difference between the 280 and 5870 in most games with everything maxed, considering I feel ATi's AA/AF and colors are generally better, then yeah, I'd say it's worth the move.

I'm assuming that [H] has vsync on in their testing, since their numbers are so close to mine, but if that's the case, then again, we're seeing an accurate difference between the 280 and 5870 which, in what I'm pointing out, is not that big a difference.

These other reviews/benches obviously are not tested with vsync on, so I cant really make a comparison when they're hitting 100+ fps, lol. So, I'm sticking with comparisons between my results and [H]'s results.

Either way, even in some of those other test results, there's only a small fps difference in all the games tested (around 10 fps) in both min and average.
For example:

Fallout 3
5870
avg: 84
min: 61

GTX 285
avg: 75
min: 46

http://www.hardwarecanucks.com/foru...phire-radeon-hd-5870-1gb-gddr5-review-14.html

Roughly an average of 15 fps difference across the board. Good, but not a heck of a lot.
The min difference though... have to say that's where it counts, and that certainly seems a lot higher.

But again, my comparison with [H]'s results are different... the 280 and 5870 seem extremely close by their results.
 
Last edited:
Rig: GTX 280, E8600, 2GB RAM, ASUS Rampage Formula

Crysis: Warhead (Train Level - start to finish)
Settings: Enthusiast, 4xAA, 16xAF, 1920x1200 res

Min: 8
Max: 23
Avg: 17.9

(5870 - [H] results)
Min: 12
Max: 37
Avg: 26.6

Only a 4 fps difference min, and 8 fps difference avg?

These results are close to the 5870, and that's with me running 4xAA... [H] tested @ 2xAA... does not make sense.

TWIMTBP?
 

TWIMTBP means nothing but a logo...

anyway, these data actually pretty good, I have no idea why you still confuse about the result...

btw, Source Engine never give me screen tearing, I always have VSYNC off, dont know why you have it.
 
TWIMTBP means nothing but a logo...
Heh, what an unfounded statement. This phenomenon has become almost routine for hardware reviewers, and has been mentioned in many reviews, and on many occasions (ATI and NVIDIA have both been the beneficiaries). I would appreciate at least some support for such a flippant claim.

If I light a match and toss it on 10 straw houses, and 9 of them burn down, do you question the matches or the single remaining house?
 
Heh, what an unfounded statement. This phenomenon has become almost routine for hardware reviewers, and has been mentioned in many reviews, and on many occasions (ATI and NVIDIA have both been the beneficiaries). I would appreciate at least some support for such a flippant claim.

If I light a match and toss it on 10 straw houses, and 9 of them burn down, do you question the matches or the single remaining house?

what did TWIMTBP brings anything to you ?

I would like to hear some comment about :rolleyes:
 
what did TWIMTBP brings anything to you ?

I would like to hear some comment about :rolleyes:

I'm not clear on your question, but I take it you are asking what TWIMTBP means to me. To me, it is an indication that NVIDIA hardware was used for testing gaming performance. In a sense, you might say it is a software company's guarantee that it will perform well with NVIDIA parts. It is a sort of agreement, for one to make the other look good. I don't think it is a stretch to suggest that the product is tweaked in the process to optimize performance.

In many cases (evidently), this optimization puts ATI hardware (even superior hardware) at a significant disadvantage. The same seems to be true of games tested with and advertised for ATI hardware. The anomalies seem to speak for themselves, in my opinion.
 
4 things:

1.) Actual temperature readings for GPUs need to be taken into context. These GPUs are designed to handle that type of heat and their cooling systems are designed to keep it in check within levels the GPU can tolerate. Thermal throttling usually occurs in most modern GPUs at around 100c. So I wouldn't even be concerned with that at all. You can't say GPU B runs too hot because GPU A runs 8c cooler. Many people look at GPU temperatures and think something is wrong because they compare their GPU temperatures to their CPU temperatures. Again you can't do that because you are looking at results in different contexts.

2.) I believe my post on the noise subject is what you are referring to. It was me that said modern high end AMD graphics cards were intolerably loud. (Steve may or may not have said it too, but I know I said it in one of these 5870 threads a couple of times.) This is my opinion of most of ATI's more recent GPUs. Granted I skipped the 2900XT and 3800 series GPUs entirely. I don't know that I've ever seen one out of the packaging in person much less actually saw a system running one. However the 4870 was a bit loud and the 4870 X2's I've had quite a lot of experience with and they are too damn loud.

3.) Obviously if you are happy with the performance you are getting from your current GPU in every game you play there isn't much of a reason to upgrade unless you decide you want something that your current GPU can't deliver. That's really the bottom line. Your post makes it sound like you are trying to talk yourself out of purchasing the 5870. There is nothing wrong with that. After all it is your money.

4.) Finally you can't compare [H]ard|OCP's testing of the 5870 to any test results you have come up with because I can guarantee your testing methodology isn't close enough to be able to accurately compare results. At most you'll see a ball park correlation between the [H] numbers and yours.
 
I'm not clear on your question, but I take it you are asking what TWIMTBP means to me. To me, it is an indication that NVIDIA hardware was used for testing gaming performance. In a sense, you might say it is a software company's guarantee that it will perform well with NVIDIA parts. It is a sort of agreement, for one to make the other look good. I don't think it is a stretch to suggest that the product is tweaked in the process to optimize performance.

In many cases (evidently), this optimization puts ATI hardware (even superior hardware) at a significant disadvantage. The same seems to be true of games tested with and advertised for ATI hardware. The anomalies seem to speak for themselves, in my opinion.

well, that is certainly not true..

Crysis is a great example, everyone knows this is a TWIMTBP game, and think it makes nVidia have the advantage.
but reality is, the game on Very High DX10 actually runs better on ATI cards
(beside the benchmark program which somehow ATI does not perform that well and does not even scale in 64bit mode)

there are lots of game out there under TWIMTBP, and only very few of them that ATI perform worse than nVidia. at the same time there are games out there that flavor ATI but does not have ATI logo on there..

TWIMTBP is pretty much a marketing strategy if you ask me...
 
It terms of "50% more performance", it's not, really. Just because the min fps is double, it's still only 8 fps, and not every other number is doubled, not even close. The max and average are pretty close to what I'm getting with the GTX 280, which does not make much sense to me.

I don't think you understand how percentages work.

(both average FPS, according to your first post)
GTX 280: 17.9
5870: 26.6

17.9 * 1.5 (+50%) == 26.85 ~= 26.6

So yes, it *IS* a 50% improvement despite being "only" 8 fps. Double the performance is a 100% improvement ;)

In the apples-to-apples section (which I looked over, so thank you) I'm actually getting the same exact performance as the 5870 (which beat-out the 285 by far?)... should that not be the case, if the 5870 is equal/close to the GTX 295?

Apples-to-apples results:
me: max 35 fps, average 24.9
[H]: max 37 fps, average 25.9

Yet, they state that the 5870 basically "trounces" a GTX 280...?
I'm very confused by that statement (I even asked in the main review thread, and that was the response I received).

According to your own first post the GTX 280 had 17.9 fps average, min of 8 fps, and a max of 25 fps. So where did you get the 24.9 from?
 
Crysis actually runs better on NVIDIA hardware. I've never seen any tests that indicate otherwise. I've seen it myself, Crysis runs like shit on AMD hardware comparatively.

Here is just one example of that. The 4870 gets beat by the Geforce GTX 260 and the 4890 ties with the Geforce GTX 260. The Geforce GTX 280/285 would have crushed those cards in that game if they were tested I'm sure. I've also personally tested the 4870 X2 vs. the Geforce GTX 280 and found the latter to be better in Crysis alone. Every other game was better on the 4870 X2, but Crysis was the one exception. Going beyond that the 4870 X2 CrossfireX setup got leveled by my Geforce GTX 280 3-Way SLI setup as well. That's just Crysis but since you brought it up specifically I thought it needed to be addressed.

Crysis only runs better on AMD hardware when the performance gap between NVIDIA and ATI hardware is large enough for that to happen. (Basically comparing a 5870 to a Geforce GTX 285 which we know is considerably faster in everything, and naturally Crysis falls under that.)

In any case there have always been games where AMD/ATI have had better performance and the reverse has always been true as well. There are times where one company or the other will have a larger technological lead than the other in which case that company's product will sweep all the benchmarks save for one or two. That's what we are seeing now. The 5870 is a damn fine and powerful card and it's kicking ass right now.
 
Crysis actually runs better on NVIDIA hardware. I've never seen any tests that indicate otherwise. I've seen it myself, Crysis runs like shit on AMD hardware comparatively.

Here is just one example of that. The 4870 gets beat by the Geforce GTX 260 and the 4890 ties with the Geforce GTX 260. The Geforce GTX 280/285 would have crushed those cards in that game if they were tested I'm sure. I've also personally tested the 4870 X2 vs. the Geforce GTX 280 and found the latter to be better in Crysis alone. Every other game was better on the 4870 X2, but Crysis was the one exception. Going beyond that the 4870 X2 CrossfireX setup got leveled by my Geforce GTX 280 3-Way SLI setup as well. That's just Crysis but since you brought it up specifically I thought it needed to be addressed.

Crysis only runs better on AMD hardware when the performance gap between NVIDIA and ATI hardware is large enough for that to happen. (Basically comparing a 5870 to a Geforce GTX 285 which we know is considerably faster in everything, and naturally Crysis falls under that.)

In any case there have always been games where AMD/ATI have had better performance and the reverse has always been true as well. There are times where one company or the other will have a larger technological lead than the other in which case that company's product will sweep all the benchmarks save for one or two. That's what we are seeing now. The 5870 is a damn fine and powerful card and it's kicking ass right now.

I am talking about Crysis, not Warhead... They scale extremely different..

and during the test, it use 9.4 driver, which is a fresh driver for 4890, try 9.8 you will see..

PS: I am talking on Very High Setting, not "High"
I had run both before, nVidia have advantage on High, but deep down like crap in Very High... on the other side, ATI is opposite of what nVidia have..
 
well, that is certainly not true..

Crysis is a great example, everyone knows this is a TWIMTBP game, and think it makes nVidia have the advantage.
but reality is, the game on Very High DX10 actually runs better on ATI cards
(beside the benchmark program which somehow ATI does not perform that well and does not even scale in 64bit mode)

there are lots of game out there under TWIMTBP, and only very few of them that ATI perform worse than nVidia. at the same time there are games out there that flavor ATI but does not have ATI logo on there..

TWIMTBP is pretty much a marketing strategy if you ask me...
I hope you will agree that there are very unusual results for particular games that do not fall in line with accepted truths about card performance (i.e. in most games, a GTX 285 is generally faster than 4890, a 4890 is generally faster than a GTX 260, etc.). When a GTX 260 (for example) outperforms a 4890 in just one game, shouldn't eyebrows be raised?

If TWIMTBP is a marketing strategy, it is a strange one, indeed.
 
I am talking about Crysis, not Warhead... They scale extremely different..

and during the test, it use 9.4 driver, which is a fresh driver for 4890, try 9.8 you will see..

PS: I am talking on Very High Setting, not "High"
I had run both before, nVidia have advantage on High, but deep down like crap in Very High... on the other side, ATI is opposite of what nVidia have..

I don't think they scale "differently" it's just that Warhead scales better with hardware.
 
Crysis actually runs better on NVIDIA hardware. I've never seen any tests that indicate otherwise. I've seen it myself, Crysis runs like shit on AMD hardware comparatively.

Here is just one example of that. The 4870 gets beat by the Geforce GTX 260 and the 4890 ties with the Geforce GTX 260.
Heh, funny you should use this example . . . :)
 
I don't think they scale "differently" it's just that Warhead scales better with hardware.

NO, Warhead does not SCALE BETTER... NO WAY..

I been playing both, and Warhead is clearly not better on Highest setting, but it does run better in "High" setting which is "Gamer" setting.

also, forgot to point out the test on GTX 260 is under heavy overclock compare to 4870, the test result is irrelevant to determine which is better...

and trust me, I play Crysis and Warhead almost everyday and mods, I know what I am talking about..
 
NO, Warhead does not SCALE BETTER... NO WAY..

I been playing both, and Warhead is clearly not better on Highest setting, but it does run better in "High" setting with is "Gamer" setting.

also, forgot to point out the test on GTX 260 is under heavy overclock compare to 4870, the test result is irrelevant to determine which is better...

and trust me, I play Crysis and Warhead almost everyday and mods, I know what I am talking about..
Okay, congrats. Goodnight.
 
If TWIMTBP is a marketing strategy, it is a strange one, indeed.

Here's how this marketing strategy is supposed to work.

When a gamer decides to upgrade their video card it's typically because they are running a game - let's call it Super Turbo Turkey Puncher - that isn't performing very well. The gamer remembers that every time he starts the Super Turbo Turkey Puncher there's a video telling him that Nvidia/ATI hardware is the preferred hardware for the game and purchases an upgrade with matching branding.
 
NO, Warhead does not SCALE BETTER... NO WAY..

I been playing both, and Warhead is clearly not better on Highest setting, but it does run better in "High" setting with is "Gamer" setting.

also, forgot to point out the test on GTX 260 is under heavy overclock compare to 4870, the test result is irrelevant to determine which is better...

and trust me, I play Crysis and Warhead almost everyday and mods, I know what I am talking about..

No. Crysis Warhead runs better than Crysis does on the same hardware given similar conditions. As for the GTX 260 in that review, the factory overclocking doesn't count for much in the real world. It never has. The 4870 was also faster than the GTX 260 in just about everything else and when it was released the 4870 was closer to the performance of the Geforce GTX 280. Also let's not forget that the 4870 in that evaluation was overclocked as well. (There goes that excuse huh? :rolleyes: ) If Crysis ran "better" on AMD hardware then the GTX 260 shouldn't have stood a chance against it. Yet it did.

TWIMTBP is normally just marketing bullshit most of the time. Though to be fair differences between cards and various choices developers make during a game's design can obscure that. Sometimes a developer may implement a feature that runs better on the NVIDIA cards or another feature that runs better on AMD hardware. Most of the time if an AMD card is generally slower than an NVIDIA card it will be slower than an NVIDIA card in a TWIMTBP game. If an AMD card is faster than a given NVIDIA card then it normally dominates even in TWIMTBP games as well. Crysis is actually a different animal. I remember seeing something about Crysis actually being developed on NVIDIA hardware that was around at the time. Specifically in regard to handling textures if memory serves.

The idea that AMD/ATI hardware runs Crysis or Crysis Warhead better than NVIDIA hardware does is ludicrous. Just about everyone else on the internet would agree with me on that point too. Get on Google and you will find tons and tons of data and personal experiences, and even benchmarks that prove it. The only reason why Crysis or Crysis Warhead run so well on the 5870 is because the 5870 is considerably more powerful than the Geforce GTX 285. (The Geforce GTX 295 still beats it most of the time ESPECIALLY WHEN RUNNING CRYSIS!!)

What is with all the people that can't face reality today? :rolleyes:
 
I've a couple questions... and some comparison test results, and would love even responses from any on the [H] team regarding the following.

This is a little long, and not meant to "knock" the 5870, but instead to compare results that I personally find a little confusing with the supposed performance of the 5870.

Keep something in mind: I have never gotten the same results with any hardware as others either boast or complain about over the years; all the "hype" about what people can/cannot run on their set-ups. I could never understand why, but it's something to keep in mind when reading this.

First, the questions:

- How is the noise with the 5870?

I think it was [H]'s Steve I saw state that the last few gens of ATi's GPUs are almost intolerably loud.

- Regarding temps/heat... doesn't 88C under load seem a bit hot, which could affect the longevity of the GPU?

Under load, my GTX 280 never hits above 80C (generally around 60-65C), and that's on Crysis: Warhead maxed (VRM hits 80C), on Enthusiast with 4xAA.

Now, in terms of performance...

It would be assumed that the 5870 would pound the GTX 280 into dust, but there's some confusing numbers I've gotten in benching that are almost comparable to the 5870 that put me "on the fence" about moving from the GTX 280 to the 5870.

Keep in mind, these results are the types that I've heard countless people many times state they cannot get and/or are "impossible", but I've never gotten the same "problems"/results as others, so I'm not sure what the deal is.

I'm a graphics whore, so I run everything maxed, all settings on every game.

Rig: GTX 280, E8600, 2GB RAM, ASUS Rampage Formula

Crysis: Warhead (Train Level - start to finish)
Settings: Enthusiast, 4xAA, 16xAF, 1920x1200 res

Min: 8
Max: 23
Avg: 17.9

(5870 - [H] results)
Min: 12
Max: 37
Avg: 26.6

Only a 4 fps difference min, and 8 fps difference avg?

These results are close to the 5870, and that's with me running 4xAA... [H] tested @ 2xAA... does not make sense.

The min, max and average are, on average, only around 4-14 fps more with the 5870 over the GTX 280, at best. I'm not seeing a huge difference here, though I would have expected to.

Even the places where the fps drops really low in Crysis: Warhead, it's not unplayable; a bit choppy, which I'm sure would be a little better with the 5870, but not unplayable.

Sure, the 5870 is faster at the same high settings (except I run at 4xAA, which is higher than [H]'s testing), but not always by a margin that's enough to make a huge difference, comparing the numbers.

Here's an example of a couple other games (again, all in-game settings maxed):

L4D
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 42
Max: 63
Avg: 57.350

TF2
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 62
Avg: 59.343

Fallout 3
Settings: 8xAA, 16xAF, 1920x1200 res
Min: 50
Max: 61
Avg: 59.183

I mean, how much better can you really get than this, that would make any difference in gameplay or visuals? I avg almost 60 fps in every game, with everything maxed.

I'm really just trying to get a feel for the situation, because I've always found that there's a lot more "hype" behind people stating what they can/cannot run on their systems, which I've always found to be different for me. Somehow, I'm always just "magically" able to run faster? I don't know... maybe people exaggerate things too much, too often... but these are accurate and true results on my end.
.

Well the 5870 isnt THAT much faster than a 285. If you averaged up across all games it's probably 30-40% I imagine, on average.

And I disagree with those stating Crysis favors Nvidia cards..4890 runs Crysis almost as well as a GTX285 for half the price. Like 2 FPS difference. So there's no reason to buy a 280 or 285 imo. And a 4890 will get over 60 FPS in every game today so again, no reason to ever buy a GTX280 or GTX285 imo. Even when GTX 280 was new, no game needed it.

So I'm really not seeing why you paid so much for a GTX 280 when my 4890 is as fast or faster for so much cheaper? What am I missing?

5870 is DX 11..that's probably the main reason to get it over 285..and it is faster, just not like 100% faster. More like 40% at best.

Anyways in the Anandtech Warhead benchmarks (picked at random) at 2560 4XAA gamer settings etc, the 5870 got 24.9 FPS and the 285 (not 280) got 16.9. So that is 47% faster. That's a lot faster, you need to look at percent not say "it's just a few FPS faster".

As to why your 280 supposedly runs so much better for you..who knows. Most likely you're running a fake timedemo or something that doesnt reflect real performance. Anyways I've had many people online lie to me about Crysis performance, I've had many say that a 4850 can run Crysis all maxed easily for them and other things that are just flat not true, but they swear up and down. The bottom line is people make a lot of things up.

Still dont know why on earth you own a 280 though, when a 4890 is much better for less..or even a 260 or 4870..and those cards play every game at 60 FPS so why do you need the 280??? That must have cost a pretty penny and no games need it! I bet you paid more for that 280 by far than a 5870!

Also for noise/temps..my 9800GTX was SO hot, it idled at 72C! My 4890 idles at 54C and is way faster. So it just depends on the card, even within a generation the manufacturers change around the cooling so much.

But yeah one of the main mistakes you seem to be making is not looking at percent, you seem to think 16 and 24 is close in your first example, because it's only 8 FPS. But you do realize the latter is half again, or 50%, faster right? It's the same as one card running a game at 100 FPS and another at 150 FPS.

As for why your 280 magically runs faster than all the independent benchmarks, well I just dont know. Like I said I've had people tell me their 4850 runs Crysis all maxed at high res, so I just dont believe such things, those people are not telling truth, but you cant argue with them.
 
Last edited:
Here's how this marketing strategy is supposed to work.

When a gamer decides to upgrade their video card it's typically because they are running a game - let's call it Super Turbo Turkey Puncher - that isn't performing very well. The gamer remembers that every time he starts the Super Turbo Turkey Puncher there's a video telling him that Nvidia/ATI hardware is the preferred hardware for the game and purchases an upgrade with matching branding.

That's how it should work but most developers don't actually do much optimization for one card vs. the other. They choose what Direct X features they want to implement and it's up to the driver engineers to make card brand A perform better than card brand N. It is as simple as that most of the time.

Again, even when playing most TWIMTBP games whatever company has the faster video cards out at the time will dominate in those games as well as any other non-TWIMTBP title.
 
Here's how this marketing strategy is supposed to work.

When a gamer decides to upgrade their video card it's typically because they are running a game - let's call it Super Turbo Turkey Puncher - that isn't performing very well. The gamer remembers that every time he starts the Super Turbo Turkey Puncher there's a video telling him that Nvidia/ATI hardware is the preferred hardware for the game and purchases an upgrade with matching branding.

Well, I get that, and I see how it benefits NVIDIA (or said card company). But how does the game company benefit from picking a side? Isn't it somewhat like the NFL selling the exclusive license to Madden? Now all those 2K fans will be pissy for the next 10 years.
 
I hope you will agree that there are very unusual results for particular games that do not fall in line with accepted truths about card performance (i.e. in most games, a GTX 285 is generally faster than 4890, a 4890 is generally faster than a GTX 260, etc.). When a GTX 260 (for example) outperforms a 4890 in just one game, shouldn't eyebrows be raised?

If TWIMTBP is a marketing strategy, it is a strange one, indeed.

GTX 285 is a clear fastest GPU, but not all of the performances are faster than 4890.

like I said, there are some very little case where one game flavor to another.
but it does not mean TWIMTBP is behind the trigger.

Well, I get that, and I see how it benefits NVIDIA (or said card company). But how does the game company benefit from picking a side? Isn't it somewhat like the NFL selling the exclusive license to Madden? Now all those 2K fans will be pissy for the next 10 years.

not quite sure about the logo thing, but clearly ATI is not advertising enough compare to nVidia.

so far, TWIMTBP does not mean ATI did not help the developer through out the game development.
and since I am a Crysis fan, I pretty much read through all their developer review, and they did mention ATI did help them during the game development. For some reason it doesn't have ATI logo in there, I do not know..
 
And I disagree with those stating Crysis favors Nvidia cards..4890 runs Crysis almost as well as a GTX285 for half the price. Like 2 FPS difference. So there's no reason to buy a 280 or 285 imo. And a 4890 will get over 60 FPS in every game today so again, no reason to ever buy a GTX280 or GTX285 imo. Even when GTX 280 was new, no game needed it.

Crysis does run better on NVIDIA cards GENERALLY speaking. Again 3 Geforce GTX 280's in 3-Way SLI handed my 4870 X2 CrossfireX setup it's ass in that game. As for their being no reason to buy a Geforce GTX 280 or 285, I don't think that's quite so clear cut. I know that in my case when I bought my Geforce GTX 280's the 4870 hadn't even come out yet much less the 4890. So while the 4890 may be a great choice today or the day it was released, the Geforce GTX 280 has been out there for more than a year now.

So I'm really not seeing why you paid so much for a GTX 280 when my 4890 is as fast or faster for so much cheaper? What am I missing?

I don't know about the OP but again when I bought mine, the 4870 wasn't even out yet. The Geforce GTX 280 was a huge improvement in Age of Conan and Crysis for me. (It was better in Call of Duty 4 thanks to how badly the 9800GX2's did AA and AF at 2560x1600.)

Still dont know why on earth you own a 280 though, when a 4890 is much better for less..or even a 260 or 4870..and those cards play every game at 60 FPS so why do you need the 280??? That must have cost a pretty penny and no games need it! I bet you paid more for that 280 by far than a 5870!

Well not everyone has the same setup or is in the same situation as you. I run a 30" LCD and I need all the power I can get. For me that's usually CrossfireX or 3-Way/Quad-SLI or bust. I've never found any single card that was able to give me the all the max details combined with high levels of AA and AF in every game at 2560x1600. Even if they did within a short time more demanding titles will be released that will change that.
 
I disagree that Crysis runs better on Nvidia in general. I think you guys are going off old info. ATI increased their Crysis performance a lot in recent driver. A 4890 will usually run neck and neck with a 285 in Crysis (a tad slower), whereas in some other games it wont. Last I saw if anything Nvidia had a bit more of an edge in Warhead, and not in regular Crysis, but it still wasn't anything huge.

On the 5870 benches I remember thinking ATI needed some work on 5870 drivers in Crysis on some reviews I looked at, but then again the Anand bench I just looked up showed the 5870 50% faster, which I'm sure is in line or even above other games,
 
Crysis does run better on NVIDIA cards GENERALLY speaking. Again 3 Geforce GTX 280's in 3-Way SLI handed my 4870 X2 CrossfireX setup it's ass in that game. As for their being no reason to buy a Geforce GTX 280 or 285, I don't think that's quite so clear cut. I know that in my case when I bought my Geforce GTX 280's the 4870 hadn't even come out yet much less the 4890. So while the 4890 may be a great choice today or the day it was released, the Geforce GTX 280 has been out there for more than a year now.

I have to admit one part there, 4870X2 CF BLOWS in Crysis last time I used it...

I have no idea how some people got 50fps average while I am having 30-40 jumping around.

maybe we both fucked up some part :(

anyway, check GTX 295 review on Crysis in [H], I believe you should see 4870X2 outperform GTX 295 there.. and that is what I experienced, but a bit better.

then again, nVidia does not run better in Crysis, maybe on first launch, but not anymore...

like I said, I been playing Crysis almost everyday, and I been testing how to run the best on this game, and that is my result..

if you still think nVidia runs better, perhaps its 4870X2 CF phantom thats hunting you, I was on the same boat but not anymore.
 
Back
Top