First test of 5870?

Status
Not open for further replies.
We have to wait until Q1 2010 for an HD-5K series low-profile card (Redwood)? That is at least 3.5 months away!!

Rats.

That's the way things been rolling for many years. ATI is actually fast all things considered. Think about how long people have been waiting for mid range GT200's.
 
There will be added a chapter now though:
"AMD GPU users will suddenly praise GPU physics..."

Mark my words.

As long as it isn't Physx.

Besides, CPU can do physics as well, and nobody (seriously... don't do this just to gib me), will buy a 4850 and stick it an a Celeron 430 system.
 
58it1.png


dual gpu results
 
this COULD very well be the series of cards to make me go back to ATI. either way i wont have the money for one till long after nvidia comes out with something new, so we shall see!
 
WOW impressive 2 of this cards destroy GTX295 quad ?? niceeeee Crysis result over 25%faster amazing! :eek:
 
If we believe those graphs then we can assume that the 5870 is on average 20-30% faster (whatever... I'm not going back to interpret it with a more accurate number) than a GTX 285.

Even with that significant lead, we don't know for certain that the minimum framerates maintain the same lead. The gameplay experience is not entirely based upon the average framerate, as the [H]-reviews so clearly demonstrate.

I'm anxious to see further testing, although I can't justifiably get a new video card unless I get another LCD.

On a side note... I was just (few hours ago) bitten by a baby black widow. 3rd Black Widow bite of my life, but at least it was a baby. I took some calcium, but my muscles are still acting up and hurting... it's sucking :( and I can't sleep... where is the Mother? :(
 
On a side note... I was just (few hours ago) bitten by a baby black widow. 3rd Black Widow bite of my life, but at least it was a baby. I took some calcium, but my muscles are still acting up and hurting... it's sucking :( and I can't sleep... where is the Mother? :(

Nuke it from orbit :D

 
Current rumour puts 5850 core at 1440 SP so it's a different die though.

I'm guessing $380 for 5870. If it was $349 the slide leaked before would have said <$350 IMO.

No, it's the exact same chip. It's just a salvage part of RV870 that isn't cut for a HD 5870.
 
There will be added a chapter now though:
"AMD GPU users will suddenly praise GPU physics..."

Mark my words.

We already knew that...and the replies to this post of yours certainly proves the point in other threads: "as long as it's not PhysX"

They don't care about the tech, they care about the color!

Suddenly "curtains" will be a God send :)
 
Well even if this cards are slower then GTX295 that will change as soon as you change the settings to very high or bump AA/AF or change the resolution.

Remember GTX280 vs 8800GT SLI ? or 4870 vs 3870X2 ? the minimum fps never drop so low on "newer" tech.Also smooth game play, no micro stutering and no scaling issues! 100% powah almost all the time ;)

IMHO This should be also taken as point not just the fps in the tests.
 
I know but it is useles...we have all seen it before...every single launch.
"6 FPS is 200% more than 3 FPS!!!1!!!!" :rolleyes:

Without resolution, settings and FPS NUMBERS, those graphs are not worth the pixels they ar made up of.

But you know about where a GTZ 285 preforms at so at least you can get a pall park idea of where the 5870 results may lie. But this with the other leaked data is showing that ATI has improved over last gen.
 
I will wait for the [H]ardocp Review. I would love for this card to kick all sorts of ass. But I have to say I will not read a videocard review unless it is from here.
 
I will wait for the [H]ardocp Review. I would love for this card to kick all sorts of ass. But I have to say I will not read a videocard review unless it is from here.

Good point, no point in reading canned video card reviews. Only a [H] review to find out what it really like
 
Good point, no point in reading canned video card reviews. Only a [H] review to find out what it really like

"Canned" = raw data for you to interpret (mins, avgs, highs, every resolution)

"Real-world" = data interpretted for you

A matter of preference I suppose... I like to see apples-to-apples and reduce the # of variables as much as possible in any scientific analysis such as a videocard review.
 
No one review can answer a question.

To say otherwise sounds religious in outlook, something which has no place in reviewing.
 
We already knew that...and the replies to this post of yours certainly proves the point in other threads: "as long as it's not PhysX"

They don't care about the tech, they care about the color!

Suddenly "curtains" will be a God send :)


No, it's about who controls it. Nvidia has already shown that they are more than happy to deliberately lock PhysX processing out of any PC that has an ATI video card in it. PhysX is never going to become an industry-wide standard with restrictions like that. Developers are not going to bother spending resources on physics coding if it can only be used on a fraction of their entire potential customer base.

What everybody is waiting for is a hardware agnostic package that will work on every GPU equally well and is not wholly controlled by any one video card manufacturer. Personally, I hope Microsoft will step up to the plate and include DirectPhysics as part of the next update to DX11. They have no axe to grind, nor video card manufacturer to favor.
 
Hasn't microsoft already ruled themselves out, at least for the moment?

I agree though that some common base that can be fell back on would be at least desirable.
 
No, it's about who controls it. Nvidia has already shown that they are more than happy to deliberately lock PhysX processing out of any PC that has an ATI video card in it. PhysX is never going to become an industry-wide standard with restrictions like that. Developers are not going to bother spending resources on physics coding if it can only be used on a fraction of their entire potential customer base.

And you have any idea of the reasons why that happens, or do you (as most of the naysayers) assume NVIDIA is just doing it to hurt to you ?

GPU physics as been pushed by NVIDIA and NVIDIA alone. Anyone that wants to license their tech is welcome to. With a license, comes support. Support that will be included in NVIDIA's QA for drivers and everything else. If they don't license it, tough luck for them. NVIDIA isn't in the "business" of charity. No company is really.

Creig said:
What everybody is waiting for is a hardware agnostic package that will work on every GPU equally well and is not wholly controlled by any one video card manufacturer. Personally, I hope Microsoft will step up to the plate and include DirectPhysics as part of the next update to DX11. They have no axe to grind, nor video card manufacturer to favor.

Microsoft doesn't do middleware, so your "hope" dies right there. Developers choose what's best for their games and PhysX adoption as of late (and in the future) is a good sign for it.

And it can work on any GPU. It's just a matter of licensing the tech. Do you know how a license works ? I'm guessing not, so here goes. As soon as a license agreement is "forged" between two parties, it must be respected by both during the license's period. If it's not, the party that "broke" the agreement can and most likely will, be sued to high heaven for that. "One video card manufacturer" won't be controlling everything. They are just the tech holders and thus receive royalties for the tech, which they obviously deserve, since with a license agreement, they are obligated to support rival's products to use that tech.

Anyway, this is way off topic for this thread, let's take it to the physics sub forum.
 
Help me out here, I'm sure I'm missing something, and I'm certainly not trying to be a jackass, but I really don't get why there is so much drooling going on over this information.

I mean, we are comparing next gen hardware to current gen hardware, and to be honest, I'm not being blown away. If the 5870 was sitting at $299, like speculated, then of course, but it isn't.

I know that with driver maturity there will be better numbers, but still.

I certainly not trying to label anyone, but....how do I put this....I'm almost embarrassed for the people that are ranting and raving like these new cards are the second coming.

Is there something I'm missing? I'd gladly change my stance given factual data. I mean, I know there is still a lot of speculation, but expected a tad more out of the cards.

As far as eyefinity, yes, it is awesome. It makes me want to get two more monitors. But really, how many people have or be willing to get two more monitors? Like .0001% of the population?

Again, I'm totally open here, it just seems to me that people to eager to give out leet status to something that, IMO, should be given OK, we'll see where the drivers take us, status.
 
^

Well, IMHO I think the point is that the HD 5870 is equal, faster and/or close to a GTX 295. Which is more expensive, a dual GPU, uses more power and does not have DX 11. A single GPU is now capable of keeping up with a top of the line GPU from the last generation.


Drivers and overclocking will definitely improve performance of this card even more with newer driver updates. (Look at Crysis performance, itimproved quite noticeably on the HD 4870 and HD 4890 over time.)


Myself, I'm especially excited about the HD 5850. So much power for a card which isn't even top of the line, yet surpasses a GTX 285.


Still, it would be best to wait for official reviews before judging these cards. But for now, things are looking well.
 
Last edited:
^

Well, IMHO I think the point is that the HD 5870 is equal, faster and/or close to a GTX 295. Which is more expensive, a dual GPU, uses more power and does not have DX 11. A single GPU is now capable of keeping up with a top of the line GPU from the last generation.


Drivers and overclocking will definitely improve performance of this card even more with newer driver updates. (Look at Crysis performance on the HD 4870 and HD 4890 over time.)


Myself, I'm especially excited about the HD 5850. So much power for a card which isn't even top of the line, yet surpasses a GTX 285.

Yes, I agree with you, the 5850 looks very promising. I wonder how it'll OC :D
 
A single GPU is now capable of keeping up with a top of the line GPU from the last generation.

Which is what almost always happens anyway (in both single chip and dual chip cards)...so that's not that big of a point...

8800 GTX > 7950 GX2
GTX 280 > 9800 GX2
HD 4870 > HD 3870 X2
 
Strictly speaking, the 9800GX2 did put out better fps than the GTX280 for the first few months. At least when the GX2 wasn't memory-limited and the game had decent SLI support.
 
Which is what almost always happens anyway (in both single chip and dual chip cards)...so that's not that big of a point...

8800 GTX > 7950 GX2
GTX 280 > 9800 GX2
HD 4870 > HD 3870 X2



Hmm, good point. Maybe I'm more excited because I didn't really expect to see these (rumored) numbers of the new AMD cards. Also, I view the GTX 295 very highly, more so than I ever did the other Dual GPU cards in the past, so for me personally it seemed like a bigger advancement.



If AMD would be able to improve Crysis performance, the 5800 series would be perfect. I'm still hoping Nvidia will announce something soon.
 
Last edited:
Strictly speaking, the 9800GX2 did put out better fps than the GTX280 for the first few months. At least when the GX2 wasn't memory-limited and the game had decent SLI support.

False. It had higher MAXIMUM fps and "sometimes" a better average, but the framerate would fluctuate a lot. The GTX 280 had higher minimum fps on ALL instances. And as drivers improved, this got even better.
 
Hmm, good point. Maybe I'm more excited because I didn't really expect to see these (rumored) numbers of the new AMD cards. Also, I view the GTX 295 very highly, more so than I ever did the other Dual GPU cards in the past, so for me personally it seemed like a bigger advancement.



If AMD would be able to improve Crysis performance, the 5800 series would be perfect. I'm still hoping Nvidia will announce something soon.

While I'm running a 260 right now, I'm not firmly rooting in either camp. I want Nvidia to get something on the table just so I can see a direct comparison. I'm wondering if we'll see something similar. Better performance, but not as much as I hoped, but a more "refined" performace with nice extras (eyefinity).
 
I don't see a problem with nvidia being a bit behind, most purchasers won't be early adopters anyway so as long as nvidia can get something out before the X2 version comes out. If they can and it is competative then people will most likely accept that, however if they come out after the x2 people will most likely compare it to the best there is and like the R600 vs G80, be dissapointed and no matter how good it is, it will not be good enough.
 
Strictly speaking, the 9800GX2 did put out better fps than the GTX280 for the first few months. At least when the GX2 wasn't memory-limited and the game had decent SLI support.

exactly thats why everybody was dissapointed in gtx 280 when first came out then i think nvidia dropped support for 9800gx2 and it began to get some weird numbers lol
 
Which is what almost always happens anyway (in both single chip and dual chip cards)...so that's not that big of a point...

8800 GTX > 7950 GX2
GTX 280 > 9800 GX2
HD 4870 > HD 3870 X2

It is coming from AMD though. AMD couldn't compete for top spot this go around. They said they had changed their strategy to midrange cards. If their card was just faster than a 4870x2 it would have been impressive but the norm, having it faster than a 295 is what is impressive.
 
the only thing im truly amazed with, besides eyeinfinity, is that amd/ati have finally beaten nvidia with a new generation of cards. The last time ati was able to do that was the 9800 series
 
And you have any idea of the reasons why that happens, or do you (as most of the naysayers) assume NVIDIA is just doing it to hurt to you ?

GPU physics as been pushed by NVIDIA and NVIDIA alone. Anyone that wants to license their tech is welcome to. With a license, comes support. Support that will be included in NVIDIA's QA for drivers and everything else. If they don't license it, tough luck for them. NVIDIA isn't in the "business" of charity. No company is really.

What I doubt is that Nvidia would put forth as much effort into optimizing PhysX on ATI's behalf as their own if ATI were to license PhysX. Somehow I have the feeling that ATI's performance would always be one step behind that of Nvidia's, even should ATI's hardware be shown to be capable of better performance. And apparently ATI feels the same.


Microsoft doesn't do middleware, so your "hope" dies right there. Developers choose what's best for their games and PhysX adoption as of late (and in the future) is a good sign for it.

Apparently you and I see things differently as I don't think PhysX acceptance is increasing much at all. And now with the ATI lockout, I don't doubt developers will be spending even less time and money for PhysX coding. After all, why spend valuable resources on effects that will only be seen by a small percentage of your potential customer base?


And it can work on any GPU. It's just a matter of licensing the tech. Do you know how a license works ? I'm guessing not, so here goes. As soon as a license agreement is "forged" between two parties, it must be respected by both during the license's period. If it's not, the party that "broke" the agreement can and most likely will, be sued to high heaven for that. "One video card manufacturer" won't be controlling everything. They are just the tech holders and thus receive royalties for the tech, which they obviously deserve, since with a license agreement, they are obligated to support rival's products to use that tech.

Please see above regarding my opinion of Nvidia's potential commitment levels in ensuring PhysX would run at peak performance levels on ATI products.


Anyway, this is way off topic for this thread, let's take it to the physics sub forum.

So you responded to what you perceive as an OT comment with another OT comment? :confused:
 
Apparently you and I see things differently as I don't think PhysX acceptance is increasing much at all. And now with the ATI lockout, I don't doubt developers will be spending even less time and money for PhysX coding. After all, why spend valuable resources on effects that will only be seen by a small percentage of your potential customer base?

Welcome to the real world:
http://hardforum.com/showthread.php?t=1451856
 
Umm lets see I just picked up Darkest of Days, Batman Arkham Asylum and NFS Shift this week and all use PhysX. 3 games in one week. Show me where 3 games in one week were ever released that all supported DX10 or 10.1??

So for the people clueless to the tech, its out there and picking up steam. I'd miss my PhysX for sure going back to ATI. I use a dedicated 9600GT and it runs great for me in all the games that support it.

I'm not really excited about the ATI cards either. 1. I prefer nvidia's drivers 2. nvidia cards OC like crazy, I've got a GTX 280 @ 756mhz core, 1566 shader, 1260 ram 3. I'd lose CUDA and PhysX 4. I had two HD4870s die on me.
 
You do realise that nothing in any game released actually needs physx, just like no game actually needed the ageia PPU.

When you need to manipulate a users experience to make it seem worth it as much as they have to, you know it's bad.
 
Interesting little article. I like how they point out that the three year old PhysX has overtaken the nine year old Havok.

One thing though that people are ignoring I think, is the fact that since 2007, Intel has owned Havok, and that Intel and AMD both have been working together on getting Havok more greatly supported.

There's one reason why I wouldn't be surprised if Havok ultimately makes a comeback and wins out: Intel.

Remember, PhysX only really started to truly gain some level of mainstream acceptance when nVidia bought Ageia and began to work on integrating PhysX into its software line. Thanks to the clout that nVidia can pull with developers, it's a no-brainer that you'd begin to see more and more games supporting it.

However, as everyone also knows, Intel is working on its own discrete graphics, Larrabee. With Intel's ownership of Havok, and even working with a company (AMD) who is a rival in the CPU market and soon to be a rival in the discrete graphics market, I think Havok could come back with a vengeance in terms of developer support, as Intel could throw their full weight behind getting it to be the top-accepted standard amongst developers.

Some people laugh and scoff at the idea of Intel having any sway on the GPU market, but it's only a matter of time.
 
Status
Not open for further replies.
Back
Top