First test of 5870?

Status
Not open for further replies.
So there will be no GPU physics on the R800 series?

oops GPU physics?

here is the problem

nVidia card -> PhysX enable -> Game initialize -> Lag out the game -> ignore the lag -> ALL hail the physX!!!
 
***CAUTION: ignore extremely long post at your own discretion***





wow this thread is way past the point of no return so i will add my 2 cents anyways (and yes, the physics related "discussion" should be taken to its own forum as suggested):

to summarize - 5870 >= 4870x2/ gtx295, gt300 speculated to be at least = or faster than a 5870 at a higher price at a later date

ati has eyefinity, which is cool

nvidia has 3dvision and gpu physx-physics, which is also cool

physics acceleration is good progress for gaming

cpu physics is capable of decent physics simulations, even though gpus are still exponentially faster at acceleration - same situation it was 10+ years ago with 3d acceleration (would anyone want graphics handled by the cpu today?)

certain types of physics effects (not all) are certainly capable of running on the cpu (some are very easy for it to handle while others will cause it to choke), but the gpu will still handle it faster, assuming a balanced workload

the more heavy physics processing used, the more difficult it becomes for a cpu to keep up with a gpu (20+ fps is preferable to 2 fps); the more heavy physics processing is used,
the more power is needed to achieve acceptable performance, and vice-versa for less

assuming accelerated physics what will change/improve gameplay is the next evolution in gaming, don't expect the cpu to be the best solution (considering a few games already make the most of and/or maximize existing cores)

there are currently more gpu physics capable vid cards in systems compared to those with even quad cores according to steam

multicore cpus with lots of cores may eventually be a good baseline for advanced physics, but when the time comes, future gpus will still probably be exponentially faster

not all games need fancy physics to enjoy

most games use simple physics; however, the potential exists for devs to make games with highly advanced/ dynamic physics environments with accelerated physics

most games still cater to the lowest common denominator to have the widest support (which is to be expected from a business standpoint), while a few push the limits like crysis

eventually devs will push the limits of what is possible with physics on current hardware in the same way crysis did for graphics

if all current and future gpus had gpu physics, devs would have a higher baseline to implement more advanced physics in games (not just pretty physics "effects")

if devs have the hardware to make more immersive and dynamic games, they should use it

physx is currently the #1 physx api, with havok right behind it

1 out of every 4 games uses physx and 1 out of every 2 games uses physx or havok

which means one shouldn't expect the physx api to die with such widespread support on the pc and all consoles and even the iphone (cpu + gpu support on pc, cpu - others) and used in games like batman: aa, nfs: shift, dragon age origins, mass effect 2, mafia 2, etc.

physx is currently the only api with gpu acceleration currently

physx offers gpu acceleration for physics, but still supports the cpu, just like havok

havok may eventually have actual gpu acceleration supported in games

games can still have nice looking physics effects running off the cpu in software physx/havok/etc.

some people complaining about lower performance due to the higher physics load of gpu physx may fail to realize that it would be lower still in software ( and the option remains to turn of what amounts to accelerated physics effects thus far), while probably having no qualms over enabling performance degrading graphic options and high levels of aa in games

some games like ut3 can have boosted performance with gpu physx assuming the game is only using low level software physx

some people obsess way too much over fps (if one wants high fps, just play at 640x480 low settings - problem solved)

some people hating on gpu physx in games not "changing the gameplay" even though they shouldn't expect it to if only "half" the gpu market supports it

hypothetically speaking , the same situation would be decried if there was no gpu physx by nvidia and ati had a stranglehold on gpu physics in an alternate reality

some people hating on lackluster implementation of said gpu physx "effects" in game should blame the devs, or just ignore the game like they would any other game they don't happen to like for whatever reason; the same people may also go gaga over other "pointless" visual effects, like for example dx10 exclusive graphics options, even though both are just meant to make a game that much better looking/ immersive (hypocrisy much)

gpu physics has only been around for a year or so - to expect anything more than devs to tack on some optional physics effects in such a short amount of time is ridiculous. multiplatform games like batman: aa are built to run on consoles to, so the devs probaby already had a basic framework in place for the physics they wanted to run acceptably on consoles. they probably tacked on the gpu physx effects in a short amount of time.

cool physics effects (or pretty graphics) will not magically turn a crappy game into a good one

some people hating on gpu physx because it isnt supported by both gpu companies can keep on hating until things change in this current not so ideal situation

some people may not even be arguing over the matter if nvidia didn't purchase ageia and enable gpu physics via cuda; it would have been better to leave physics acceleration languishing away for several more years and let the potential die huh (sarcasm much)

some people would probably change their tune if gpu physics was supported by both vendors if it meant the possibility for more advanced games

even if that was the case, console gaming would still be an impediment for widespread support in games, due to most being multiplatform

on the other hand, it would allow for a platform for <gasp>, pc exclusives over consoles if games were designed from the ground up for gpu physics

more advanced physics and a.i. is the next logical step in gaming outside of the constant progression of improved graphics

fanboys will always be fanboys, whether green or red, console or pc, xbox360 or ps3, etc.





in conclusion, the new ati hardware performance numbers look good so far! 2 thumbs up.

(...and now back to your regularly scheduled programming)
 
Last edited:
Like I stated before:
AMD GPU users will start praising GPU physcis...as long as it isn't PhysX :D

Most GPU users wants GPU physcis...preferably being open source or at least based on a common independent standard.

Your statement is just a twisted subset of a basic fact.You are kicking in a wide open door.You seem to think that most AMD GPU users are having some grudge against Nvidia or PhysX. That's just ridiculous.
 
The thing about GPU physics is that currently it's not a compelling enough reason to choose one brand over another. Physics processing support needs to become universal across most PC gaming platforms. If we ever want to see a majority of new titles take advantage of the technology, customers will have to have some reason to care more about the ability of their new card to handle physics processing. The best way for that to happen is for a company like Microsoft to step in and develop an open standard for physics acceleration in the same vein as direct3D or OpenGL.
 
The thing about GPU physics is that currently it's not a compelling enough reason to choose one brand over another. Physics processing support needs to become universal across most PC gaming platforms. If we ever want to see a majority of new titles take advantage of the technology, customers will have to have some reason to care more about the ability of their new card to handle physics processing. The best way for that to happen is for a company like Microsoft to step in and develop an open standard for physics acceleration in the same vein as direct3D or OpenGL.

Except that technology and business are interrelated. Technology-wise, I'm sure it was possible for Rocksteady to put smoke and fumes and other theatrics into Batman without hardware Physx. But because of whatever business deal they had with Nvidia to pimp Physx, that didn't happen. Having now played for a couple hours with Physx enabled, I'm simply not going to switch to a card that I know is giving me less eyecandy than what I know is available.

On the other hand, ATI has DX11. If Nvidia releases a DX 11 part by December, I'd go with them. Otherwise, it's an impossible decision to choose between DX11 eyecandy versus Physx eyecandy, and I don't know what I'd end up with.
 
I can't say I'm mad at Atech. This thread would have been a bunch of video card enthusiasts sitting aorund and contemplaying their navels, at least without him or reliable benchmarks.
 
On the other hand, ATI has DX11. If Nvidia releases a DX 11 part by December, I'd go with them. Otherwise, it's an impossible decision to choose between DX11 eyecandy versus PhysX eyecandy, and I don't know what I'd end up with.
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."

Personally, I think Havok will see some very interesting changes when Intel finally begins showing off Larrabee. Remember, Havok was working on a GPU-based implementation around the time Intel bought them up. With AMD also working with Intel in support of Havok, we could very well see some form of Havok FX re-appear, supporting AMD and Intel GPUs.

That's why PhysX barely holds my interest currently - I'd rather just go with the top performance card in general, until such issues as physics rendering have been determined. And if it does in fact some day come down to nvidia with PhysX vs. Intel/AMD with Havok, I think I'll put my money on the big blue. :p
 
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."

Personally, I think Havok will see some very interesting changes when Intel finally begins showing off Larrabee. Remember, Havok was working on a GPU-based implementation around the time Intel bought them up. With AMD also working with Intel in support of Havok, we could very well see some form of Havok FX re-appear, supporting AMD and Intel GPUs.

That's why PhysX barely holds my interest currently - I'd rather just go with the top performance card in general, until such issues as physics rendering have been determined. And if it does in fact some day come down to nvidia with PhysX vs. Intel/AMD with Havok, I think I'll put my money on the big blue. :p

But since Intel is going with a x86 GPU, wanna bet that Havok is going to be tailored towards that?
 
But since Intel is going with a x86 GPU, wanna bet that Havok is going to be tailored towards that?
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.

It seems like people tend to view AMD and Intel as having a lot of animosity between each other, and while they are competitors in the CPU market and, to a lesser extent currently, the GPU market (with that increasing once Larrabee is released), Intel has a lot to gain by having AMD use Havok. They get licensing fees, a better chance that developers will use it (why support an API that only represents one of the three companies selling discrete cards, when you can use a physics solution supported by the majority, one of whom is Intel), etc.

That's one of the reasons why Havok is so appealing - it's so open, whereas PhysX is controlled by wonderfully-draconian nvidia, lol.
 
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.

It seems like people tend to view AMD and Intel as having a lot of animosity between each other, and while they are competitors in the CPU market and, to a lesser extent currently, the GPU market (with that increasing once Larrabee is released), Intel has a lot to gain by having AMD use Havok. They get licensing fees, a better chance that developers will use it (why support an API that only represents one of the three companies selling discrete cards, when you can use a physics solution supported by the majority, one of whom is Intel), etc.

That's one of the reasons why Havok is so appealing - it's so open, whereas PhysX is controlled by wonderfully-draconian nvidia, lol.

Havok isn't "open", it's controlled by Intel.
You think Intel is going to play nice with AMD?

Take a look on the CPU markedet...Intel don't play nice with AMD...and neither will they here.
 
I am pretty sure that there is no license fee's involved with havoc except with the game companies. Unless that has changed with any proposed gpu based physics but havoc is not like physx correct me if I am wrong.
 
Havok isn't "open", it's controlled by Intel.
You think Intel is going to play nice with AMD?

Take a look on the CPU markedet...Intel don't play nice with AMD...and neither will they here.
Do you really think that if Intel didn't want AMD to fully take advantage of Havok, they would even have let a deal between AMD and Havok be done? Like I said before, Intel has a lot to gain from AMD using Havok.

And what about the CPU marketplace? Do Intel and AMD compete? Yes. But Intel is hardly "playing dirty" in the CPU market. Guess who AMD has to license the rights to make x86 processors from? Yep, Intel.

AMD and Intel also exchange instruction sets at times or work to make sure that there's compatibility between them.

If Intel truly was "not playing nice", AMD's CPU business would be in much worse shape. The fact that Intel's processors have been out-performing AMD's has more to do with just the design decisions each has made and the R&D resources each has available.
 
I am pretty sure that there is no license fee's involved with havoc except with the game companies. Unless that has changed with any proposed gpu based physics but havoc is not like physx correct me if I am wrong.
There might not be. I'm not aware of the specific terms in regards to the arrangement AMD and Havok have, so yeah, it may be completely free for AMD to use.
 
I know they are developing for gpu and that but havoc have always been hardware agnostic, I think this is a nasty, nasty byproduct of physx both ageia and nvidia that people now assume that hadrware manufacturer matters.

There is no reason for them to be even a consideration, it should only be between game devs and havoc as to what games use it, not what consumers get to use it.
 
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."

Personally, I think Havok will see some very interesting changes when Intel finally begins showing off Larrabee. Remember, Havok was working on a GPU-based implementation around the time Intel bought them up. With AMD also working with Intel in support of Havok, we could very well see some form of Havok FX re-appear, supporting AMD and Intel GPUs.

That's why PhysX barely holds my interest currently - I'd rather just go with the top performance card in general, until such issues as physics rendering have been determined. And if it does in fact some day come down to nvidia with PhysX vs. Intel/AMD with Havok, I think I'll put my money on the big blue. :p

Betting on IBM? :D
 
Extrapolated some of the comparable data to see what a GTX 285 looks like next to the new cards, remember this is all information from the leaked performance results that hit the net and not my own data.

57344827.jpg
 
Extrapolated some of the comparable data to see what a GTX 285 looks like next to the new cards, remember this is all information from the leaked performance results that hit the net and not my own data.

57344827.jpg

nice chart, great work man!
 
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.

It seems like people tend to view AMD and Intel as having a lot of animosity between each other, and while they are competitors in the CPU market and, to a lesser extent currently, the GPU market (with that increasing once Larrabee is released), Intel has a lot to gain by having AMD use Havok. They get licensing fees, a better chance that developers will use it (why support an API that only represents one of the three companies selling discrete cards, when you can use a physics solution supported by the majority, one of whom is Intel), etc.

That's one of the reasons why Havok is so appealing - it's so open, whereas PhysX is controlled by wonderfully-draconian nvidia, lol.

Think of it like WW2, Havok is USA and Physx is Germany. AMD represents "the third guy" Russia that can tip the outcome of the war.

I said it before, Intel has a vested interest to use AMD to help take NV off the #1 spot first. Only after that should they go USA vs USSR on eachother.
 
Extrapolated some of the comparable data to see what a GTX 285 looks like next to the new cards, remember this is all information from the leaked performance results that hit the net and not my own data.

57344827.jpg

I dont see how this chart to be legit for Crysis...

41 fps on GTX 285? I think something is smoking...
 
Crysis has always favored NVIDIA GPUs.

I had a GTX 295, the FPS never goes that high on max setting, even with 0xAA..

its probably either Medium setting that need to be mention

PS: Warhead seem to favor nVidia a bit, while Crysis the original favor on ATI....
 
I had a GTX 295, the FPS never goes that high on max setting, even with 0xAA..

its probably either Medium setting that need to be mention

PS: Warhead seem to favor nVidia a bit, while Crysis the original favor on ATI....
Ah, that's what you meant. Well they're only playing at 1680x1050, and the numbers are probably from the GPU benchmark, not actual in-game play (that would be my guess anyway).
 
Betting on IBM? :D
Yeah, I realized after I posted it that it could mean IBM also. Eh, you never know, IBM is always cooking up stuff in their labs... :p

Think of it like WW2, Havok is USA and Physx is Germany. AMD represents "the third guy" Russia that can tip the outcome of the war.

I said it before, Intel has a vested interest to use AMD to help take NV off the #1 spot first. Only after that should they go USA vs USSR on eachother.
Well, in that case, I'll think of it like US vs. Germany in Day of Defeat: roughly similar weapons/performance between each ;)

In all honesty, I could see Intel one day trying to more closely integrate Havok with its own discrete graphics cards, and leaving AMD the "odd man out", but I don't think that would be for quite some time, because first, as you said, Intel wants to have AMD assist them in knocking nvidia down some. Even then, I think that now that Havok is being provided for free possibly, AMD could make their own enhancements to benefit from it. It'll be interesting to see what does ultimately happen from the AMD/Intel Havok partnership. I just think it's kinda stupid for nvidia fans to already declare that PhysX is the next big thing.
 
Yeah, I realized after I posted it that it could mean IBM also. Eh, you never know, IBM is always cooking up stuff in their labs... :p


Well, in that case, I'll think of it like US vs. Germany in Day of Defeat: roughly similar weapons/performance between each ;)

In all honesty, I could see Intel one day trying to more closely integrate Havok with its own discrete graphics cards, and leaving AMD the "odd man out", but I don't think that would be for quite some time, because first, as you said, Intel wants to have AMD assist them in knocking nvidia down some. Even then, I think that now that Havok is being provided for free possibly, AMD could make their own enhancements to benefit from it. It'll be interesting to see what does ultimately happen from the AMD/Intel Havok partnership. I just think it's kinda stupid for nvidia fans to already declare that PhysX is the next big thing.

Well, it is:
http://www.hardforum.com/showthread.php?t=1451856

It's also the only API currently running on a PPU, a CPU a Cell SPE and a GPU...
 
Achievement Unlock: Ride on the PhysX Fail Train

lol, dude what are you trying to say? the only thing i get from that link is that the physx api is now used more than the havok api by well established developers on all major platforms. if you hate physx that much, i suggest you boycott the middleware by not playing games that use it, including new & upcoming titles like batman aa, mass effect 2, mafia 2, dragon age origins, nfs shift, shattered horizon, dark void, borderlands, aliens colonial marines, etc. if you don't care for gpu physx, that is a different story, but that won't change the fact that the physx middleware itself is used in a lot of games.

i don't see the problem with physx. i only see that some people like to complain about the gpu-accelerated aspect of it (which is still optional, mind you), irregardless of the benefits. i also don't see how havok will change anything. it would still more than likely be the same situation. the only legitimate complaint is that there isn't one gpu accelerated physics solution that works on both green and red hardware and anything else it wants to use; if that eventually doesn't happen to be physx, then great, i'll still take it if it works well. however, i imagine if/ when that day comes, people will still find something to complain about.
 
Last edited:
lol, dude what are you trying to say? the only thing i get from that link is that the physx api is now used more than the havok api by well established developers on all major platforms. if you hate physx that much, i suggest you boycott the middleware by not playing games that use it, including new & upcoming titles like batman aa, mass effect 2, mafia 2, dragon age origins, nfs shift, shattered horizon, dark void, etc. if you don't care for gpu physx, that is a different story, but that won't change the fact that the physx middleware itself is used in a lot of games.

or is that the only few games that have "physics" with paper flying around, or maybe glass breaking .. :p
http://www.havok.com/index.php?mact=News,cntnt01,detail,0&cntnt01articleid=253&cntnt01returnid=82
oh no...CPU can't even handle glass breaking (mirror's edge effect)

PS: I am not a fan in either one of it, and yes, I do hate physX when it drag the fps down below my acceptable range even with my GTX 295 that I sold few days ago, and false advertise on many part . Havok in the other hand prove itself to be powerful as physX without a massive fps drop game and offer the same or better effect.

-----------------------------
How in the world physX discussion drag into a discussion of First test of 5870? and some other thread of ATI card discussion... and same smoker there and there again.... :p
 
Status
Not open for further replies.
Back
Top