PhysX on nVidia cards won't work if an ATI card is used

The hardware certainly isn't. We never got a good look on what's in the PPU.

Well a some insigth where put forward:
http://www.blachford.info/computer/articles/PhysX1.html

If NV wanted to push physx they could very well have released a striped down version of the GPU driver similar to how Tesla is run. They can even make windows not realise that it's a video card. Since windows can't really figure out what a device is for sure. As long as there is a suitable driver for it Windows will identify a device as anything the driver claims.

Dons't make much sense if you look at NVIDIA's upcomming architechture...from SP to DP and a lot of focus on GPGPU (which physics is)...one card with all the features, not 2 cards.

My only spectulation now is when they will (again) incoperate a "soundcard" on their GPU's...and promote their cards as "the only card needed for gaming".
 
Last edited:
Well a some insigth where put forward:
http://www.blachford.info/computer/articles/PhysX1.html

Dons't make much sense if you look at NVIDIA's upcomming architechture...from SP to DP and a lot of focus on GPGPU (which physics is)...one card with all the features, not 2 cards.

My only spectulation now is when they will (again) incoperate a "soundcard" on their GPU's...and promote their cards as "the only card needed for gaming".

They never incorporated one (sound card). It was a motherboard integrated product.
 
Well a some insigth where put forward:
http://www.blachford.info/computer/articles/PhysX1.html



Dons't make much sense if you look at NVIDIA's upcomming architechture...from SP to DP and a lot of focus on GPGPU (which physics is)...one card with all the features, not 2 cards.

My only spectulation now is when they will (again) incoperate a "soundcard" on their GPU's...and promote their cards as "the only card needed for gaming".

I meant for people who don't want NV as their display, that includes Intel GMA people. Just a simple CUDA only driver. The point is people who want it only has to concern themselves of what 'extra' cards to buy and not what is in their existing system. By locking out ATI users they also denied themselves potential sales from them.

They never incorporated one (sound card). It was a motherboard integrated product.
Technically they did, check the link he provided.
 
Technically they did, check the link he provided.

One potential use would be in “physical modelling” synthesisers, these use physics processing to simulate the individual parts of musical instruments and can create strikingly realistic sounds. Physics is also used in engineering and of course scientists could also potentially find the processor useful.
^^ only refernce to Sound processing.

Yeah... why not use a microphone and record real sounds? (Hypernova, this is NOT against you).
 
I meant for people who don't want NV as their display, that includes Intel GMA people. Just a simple CUDA only driver. The point is people who want it only has to concern themselves of what 'extra' cards to buy and not what is in their existing system. By locking out ATI users they also denied themselves potential sales from them.

I have a Intel GMA965 in the craptop I am writing from...but I see no use for a CUDA physX thingy in my craptop..it can't paly games for shit?


Technically they did, check the link he provided.

He is just trolling every post I make, don't bother ;)
 
And how does that translate to having PhysX translated to what ATI GPUs understand ?
Are you seriously suggesting that NVIDIA should be doing ATI's job ?? Really ??...

I'm not. I am however questioning Nvidia's "help/support" when Eran Badit at NGOHQ were to run PhysX on Radeon's as the orginal post you answered to regarding this. Bad support from Nvidia considering that ATI's SDK is free for download.

Its not ATI's job to port and run Nvidia's "middleware (more Nvidiaware though)".

LOL, why would NVIDIA need to go to them ? It's open for license. AMD just needs to license it, if they want to. They don't seem to want to. Is that NVIDIA's fault ?

As the original post you answered on (please read again that one, so not to answer pointless questions as this), Nvidia only "offered" ATI PhysX through media. They never offered ATI PhysX directly. Of course ATI doesn't want to license PhysX. Considering the OP, why should they?


The difference with that "analogy" is that you don't lose physics in a PhysX powered game, when you can't enable GPU Physics. PhysX defaults to the CPU, when the requirements for GPU physics are not met. While with the "CPU disabling" you wouldn't even be able to play the game.

You do loose PhysX effects. If you have a Nvidia card, bought and paid for, and want to use it as PPU, you can't, since its been disabled as PPU as long as the main renderer card isn't from Nvidia.

We can change the analogy a bit if you'd like. What if Intel and AMD processors would simply stop working if they detect that hardware accelerated physics is enabled? (As Nvidia disable hardware accelerated physics if it detects another card as main renderer)?

You can still play the game, right? And, it should be all fine within Nvidia's business ethics... ;)

And PhysX can be used by anyone that licenses the tech. Developers do it, why wouldn't AMD need to do it ? Should they get it entirely free ? Why isn't AMD sharing their OpenCL efforts to have GPU accelerated physics through Havok, with NVIDIA ?
That's right, because 1) NVIDIA doesn't have an Havok license from Intel and 2) Why would AMD share tech they've developed to work with the license they have for Havok, with a competitor ?

PhysX is free, right? AMD is "sharing" their OpenCL efforts on Havok with Nvidia. This is the whole point. Nvidia supports OpenCL and will get support for Havok by default.

The BIG difference here (and the reason why I support OpenCL Havok and not CUDA PhysX) is that the first one is middleware as it should with hardware agnostic support, while the last one is Nvidiaware used to push Nvidia hardware.

CUDA PhysX is a closed standard, even though it can be run on CPU. Batman showed us this. It doesn't matter for me, since it doesn't offer anything more then what Aegia did (some extra eyecandy, but nothing worth buying a card for). Worst part is that the PhysX effects are just disabled, instead of scaled accordingly to hardware support. This means that developers are crapping on gamers just to showcase Nvidia:
http://www.hardforum.com/showpost.php?p=1034631365&postcount=66

I'll take Havok's Hydracore CPU support any day over anything I've seen from PhysX up till now.

Edit: Ah, and Batmans cape too hard to run on CPU? Check out Havok cloth (and Roy preview):
http://www.havok.com/index.php?page=showcase

Same goes with banners on Mirrors Edge. Check out Flags...

To think about it, Nvidia PhysX brings less to the table then Aegia physX, since under aegia, the CPU physics were better, while the GPU physX isn't more now under Nvidia. I didn't buy Aegia physx card then and PhysX won't be a reason if I buy a Nvidia card on next gen.
 
Last edited:
To think about it, Nvidia PhysX brings less to the table then Aegia physX, since under aegia, the CPU physics were better, while the GPU physX isn't more now under Nvidia. I didn't buy Aegia physx card then and PhysX won't be a reason if I buy a Nvidia card on next gen.

You are clueless....CPU PhysX works just fine, just like when AGEIA was around :rolleyes:
 
You are clueless....CPU PhysX works just fine, just like when AGEIA was around :rolleyes:

When Aegia was around, they tried to optimize it a bit more for CPU. With Nvidia...:

Because a Core i7 is more than capable enough of handling PhysX by itself (this is what the tweak does). It allows ALL Cores to be used for PhysX (nVIDIA.. in an attempt to try and claim CPUs can't run PhysX... generally relegate it to a single core).
http://www.hardforum.com/showpost.php?p=1034626406&postcount=8

PhysX being a pawn in marketing instead of being a middleware hurts gamers. Can't say I like that.
 
Don't post crap that has ladready been debunked...in that same thread:
http://www.hardforum.com/showpost.php?p=1034639068&postcount=13

Can't see that you have "debunked" anything there. He's claiming that he's running it on more then 1 core vs. single core before hack. You've proven nothing otherwise.

In addition, though he claims physX, you claim reduced physx, which both proves my point that games are not created optimal for physics on CPU with Nvidia PhysX. A simple hack improves the physics on CPU, doesn't it... ;)

Isn't it a bit ironic that you are proving what you call crap?:rolleyes:

Just like Havok, Physx is scalable. However, when Nvidia is using physx as a showcase to sell hardware and actually limits the physx effects on CPU's to make PhysX look better, instead of scaling it as Havok in Farcry 2, then PhysX is bad for gamers.
 
Just not a fan of Nvidia even though all my cards happen to be Nvidia at this point. The problem is Nvidia intentionally disabling it on an Nvidia card. It is not like we are hacking it to run on ATI. Anyway with the 5xxx series I think Ill go ATI again.
 
I'm not. I am however questioning Nvidia's "help/support" when Eran Badit at NGOHQ were to run PhysX on Radeon's as the orginal post you answered to regarding this. Bad support from Nvidia considering that ATI's SDK is free for download.

Its not ATI's job to port and run Nvidia's "middleware (more Nvidiaware though)".

Er...it is ? Your arguments are becoming so far fetched that it's quite boring to even reply back to your posts...so I'm just going to cover a couple of points.

Is IS ATI's job. PhysX is OWNED by NVIDIA. If they want to use it, they NEED to work with NVIDIA so that PhysX is ported to Brook+/OpenCL. NVIDIA will provide support as per a license agreement, but they are not going to do ATI's job for them. ATI doesn't seem to want to license PhysX and so that's entirely their problem.

Tamlin_WSGF said:
As the original post you answered on (please read again that one, so not to answer pointless questions as this), Nvidia only "offered" ATI PhysX through media. They never offered ATI PhysX directly. Of course ATI doesn't want to license PhysX. Considering the OP, why should they?

They did nothing through the "media". They just answered questions done by journalists, where they said that PhysX is open for license and ANYONE can use it. They don't need to offer anything directly to ATI. If ATI wants it, they form a license agreement with NVIDIA and that's it. You seem to have a very dumb idea of how a license agreement is formed. Somehow you think that the owner of the tech NEEDS to go to everyone and ask if they want to use it...I don't think I need to explain (again) how wrong that is ?

Tamlin_WSGF said:
PhysX is free, right? AMD is "sharing" their OpenCL efforts on Havok with Nvidia. This is the whole point. Nvidia supports OpenCL and will get support for Havok by default.

No it's not entirely free, since it is subject to a license, just like Havok.

As for AMD sharing their OpenCL efforts. They are ? That's great :rolleyes:....The problem is...that they are not (don't know why you try to make something up...that really doesn't help your point).
But even if they were, for it to work, NVIDIA would need to license Havok aswell (they need access to the API), so that they can do the proper OpenCL calls translation to CUDA, much like ATI did with their "now non-existent" GPU physics effects from OpenCL to Brook+.
 
Er...it is ? Your arguments are becoming so far fetched that it's quite boring to even reply back to your posts...so I'm just going to cover a couple of points.

Is IS ATI's job. PhysX is OWNED by NVIDIA. If they want to use it, they NEED to work with NVIDIA so that PhysX is ported to Brook+/OpenCL. NVIDIA will provide support as per a license agreement, but they are not going to do ATI's job for them. ATI doesn't seem to want to license PhysX and so that's entirely their problem.

You should only be happy that your posts are answered and again that is not as much done for you. Regardless of what we discuss, you discuss Nvidia as the subject don't matter, only Nvidia itself as company. Like some sort of Nvidia's answer to Comical Ali.

PhysX is owned by Nvidia, here we agree. However, nobody has the access and most likely will be given the access to port PhysX to run on other API's then CUDA. Its not ATI's job and not even ATI's choice to port middleware. Where Nvidia wants to go with PhysX is entirely up to them. Calling it ATI's job, now THATS really far fetched... :rolleyes:

They did nothing through the "media". They just answered questions done by journalists, where they said that PhysX is open for license and ANYONE can use it. They don't need to offer anything directly to ATI. If ATI wants it, they form a license agreement with NVIDIA and that's it. You seem to have a very dumb idea of how a license agreement is formed. Somehow you think that the owner of the tech NEEDS to go to everyone and ask if they want to use it...I don't think I need to explain (again) how wrong that is ?

I'll not comment the first part beyond this, since the you are basically agreeing that Nvidia didn't offer PhysX to ATI.

Its funny that you are talking about licence agreement and others understandings of it. ATI can't just form a license agreement with Nvidia about porting their whole PhysX API as if its open for fully reengineering. PhysX is accessed through CUDA and they sell no licenses to ANYONE where they reveal the code needed for porting the libraries nor the rights.



No it's not entirely free, since it is subject to a license, just like Havok.

As for AMD sharing their OpenCL efforts. They are ? That's great :rolleyes:....The problem is...that they are not (don't know why you try to make something up...that really doesn't help your point).
But even if they were, for it to work, NVIDIA would need to license Havok aswell (they need access to the API), so that they can do the proper OpenCL calls translation to CUDA, much like ATI did with their "now non-existent" GPU physics effects from OpenCL to Brook+.

On some point I agree here. If you need PhysX, you get a large commercial luggage with it (when you read the EULA), so I would hardly call it free as long as you need to give something as "payment". However, its advertised as free:
http://developer.nvidia.com/object/physx_downloads.html

As for ATI sharing their Opencl efforts?:

This API is simple to use and program to deliver results for game developers. Based on OpenCL it will run on any OpenCL compliant hardware, not just AMD's.
http://www.rage3d.com/articles/vision_eyefinity/index.php?p=6

And, Nvidia doesn't need to license Havok in OpenCL. It works by default as long as Nvidia supports OpenCL. There won't be a need for ATI to license PhysX either if Nvidia were to port it to OpenCL.
 
I find it highly annoying that they have disabled it, I was just about to get myself a 9800gt or GTX260 just for physx, alongside my 4870 and possibly in the near future 5870, but obviously wont bother now.

One thing I did see today that was quite funny was a Nvidia Physx logo on the back of an xbox 360 Batman:Arkham Asylum, wonder what the physx effects are on the xbox 360? The x360 from what I remeber has an ATi gfx core, will there be more effects than on the pc with an ATi card?
 
You should only be happy that your posts are answered and again that is not as much done for you. Regardless of what we discuss, you discuss Nvidia as the subject don't matter, only Nvidia itself as company. Like some sort of Nvidia's answer to Comical Ali.

PhysX is owned by Nvidia, here we agree. However, nobody has the access and most likely will be given the access to port PhysX to run on other API's then CUDA. Its not ATI's job and not even ATI's choice to port middleware. Where Nvidia wants to go with PhysX is entirely up to them. Calling it ATI's job, now THATS really far fetched... :rolleyes:



I'll not comment the first part beyond this, since the you are basically agreeing that Nvidia didn't offer PhysX to ATI.

Its funny that you are talking about licence agreement and others understandings of it. ATI can't just form a license agreement with Nvidia about porting their whole PhysX API as if its open for fully reengineering. PhysX is accessed through CUDA and they sell no licenses to ANYONE where they reveal the code needed for porting the libraries nor the rights.





On some point I agree here. If you need PhysX, you get a large commercial luggage with it (when you read the EULA), so I would hardly call it free as long as you need to give something as "payment". However, its advertised as free:
http://developer.nvidia.com/object/physx_downloads.html

As for ATI sharing their Opencl efforts?:


http://www.rage3d.com/articles/vision_eyefinity/index.php?p=6

And, Nvidia doesn't need to license Havok in OpenCL. It works by default as long as Nvidia supports OpenCL. There won't be a need for ATI to license PhysX either if Nvidia were to port it to OpenCL.

sure opencl has the power and flexibility to cure all these problems with licenses and lockouts and things of that nature. unfortunately opencl has been standardized for less than a year. havok isn't available in opencl and amd has nothing for opencl. seems kind of ridiculous to frown upon nvidia for not porting physx to opencl. it's not like anybody else has done it yet. sure, amd has put out demos (one for bullet and one for havok) but a demo hardly makes a complete api.
 
I find it highly annoying that they have disabled it, I was just about to get myself a 9800gt or GTX260 just for physx, alongside my 4870 and possibly in the near future 5870, but obviously wont bother now.

One thing I did see today that was quite funny was a Nvidia Physx logo on the back of an xbox 360 Batman:Arkham Asylum, wonder what the physx effects are on the xbox 360? The x360 from what I remeber has an ATi gfx core, will there be more effects than on the pc with an ATi card?

the console versions like xbox360 make use of software physx along with the pc version - hence the physx logo. it is the optional hardware accelerated physx effects that make use of a compatible gpu.
 
You should only be happy that your posts are answered and again that is not as much done for you. Regardless of what we discuss, you discuss Nvidia as the subject don't matter, only Nvidia itself as company. Like some sort of Nvidia's answer to Comical Ali.

PhysX is owned by Nvidia, here we agree. However, nobody has the access and most likely will be given the access to port PhysX to run on other API's then CUDA. Its not ATI's job and not even ATI's choice to port middleware. Where Nvidia wants to go with PhysX is entirely up to them. Calling it ATI's job, now THATS really far fetched... :rolleyes:



I'll not comment the first part beyond this, since the you are basically agreeing that Nvidia didn't offer PhysX to ATI.

Its funny that you are talking about licence agreement and others understandings of it. ATI can't just form a license agreement with Nvidia about porting their whole PhysX API as if its open for fully reengineering. PhysX is accessed through CUDA and they sell no licenses to ANYONE where they reveal the code needed for porting the libraries nor the rights.





On some point I agree here. If you need PhysX, you get a large commercial luggage with it (when you read the EULA), so I would hardly call it free as long as you need to give something as "payment". However, its advertised as free:
http://developer.nvidia.com/object/physx_downloads.html

As for ATI sharing their Opencl efforts?:


http://www.rage3d.com/articles/vision_eyefinity/index.php?p=6

And, Nvidia doesn't need to license Havok in OpenCL. It works by default as long as Nvidia supports OpenCL. There won't be a need for ATI to license PhysX either if Nvidia were to port it to OpenCL.

prehaps these example will help you out. Asus did they own porting of EAX to work on thier brand of sound cards. Of course it's ati freakin job to port the api to thier hardware. just like every other sound manufacture has to find a way to make Eax work on their sound cards. jesus, it's not complicated. All the licensee does in a licnense agreement is give you permission, and prehaps a little tech support. Use your brain for a change. As for why nvida block windows, it's basic economics, its called the free rider problem. the destroyer of public lands and public works. Seriously Nvidia would earn an F in basic economics if they had not blocked the pratice.
 
PhysX is owned by Nvidia, here we agree. However, nobody has the access and most likely will be given the access to port PhysX to run on other API's then CUDA. Its not ATI's job and not even ATI's choice to port middleware. Where Nvidia wants to go with PhysX is entirely up to them. Calling it ATI's job, now THATS really far fetched... :rolleyes:

Are you actually trying to argue that its Nvidias job to port Physx to run on ATI's hardware??? Seriously??? :rolleyes::rolleyes::rolleyes::rolleyes:

I use and love both verders systems. Still have working 9500-9700 card, my work PC has a 9800 in it, and I have 7800GTs and 8800GTs. Neither company is superior, as each takes their own approach, and who ever is king one day, is the joker the next usually. Just get what you think is the best bang at the time, and stop the madness!

Dedicated Fanbois can be funny/insame with their drivel!

EDIT - I should add that I completly DISAGREE with what NV did here, this is just bad customer service. Afterall, if you own the hardware, who cares if its the primary GPU/rendering device or not..
 
Last edited:
The Velocity physics engine can be seen in action with the Ghostbusters: The Videogame which runs the Infernal Engine.

No separate card required. Runs great on your CPU.

I would rather buy a game that uses my CPU for physics rather than some companies graphics card to do the same thing. Lower cost with better performance is always a good thing.

Not again :rolleyes:
Simple rigid bodies that dissapear after 10 seconds are no where near eg. tearable cloth or interactive fog.

I guess we need a new sticky for all the physics inepts...:rolleyes:
 
Seriously Nvidia would earn an F in basic economics if they had not blocked the pratice.

It's a shame that economics and customer satisfaction don't always go hand in hand...

First of all this worked without any modification, it still was using Nvidia's hardware, nothing that the ATi cards were handling took control of Physx or removed Nvidia's card from the picture.

Now if it didn't work out of the box then I would have been concerned about ATi or the community wanting support added for free. This wasn't the case and if Nvidia didn't want it this way they should have had it disabled in such a configuration from the start, they had to have tested a dual card config of this matter before release Physx drivers. Instead they proven that it works and they took the time and put forth the effort to disable it.

I was a loyal customer (not a fanboy, was always waiting for ATi to step up their game) to Nvidia since the 6 series. I sold my 5 series card and replaced it with an ATi 9800 PRO. I loved that card until it fried not long after the 6 series came out (it was an error on my part with the heatsink fitting not on ATi's)...

I'm glad ATi said something, I'm the [H] is has linked this thread. Why? I don't care if it was started by one of the bigger AMD fanboys on this forum, for once this is a consumer feature being ripped away from them for no good reason at all.

The consumer always loses in the end if they just sit their and take it up the ass and I am sorry but this shit is so wrong you know they aren't including lubricant. :p

EDIT: I had a crappy 4550 in my system with native HDMI port, it was there to run my second monitor and give me LPCM sound to my home theatre receiver. My GeForce 280 was the main card for both my main monitor and for graphics, I couldn't even turn on Physx with that config, a config where it is so obvious that I am using the Nvidia card as my only gaming video card.

This affects those of us who don't even want to have an ATi hard as anything but an extra convenience non gaming card. Forget this, I'm not fanboying it up for ATi either, I sit on the fence but right now Nvidia is seriously down below shaking the fence with all their strength hoping I fall on their side alone... Well I got news for you, if the G300 doesn't completely rape the 5870 or this Physx shit isn't corrected then Nvidia has lost a long term faithful customer.
 
The biggest issue with this isn't even WHY they did it. It's internal politics at its finest. There's no way to change that.
For the people that BOUGHT the card, shouldn't they get to use it for what it claims to be capable of?
So what happens if I have an AMD chipset on my next motherboard with an nVidia graphics card? Will they disable hardware accelerated graphics because they can't guarantee compatibility? This whole thing seems a lot silly for the only people REALLY caught in the middle. THEIR CUSTOMERS. It's not like this is a software hack allowing unlicensed software to be run on a competitor's hardware. It's THEIR equipment running THEIR software. Why on EARTH should I have to deal with this BS?

I have owned several nVidia graphics cards, and now I use ATI primarily. I should certainly get to use my hardware for the its capabilities, regardless of what other hardware is in my system.
 
This affects those of us who don't even want to have an ATi hard as anything but an extra convenience non gaming card. Forget this, I'm not fanboying it up for ATi either, I sit on the fence but right now Nvidia is seriously down below shaking the fence with all their strength hoping I fall on their side alone... Well I got news for you, if the G300 doesn't completely rape the 5870 or this Physx shit isn't corrected then Nvidia has lost a long term faithful customer.

Here here....lets hope the press on this BAD decision by NV causes them to re-think this approach of intentionally disabling a feature for marketing purposes, cause you mixed an NV product with an ATI product.
 
Nvidia is a bunch of hypocrites. They, just like intel and amd, are part of the http://www.pcgamingalliance.org/ which basically promotes pc standards and they are making all of their shit proprietary. Congratulations nvidia on your selfish ambitions while amd is promoting open standards.
 
I just want to point out this used to function perfectly (AMD card + dedicated NV card for physics). There is no technical reason why this can't work (immediately).

As for why AMD chooses not to support PhysX, I think most of you are missing the point. It's not whether or not AMD can implement PhysX (I doubt NV would block them) or whether or not AMD has the ability to implement PhysX (I have faith their software engineers could accomplish this feat), it's about AMD making sound business decisions. Supporting an API that your direct competitor has complete control over is not a sound business decision. That's not a long term strategy. Nvidia can always make changes/additions to the API that suit their type of hardware better than other types of hardware. There's no benefit for AMD to support PhysX other than to gain a "check mark" next to a feature. It's in AMD's best business interest (and some would argue in everyone's best interest) that a physics middleware (or any middleware) is independent of IHV's control.
 
Wow, yesterday it was the Q & A, today it's this. That's two days in a row that shows at a young age Nvidias top exec's were beat by their parents and had odd photographs taken of them by their uncles.
 
I just want to point out this used to function perfectly (AMD card + dedicated NV card for physics). There is no technical reason why this can't work (immediately).

As for why AMD chooses not to support PhysX, I think most of you are missing the point. It's not whether or not AMD can implement PhysX (I doubt NV would block them) or whether or not AMD has the ability to implement PhysX (I have faith their software engineers could accomplish this feat), it's about AMD making sound business decisions. Supporting an API that your direct competitor has complete control over is not a sound business decision. That's not a long term strategy. Nvidia can always make changes/additions to the API that suit their type of hardware better than other types of hardware. There's no benefit for AMD to support PhysX other than to gain a "check mark" next to a feature. It's in AMD's best business interest (and some would argue in everyone's best interest) that a physics middleware (or any middleware) is independent of IHV's control.


I have always disagreed with this position. I think that if ATI were a seperate entity they would have been on the physx bandwagon, however ATI no longer exist.

I agree however it is foolhardy for AMD to support CUDA because CUDA presents a bigger threat to AMD than INTEL. So it is in the best interest of AMD to ensure the preeminence of CPUs which CUDA does threathen.

So I Intel, Nvidia and AMD are all acting in thier corporations' best interest, which is not always aligned with that of the cusumer, as in this case.
 
NVIDIA NEEDS TO REVERSE COURSE ON THIS and make ppu specific cards :p like the old ones this way oh hai you have xgpu doing the rendering well here is a 50-100 $ addon board that will gives you physx
 
Not again :rolleyes:
Simple rigid bodies that dissapear after 10 seconds are no where near eg. tearable cloth or interactive fog.

I guess we need a new sticky for all the physics inepts...:rolleyes:

simple?

if you know how physics enigne calculate/work, you will notice your argument is entirely irrelevant.

tear-able cloth? interactive fog? or unrealistic potato rolling type of water?

cloth simulation is basically piece by piece together in a chain to simulate cloth, which is the basic principle of how physics engine work.
same goes to fog..

if velocity can handle load of object colliding, they can simply finish the cloth part.
 
I just want to point out this used to function perfectly (AMD card + dedicated NV card for physics). There is no technical reason why this can't work (immediately).

As for why AMD chooses not to support PhysX, I think most of you are missing the point. It's not whether or not AMD can implement PhysX (I doubt NV would block them) or whether or not AMD has the ability to implement PhysX (I have faith their software engineers could accomplish this feat), it's about AMD making sound business decisions. Supporting an API that your direct competitor has complete control over is not a sound business decision. That's not a long term strategy. Nvidia can always make changes/additions to the API that suit their type of hardware better than other types of hardware. There's no benefit for AMD to support PhysX other than to gain a "check mark" next to a feature. It's in AMD's best business interest (and some would argue in everyone's best interest) that a physics middleware (or any middleware) is independent of IHV's control.

This is mentioned in another post.

Ati Stream is a open standard.
Physx not.
Ati would have to use cuda.
AMD doesnt like that.
 
Sigh. Too bad they're enforcing their second-place status. "Hey, you have to use only our stuff to use PhysX!" Meanwhile, the AMD cards have no such restrictions. I'd like to see an industry-standard API that can be run anywhere, but I don't think PhysX is it, and even if it was I don't think this is the way to push it in that direction.

G300 had better be good or NV is in trouble. My die-hard NVidia user roommate is buying an ATI card this generation.
 
I think this is a piss poor business decision on Nvidia's part because they are turning away potential customers of Physx capable cards. Right now if someone is looking to build a moderate gaming machine and games they play support Physx they may may be willing to spend $50 - 100 to get an Nvidia card that can handle Physx. Since ATI's cards currently offer better value for the money in the midrange they are likely to purchase an ATI card. Nvidia is basically telling that customer that if you want Physx you have to buy an inferior product or pay more for a competing product.

Why Nvidia doesn't want the money from the sales of Physx capable cards is beyond me. On its own Physx just isn't there yet and there isn't a killer, must have game (less yet multiple must have games) that supports it. If Physx was that awesome I would understand the logic, but as it is they are just pissing away potential sales and giving themselves loads of unneeded bad publicity.
 
I think this is a piss poor business decision on Nvidia's part because they are turning away potential customers of Physx capable cards. Right now if someone is looking to build a moderate gaming machine and games they play support Physx they may may be willing to spend $50 - 100 to get an Nvidia card that can handle Physx. Since ATI's cards currently offer better value for the money in the midrange they are likely to purchase an ATI card. Nvidia is basically telling that customer that if you want Physx you have to buy an inferior product or pay more for a competing product.

Why Nvidia doesn't want the money from the sales of Physx capable cards is beyond me. On its own Physx just isn't there yet and there isn't a killer, must have game (less yet multiple must have games) that supports it. If Physx was that awesome I would understand the logic, but as it is they are just pissing away potential sales and giving themselves loads of unneeded bad publicity.


I have to disagree. first because the cards would mostly likely be low margin cards. and once you factor in the oppurtunity cost it's a no brainer for them to do this.
1. they would have to be responsible by any driver interactions caused by ati's driver
2. Ati gets a free pass as GPU development since they could seed physx acceleration to nvidia
3. ATI would never have any reason to support physx. if physx gets populer enough, A big if i know, but if it did, ATi would probally have to support it.

4. most people would be using thier old nvidia card, why, because nvidia as 70% marketshare, so by far away the most likely statisical situtation would be one,where someone was buying a new ATI card to use with an old Nvidia card.

so in sum, the losses for nvidia easily dwarf any material gains by allowing the pratice.
it just good business pratice by Nvidia, so company worth it's salt allows free riding when it doesn't have to.
 
Some of you may know that Creative not only did not properly support their older cards in Vista (they claimed that their older cards could not do this, couldn't do that under Vista), but also threatened to sue a modder who was providing fully functional Vista drivers for these older cards!
There was a very severe backlash from the community, and Creative backed up, and let that modder do its thing. Moreover, I just saw that they released Windows 7 drivers for one of those older cards that I have!

nVidia should learn from this, and let its customers use the hardware that they paid for, instead of crippling it intentionally by means of drivers.

I'd say that it's a very fine line nVidia is walking!
 
In terms of business practices, it almost makes sense if it's proprietary technology, though it does seem extreme.

But, does it really matter to that many people? No, I'm really asking...

I don't see much of a "fuss" to be made over something such as PhysX. It's barely been utilized, and is a completely extraneous thing. It's always optional whether or not to use it, and it's not necessary for quality physics or particle effects.

It's not been implemented to any degree, or in any ways, that it's absence has ever "hindered" or "lessened" any gaming experience.

I've never used it, and never missed it, and I've been running nVidia for years now (though I'm about to switch back to the ATi camp).

Glass still shatters and breaks... walls still smash and crumble... explosions still generate fire, sparks, dust, dirt and debris... so what's the point? PhysX is not required to manage these effects at high-quality level.

Tear-able cloth... interactive smoke... sure, it looks cool, but does it really matter? Two small details that I don't feel really add much, if any, immersion to a game.

Not here to argue, but I just don't see the "importance" of PhysX what-so-ever.
 
Proprietary standards suck smelly fart juice ass. I hope physx dies quickly and DirectX Compute is used for physics calculations in future DX11 games. From a business standpoint, it makes sense for Nvidia to do this. They are making you buy their regular mid/high end card or $LI + a lower end card for physx. Why would Nvidia only want to sell you 9800GTs to go along with your new ATI 5870 or 5870x2. Physx is overated anyway as far as software support. There are alot more software driven engines like Havoc for physics. There are only 3 good games that make good use of the technology.

Cryostasis (which is more atmosphere then gameplay)
Batman AA
Mirror's Edge

And Physx doesn't add anything to the gameplay, just atmosphere, unlike software based solutions like Havoc and other engines like the one in Ghostbusters, etc. Flying paper, broken glass, broken rocks/tiles, realtime smoke, water particles dripping down collecting puddles, are all neat but they are used strictly for visual presentation and nothing affecting gameplay.
 
...and this is why the GTX275 will be my last Nvidia purchase... Too many compelling reasons to go ATI these days.
 
Back
Top