Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
works fine with the latest wql drivers 191.07 -batman close to 80fps ave ftw
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
Everyone needs to get onboard - http://www.ipetitions.com/petition/physx/
I will be sure this gets to Nvidia, help us get some support behind this though, there is no reason why Nvidia should remove this ability as a current Nvidia GPU customer I deserve every feature they sell it to me based on. I know some of you guys hate petitions - we need as many signatures as possible so please, just sign it!!
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
Me thinks you are wrong. More titles are due out next year sporting PhysX technology.Is physx really worth it? I can't find any game that is interesting that uses hardware physx. I could be wrong.
Usually proprietary things like this fall off the earth after a while as well, considering probably half the market has ati cards i can't see gaming companies making any more big name games with physx. Am I wrong?
This is retarded. If you don't find any value in having a little extra eye candy then fine but leave it at that. I personally don't give a shit about 3DMark scores but I do care about having the best gaming experience possible.It seems pointless unless you are some nerd who wants the epenis gratification of a high 3dmark score.
Sounds about right, didn't the 9600GT have most of the GPU guts of the 9800 but gimped memory bandwidth?I'm running a GTS250. They can be had for under $100 if you look. There are a lot of people claiming the 9600GT is good enough as a secondary card.
I know I have PhysX on "high" in Batman and I'm still hitting 60 fps (the max possible) with vsync on.
I think they could have done more with the game. They could have made more of the objects dynamic, more stuff to bump into, maybe give you an opportunity to throw things whether in battle or not and have them react realistically to the environment. Puddles should have reactions and they don't.I'm using the new ATI/Nvidia PhysX hack and while it's nice to see the little effects in Batman, they don't really add much...and they're also somewhat sparse.
We're all still waiting for the killer app. The point is, Batman has the one of the better if not best implementations THUS FAR. I wouldn't spend the money on a PhysX card either just for Batman (It's definately not that killer app) but if I already own the hardware to handle the computations, then I want to be able to use it.It's definitely not worth buying any form of add-on card for that game alone. I think only time will tell if hardware PhysX is worth anything.
I'm running a GTS250. They can be had for under $100 if you look. There are a lot of people claiming the 9600GT is good enough as a secondary card.
I know I have PhysX on "high" in Batman and I'm still hitting 60 fps (the max possible) with vsync on.
We're all still waiting for the killer app. The point is, Batman has the one of the better if not best implementations THUS FAR. I wouldn't spend the money on a PhysX card either just for Batman (It's definately not that killer app) but if I already own the hardware to handle the computations, then I want to be able to use it.
I thought this thread was about activating PhysX on your Nvidia card when you run an ATI card in the system as well, and folks reporting their successes with the instructions provided. There's a shit load of other threads to debate the worthiness of PhysX. Neither side will convince the other in that debate, it's worthless really. Some will find value in it, others won't. There isn't much more to it. Developers have been moving forward with it in quite a few titles so there must be some value in it.
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
It was not until the late 1700s that Europeans finally took up the potato with gusto. Thereafter, it revolutionized their eating habits. It came to feed millions to the exclusion of most other vegetables. Today, ninetenths of the world's crop is produced in Europe. The greatest per capita consumers of potatoes in the world are Poland, Ireland, and East Germany. The largest overall producer is the Soviet Union. The Netherlands is typical of many: of the 5 million tons of vegetables it produces annually, 4 million tons are potatoes. From this former Inca crop, Scandinavia, France, Germany, and Russia eventually developed "national" dishes such as potato dumplings and potato pancakes, not to mention their renowned liquors aquavit and vodka.
The turnaround in popularity of the potato was due mainly to certain "crop champions"individuals of vision who dedicated their talents, emotions, and egos to the crop's cause. The most famous was Antoine Parmentier. After convincing Louis XVI of the potato's qualities, he tricked French peasants into thinking potatoes were fit only for royalty. As a result, the people pilfered the king's potato fields, and the plant quickly ended up in gardens all over France.
In Germany, potatoes were also given royal cachet by a series of royal advocates, including Frederick the Great; and in Greece, by King Otto I. In Sweden, the potato's protagonist was Jonas Alstromer, who "stole" two sacks of them from England and (despite being chased by the British navy) got them safely to Stockholm.
You gotta love when the world changes from "PhysX is uselees" to "Hey AMD users, we can have this PhysX too"
*ROFL*
I clicked on this thread knowing you would post a comment somewhere. You didn't disappoint.
So does nVIDIA pay you hourly?
PS. I've had this patch running since October 6th and it's worked nicely: http://www.ozone3d.net/benchmarks/score.php?id=80ed7494d7edda1bf89760f940cfd585
Is physx really worth it? I can't find any game that is interesting that uses hardware physx. I could be wrong.
Usually proprietary things like this fall off the earth after a while as well, considering probably half the market has ati cards i can't see gaming companies making any more big name games with physx. Am I wrong?
It seems pointless unless you are some nerd who wants the epenis gratification of a high 3dmark score.
It just occured to me, when reading this:
linky
Because it seems to me AFTER NVIDIA locked PhysX, it suddenly got more attention...
PhysX was pretty useless, until Batman came along
Then if this is all fluff there sure as hell a lot of people that want it. Complain Phsyx is POS no games support it, then complain it's been taken away.
Two questions:
What's the cheapest nvidia card that supports this, and:
Does x16 PCIe cards work in a open-back x1 slot?
If you have crossfire 5870s and a third pci-e slot like some motherboards due that support 3-way crossfire/sli, etc
Can you go 2 x 5870 + 1 nvidia card for physics x?
or does this 'workaround'//'setup' not work with crossfire?
Eventually, in that motherboard setup described above, will those potentially work with two 5870x2s and a nvidia video card combined together? I think that would be awesome if it did work out successfully.