Hawken PhysX Trailer

Yeah, but then on the other hand PhysX is the perfect example of tech being held back because it's only in nvidia's hot little hands.

Howso? If the other company who suddenly decided (after their own "closed" standards got nowhere) that they're some "Open" whiteknight, and wont take it (for free), there's not much you can really do (other than pay them)...

"Competing" against a grand total of nothing (regardless to what people believe OpenCL isn't a physics API, there is no "Open" standard that would be an easy replacement, or one which wouldn't require a boatload of time+money) doesn't really help, but it has improved steadily over the years.
 
Thought AMD had their own physics called HAVOK?
Did that die out or something?

At one point (beforfe nvidia bought Ageia) both AMD and nvidia were working on hardware accelerated havoc physics. Then intel bought Havoc so AMD was left in the cold. Nvidia had already bought Ageia by then.
 
nVidia has no real interest in what physics engine is better. They are just using theirs, rather poorly I might add, to lock you into a single brand of gpu. Its going to take more than a few overused particle effects to convince me that its worth buying another graphics card for.

Just another example of how desperate nVidia is to get their middleware to be used. At least they know their target market, big crazy explosions are a big seller to many folks.
 
A bunch of clutter that will be turned off by anyone that wants to be competitive at the game.

I think it looks cool but also agree that it will get in the way for the highest level of competitive game play.
 
Most of those effects make no sense. Someone should make a ghost game if they just want spirits flowing out of robots.
 
That just looks terrible imo...

I've been considering going back to an Nvidia card at my next upgrade for PhysX, but if thats all it brings to games, I'll keep the cash and get the AMD....
 
Looks like its a case of "ooooohhh look we've come up with a nifty particle effect, lets use it everywhere we can! That won't get boring quick!"

Yep. I actually ended up preferring the PhysX off side on some scenes. Less distracting and easier to see what's actually happening.
 
PhysX needs to die. It's always been complete bullshit. Some stupid effect that "would not be possible without it" ha ha ha. Come the fuck on
 
The new features were vastly overdone to maximize the "ooo.... awww.... shiny" and like someone said, in competitive gameplay you would turn that off.
 
Really wish ATI took this on so it was more than a graphical effect and had actual consequences on game play, not cool bro denying yourself's PhysX

The only reason i could think for their asshatery is they want every frame they can get to look good in bench's and/or want to keep a superficial edge by having nvidia loose frames in PhysX enabled games.
 
open standards > closed standards

Many developers love Physx because it works well for them. Borderlands 2 is a great example how it added to gameplay. I turned it off one time and ended up turning it back on because I missed the effects. This trailer actually looks really nice and the particles from smoke to debri really adds to the immersion.
 
PhysX needs to die. It's always been complete bullshit. Some stupid effect that "would not be possible without it" ha ha ha. Come the fuck on

Please cite a Physics simulation solution that works as well. Why hasn't AMD released a solution or worked with developers?
 
No amount of of PhysX is going to stop the mech from looking like a microwave that had a threesome with a TV and trashcan.
 
A bunch of clutter that will be turned off by anyone that wants to be competitive at the game.
Pretty much my first thought. You sacrifice a lot of visibility just to have some pretty (and unnatural looking) effects.
If everyone had the same experience, it wouldn't matter, but...they won't.
 
Really wish ATI took this on so it was more than a graphical effect and had actual consequences on game play, not cool bro denying yourself's PhysX

The only reason i could think for their asshatery is they want every frame they can get to look good in bench's and/or want to keep a superficial edge by having nvidia loose frames in PhysX enabled games.

No its corporate ego and arrogance. They don't want to kneel and kiss the green ring.

Anyway I never paid much mind to PhysX until I started playing Borderlands2. Its extremely fun when its enabled, esp with Torgue weapons. So all the closet AMD owners here bashing it should watch the gameplay videos for themselves before passing judgment
 
I get people are pissed at it being locked to nvidia. Even as an nvidia user I am pissed that amd can't enjoy it.
However I am all for the progression of pretty graphics
Even if it means certain features being developed with nvidia exclusively.
These Physics features would not be present in the game without PhysX (Unless the dev is Crytek or someone.)
I am NOT saying PhysX technology is exclusively needed for this sort of physics however it does provide the funding and incentive for its development.
Which I can get behind (sort of..).
 
If they don't want to take it then at least both companies should make having both cards installed a breeze to run driver wise, still wouldn't really make it worthwhile for devlopers to make a game that requires PhsyX though.

Is it so wrong to want a game where i can have the room flooding and blow a whole in the wall to make the water drain, or move some water along a path of pipes to fill a room to get to the top, and you cant do that shit on the CPU, it can get away with a bit of fabric and simple things but not water.

Just try running that nvidia liquid physics demo on the cpu and try get a descent frame rate if you think its all silly, all it can be atm is flashy after effects unless you want to cut out a lot of your audience, its been modded and has some extra scenes

http://physxinfo.com/news/4214/modding-of-physx-fluid-demo/
 
interesting, i think its maybe a step forward. i like the particle effects they used for debris for explosions...but the whispy glowing effects? no thanks. reminds me of when valve went overboard with the HDR lighting. that really lasted, didn't it.
 
I've got a 6950 right now and after seeing Borderlands 2 with full PhysX, fuck ATI. The difference in price between an ATI card and an Nvidia card isn't that much so this is that tiny difference that pushes me to one brand over another.

As far as PhysX is going to fail because it's closed arguments go... if Nvidia is right now essentially paying devs to learn their system, and they all become familiar with it.... won't it inevitably become the industry standard?

What if they make it free and work on ATI cards and just make the splash screen say "Nvidia PhysX" every single time you turn on a game. Isn't that free advertising? What happens when everyone gets used to PhysX and then Nvidia says "PhysX 2.0 coming out for Nvidia cards only". They can still have the old PhysX but the new shit looks better. Or maybe it won't even be backwards compatible, maybe it's either PhysX 2.0 or nothing. How many people would switch at that point?

Nvidia paid almost $30m for Ageia, why should they share that for free unless it's in their best interest? Obviously we can argue about what is in their best interest but for now, PhysX is definitely causing at least a few people to pick Nvidia over ATI
 
If they don't want to take it then at least both companies should make having both cards installed a breeze to run driver wise, still wouldn't really make it worthwhile for devlopers to make a game that requires PhsyX though.

Is it so wrong to want a game where i can have the room flooding and blow a whole in the wall to make the water drain, or move some water along a path of pipes to fill a room to get to the top, and you cant do that shit on the CPU, it can get away with a bit of fabric and simple things but not water.

Just try running that nvidia liquid physics demo on the cpu and try get a descent frame rate if you think its all silly, all it can be atm is flashy after effects unless you want to cut out a lot of your audience, its been modded and has some extra scenes

http://physxinfo.com/news/4214/modding-of-physx-fluid-demo/

I don't know about this one.

It has been shown time and time again that Physx for the CPU uses x87 instructions or something...basically Nvidia has crippled Physx for the CPU and has restrained it to ONE core. A processor can NOT do multi-core/threaded Physx. I think a lot of the Physx we've seen, even good stuff like in Borderlands 2 could easily be done on a CPU.

You can't tell me that my quad core 4.5ghz 2500K (EIGHTEEN GIGAHERTZ TOTAL) couldn't handle particles...even thousands of them, which is all water in Physx is...just a bunch of particles with shaders to blend them together so they look solid with different physical properties. No different really than any other Physx based effect except for maybe the number of particles.

So while I am a Nvidia owner, and honestly hold off of AMD because of Physx, I am not so blind as to say Physx is something that NEEDS a GPU to do what it does. At least with nothing I've seen so far. Yes Physx is a bit of a gimmick and yes it doesn't really add anything game play wise...but MAN does it add to the atmosphere and realism of a game. I mean, look at Batman Arkham City...MUCH better with Physx even if the game is still technically the same. Cloth, paper, fog and smoke that moves around, etc...just simply gorgeous!
 
Honestly I'm not liking the added fx, prefer the game without those shown. It just wasn't convincing, looked more like someone stuck a screensaver in the game.
 
Honestly I'm not liking the added fx, prefer the game without those shown. It just wasn't convincing, looked more like someone stuck a screensaver in the game.

I'm looking forward to it more so just to see it in action...I will agree that other than wanting to play with it for a bit it is WAY too much and doesn't make sense...especially when they blow up. The health orbs, power shits, and shield look good and make much more sense.
 
I don't know about this one.

It has been shown time and time again that Physx for the CPU uses x87 instructions or something...basically Nvidia has crippled Physx for the CPU and has restrained it to ONE core. A processor can NOT do multi-core/threaded Physx.

That's no longer true

From Physx 3.0, CPU physx is multicore enabled. Not sure about SSE support though.
 
"wouldn't be possible without nvidia's phys-EX"

Bullshit. Devs could have easily coded this to run on the CPU or any gpu with open standards.

How much does nvidia pay these developers?


disclosure: I own $20k in nvidia stock :p
 
Nvidia paid almost $30m for Ageia, why should they share that for free unless it's in their best interest? Obviously we can argue about what is in their best interest but for now, PhysX is definitely causing at least a few people to pick Nvidia over ATI

It's called consumer choice. Nvidia is not giving that.

I have an AMD Radeon card. My choice. But, I'd also like to see physics-based effects in games that doesn't require me to spend money on a new Nvidia card or hacked drivers.

What's hard for game developers to switch to an open-platform that works on both Nvidia and AMD video cards, or even the CPU?

Honestly, it shouldn't be hard at all. Why isn't it happening then?

It's fucking stupid in this industry. Not only are PC gamers getting shafted by console game developers, but companies like Nvidia that like to lockdown their products instead of opening it up.

This is why an open platform is better for the consumer in the end. It gives us a choice on what video card we want to spend our OWN money on regardless of the software that is going to be using it. And, yes, OpenCL is capable of creating real-time physics APIs and engines such as Bullet: http://www.geek.com/articles/games/...cl-for-open-real-time-physics-system-2009102/

http://bulletphysics.org/wordpress/

Why developers aren't using it? Who knows? It's honestly stupid to go the proprietary route.
 
I get people are pissed at it being locked to nvidia. Even as an nvidia user I am pissed that amd can't enjoy it.
However I am all for the progression of pretty graphics
Even if it means certain features being developed with nvidia exclusively.
These Physics features would not be present in the game without PhysX (Unless the dev is Crytek or someone.)
I am NOT saying PhysX technology is exclusively needed for this sort of physics however it does provide the funding and incentive for its development.
Which I can get behind (sort of..).

Except when it's exclusive, it's never going to be more than a gimmicky feature trying to sell nvidia cards. All the years physx has been around and we've only had a few games that use it well and none that I'm aware of that use it for actual gameplay elements. Then with physx off, they have practically no effects at all so that they can do these physx promotional videos "physx on vs physx off" like we see here, even though the "physx off" option has less effects than other non-physx games.

Whether it be AMD's fault or nvidia's fault (honestly I haven't been following close enough, I though nvidia was to blame, but perhaps it's AMD), PhysX as an exclusive system is NOT good for gaming because it can never be used as more than a gimmick, while in the more general sense, GPU driven physics COULD be good for gaming, PhysX is not.
 
I've got a 6950 right now and after seeing Borderlands 2 with full PhysX, fuck ATI. The difference in price between an ATI card and an Nvidia card isn't that much so this is that tiny difference that pushes me to one brand over another.

As far as PhysX is going to fail because it's closed arguments go... if Nvidia is right now essentially paying devs to learn their system, and they all become familiar with it.... won't it inevitably become the industry standard?

What if they make it free and work on ATI cards and just make the splash screen say "Nvidia PhysX" every single time you turn on a game. Isn't that free advertising? What happens when everyone gets used to PhysX and then Nvidia says "PhysX 2.0 coming out for Nvidia cards only". They can still have the old PhysX but the new shit looks better. Or maybe it won't even be backwards compatible, maybe it's either PhysX 2.0 or nothing. How many people would switch at that point?

Nvidia paid almost $30m for Ageia, why should they share that for free unless it's in their best interest? Obviously we can argue about what is in their best interest but for now, PhysX is definitely causing at least a few people to pick Nvidia over ATI

I run an ATI HD5970 one one of my rigs and a GTX 670 on the other and both are able to play BL2 with full PhysX, it's just that the 670 does it in hardward while the 5970 is forced to do it with software but the effects are the same.
 
The real reason is that NVIDIA physx engine is complete written in CUDA and AMD would have to license the CUDA cores from NVIDIA to run them. Now I have used both Team Gree (480) and Team Red (5970 and 7970M x2) I was really unsure of the last one until the 12.11 and 13.1 drivers came out and pushed the cards past the 680;s.

While the Physx support is neat, There is a problem and that is that many companies are supporting OpenCL, such as Adobe and the modelling software. It has been shown that current 7970 cards have nearly 8-10x the OpenCL performance If there was s physx stardard that used OpenCL, it would be great. I think you will see AMD physx sooner than later for one big reason, well 2 big reasons... XBOX720 and Playstation 4. There has to be a really good reason for both to choose AMD. Cost could be one, but again, when it comes to cost, the 7970M is much cheaper and is faster that the much more expensive 680M, so if AMD can use that awesome GCN architecture, I'm sure that a great physics library is in the works... some are... here are some basic ones I have found...

Now, lets think... how could NVIDIA use this to their advantage?? License Physx?? No.. since they seem to have plenty of performance on most of their cards, why couldn't they release a generic CUDA card that just does CUDA based Ageia Physx and make it a 1x, 4x, 8x or 16x PCI-E slot... If they did this, they could get an NVIDIA card in every system, and that could help them in the long run, just by having the name in the computer, or by opening up a large market to their product and name and have another route to sell upgrades to cards that need upgrading.... It seems like maybe they may have missed a ship... a ship lined in gold that was already made and ready to sail, if they had used the gold to build the ship(and it would have floated). Just seemed like a very very poor choice to forgo that... it would have been AMD with my NVIDIA Physic card.. and the NVIDIA system (never mention AMD because never need a separate AMD card) It could have been a few hundred million dollars... I would have jumped to oversee that division!

http://www.youtube.com/watch?v=bR15Xnt_I5E

http://www.amd.com/us/press-releases/Pages/amd-plug-in-maya-gd-conf-2011mar02.aspx

http://www.xbitlabs.com/news/cpu/di...ion_of_GPU_Physics_into_Games_Using_Maya.html

http://www.bit-tech.net/hardware/graphics/2011/02/17/amd-manju-hegde-gaming-physics/1
 
All the people supporting an OpenCL whatever don't seem to realize, still, that OpenCL isn't a replacement for PhysX, it would be a replacement for CUDA.

You cannot drop in "OpenCL" into a game and make it do stuff, you'd need to write a whole new physics API, which would take lots and lots of time and money, and likely be worse than the free alternative.

OpenCL, at the moment is a pipe dream, the only is mildly promote because AMDs own "propriety" software failed. It's the same reason why Microsoft wont support Blu ray; because they spported HD DVD which fell oin it's face. It's time for both of them to just stop being bitches and accept they "lost". :p
 
I could see it being useful on the shield, if it were toned down slightly and modified more to show where damage was coming from.
 
Why developers aren't using it? Who knows? It's honestly stupid to go the proprietary route.

I'm guessing that it takes man hours to put in all these special effects and Nvidia is willing to help pay for part of that cost because when consumers see cool effects vs no effects some of those consumers will pay for the cool effects.
 
Please cite a Physics simulation solution that works as well. Why hasn't AMD released a solution or worked with developers?

HAVOK. It is done entirely in software with great results and no vendor lock-in.

PhysX may be better overall, but so far every single tech demo or actual game implementation is filled with pointless effects that suck GPU performance for shiny lights and other cheap tricks such as waving flags, volumetric smoke, etc.
 
Back
Top