Borderlands 2 PhysX Demo

When am I going to see a game where PhysX is an integral part of the game play and not just some ugly looking afterthought.

Maybe next gen consoles will be powerful enough to build cross-platform particle physics engines.
 
I remember reading before that some people have been able to use their main ATI GPU (in my case my 6970) for graphics, and another nVidia gpu for Physx, is this easy and reliable to pull off?
 
I remember reading before that some people have been able to use their main ATI GPU (in my case my 6970) for graphics, and another nVidia gpu for Physx, is this easy and reliable to pull off?

Yeah there was a driver hack for that and how I used to play Batman AA with PhysX on my 5870(my previous 8800GTS 512 ran the calculations). Here's a link detailing the process.

Then there was a malware/virus scare in relation to that driver hack(Norton threw up a false positive I think) and people went apeshit. Not to mention Nvidia doing everything they could to hobble any efforts into allowing AMD and Nvidia cards to work together in this way.

I don't know the current situation now though. I'm unsure if it ended with Nvidia caving and letting us run mixed setups to get advanced PhysX with AMD cards.
 
Honestly, they've been showing the same stuff since ~2008: flowy banners, ludicrous amounts of debris, etc. It's even less impressive now.

I was expecting to see some next level stuff in the video. Instead its exactly like most other PhysX implementations. 99% of the game is the same, then you run into cloth or debris situations so the game can be like "OMG LOOK HERE, PHYSX"
 
When am I going to see a game where PhysX is an integral part of the game play and not just some ugly looking afterthought.

Probably won't, because if you think the bitching about PhysX from the AMD side is bad now, imagine what will happen if it was an integral part of the game...
 
two things.
so like, why can't my 12 core 8 ghz cpu do this?

Because of the fundamental differences in the design of a general purpose processor compared to GPU. Due to its complexity, a general purpose processor sacrifices parallelism for its complexity. You ONLY have 12 cores.

A GPU, on the other hand, is significantly less complex, and hence you can cramp hundreds or even thousands of cores into a single chip.

So, see the difference between your 12 core CPU with a GPU that have more than a thousand "cores". Physic processing doesn't require all the additional advance branch prediction, scheduling and so its extremely inefficient to run them on a general purpose processor.
 
Proprietary standard is proprietary. I will not get this game, because I don't know what brand my next GPU will be.
 
Fuck, much better(looking, performing) physics engines have been made w/o the need for PhysX bullshit.

It has but coders are too lazy to make their own. I mean only cpu generated physics made from scratch for a game was that ghostbusters remake... the game sucked though... barely 10 hours play but the cpu physics are top notch. You can license a physic engine I forget it's name but Fallout, Oblivion, Skyim use it... sorry the name escapes me...
 
It has but coders are too lazy to make their own. I mean only cpu generated physics made from scratch for a game was that ghostbusters remake... the game sucked though... barely 10 hours play but the cpu physics are top notch. You can license a physic engine I forget it's name but Fallout, Oblivion, Skyim use it... sorry the name escapes me...

Havok?
 
You guys do realize that you don't need to have a nVidia GPU to run this game right? Physx is just extra fancy looking stuff..

And btw, you can actually set in the physx driver to run on CPU, and I believe that's what it will automatically be if you do not have the appropriate GPU in your system.

What's the big deal? The game will still be great with or without it anyway.
 
You guys do realize that you don't need to have a nVidia GPU to run this game right? Physx is just extra fancy looking stuff..

And btw, you can actually set in the physx driver to run on CPU, and I believe that's what it will automatically be if you do not have the appropriate GPU in your system.

What's the big deal? The game will still be great with or without it anyway.

I don't think the game is going to live up to the first. But that's my opinion. My biggest problem is Nvidia trying to snuff out more competition by paying a company. I want the fancy stuff without having to buying a different GPU from one particular company.

Anyways I hope they don't have that much wub wub in the game. Or if they do have the native ability to put your own soundtracks into the game. I honestly don't like 99% of electronica music.
 
most of the people in this thread are idiot. unlike AMD Nvidia acually tries to help its customers and wants to gain new customers. So it spends time and money helping developers use its products. This is what competent companies do, Nvidia is one, AMD is not, that simple.
That's not how Linus Thorvald feels about it.
 
I was poised to buy stock in Nvidia the second he made his comments about a driver release expecting a huge overnight slump. Never happened. Was pissed. :(

He is not all that important in the world of gaming video cards, and investors, as likely as not, do not even know who he is.
 
Because of the fundamental differences in the design of a general purpose processor compared to GPU. Due to its complexity, a general purpose processor sacrifices parallelism for its complexity. You ONLY have 12 cores.

A GPU, on the other hand, is significantly less complex, and hence you can cramp hundreds or even thousands of cores into a single chip.

So, see the difference between your 12 core CPU with a GPU that have more than a thousand "cores". Physic processing doesn't require all the additional advance branch prediction, scheduling and so its extremely inefficient to run them on a general purpose processor.

I get your point about gpu's being more efficient with specialized tasks, but i still stand by the point i was making (though i shouldn't have clearly exaggerated my cpu in hindsight)

there are physics engines that run great on cpu's, everybody's cpu's. There are also physics engines that run great on gpu's, everybody's gpu's.

Nvidia pushing proprietary tech that only works on their hardware i don't like and never will. it doesn't matter that they have every right to do everything possible to move their product, and if this stratagy works for them great for them, but nobody should like it is all im saying.

i wouldn't normally bring it up, but there is an advertisement video flaunting it on the front page of one of my regularly viewed websites, and there is a dedicated forum page right here to discuss it. Proprietary is bad, nvidia isn't the only one that does it, but their the only ones doing it in this instance so they will get harped on.
 
Does anyone actually have proof that Nvidia is paying them off to use PhysX? I see a lot of accusations but no proof to back it up. If anyone has some real evidence I would like to see it.

It has but coders are too lazy to make their own. I mean only cpu generated physics made from scratch for a game was that ghostbusters remake...

Trespasser had CPU physics. Stuff bounced, rolled, and fell pretty realistically for a game made in 1998. Of course, that's talking about a few object interactions here and there in a fairly slow-paced single player environment, not hundreds of or even a few thousand pieces of debris flying all around and bouncing into each other in a very busy environment with up to four players in action.

I think the GPU offloading helps tremendously, and that's what they're showing in the video. What I see in the videos is more akin to "This is how our game looks, but we can add even more effects like this by offloading this stuff to the GPU that's running PhysX". Now it may be possible to do physics with Havoc or OpenCL, but perhaps PhysX is just easier for the programmers to work with for what they're trying to do. Anyone ever stop to think of this? Support is a factor in this as well and something a lot of the arm-chair critics don't take into account. If they're getting implementation and development support from Nvidia that's gold to a dev team, and if implementation is easy that's even better. If someone said to me, "Hey, you can do this stuff, but you have to write it in assembly", I'd not be too keen on doing it. If someone else said, "Hey, I have this toolkit ready to go, and you can make this stuff happen and code it all in C, and I'll even give you support along the way if you run into snags", I'd jump on it. Any programmer with a shred of sense would.

Now that may not seem fair to the AMD crowd, but what has AMD brought to the table to compete with PhysX? If people are mad, they shouldn't be mad at Nvidia for simply offering something AMD is not, nor the devs for making use of it. Whining that the "other guy" has something you don't is not going to accomplish anything. He's not going to give it up no matter how much you bitch about it, and you're not going to get what you want either. If you don't like devs taking advantage of PhysX then tell AMD to get off their butts and come up with a competitive alternative.
 
NVIDIA does work closely with developers and encourages them to use their PhysX API. NVIDIA has even invested money with game developers in order to "help" them out and those titles often have PhysX in them.

Seems pretty clear what they are doing. They are trying to get their API to take off. At present, it really hasn't. At least, not the way they want it to. Many games use software PhysX and nothing hardware accelerated. Mass Effect 3 is a perfect example of that.
 
Proprietary is bad, nvidia isn't the only one that does it, but their the only ones doing it in this instance so they will get harped on.

In principle I agree on proprietary being bad, but the reality is it's out there and it's being used, and I don't see that changing any time soon. AMD needs to do something to compete with PhysX, even if it means introducing their own proprietary stuff. It is a step backward, true, but some devs are more than willing to code "direct to metal" as it's been said if it gives them control and speed advances. Universal API's can be bottlenecks (Direct-X comes to mind) and a hardware manufacturer writing its own API knows exactly how their own stuff works. Some devs would be willing to code to both Nvidia and AMD's physics stuff if it made their games look great and sell more copies. Not all, but it would open the door to putting some pressure on AMD and Nvidia from the devs to stop with the proprietary stuff so the devs wouldn't have to choose one or the other or both.
 
NVIDIA does work closely with developers and encourages them to use their PhysX API. NVIDIA has even invested money with game developers in order to "help" them out and those titles often have PhysX in them.

Thanks for that tidbit, Dan_D. That answers my request for evidence. Any statement from the [H] staff is good enough for me.
 
Thanks for that tidbit, Dan_D. That answers my request for evidence. Any statement from the [H] staff is good enough for me.

There was an article about it a few years back. Many game developer making of videos even talk about working with NVIDIA. In fact many times people from the PhysX team at NVIDIA are listed in the credits of PhysX enabled games. Batman Arkham Asylum is one of them. In contrast, AMD does have a similar program, but I've not seen nearly the same attention, or listings in game credits, etc.
 
Many games use software PhysX and nothing hardware accelerated. Mass Effect 3 is a perfect example of that.

True, but we don't really see the really cool particle/fluid dynamics with software PhysX.
 
To add: If only nVidia would release some sort of official driver to allow a PhysX-only nVidia card to work with any other card. They would still get money from people buying cards for PhysX and AMD users wouldn't have much reason to whine incessantly.
 
To add: If only nVidia would release some sort of official driver to allow a PhysX-only nVidia card to work with any other card. They would still get money from people buying cards for PhysX and AMD users wouldn't have much reason to whine incessantly.

There was talk about NVIDIA releasing cards which were basically GPUs only with out all the monitor connections for both PhysX and SLI use. But that seemed to die out the second NVSurround showed up.
 
talk about useless waste of resources. physx is crap. why some people tend to praise that stupid shit is beyond me
 
talk about useless waste of resources. physx is crap. why some people tend to praise that stupid shit is beyond me

While somewhat subjective, how are better visuals a waste of resources? Do you consider AA, AF, or high quality texture details a waste of resources?
 
While that looks cool and all, I don't understand exactly what the point of physX is, why could they not just code that in like any other object/texture? How is a moving banner or water flowing any different than a moving character or vehicle?

Sorry I'm kind of ignorant I have not coded much graphical apps, but from the little playing around I've done with Open GL and such I just don't understand why these things require a video card specific feature and can't be coded in as normal scenery objects.
 
It has but coders are too lazy to make their own. I mean only cpu generated physics made from scratch for a game was that ghostbusters remake... the game sucked though... barely 10 hours play but the cpu physics are top notch. You can license a physic engine I forget it's name but Fallout, Oblivion, Skyim use it... sorry the name escapes me...

Please tell me your not talking about Ghostbusters: The Video Game. Heres the Wiki http://en.wikipedia.org/wiki/Ghostbusters:_The_Video_Game The Game is fine and fun. I think it fit the ghostbuster films nicely. Then again im a huge fan of the movies, just simply I love the movies in particular the first one. My sister bought me the ghostbusters dvd set for christmas. Now the original ghostbusters game on the nes, blew monkey balls hard.
 
Why no edit button!


Back on topic: While nvidia has genereally had the fastest hardware last few generations, this video really doesnt show that much of a difference (thankfully) and doesnt show that i made a mistake picking up my factory oc ati 6850 for 129. Versus the 180 to 200 i would have paid for the equal nvidia card. The special effect of the water getting sucked up into that "micro blackhole" gun looked pretty nifty though.
 
It's going to be interesting to see what kind of performance hit there is with PhysX enabled in B2. Hopefully will still be able to play the game @ 2560x1600 with a single 680gtx.

I've noticed a lot of folks have expressed frustration toward Gearbox and nVidia for implementing a propriety system into the game. Some claiming it was underhanded and poor business ethics. Personally, I don't think it is.. There is something in the business world called "Point of difference". Where a company offers a product or service that no other company does.

Besides, nVidia even went on record back in 2008 saying, "We are committed to an open PhysX platform that encourages innovation and participation," and added that Nvidia would be "open to talking with any GPU vendor about support for their architecture."... AMD/ATI did not pursue this offer.

Bit-Tech Article
 
PS - It's going to be hilarious playing B2 multi-player "Spot the nVidia user". With people jumping around, splish splashing in the water, shooting random walls and (to ATI users) "invisible objects" :D

All the ATI peeps will be like...

7806471064_4b1c3e3deb.jpg
 
It's going to be interesting to see what kind of performance hit there is with PhysX enabled in B2. Hopefully will still be able to play the game @ 2560x1600 with a single 680gtx.

I've noticed a lot of folks have expressed frustration toward Gearbox and nVidia for implementing a propriety system into the game. Some claiming it was underhanded and poor business ethics. Personally, I don't think it is.. There is something in the business world called "Point of difference". Where a company offers a product or service that no other company does.

Besides, nVidia even went on record back in 2008 saying, "We are committed to an open PhysX platform that encourages innovation and participation," and added that Nvidia would be "open to talking with any GPU vendor about support for their architecture."... AMD/ATI did not pursue this offer.

Bit-Tech Article

Yeah because AMD/ATI really wants to pay Nvidia a presumably absurd sum of money to the competitor. And if Nvidia is so comited to a open platform why isn't there one? :rolleyes:

The whole thing reminds me of a pair of GPU manufacturers a few years back that did something like this and now one is no longer around. Hmm what where their names?
 
although that's not really analogous

developers have to pay to license Havok whereas developers get assistance from nVidia for PhysX. So clearly PhysX is the better option for developers who don't already have Havok licenses or the money to pay for them.

AMD could license PhysX from nVidia and it'd be a lot less expensive than it cost nVidia to buy AGEIA and continue developing PhysX. Then all the developers could continue using it without having to pay individual licenses and AMD customers would be able to run physics on their GPUs...where they belong.

So it's odd to hear all this bellyaching about proprietary this or that and hear those same people championing Havok.
 
although that's not really analogous

developers have to pay to license Havok whereas developers get assistance from nVidia for PhysX. So clearly PhysX is the better option for developers who don't already have Havok licenses or the money to pay for them.

AMD could license PhysX from nVidia and it'd be a lot less expensive than it cost nVidia to buy AGEIA and continue developing PhysX. Then all the developers could continue using it without having to pay individual licenses and AMD customers would be able to run physics on their GPUs...where they belong.

So it's odd to hear all this bellyaching about proprietary this or that and hear those same people championing Havok.

I didn't say Havok did i? The person I quoted said Nvidia was committed to a open engine. They are not. And at least if AMD payed Havok they wouldn't be giving money to the rival doubling the loss. Nvidia "helps" develop this so people will buy their cards if they want the extras.
 
While the PhsyX engine did add some nice effects to the explosions, I don't see why it's such a big deal..It's not like this is the first TWIMTBP title with PhysX, nor will it be the last..

On a more important note, are those terrible cartoon graphics legit? If so, are they the same as the first game?

I was really thinking about getting the first one while waiting for the 2nd to drop due to a lot of the gossip and people here eagerly awaiting the release..But those graphics are terrible..They are worse then TF2, IMO..
 
Back
Top