NVIDIA Shows PhysX Comparison

My problem is this:
Mirror's edge is THE game I have been looking forward to. I like the difference. It may mean I will be getting a physX card... :-(
 
It seems that they added banners and cloth things, for the sake of adding banners and cloth things. I can't say that I am impressed.
 
Physx is great and I have an old nvidia card here I can use for it too. Now only if games actually used it...
 
Considering PhysX, how does, say, a 8800GT card compare to one or two CPU cores on a Core2Quad CPU at 3'ish ghz?

I guess my point is, since most games today can barely use two cores, let alone 4, wouldn't physics be better left to the CPU considering the quickly increasing number of CPU cores that are left idle in games?
 
Considering PhysX, how does, say, a 8800GT card compare to one or two CPU cores on a Core2Quad CPU at 3'ish ghz?

I guess my point is, since most games today can barely use two cores, let alone 4, wouldn't physics be better left to the CPU considering the quickly increasing number of CPU cores that are left idle in games?

Actually, no. GPU's are much better suited to handling a lot of small parallel calculations, where CPUs are not.
 
actually they have agreed to work together on this.

Not really. AMD partnered with Intel to use Havok, which is CPU only this far. There's nothing about Intel letting AMD port Havok to Brook.
 
One has to separate the potential of how PhysX can be used with the quality of Mirror's Edge.

From what I've seen, the game has (for this day and age) poor graphics quality and very bizarre perspective from the player's viewpoint. The arm movement, in particular, looks very primitive and unnatural.

Point being; if you have a game that does not look well made and polished in regards to the game itself, adding the PhysX eye candy will not improve the game.

If on the other hand, you take the PhysX capabilities and apply them to whatever game you currently favor, you will definitely have a more immersive experience.
 
Not really. AMD partnered with Intel to use Havok, which is CPU only this far. There's nothing about Intel letting AMD port Havok to Brook.

not sure but I can't see intel not porting it to theirs so I would imagine that ATI could do the same. but I think its almost a mute point, with quadcores or better becoming the standard I think putting it on the graphics card may become unnecessary. havok is working pretty good as it is.
 
Hmmm. The one on the right seems to slow down at a lot more when the physics objects are interacted with - eg. shooting the flag banners.

Overall, the physics didnt blow me away .. no where near.

:eek: :p

That's just to highlight the differences.
You've never seen a slo-mo video before :confused:

Anyhow, I thought it looked great, and it wasn't too overdone.
Kudos NV.
 
Basically we have a chicken/egg effect going on with games that require hardware physics acceleration that effects the game play. Until the physics acceleration is more widespread it doesn't make sense for developers to use it as a requirement for the game if a large number of the user base does not have it yet. For that reason making a game that solely relies on the acceleration is probably not a good investment at this time.

If you compare the 2 videos it is easy to see that the PhysX video has more eye candy and looks overall better.

I fail to understand how anyone can watch the 2 videos side-by-side and say "Yeah, I don't need those effects.", "It's just eye candy", or "not impressed". Yet we can see video card reviews and forum "discussions" where sometimes the only difference between 2 cards is a very slight image quality that is only noticeable if zoomed in excessively... however no one usually has any problems declaring the card that can turn on more options and keep a good frame rate the winner.

I like the looks of a game that has nice PhysX effects. Here we have to side-by-side shots where one has something the other does not, yet people still complain that the additional items could be better or are worthless. I remember these same arguments with 3d graphics, 3d audio, dual core CPU’s, ect. To me it's as simple as saying "Does it look better?" and then answering "Yes".

It seems to me that those without the hardware are just rationalizing why they don't need it.
 
why should they have? its a propitiatory technology by a competitor. it made more sense to go with intel and havok then to support a stillborn product by a competing company.

All I am saying is people shouldn't blame Nvidia for ATI not supporting PhysX when Nvidia tried to get PhysX onto ATI cards.
 
The real problem with this one is that Mirror's Edge wasn't built around that much physics dependence for the PhysX to truly shine. If Mirror's Edge with PhysX could have, somehow, approximated what the gravity gun did for Half-Life 2, then there would be more reason for excitement.

Basically the point about PhysX in games is about multi-touch in games - right now, it sucks. Period. But, if the concept were fleshed out with games and applications that were built with this technology in mind, from the ground-up, you would see something truly awesome happening. I'm still waiting for a game to let me chip away at a wall and make my own sniper hole, or a sandbox-type game where I scope out a location where a dignitary is going to be, then decide how soon before the event I want to get there and, interacting with the environment, security, the crowds, and various materials, set up a Rube-Goldberg type of device to assassinate him (preferably Bush :p) - that would pwn.
 
The real problem with this one is that Mirror's Edge wasn't built around that much physics dependence for the PhysX to truly shine. If Mirror's Edge with PhysX could have, somehow, approximated what the gravity gun did for Half-Life 2, then there would be more reason for excitement.

Basically the point about PhysX in games is about multi-touch in games - right now, it sucks. Period. But, if the concept were fleshed out with games and applications that were built with this technology in mind, from the ground-up, you would see something truly awesome happening. I'm still waiting for a game to let me chip away at a wall and make my own sniper hole, or a sandbox-type game where I scope out a location where a dignitary is going to be, then decide how soon before the event I want to get there and, interacting with the environment, security, the crowds, and various materials, set up a Rube-Goldberg type of device to assassinate him (preferably Bush :p) - that would pwn.

Hello red faction 2 ;) Hehe, well you could make holes in the walls that would go very very long. Ah good times making huge cave systems.
 
You guys crack me up especially Kyle. How can you not like what you can CLEARLY see to the right with it on? Everything looked more, moved, reacted way more realistic then the left. This technology is not about eye candy in the traditional sense. It is about things behaving and reacting in a more realistic fashion. At least this is what it means for me. It is the one thing that needs to be added in video games to make them more realistic!
 
The video isn't showing PhysX vs software physics at all. The physics objects on the right are simply absent on the left. The entire presentation is artificial and meaningless. "Hey look, our physics is better than no physics at all". Yeah, no shit.
 
Basically we have a chicken/egg effect going on with games that require hardware physics acceleration that effects the game play. Until the physics acceleration is more widespread it doesn't make sense for developers to use it as a requirement for the game if a large number of the user base does not have it yet. For that reason making a game that solely relies on the acceleration is probably not a good investment at this time.

If you compare the 2 videos it is easy to see that the PhysX video has more eye candy and looks overall better.

I fail to understand how anyone can watch the 2 videos side-by-side and say "Yeah, I don't need those effects.", "It's just eye candy", or "not impressed". Yet we can see video card reviews and forum "discussions" where sometimes the only difference between 2 cards is a very slight image quality that is only noticeable if zoomed in excessively... however no one usually has any problems declaring the card that can turn on more options and keep a good frame rate the winner.

I like the looks of a game that has nice PhysX effects. Here we have to side-by-side shots where one has something the other does not, yet people still complain that the additional items could be better or are worthless. I remember these same arguments with 3d graphics, 3d audio, dual core CPU’s, ect. To me it's as simple as saying "Does it look better?" and then answering "Yes".

It seems to me that those without the hardware are just rationalizing why they don't need it.

Agree with you here 100%. Thing is physX got bad press from the get go. Nobody really wanted it to succeed at least not from what I could tell from reading forums. It makes absolutely no sense why not though. The days of just looking at frame rates are over, or they should be by now.
 
Aside from the wash from the helicopter, really, really gimmicky.
 
I like the look of the physx, but all of the mainstream people with midrange cards they will either be running physx on the cpu or turning it off to have better fps or higher shaders. For the top of the line systems it should add a nice bit of immersion down the road.

Obviously Nvidia can see that if physx is done right it could drive sales on SLI setups in the future, but the only way for that to happen is for physx to be widely supported. And that's exactly why nvidia is making physx open source now that they've gotten ahead on their cuda implementation.
 
The video isn't showing PhysX vs software physics at all. The physics objects on the right are simply absent on the left. The entire presentation is artificial and meaningless. "Hey look, our physics is better than no physics at all". Yeah, no shit.

i think the point of the video is to delineate the visual difference between enabling hardware physx on a compatible nvidia gpu and disabling it on the same gpu. obviously, the experience is visually enhanced on the right with effect physics. otherwise, they could have shown something akin to cellfactor, where running physx off the cpu would result in a slideshow, vs having it run smoothly on a ppu or physx capable gpu. improved gameplay physics will come in due time with dev support continuing to grow and nvidia aggressively marketing physx. just like with the advent of 3d accelerator cards over a decade ago, it all has to start somewhere.
 
not sure but I can't see intel not porting it to theirs so I would imagine that ATI could do the same. but I think its almost a mute point, with quadcores or better becoming the standard I think putting it on the graphics card may become unnecessary. havok is working pretty good as it is.

Intel owns Havok and they'll continue to do it on their CPUs. AMD will only be able to use it on their CPUs aswell. Nothing has been done to port it to AMD's GPUs and I doubt they ever will, since Larrabee with be a set of x86 cores working in parallel, so Intel gains nothing in letting AMD do it on their GPUs.

It's irrelevant how many cores there are on a CPU, when games don't really take advantage of them. Graphics cards however, are perfect for these sort of calculations and having a low-end graphics card dedicated to physics, is a great idea, because even that low-end graphics card, can outperform the most powerful quad-core CPU, that costs 5 times more, in these types of calculations.
 
I love PhysX and think it will be used by many developers in the future... :)

Keep up the good work Nvidia!!
 
why should they have? its a propitiatory technology by a competitor. it made more sense to go with intel and havok then to support a stillborn product by a competing company.

I don't think you understand that Intel is also an AMD competitor and that Havok is also proprietary tech (now from Intel) ?

Not to mention that AMD chose the worse of both evils to partner with, since Intel is a much larger "threat" than NVIDIA. NVIDIA only competes with AMD in the GPU market. Intel competes in AMD's largest market (CPUs) and will be competing in the GPU one aswell.
 
Seems like they artificially slowed down the videos to match either other, so as far as performance goes it's impossible to tell who's better. I did watch the video on the right standalone before, and isn't that kind of physics work already available without PhysX? And without slowdowns, of course. It just doesn't seem like something you're not allowed to have simply because you don't have the NVIDIA brand.
 
I don't think you understand that Intel is also an AMD competitor and that Havok is also proprietary tech (now from Intel) ?

Not to mention that AMD chose the worse of both evils to partner with, since Intel is a much larger "threat" than NVIDIA. NVIDIA only competes with AMD in the GPU market. Intel competes in AMD's largest market (CPUs) and will be competing in the GPU one aswell.

I understand perfectly. That doesn't mean that it does not sometime make good business to work with your competitor. I need to check but I don't think Havok is just intels
 
I understand perfectly. That doesn't mean that it does not sometime make good business to work with your competitor. I need to check but I don't think Havok is just intels

Doing good business with the competitor is exactly why AMD should've partnered with NVIDIA instead, since PhysX runs on both CPUs and GPUs and AMD has both CPUs and GPUs. Havok is for CPU only. With Larrabee in Intel's future, AMD has no chance of getting Havok support for their GPUs, unless Intel grows a "heart".

As far as Havok goes:

http://www.intel.com/pressroom/archive/releases/20070914corp.htm

The company will be a wholly owned Intel subsidiary and continue to operate as an independent business working with its customers in developing digital media content.
 
Some people knocking on curtains?! Keep thinking back to Splinter Cell: Chaos Theory, it made great use of curtain physics. Not only does it react to your touch, actions, weather, but it casts a shadow. So any curtain movement can potentially compromise/hide Sam.
 
PhysX aside, how's Mirror's Edge as a game? I've read mixed reviews and don't want to pick up a copy for some gimmicky physics effects.
 
What I find hilarious is that, with the exception of the cloth physics, the console version of the game had all of the extra effects that the PC version apparently doesn't support without PhysX.

Thank you DICE for gimping the game to help sell nVidia cards =\
 
I have seen physx demos in the past on ageia's site that are no longer there and they were a site to behold. One was for a tom clancy game and the other for city of Villans. Both looked amazing.
It wasen't like mirror's edge where certain graphics simply weren't there without physx. But they both had the graphics it's just that with the physx card you got to see more and it all looked so natural. Much better immersion.

I have an old 8500GT here I was using in my HTPC and I also have a 9800GTX so i'm set. However I have heard that you can still use an Nvidia card for Physx even if your main card is an ATI.
 
So what cards can I use for PhysX support? Anything like 9400/9500 cheap cards, or you need to get something more sophisticated?

And you just stick them into PCIe slot, and install drivers?
 
So what cards can I use for PhysX support? Anything like 9400/9500 cheap cards, or you need to get something more sophisticated?

And you just stick them into PCIe slot, and install drivers?

any 8000 or 9000 series. The drives will give you can option of setting one card to physx processing.
 
So what cards can I use for PhysX support? Anything like 9400/9500 cheap cards, or you need to get something more sophisticated?

And you just stick them into PCIe slot, and install drivers?

actually that only works in XP. vista does not allow for installing two video drivers if I recall correctly
 
actually that only works in XP. vista does not allow for installing two video drivers if I recall correctly

It works in Vista. You don't install two drivers. You install your standard nvidia driver and set one card to physX.
 
Not impressed.

the none-PhysX portion looks dull like hell, agreed.

Yay, now we all can have curtains in games

Better that than.....NOTHING...which seems to be your "flavour" :rolleyes:

Some particle effects here, some cloth effects there, not very impressive IMO.

Try running those on your CPU then...

PhysX is not the only option for those type of 3D animation..

it can also process through CPU ..


another good example is from CryEngine 2 , and Havoc, and its even more fantastic...

so what this PhysX can do, and a hardware limit, it doesn't impress me at all...

It's PHYSCIS, not 3D...but try the other options..and report your FPS back to us *chough*

But I would love to see you fantastic havok physcis...I dare you! :)

Is it me... or is that video a bit off... Seems like all the physX items (banners, cloth etc_ were missing from the "off" video.
.
For a proper comparison shouldn't the "off" video show the items with standard CPU physics?


Because tearable cloth makes you FPS go into single digits(if even so lucky) on a CPU-based system...slideshow vs. video would make the anti-physcis crowd cry at nights.

It seems that they added banners and cloth things, for the sake of adding banners and cloth things. I can't say that I am impressed.

I guess you are impressed by nothing then...or speaking out of ignorance.

updated video of physx comparison in mirror's edge without the intentional slowdown

http://firingsquad.com/news/newsarticle.asp?searchid=21079

Good find...the all people that can't read and think "ARGGHHH slowdowns" can be eased....untill they try running the game in their none PhysX PC's :D
 
Non physX physics game:
http://store.steampowered.com/app/900692/

actually, it uses physx without the card to do just fine, dl the demo.

If you define that as "fine" you just set the bar for physics so low that HL2 must be a future game?
Hell members on this board has made better videos demonstrating physcis, but when you talk about physX outside the PPU forum you get "IT-Joes" posting that dosn't know physcis from AA if it bite them in the *beeep*

As a sidenote I look forward to "larrabee" as Intel entering the gaming market is only good...and will mean MORE physics...not less, sorry to dissapoint [Team]ANTI-physx :p
 
So what cards can I use for PhysX support? Anything like 9400/9500 cheap cards, or you need to get something more sophisticated?

And you just stick them into PCIe slot, and install drivers?

8800 series or higher. 9400/9500 cards might work but they are bad enough cards as is I don't know if you would be doing your frame rate any favors.
 
Back
Top