Cryengine 3 to Use in-house Physics Engine, No Havok Physics or PhysX Middleware

You know, I wouldn't mind this (and other titles) supporting nVidia's PhysX hardware acceleration with their cards if they hadn't disabled it on the driver level for folks using ATI video cards as the primary GPU. While they point to a few problems as a reason for "wanting to ensure a quality experience," it was largely hogwash, and a LOT of people got it to work (and still use it!) quite successfully. nVidia locking it out really pissed me off, and made me feel kinda bad for supporting them with so much of my money in my current build (GTX 295 plus a dedicated physx card!). That kind of business tactic really makes me mad and makes NO SENSE, because it sends two messages:

Reason 1. If people own our competition's GPUs, we will restrict our GPUs from working with them for the purpose of dedicated PhysX processing, which essentially means "please don't buy our cards for dedicated PhysX if you run ATI!" And they're deluded if they think people are gonna switch their whole rig's video cards just to support a few PhysX titles. So basically this reason will make them sell LESS.

Reason 2. We are intentionally making life HARDER for people that buy our competition and making sure NONE of our hardware works with theirs, because they are the ENEMY. And this kind of tactic is such a low blow that it pisses people off and looks really, really bad and cheap and devious - AGAIN, making people want to buy LESS video cards from nVidia.

That's why even though I run nVidia in my primary rig, I also enjoyed the F U and other anti-nVidia comments on this thread.

Right now they deserve it.
 
I would prolly see one core at 90 to 100% usage and another around 50% with the other two under 20%. Is your point that it still looks good without cpu crushing physics effects? If so, you are preaching to the choir. Or are you trying to point out that it makes only mediocre usage of multi core cpus? Again, you would be preaching to the choir.

What does that have to do with what I typed in my post regarding it and GPU's?

I've been playing DarkVoid with PhysX set to high. I've got 2x GTX260s. I'm using 1 for physx and the other to render. The being used for physx averages around 15% usage. Now on a GPU that has about 850GFlop power(my shader core is clocked at 1512), 15% says it takes about 130GFlops on average to compute the physics in DarkVoid. Care to point me to where I can buy a CPU setup for under 500 bucks that has that kind of computing power? Hell, under 100 bucks?
 
Last edited:
I've been playing DarkVoid with PhysX set to high. I've got 2x GTX260s. I'm using 1 for physx and the other to render. The being used for physx averages around 15% usage. Now on a GPU that has about 850GFlop power(my shader core is clocked at 1512), 15% says it takes about 130GFlops on average to compute the physics in DarkVoid. Care to point me to where I can buy a CPU setup for under 500 bucks that has that kind of computing power? Hell, under 100 bucks?

WTF are you talking about? Did you quote the wrong post or something?
 
My question is, can it handle making boobies giggle?

I think you mean jiggle. lmao

reminds me of that "hey, ma" song from a couple years ago, when in it, one of the rappers says, "I see a boobie grinnin' ".
 
WTF are you talking about? Did you quote the wrong post or something?

No, that makes quite a bit of sense. The kind of computing power a GPU puts out still can't be efficiently matched by a CPU.

However, if that output can only be utilised if you have a full nVidia set-up, then screw it. It's not worth the money to be able to play niche games.
 
No, that makes quite a bit of sense. The kind of computing power a GPU puts out still can't be efficiently matched by a CPU.

However, if that output can only be utilised if you have a full nVidia set-up, then screw it. It's not worth the money to be able to play niche games.

Whether his post makes sense or not on it's own has what to with the post he quoted? Pretty much nothing, which is why I was asking if he quoted the wrong post.
 
WTF are you talking about? Did you quote the wrong post or something?

Sorry for the delay in responce. Think of my post as more of a confirmation for yours to a point.

No, that makes quite a bit of sense. The kind of computing power a GPU puts out still can't be efficiently matched by a CPU.

However, if that output can only be utilised if you have a full nVidia set-up, then screw it. It's not worth the money to be able to play niche games.

Sorta my point. ATI GPUs can do much more in terms of GFlops, sadly tho, they have done little to nothing other than show a demo over 18m ago of GPU based physics.

As to you other point. Meh is all I can say. I'm on the side of gamers in terms of the PhysX thing, ATI owners shouldn't be locked out, but I do understand why Nvidia has done it. Do I think it is right? No, but I understand.
 
Just hopeing Diablo 3 will not use Nvidia Physx.... oh wait its blizzard they are richer then Nvidia....
 
Diablo 3 will probably use Havok I would imagine.

If we are all lucky, it wont use shit. No PhysX to keep the whiners quite and no nothing else to keep the game from becoming an adnausium of grotesque preprogrammed crap. Or god forbid, they actually put forth an effort and actually USE the PhysX library such as the Devs of Metro 2033 have done and design a game that uses PhysX to its fullest. But I dont see that happening so Metro will have to be the Best Physics game put for some time.
 
If we are all lucky, it wont use shit. No PhysX to keep the whiners quite and no nothing else to keep the game from becoming an adnausium of grotesque preprogrammed crap. Or god forbid, they actually put forth an effort and actually USE the PhysX library such as the Devs of Metro 2033 have done and design a game that uses PhysX to its fullest. But I dont see that happening so Metro will have to be the Best Physics game put for some time.

What? First of all, have you actually played Metro? The videos I've seen haven't shown much in the way of groundbreaking CPU-side physics. As far as I know, the CPU client is still single threaded, but maybe that's changed by now. If it has changed that has almost 0 to do with the game studio. It's nVidia's call if their software client supports more than 1 core. At least up until the last PhysX release, it hasn't (except on consoles). I'd expect you to know that, but

oh yeah, it's you :D

Havoc is platform agnostic and, by its preponderance over in-house physics engines (and over PhysX), I'd assume quite efficient. Why the hell wouldn't we want to see that used in Diablo III?
 
What? First of all, have you actually played Metro? The videos I've seen haven't shown much in the way of groundbreaking CPU-side physics. As far as I know, the CPU client is still single threaded, but maybe that's changed by now. If it has changed that has almost 0 to do with the game studio. It's nVidia's call if their software client supports more than 1 core. At least up until the last PhysX release, it hasn't (except on consoles). I'd expect you to know that, but

oh yeah, it's you :D

Havoc is platform agnostic and, by its preponderance over in-house physics engines (and over PhysX), I'd assume quite efficient. Why the hell wouldn't we want to see that used in Diablo III?

Still haven't read the interview with the Dev behind Metro 2033 I see. Failed to see how he states very clearly, he has been using the PhysX SDK since it was Novodex, not nvidia. Clearly states it works with mutlicore/multithread capable setups, this includes PCs for one brain celled organisms out there. Outlines what the CPU will handle for physics and what the GPU(should one be installed that can do it) will take care off and they WILL work in tandom.

As to way not use Havok, ad-nausium, preprogrammed, rail based self repeating boring as hell half assed physics, you can have it. If a dev is going to do physics in a game, do it, dont half ass it. Sadly every game I've played that uses Havok is half assed. The same boring routine every single time you see something.
 
I’m so excited :D I hope it’ll meet the expectations

20h4oxv.jpg
 
Those guys made a pretty good engine. Not to most efficient, but the best looking, and it makes sense they would code their own physics instead of paying money to other companies to use proprietary sw with possibly limited customization.
 
Still haven't read the interview with the Dev behind Metro 2033 I see. Failed to see how he states very clearly, he has been using the PhysX SDK since it was Novodex, not nvidia. Clearly states it works with mutlicore/multithread capable setups, this includes PCs for one brain celled organisms out there. Outlines what the CPU will handle for physics and what the GPU(should one be installed that can do it) will take care off and they WILL work in tandom.

As to way not use Havok, ad-nausium, preprogrammed, rail based self repeating boring as hell half assed physics, you can have it. If a dev is going to do physics in a game, do it, dont half ass it. Sadly every game I've played that uses Havok is half assed. The same boring routine every single time you see something.

Xman, it's a PR piece. He doesn't give any technical exposition, it's just fluff. Yes, we know they implemented a mutli-threaded PhysX engine on the CPU. Doesn't mean it's any good. Novodex/PhysX was never known for its efficiency, compared to in-house stuff, never mind Havok, and like I said, there is a good reason why console devs constantly choose the expensive Havok libraries over the free PhysX ones.

I mean you can believe whatever you want I don't really care :D It's the same software that costs 30% of your frames for a hundred debris particles.
 
Xman, it's a PR piece. He doesn't give any technical exposition, it's just fluff. Yes, we know they implemented a mutli-threaded PhysX engine on the CPU. Doesn't mean it's any good. Novodex/PhysX was never known for its efficiency, compared to in-house stuff, never mind Havok, and like I said, there is a good reason why console devs constantly choose the expensive Havok libraries over the free PhysX ones.

I mean you can believe whatever you want I don't really care :D It's the same software that costs 30% of your frames for a hundred debris particles.

Havok is good if you want prerendered repeating crap which is what you have in every game that uses it. I'll find the link again and post it here, but the install base for PhysX and Havok are about equal across all platforms, but the number of games being released each year is higher for PhysX enabled games than Havok.

http://physxinfo.com/articles/?page_id=154
 
Havok is good if you want prerendered repeating crap which is what you have in every game that uses it. I'll find the link again and post it here, but the install base for PhysX and Havok are about equal across all platforms, but the number of games being released each year is higher for PhysX enabled games than Havok.

http://physxinfo.com/articles/?page_id=154

I wouldn't post corroborating evidence before you read it to make sure it supports your argument. Take a look at the "quality title" breakdown. The only category in which PhysX has more users is "budget" titles, and that is 100% because PhysX SDK is free for commercial use on consoles.

Titles that score >85 are 22 vs 8 in favor of Havok. AAA devs are not using PhysX. Why? Because it isn't that great of a package. When cost isn't an issue, devs consistently choose Havok.

And I'm not sure what "same repeating crap" means. Havok doesn't use any pre-animated libraries.
 
I wouldn't post corroborating evidence before you read it to make sure it supports your argument. Take a look at the "quality title" breakdown. The only category in which PhysX has more users is "budget" titles, and that is 100% because PhysX SDK is free for commercial use on consoles.

Titles that score >85 are 22 vs 8 in favor of Havok. AAA devs are not using PhysX. Why? Because it isn't that great of a package. When cost isn't an issue, devs consistently choose Havok.

And I'm not sure what "same repeating crap" means. Havok doesn't use any pre-animated libraries.

And you should really try to read an article once in awhile.

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph_year.png
http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph.jpg
http://physxinfo.com/articles/wp-content/uploads/2009/12/platform_distribution_graph.jpg

The most important part of the article is above, all three graphs and pay CLOSE attention to the PC releases while the graph you point to is all platforms, not just PC.. And Havok does reuse the same thing over and over again. It is way it got used as you could code something once and reuse the code. With PhysX, because most of it is interactionary based PhysX in games, the calculations can't be reused because the calculations have to take place on the fly as they occur, they can't be preprogrammed like havok.
 
Last edited:
And you should really try to read an article once in awhile.

http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph_year.png
http://physxinfo.com/articles/wp-content/uploads/2009/12/titles_release_dynamics_graph.jpg
http://physxinfo.com/articles/wp-content/uploads/2009/12/platform_distribution_graph.jpg

The most important part of the article is above, all three graphs and pay CLOSE attention to the PC releases while the graph you point to is all platforms, not just PC.. And Havok does reuse the same thing over and over again. It is way it got used as you could code something once and reuse the code. With PhysX, because most of it is interactionary based PhysX in games, the calculations can't be reused because the calculations have to take place on the fly as they occur, they can't be preprogrammed like havok.

This is my last post to you. My brain finds you toxic.

And I don't know why that PC statistic is important, considering almost every one of the pc titles is double counted in the console figures.
 
Last edited by a moderator:
This is my last post to you. My brain finds you toxic.

And I don't know why that PC statistic is important, considering almost every one of the physx titles is double counted in the console figures.

And teh Havok titles aren't by your same standard? Give me a break. Facts you can not refute and try to make up some excuse as to teh good numbers for PhysX.

1. Fact, more PhysX titles have been released than Havok ones
2. More PC titles are shipping using PhysX than Havok
3. PhysX based titles is on the rise, Havok has been on a slow but steady decline
4. PhysX Gets used more on PC game titles than Havok and other options combined.
 
With hexacore cpus around the bend, the more simulation engines the merrier!
 
That's right nvidia, fuck you.

+1 to this. It's awesome what a bunch of smart programmers can do. Mad props to them and their industry. Can't wait to see the eye-candy and play the game.
 
LOL @ "news"

Many (would say most) games use developer-made physics. If you have AAA blinders on you may not agree, but there are hundreds of PC games released a year. And yeah, CryEngine is a pig.

4. PhysX Gets used more on PC game titles than Havok and other options combined.
No.
 
The person who stated that there are more havok titles is misinformed. If I'm not mistaken phyx surpassed havok awhile ago, and has been steadily growing market share. I continue to blame ATI the stunted development in physics much like i blame Creative labs for sound. The physx lic was offered to them for free, and they turned it down. I don't for a second believe that bs, that nvidia didnt offer it to them officially. Everyone knew the offer was on the table.
 
LOL @ "news"

Many (would say most) games use developer-made physics. If you have AAA blinders on you may not agree, but there are hundreds of PC games released a year. And yeah, CryEngine is a pig.

No.

actually he is quite right. Havok has steadily been losing market share since nivida gave the physx away for free. we should be in a boon time for physics were it not for AMD's pride and stubborness.
 
actually he is quite right.
LOL, read plox.

"PhysX Gets used more on PC game titles than Havok and other options combined"

Most games do not use either Havok or PhysX, and yet very often incomporate some form of physics.

AAA blinders, remove them.
 
LOL, read plox.

"PhysX Gets used more on PC game titles than Havok and other options combined"

Most games do not use either Havok or PhysX, and yet very often incomporate some form of physics.

AAA blinders, remove them.

please tell me you are kidding?

most games that have physics do use either havok or physx. very few developers use in house physics. which is why crytek has twice made the news by using in house physics in both crysis and crysis 2.
 
LOL, read plox.

"PhysX Gets used more on PC game titles than Havok and other options combined"

Most games do not use either Havok or PhysX, and yet very often incomporate some form of physics.

AAA blinders, remove them.

Given NOONE WILL USE Cry Engines 1 or 2, that leaves just SHAMOK, Bullet, OCL or DC(OCL and DC have yet to do much of anything in gaming) PhysX is killing them all as far as adoption rate goes. DO ALL vendors put in the work needed? No! Should they? Yes. Do we pay for it? Yes. IS there anything better that is widely used? NO!
 
Given NOONE WILL USE Cry Engines 1 or 2, that leaves just SHAMOK, Bullet, OCL or DC(OCL and DC have yet to do much of anything in gaming) PhysX is killing them all as far as adoption rate goes. DO ALL vendors put in the work needed? No! Should they? Yes. Do we pay for it? Yes. IS there anything better that is widely used? NO!

I agree.
 
The idea of being able to offload physics to gpu is awesome imo

Just need a good open physics engine

People give Crysis low cpu usage as an example why physics doesn't need to be off the cpu, which is stupid.

What about a strategy game? They can be cpu intensive (I think Anno 1404 fully utilises 4 cores? cant remember). Then your cpu won't be able to cope with extra physics and you'll wish you could offload to a gpu :p

Shouldn't be compulsory though. But imo if they're making their own physics engine, they should also make it GPU-able (although maybe current gpu physics engine not possible)
 
Last edited:
There are about 5 people on this whole forums who actually know the meaning of the word monopoly. The rest just use it every time they hear the words nvidia or Intel.

Dont lie, we all know Monopoly is a board game :p
 
My biggest problem with PhysX is that it is proprietary; if they made it an open standard like OpenGL that all card manufacturers could get behind, we wouldn't be badmouthing NVIDIA so much.
 
My biggest problem with PhysX is that it is proprietary; if they made it an open standard like OpenGL that all card manufacturers could get behind, we wouldn't be badmouthing NVIDIA so much.

Its only proprietary because ATI refuses to license it. Do you think if ATI had bought Ageia back and put all the work into making PhysX the force it is today wouldn't charge for other GPU vendors to use it you are severely kidding yourself.
 
Its only proprietary because ATI refuses to license it. Do you think if ATI had bought Ageia back and put all the work into making PhysX the force it is today wouldn't charge for other GPU vendors to use it you are severely kidding yourself.

You've got that backwards - ATI won't license it because it's proprietary. It could be "universally" supported if ATI licensed it, but that wouldn't change the fact that it's still proprietary. It would actually reinforce that it's proprietary if ATI were to license it.

As others have said, ATI would be paying their competitor to get something which could very well be inferior to the version that Nvidia uses internally. Assuming they make more money off selling video cards than from licensing PhysX, it's in Nvidia's best interest for ATI to not have a card that's good at PhysX (if ATI has a good PhysX card, it would hurt Nvidia's sales). Since Nvidia said it would cost pennies per GPU for ATI to license PhysX, I'm sure they'd much rather sell a GTX480 than get the PhysX licensing fees from ATI selling an HD5870. Would you buy and rely upon supplies from your main competitor if you knew that it was in their best interest to give you crappy supplies?

I'm not saying that Nvidia shouldn't be able to license PhysX however they want. I don't think they should be forced to give it away to everyone for free. However, I do think it's in the best interests of both consumers and devs for there to be an open standard for GPU-accelerated physics. It's good for consumers because it means that any video card has equal opportunity to run it, so you're not locked in to one vendor and the vendors will be competing for the best support of it. Likewise, it means devs can use it without the customer needing to have a certain brand of card. This means more potential customers, meaning more potential profit. The only one who loses with an open physics standard is Nvidia, and they'd probably pretty much break even rather than actually losing anything (they can still support both PhysX and the new open standard, they just lose their exclusivity).

Look at the Steam Hardware Survey. Of the top 15 cards, 5 simply can't do PhysX. Of the remaining 10, 5 are generally considered weak for a standalone PhysX card, much less for doing both graphics and PhysX. Going further down the list, it doesn't get any better. Even in gamers' PCs, a lot can't do PhysX at all, and many can't do it very well. It's a chicken & egg situation - devs aren't willing to rely on PhysX because it's not universally available (the same reason games don't require SLI 480s), and people aren't willing to fully support it because very few games actually make use of it (would you spend an extra $400 just to see some neat fog or breaking glass in a game?).

We're at the tough point right now. Your options are basically PhysX, which is a vote for closed, proprietary systems without a whole lot of support, or the awesome open standard that's widely supported in both hardware and software but doesn't actually exist yet. The easy answer is to use PhysX for now until the good standard gets here. Unfortunately, that encourages people to continue (or begin) to use PhysX, which draws efforts away from developing the standardized option and makes it less likely to arrive quickly or even at all.
 
You've got that backwards - ATI won't license it because it's proprietary. It could be "universally" supported if ATI licensed it, but that wouldn't change the fact that it's still proprietary. It would actually reinforce that it's proprietary if ATI were to license it.

As others have said, ATI would be paying their competitor to get something which could very well be inferior to the version that Nvidia uses internally. Assuming they make more money off selling video cards than from licensing PhysX, it's in Nvidia's best interest for ATI to not have a card that's good at PhysX (if ATI has a good PhysX card, it would hurt Nvidia's sales). Since Nvidia said it would cost pennies per GPU for ATI to license PhysX, I'm sure they'd much rather sell a GTX480 than get the PhysX licensing fees from ATI selling an HD5870. Would you buy and rely upon supplies from your main competitor if you knew that it was in their best interest to give you crappy supplies?

I'm not saying that Nvidia shouldn't be able to license PhysX however they want. I don't think they should be forced to give it away to everyone for free. However, I do think it's in the best interests of both consumers and devs for there to be an open standard for GPU-accelerated physics. It's good for consumers because it means that any video card has equal opportunity to run it, so you're not locked in to one vendor and the vendors will be competing for the best support of it. Likewise, it means devs can use it without the customer needing to have a certain brand of card. This means more potential customers, meaning more potential profit. The only one who loses with an open physics standard is Nvidia, and they'd probably pretty much break even rather than actually losing anything (they can still support both PhysX and the new open standard, they just lose their exclusivity).

Look at the Steam Hardware Survey. Of the top 15 cards, 5 simply can't do PhysX. Of the remaining 10, 5 are generally considered weak for a standalone PhysX card, much less for doing both graphics and PhysX. Going further down the list, it doesn't get any better. Even in gamers' PCs, a lot can't do PhysX at all, and many can't do it very well. It's a chicken & egg situation - devs aren't willing to rely on PhysX because it's not universally available (the same reason games don't require SLI 480s), and people aren't willing to fully support it because very few games actually make use of it (would you spend an extra $400 just to see some neat fog or breaking glass in a game?).

We're at the tough point right now. Your options are basically PhysX, which is a vote for closed, proprietary systems without a whole lot of support, or the awesome open standard that's widely supported in both hardware and software but doesn't actually exist yet. The easy answer is to use PhysX for now until the good standard gets here. Unfortunately, that encourages people to continue (or begin) to use PhysX, which draws efforts away from developing the standardized option and makes it less likely to arrive quickly or even at all.

Actually, Nvidia has said that ATI could Licence PhysX from them and devolope their own driver code and paths for PhysX or if they wanted, Nvidia would help with pointers and optimizations to make it work better. ATI has refused and instead touted OCL, DC and Havok. Hey ATI, the world is still waiting for the first game to use GPU based Physics based on any of those which you helped develope. 4 Years and still just a single simple demo, way to go ATI.
 
Actually, Nvidia has said that ATI could Licence PhysX from them and devolope their own driver code and paths for PhysX or if they wanted, Nvidia would help with pointers and optimizations to make it work better. ATI has refused and instead touted OCL, DC and Havok. Hey ATI, the world is still waiting for the first game to use GPU based Physics based on any of those which you helped develope. 4 Years and still just a single simple demo, way to go ATI.

Instead of 4 years and only games where Nv actually codes it in themselves?

GPU PhysX has pretty much utterly failed. CPU PhysX is well established, because it is free, but it seems Nv has to code it themselves if they want the game to use GPU PhysX.
GPU PhysX's lack of success is prolly a fair part of the reason AMD does not seem in a hurry on another solution to compete with PhysX.

You can say "they are the same API" and "it will be easy for devs to switch to GPU PhysX from CPU PhysX", and "they are gaining mindshare", all you want. In fact, it is true to varying degress, but the devs are still not bothering with GPU PhysX.

The future is prolly going to look like this:
Other gpu vendor agnostic gpu physics solutions running on OpenCl eventually launch.
At that point:
PhysX will join suit and Nv may begin to charge for it. or
Nv will continue to use it as a marketing tool and we will get our one or two triple A games per year that use it in any meaningful way. or
Nv will refuse to support, or actively block, the other solutions from running on their GPU's, which would essentially kill GPU physics for the forseeable future. or
It will out and out just fade away.
Like it or not, Nv will never get Intel or AMD to license it, so it will remain a tack on in the occasional game that uses it, and without their support, it will never become the standard.
 
Back
Top