Game Devs Only Use PhysX For The Cash

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
According to this article, AMD’s senior manager of developer relations says that game developers only use PhysX for the cash. Thanks to Edward Cabarles for the link.

“They’re not doing it because they want it; they’re doing it because they’re paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn’t because the game developer wants it in there.” In fact, Huddy reckons that no developers outside Epic genuinely wanted to implement GPU-accelerated PhysX in their game.
 
This is all based on something said by an AMD developer relation's guy. Furthermore part of his job description is to downplay things like PhysX. I'm not saying he's lying though. I'm just saying I don't know how I you could possibly put much faith in anything he has to say on the issue. I'll reserve judgement till we have some studios chime in on this.
 
It's great that they are pushing it. Just imagine if they didn't, there would be even less of a reason to buy their cards.
 
Sorry but right now we need all the help we (PC gamers) can get. If that means an OEM is going to support the developers, to add something for PC gaming then go for it.

AMD/ATI, Intel, Microsoft and anyone else trying to sell PC hardware better get on the bandwagon and start helping these developers push PC gaming again.

Microsoft is the number one detriment to PC gaming and it's because they would rather sell their games on their closed system, where they can sell DLC, subscriptions to their XBL account and so forth. It's interesting that they forget how many people buy their OS to game on...
 
AMD could have done the same thing Nvidia is doing, so wtf are they whining for? Better yet, make the bullet physics free and help (i.e. pay) developers to implement it. It's easy to take the high-ground when you're just talking, but let's see some action from you, AMD. (I'm away AMD is pushing bullet physics now, but why did they wait so long and then whine? Just stupidity...)
 
AMD could have done the same thing Nvidia is doing, so wtf are they whining for? Better yet, make the bullet physics free and help (i.e. pay) developers to implement it. It's easy to take the high-ground when you're just talking, but let's see some action from you, AMD. (I'm away AMD is pushing bullet physics now, but why did they wait so long and then whine? Just stupidity...)

Isn't Bullet physics a free, open-source platform...? I think AMD is pushing it now because there is finally wider availability of DX11 / DirectCompute / OpenCL in terms of both hardware and software. If AMD were to release it's only propetary vendor-locked solution that would be exactly the opposite of what they're endorsing...
 
According to this article, AMD’s senior manager of developer relations says that game developers only use PhysX for the cash. Thanks to Edward Cabarles for the link.

This sounds... about right actually... from what various developers have said about GPU PhysX. Nvidia's continued re-positioning of PhysX GPU has burned quite a few developers and publishers as is. The big question is whether or not developers who want GPU enabled physics will implement the physics with OpenGL.
 
I think that PhysX and technologies like it are cool, but most developers probably don't want to use it for several reasons:

1.) It creates a feature set that a large portion of their customer base can't use and thus alienates those customers to some degree.

2.) It is difficult to implement in games themselves.

3.) Multi-CPU based physics processing is something that can be done just as easily, and benefits everyone regardless of hardware configuration.

An open standard that works on both AMD and NVIDIA hardware, is probably the solution that most developers truly want before actually making GPU physics processing a standard part of the development process. Even then many developers would probably rather develop CPU based physics processing simply to leverage multi-core CPUs which largely go underutilized as is. I think most gamers have GPUs that could be used for physics processing, but not while handling a graphics load as well.
 
AMD could have done the same thing Nvidia is doing, so wtf are they whining for? Better yet, make the bullet physics free and help (i.e. pay) developers to implement it. It's easy to take the high-ground when you're just talking, but let's see some action from you, AMD. (I'm away AMD is pushing bullet physics now, but why did they wait so long and then whine? Just stupidity...)

You mean like this? http://www.thinq.co.uk/features/2010/3/8/amd-gives-away-gpu-physics-tools/

"We’re arranging with Pixelux, who do DMM, that the GPU-accelerated version will be available to at least some game developers without charge." AMD’s senior manager of developer relations, Richard Huddy, explained to THINQ. Huddy also says that AMD is covering the cost of the physics engine in order to "get the process kick-started.” The engine supports both OpenCL and Microsoft’s DirectCompute standard.
 
I think that PhysX and technologies like it are cool, but most developers probably don't want to use it for several reasons:

1.) It creates a feature set that a large portion of their customer base can't use and thus alienates those customers to some degree.

2.) It is difficult to implement in games themselves.

3.) Multi-CPU based physics processing is something that can be done just as easily, and benefits everyone regardless of hardware configuration.

An open standard that works on both AMD and NVIDIA hardware, is probably the solution that most developers truly want before actually making GPU physics processing a standard part of the development process. Even then many developers would probably rather develop CPU based physics processing simply to leverage multi-core CPUs which largely go underutilized as is. I think most gamers have GPUs that could be used for physics processing, but not while handling a graphics load as well.

I had a question about this no one wants to answer for me. The Infernal engine they built for Ghostbuster's. That implemented crossplatform physx with nvidia right? Can't developers just use that method?
 
I had a question about this no one wants to answer for me. The Infernal engine they built for Ghostbuster's. That implemented crossplatform physx with nvidia right? Can't developers just use that method?

I meant without nvidia.
 
So basically an AMD guy is whining because NVIDIA is paying developers more? How does he think business works?
 
I think that PhysX and technologies like it are cool, but most developers probably don't want to use it for several reasons:

1.) It creates a feature set that a large portion of their customer base can't use and thus alienates those customers to some degree.

2.) It is difficult to implement in games themselves.

3.) Multi-CPU based physics processing is something that can be done just as easily, and benefits everyone regardless of hardware configuration.

An open standard that works on both AMD and NVIDIA hardware, is probably the solution that most developers truly want before actually making GPU physics processing a standard part of the development process. Even then many developers would probably rather develop CPU based physics processing simply to leverage multi-core CPUs which largely go underutilized as is. I think most gamers have GPUs that could be used for physics processing, but not while handling a graphics load as well.

Having worked on a project that used PhysX, implementing PhysX simply as our physics engine was no different than rolling our own, or using Havok, or using Bullet, etc. In the end, it only took us two days to go from having no physics to having PhysX fully integrated into the project running both multi-threaded for CPU only, as well as GPU accelerated where available.

Doing the integration of the engine itself is not a big hurdle, and setting up PhysX to properly take advantage of whatever multithreading is available is not a big hurdle. In the end, the GPU-only limitations are game-side artificial limitations. However, if Nvidia is going to come in and offer a couple million dollars for a development studio to implement stuff like that, then that's money in the bank that the studio can use to fund development of other titles. The thing that most people don't realize is that simply due to costs, most development studios never will see any money from royalties (or at least, won't see money til long after development on the title has ceased), so things like this can often be the only money that keep independent studios afloat for more projects.
 
I had a question about this no one wants to answer for me. The Infernal engine they built for Ghostbuster's. That implemented crossplatform physx with nvidia right? Can't developers just use that method?

The Infernal engine doesn't use PhysX, it uses its own VELOCITY engine.
 
I believe this is 100% accurate. There are what, maybe 12 games that support PhysX and it's been on the market for 4 years or so? Come on .. see the light man. On top of that, look at how poorly implemented it is. No PhysX games do anything that can' already be done on the CPU.

Read the sig!
 
I believe this is 100% accurate. There are what, maybe 12 games that support PhysX and it's been on the market for 4 years or so?
I want to clear this up. I mean hardware Physics. I am aware of many titles that use software PhysX, like they do Havok or others.
 
I believe this is 100% accurate. There are what, maybe 12 games that support PhysX and it's been on the market for 4 years or so? Come on .. see the light man. On top of that, look at how poorly implemented it is. No PhysX games do anything that can' already be done on the CPU.

Read the sig!

actually, only 6 current games utilize Nvidia's brand of physx.

Nvidia's site
http://www.nvidia.com/object/physx_new.html

as you can see, only two of those are even major titles (batman and mirrors edge)
 
No PhysX games do anything that can' already be done on the CPU.

:rolleyes:

Sorry, but you're talking out of your ass. Until CPUs close the order of magnitude gap in computational power between CPUs and GPUs on multithreaded problems, GPUs will be much better suited for physics.
 
actually, only 6 current games utilize Nvidia's brand of physx.

Nvidia's site
http://www.nvidia.com/object/physx_new.html

as you can see, only two of those are even major titles (batman and mirrors edge)

The featured 6 games, here is a better list from wiki:
The following games feature PhysX support (list may be incomplete):[19]

2 Days to Vegas
Adrenalin 2: Rush Hour
Age of Empires III (Only on the Mac version)
Alpha Prime
APB
Army of Two
Auto Assault
Batman: Arkham Asylum
Backbreaker
B.A.S.E. Jumping
Bet on Soldier: Blackout Saigon
Bet on Soldier: Blood of Sahara
Bet on Soldier: Blood Sport
Beowulf: The Game
Bladestorm: The Hundred Years' War
Borderlands
Brothers in Arms: Hell's Highway
Captain Blood
CellFactor: Combat Training
CellFactor: Revolution
City of Villains
Crazy Machines 2
Cryostasis: Sleep of Reason
Dark Physics
Dark Void
Darkest of Days
Desert Diner
Dragon Age: Origins[20]
Dragonshard
Dusk 12
Empire Above All
Empire Earth III
Entropia Universe
Fallen Earth
Fat Princess
Frontlines: Fuel of War
Fury
Gears of War
Ghost Recon: Advanced Warfighter
Race Driver: Grid
Global Agenda
Gluk'Oza: Action
GooBall
Gothic 3
Gunship Apocalypse
Heavy Rain
Helldorado: Conspiracy
Hero's Journey
Hour of Victory
Huxley
iFluid
Infernal
Inhabited island: Prisoner of Power
Joint Task Force
Kran Simulator 2009[21]
Kuma\War
Aura of Wisdom
Mafia 2
Magic Ball 3
Mass Effect
Mass Effect 2
Medal of Honor: Airborne
Metro 2033
Mirror's Edge
Mobile Suit Gundam: Crossfire
Monster Madness: Battle for Suburbia
Monster Truck Maniax
Myst Online: Uru Live
Need for Speed: Shift
Nights: Journey of Dreams
Nurien
Odd Blox
Open Fire (and its successor, Open Fire Gold)
Overlord 2
Parabellum
Paragraph 78
Pirates of the Burning Sea
Prince of Persia
Point Blank
PT Boats: Knights of the Sea
Rail Simulator
Red Steel
Rise of Nations: Rise of Legends
Risen
Robert Ludlum's The Bourne Conspiracy
Roboblitz
Sacred 2
Shadowgrounds (Only on the Linux version)
Shadowgrounds: Survivor
Sherlock Holmes: The Awakened
Showdown: Scorpion
Silverfall
Sovereign Symphony
Sonic and the Black Knight
Sonic and the Secret Rings
Speedball 2
Stoked
Stoked Rider: Alaska Alien
Switchball
Trine
The Hunt
The Stalin Subway
The Void
Tom Clancy's Ghost Recon Advanced Warfighter
Tom Clancy's Ghost Recon Advanced Warfighter 2
Tom Clancy's Rainbow Six: Vegas
Tom Clancy's Splinter Cell: Double Agent
Tortuga: Two Treasures
Turok
Two Worlds
Ultra Tubes
Unreal Tournament 3
Unreal Tournament 3: Extreme Physics Mod
Valkyria Chronicles
Velvet Assassin
Warfare
Warmonger: Operation Downtown Destruction
W.E.L.L. Online
Winterheart's Guild
Wolverine Origins
WorldShift
Zombie Driver
 
I was using my 8800GTX (My backup GPU) for physix for a few months but I took it out two nights ago because there just isnt enough games out there that use the feature. It would be nice if more games started using Physics on a dedicated GPU.
 
The featured 6 games, here is a better list from wiki:

Actually, that list is based on the list of games that could off load the CPU driven calculation to a PPU device. That list was established prior to the aquisition of of Ageia by Nvidia. The point is that, yes, those titles can use a Nvidia GPU or Ageia PPU to carry the physic load, but can also use the CPU to perform the same calculation. T

As far as Nvidia proprietary Physx effects are concerned though, the only titles that utilize Nvidia's brand of Physx (actually add siginificant detail to the game, not just off load to the GPU from the CPU) are detailed in that list provide prior. That is not a featured game list, that is a complete list of titles that require a mandatory nvidia GPU to run the featurse in game.

http://www.nvidia.com/object/physx_new.html
This is the list of current titles that add eyecandy to the game by use of Nvidia Phsyx with a Nvidia GPU

http://www.nzone.com/object/nzone_physxgames_home.html
Here is a COMPLETE list of game that are Physx ready. Some of these require a mod to add the Physx eyecandy, while other titles have not been released yet.
 
I think that PhysX and technologies like it are cool, but most developers probably don't want to use it for several reasons:

1.) It creates a feature set that a large portion of their customer base can't use and thus alienates those customers to some degree.

I think this is the major reason why advanced physics in games hasn't caught on yet. Currently game developers can only use physics for "eye candy" with no gameplay affecting things since it would result in different game play results in a multiplayer game for somebody with or without an accelerated PhysX rig.

Until CPU's and/or GPU's in the $100 range become so powerful that they can run games at full res and high settings with processing power to spare for Physics, true integration of physics into gameplay won't happen.

Yep, the "cloth" in Mirror's Edge looks way cool when I'm standing still, but I would completely ignore it when in motion. Plus the "kickable paper" in Batman... meh.

Currently GPU assisted Physics is just a "battle of egos" between AMD and Nvidia for bragging rights. (Which only matters to the fanbois.) Your average PC gamer could care less.
 
While most AMD devs and engineers sound like really nice people Richard Huddy seems interested in being their version of JHH.

Open standards? Paying developers for physx? That's hypocrisy considering AMD paid Codemasters to implement DX11 (a closed standard) in Dirt 2. I'm all for OpenCL based physics but unless I see "ATI Radeon Premium Graphics" on every other game's startup videos like TWIMTBP I can't see them mounting a challenge.
 
I think this is the major reason why advanced physics in games hasn't caught on yet. Currently game developers can only use physics for "eye candy" with no gameplay affecting things since it would result in different game play results in a multiplayer game for somebody with or without an accelerated PhysX rig.

Until CPU's and/or GPU's in the $100 range become so powerful that they can run games at full res and high settings with processing power to spare for Physics, true integration of physics into gameplay won't happen.

Yep, the "cloth" in Mirror's Edge looks way cool when I'm standing still, but I would completely ignore it when in motion. Plus the "kickable paper" in Batman... meh.

Currently GPU assisted Physics is just a "battle of egos" between AMD and Nvidia for bragging rights. (Which only matters to the fanbois.) Your average PC gamer could care less.

Yep, physics in multiplayer is certainly not practical yet atm

But for me, I'm all for implimenting GPU physics in single player game and I like the idea of being able to use my spare GPU for physx. Going to a new system, I have my current 9800GTX I can use for physx which would otherwise be left collecting dust.

With modern boards now mostly supports multiple graphic cards, I think this sccenario will become more common as people upgrades to newer system and finds themself with a spare graphic card from their old system which could be utilize for physics.
 
PhysX... I love it for what it does, I hate it because NVIDIA decided to keep it for themselves.

But where does AMD get off trolling NVIDIA for giving more money to devs. This isn't a bad thing. If NVIDIA wants to pay developers to add PhysX, a COMPLETELY OPTIONAL feature in its games, then I'm completely cool with that. (And you should be, too)

I hope NVIDIA / ATI get on the fraking ball and support a technology that works for both companies cards. Maybe this is something DirectX 12 will take care of? We need GPU accelerated physics for just about all of our games.
 
Yep, physics in multiplayer is certainly not practical yet atm
I'm going to assume you have never played Battlefield Bad Company 2 on PC. The physics on that game completely change the enviroment and gameplay. One minute your sniping from a building, the next someone hits it with an RPG and the walls and roof crumble leaving you exposed. After a few hits the building falls down completely. By the end of the rounds, the entire battlefield looks different, and the cover is completely different.
 
I'm sure NVIDIA has invested a lot in PhysX. If they give a shit about keeping it afloat, they should cut ATi a deal and UNLOCK THE FRAKING TECHNOLOGY for ATi cards. Didn't PhysX used to work on ATi cards? C'mon, NVIDIA, even if you sold it to individual ATi owners for five bucks a pop. Stop acting like Apple.
 
I'm sure NVIDIA has invested a lot in PhysX. If they give a shit about keeping it afloat, they should cut ATi a deal and UNLOCK THE FRAKING TECHNOLOGY for ATi cards. Didn't PhysX used to work on ATi cards? C'mon, NVIDIA, even if you sold it to individual ATi owners for five bucks a pop. Stop acting like Apple.

No, but it did work with a Nvidia card dedicated to Physx with an ATI card rendering the graphics.

It still works with a crack, thats what I am using.
 
It's good to see a company pushing a product, rather than introducing it to the market and saying here it is like ATI did with eyefinity. Who cares if Nvidia pays devs to use it, complaining about a competitor company pushing a rival product makes AMD look like their simply butt hurt they don't have anything to compete with.

Though in all honesty Nvidia should just license PhysX out, that would allow wide adoption across ATI platforms, and put Nvidia in a great position to gain market share, and force ATI's hand on the subject.

But this statement just makes AMD look like a bunch of butt hurt complainers, it really doesn't hurt AMD for Nvidia to pay devs to impliment PhysX, so why even care?
 
I'm sure NVIDIA has invested a lot in PhysX. If they give a shit about keeping it afloat, they should cut ATi a deal and UNLOCK THE FRAKING TECHNOLOGY for ATi cards. Didn't PhysX used to work on ATi cards? C'mon, NVIDIA, even if you sold it to individual ATi owners for five bucks a pop. Stop acting like Apple.

The reason they locked it, is because they didn't want to have to deal with people complaining that they had problems with PhysX on an ATI platform, they don't want to have to support a competitiors product with drivers for free.

It makes perfect sense, they did try to go to ATI and get them on the boat for some money but ATI was too dumb to cut a deal.
 
What do you expect an ATI/AMD pr guy to say? Lol...

Nvidia actually works with dev's ALOT moreso (or at least a hell of alot more quickly) than Amd/ati in optimizing games to run well on nvidia cards, part of that is getting game dev's to implement physx. ATI needs to shit or get off the pot. (I just like saying that)

Ultimately, i believe that nvidia gpu's will support both physx and opencl, ati's will only support opencl, and games will build in opencl, but may offer the physx choice as well if it offers some sort of extra functionality.

That's probably years away. Physics in games is just starting to take hold, and still is only for eyecandy, not much examples for gameplay impact. But that is hopefully coming, because it will bring with it alot of possibilities.
 
I think that PhysX and technologies like it are cool, but most developers probably don't want to use it for several reasons:

1.) It creates a feature set that a large portion of their customer base can't use and thus alienates those customers to some degree.

2.) It is difficult to implement in games themselves.

3.) Multi-CPU based physics processing is something that can be done just as easily, and benefits everyone regardless of hardware configuration.

An open standard that works on both AMD and NVIDIA hardware, is probably the solution that most developers truly want before actually making GPU physics processing a standard part of the development process. Even then many developers would probably rather develop CPU based physics processing simply to leverage multi-core CPUs which largely go underutilized as is. I think most gamers have GPUs that could be used for physics processing, but not while handling a graphics load as well.

I totally agree. This technology was much more interesting when there was a third vendor with dedicated hardware devoted to it. The whole point of the tech in the first place was that the physics would be handled by a dedicated CPU that had been made to function very efficiently at this task. If you going to shoe horn it into a GPU you might as well let one of the four cores on my i5 CPU do it.

What should we expect anyway? Carmack has moved on to building Lunar Landers. I can't blame him. It's certainly way more interesting than making computer games. It probably pays better too.
 
The reason they locked it, is because they didn't want to have to deal with people complaining that they had problems with PhysX on an ATI platform, they don't want to have to support a competitiors product with drivers for free.

It makes perfect sense, they did try to go to ATI and get them on the boat for some money but ATI was too dumb to cut a deal.

That is complete bullshit. It is 100% irrelevant what card is rendering the game. If there are problems with PhysX, then it is Nvidia's problem - period. Nvidia wouldn't have to support ATI at all - they just have to support their own damn hardware but apparently that's too hard for them. They'd rather give developers hand jobs than actually support their customers.

If that ridiculous claim was even remotely true then we'd all be too busy trying to get all of our drivers playing nice to actually do anything. Does your sound card ever corrupt your graphics? No? How about your NIC? Is your RAID card causing missing shadows? No? Then why the fuck would your physics card cause graphical glitches? Answer: it fucking doesn't.

/rant

Sorry, that isn't necessarily directed at you (definitely not meant as a personal attack), I'm just sick of people defending Nvidia with that utter bullshit. I swear if any other company just up and *disabled* people's hardware people would be going apeshit over it. But since its Nvidia somehow its not only not wrong, but we should thank them for avoiding conflicts! :rolleyes:

Open standards? Paying developers for physx? That's hypocrisy considering AMD paid Codemasters to implement DX11 (a closed standard) in Dirt 2. I'm all for OpenCL based physics but unless I see "ATI Radeon Premium Graphics" on every other game's startup videos like TWIMTBP I can't see them mounting a challenge.

DX11 is not at all like PhysX. PhysX's direction is entirely determined by Nvidia. DirectX's direction is guided by Microsoft, but both ATI and Nvidia (as well as others) help hammer out the standard.
 
Ultimately, i believe that nvidia gpu's will support both physx and opencl, ati's will only support opencl, and games will build in opencl, but may offer the physx choice as well if it offers some sort of extra functionality.

That's probably years away. Physics in games is just starting to take hold, and still is only for eyecandy, not much examples for gameplay impact. But that is hopefully coming, because it will bring with it alot of possibilities.

Just so you know, but that will never happen. OpenCL is more or less an open version of CUDA. PhysX runs on top of CUDA. If you have a physics library that runs on OpenCL, games will either use that *OR* PhysX - they won't use both. Having two entire physics engine in a single game is something developers absolutely won't do.
 
It also doesn’t help that Nvidia has openly offered to share its PhysX technology with AMD, but AMD hasn’t taken up the offer.

AMD does a lot of talk about "open" but physx is as open as there much touted directx11. Although Microsoft isn't a direct competitor.
 
The reason they locked it, is because they didn't want to have to deal with people complaining that they had problems with PhysX on an ATI platform, they don't want to have to support a competitiors product with drivers for free.

This argument is a bit of a sad point to make, since any of us with an ATI card for rendering graphics still have to use an Nvidia card for the Physx.

So I'm not trying to call you out and say your wrong, but what makes me any less of an Nvidia customer? I'm still using their hardware, just not as my primary GPU.

Not one person since it has been locked out has had any problems with running Physx on an Nvidia card paired with an ATI card once they successfully installed the hack. I understand Nvidia doesn't want to support a scenario like this if a problem did come up, but they purposefully spent the time and resources to lock out Physx on such a platform, which is more effort then just refusing support altogether. That is just plain wrong and they know it.
 
Back
Top