NVIDIA’s Pierre Terdiman Responds to “Weird, Confused” Comments concerning PhysX

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
NVIDIA PhysX team member Pierre Terdiman has taken to his blog to set the record straight on a variety of “weird, confused” comments that have circulated after the RTX announcement regarding the middleware SDK and ray tracing in general. These include the notion that “nobody uses PhysX,” “PhysX is crippled,” “developers only use PhysX because NVIDIA pays them,” “ray tracing is the new PhysX,” and “ray tracing is a gimmick until it’s supported by consoles.”

PhysX is the default physics engine in both Unity and Unreal, which means it is used in tons of games, on a lot of different platforms (PC, Xbox, PS4, Switch, mobile phones, you name it). “PhysX” is not just the GPU effects you once saw in Borderlands. It has also always been a regular CPU-based physics engine (similar to Bullet or Havok). When your character does not fall through the ground in Fortnite, it’s PhysX. When you shoot a bullet in PayDay 2, it’s PhysX. Ragdolls? Vehicles? AI? PhysX does all that in a lot of games. It is used everywhere, and it is not going away.
 
Arkham Knight's PhysX is the only reason I went back to nVidia and most likely will stay with for another generation, ugh.
 
Never addressed HairWorks™ in The Witcher 3.

I seriously doubt CDProject PAID nVidia to implement that.

nVidia eating the PC development costs for CDPR to include ShillWorks™ is the same thing as nVidia giving them cash to implement that stuff.

nVidia MADE PhysX™ and other ShillWorks™ a bit of a joke. It's their fault, no one else's.

I would say HBAO+ is their most successful middleware... PhysX on the CPU is meh. Hairworks is nice, but the performance hit is ridiculous.

With the prices of the RTX cards, they need a LubeWorks™ middle-ware.
 
Last edited:
Ah, the same guy that wrote a lengthy post debunking the "myth" of PhysX 2 being crippled on CPUs...by admitting it was crippled but not on purpose but because of lack of time/effort/resources.
 
8 years ago called and they want their website back.

And I'm not even joking, it's Wordpress 2.6 which is literally from 2008. Turning off stylesheets actually improves its readability 10x.
 
It's almost too easy for me to bash NV these days but I do have to give this guy some credit. My only real complaint is how GPU accelerated PhysX disappeared. I put effort into getting games that supported it and they in turn performed well ahead of their time. Seriously had 2 G1 OC'd 970's SLI'd and a dedicated EVGA SC780 physX card before I abandoned it. That config rocked 4k/50-60 fps with Metro Last Light maxed! Same for Arkham City. Changing features to pipe thru the CPU is a waste only done to make coding easier.

Either way not a bad read and you can tell the guy does have some valid points.
 
"Hi and welcome to Nvidia. We are at defcon level 3 damage control. Our CEO Jensen was too busy Ray Tracing with his 1950's bomber jacket to give the customers what they wanted. We are now releasing fake numbers you will never see in home use unless you have a two thousand dollar CPU/Mobo/memory system to go with it. Also, we forgot to mention Physx and our past lighting effects are still in use in today's games, the games you actually play and care about. But instead of showing you those games, look at these pretty effects in Battlefield 5 that cost an EA head honcho his job because there is very little interest in this game. When the other team is trying to blow your head off at high speed gaming, remember to look at your reflection in the cars, windows, and other players eyes! Here at Nvidia, we and we mean Jensen, couldn't fuck up a launch any [H]arder if we here at Nvidia tried. NOW GEFORCE LEMMINGS, RUN OF THAT CLIFF FOR $600, $800+, and $1200+! Nvidia appreciates you supporting our future Auto Autonomous endeavors as gaming is now a second thought as we move forward in this ever changing industry!
Thank you again from everyone at NVIDIA!"

~At no point was this directed at Kyle Bennet, Brent Justice, and the crew of [H]ardOCP. Strictly satire and sarcasm.~
 
Last edited:
Physx itself isn't the problem. The PROBLEM, if he needs it spelled out for him, is that nVidia BOUGHT Physx technology, and then leveraged it to be an EXCLUSIVE feature of supported games IF you happened to be using nVidia hardware. Yes, CPU Physx is still alive and well, and can run on any hardware, but GPU ACCELERATED PHYSX was a cheap grab to lock people into nVidia hardware, and fortunately, was not adopted by the majority of game developers. Only a complete idiot would miss the implication of how it was used, or why RTX and nVidia's new PROPRIETARY ray tracing hardware is being compared to Physx.
 
Question:

Could we see Raytracing specific GPUs used like Physix cards?

For instance, use a 1080ti + a Raytracing RTX add in card?

Not thinking Nvidia would do it even if it were possible, but I do wonder... Maybe with NVlink?
 
This reads as if Nvidia is abandoning support for GPU-accelerated physics and apologizing for it by pointing out the programming APIs are still in-use for CPU-accelerated use cases.
 
Question:

Could we see Raytracing specific GPUs used like Physix cards?

For instance, use a 1080ti + a Raytracing RTX add in card?

Not thinking Nvidia would do it even if it were possible, but I do wonder... Maybe with NVlink?
Based on 3D modeling and rendering you would think so hell even mix/matching GPU's would in theory be easily possible. It's funny how you lucid hydra and DX12 both had high promise for such things, but more or less both complete fizzles in actual practice. That said blender rendering is a example of where multiple gpu's mixed or other wise is appreciative gains w/no big real fuss involved. I don't know why something akin to alternate frame rendering frame skipping isn't possible where you render a scene like a movie at a set frame cap and like a movies black borders are rendered at a lower frame rate. Obviously they'd both be rendering the same whole scene, but the black border edging portion would have some minor frame skipping to make the whole render process more manageable and easier on the GPU. Oh no a bit less fluidity where my eyes aren't heavily focused? Seems like a fair trade off.
 
Man I wish someone had a copy of the Wicked3D drivers and stuff they used back in the day to mix and match 3dfx and nVidia hardware. I forget what happened to the Wicked3D guys, I think some of them actually ended up at Mad Onion (3dmark) or something like that.

edit

I think they actually went to Alienware. There's an article out there on it somewhere if you dig around but it was a lot of years ago that I read it. Would be fun to play with that in a "legacy" rig just to screw with.
 
money is absolutely neccessary in pushing your innovations.

NVIDIA has enough guts to back their push for Physx with lots of money.
AMD = nope. Not their priority. Their ROI and risk analysis probably dissuaded them.

will it be best for gamers and consumers as a whole if AMD was willing to back their efforts with money too ? = OF COURSE

same applies for raytracing.
 
Wrong! It is based in a dx12 standard.

Are you 100% sure about that? Nvidias own press is treating it as proprietary..."based on" doesnt give me much faith. Either it adheres to dx12 standards or it does not...
 
money is absolutely neccessary in pushing your innovations.

NVIDIA has enough guts to back their push for Physx with lots of money.
AMD = nope. Not their priority. Their ROI and risk analysis probably dissuaded them.

will it be best for gamers and consumers as a whole if AMD was willing to back their efforts with money too ? = OF COURSE

same applies for raytracing.

Um. Bullshit.
 
It's almost too easy for me to bash NV these days but I do have to give this guy some credit. My only real complaint is how GPU accelerated PhysX disappeared. I put effort into getting games that supported it and they in turn performed well ahead of their time. Seriously had 2 G1 OC'd 970's SLI'd and a dedicated EVGA SC780 physX card before I abandoned it. That config rocked 4k/50-60 fps with Metro Last Light maxed! Same for Arkham City. Changing features to pipe thru the CPU is a waste only done to make coding easier.

Either way not a bad read and you can tell the guy does have some valid points.

GPU PhysX failed because it was proprietary and limited to NVIDIA hardware. They even went so far as to break it on systems that had both Radeon and Geforce cards installed so you couldn't even use a spare Geforce as a PhysX only card. Couple that with the fact that both major consoles (and I consider the Switch to be a handheld and not a console) use AMD GPUs and it is not surprising that developers don't want to spend money and programmer time to implement support for it.
 
Are you 100% sure about that? Nvidias own press is treating it as proprietary..."based on" doesnt give me much faith. Either it adheres to dx12 standards or it does not...
RTX is NVIDIA's hardware pipeline and middleware. It uses DXR, which is built into the DX12 API.
 
The fact that is a standard, doesn't mean Nvidia doesn't pay devs to lock RTX to Nvidia only hardware. I still remember when Nvidia locked antialising in Batman:AA to their hardware.

AMD does the same crap, Lichdom had NVidia users locked out on TressFX, AMD went out of their way blocking users, and there were still idiots calling it an open standard. They did it in quite a few games, alpha beta you could use it(and yes it worked), full release gpu locked.

Get over the delusion any of them are "Better than the others" or they all don't try and do the same crap, because they do.
 
But if you have money to throw around... I mean, just look at Google/Alphabet and all the things they throw money at without seeing any ROI.

yeah you just prove your point.... you don't see AMD doing it. That's what the context is all about...

if you want good tech innovations and quick refreshes in the market, you cannot just agree to AMD sitting around and letting NVIDIA doing all the PR, marketing, franchising / branding work AND complain that NVIDIA is making things propriety
 
When your character does not fall through the ground in Fortnite, it’s PhysX. When you shoot a bullet in PayDay 2, it’s PhysX. Ragdolls? Vehicles? AI? PhysX does all that in a lot of games.
Wut?

I don't remember falling through the floor (much) in games before PhysX. When I did it was usually due to a glitch in the level, cheat codes, and/or alcohol.

Sure ragdolls and other objects tipping over is nice but a lot of the examples he gave are trivial. And even the good stuff has been done in other games without PhysX
 
AMD does the same crap, Lichdom had NVidia users locked out on TressFX, AMD went out of their way blocking users, and there were still idiots calling it an open standard. They did it in quite a few games, alpha beta you could use it(and yes it worked), full release gpu locked.

Get over the delusion any of them are "Better than the others" or they all don't try and do the same crap, because they do.
From what I can gather, it was Lichdom, not AMD, that blocked it. If it was AMD, why wasn't it blocked in other games, especially since TressFX is practically open source, unlike dll black boxes from Nvidia?
 
Wut?

I don't remember falling through the floor (much) in games before PhysX. When I did it was usually due to a glitch in the level, cheat codes, and/or alcohol.

Sure ragdolls and other objects tipping over is nice but a lot of the examples he gave are trivial. And even the good stuff has been done in other games without PhysX
He's saying that a lot of games people play today are using PhysX, not that these things didn't happen prior.
 
i only see AMD doing what can be considered as 'beta' behavior. In a two-dog race, how is that good for us consumers?

Take yer fanboi glasses off. AMD has better support for Linux, OPEN drivers, and in general, open will beat closed EVERY SINGLE TIME.
 
8 years ago called and they want their website back.

And I'm not even joking, it's Wordpress 2.6 which is literally from 2008. Turning off stylesheets actually improves its readability 10x.
I actually didn't even bother start reading once I saw how bad the laout was.
 
Wut?

I don't remember falling through the floor (much) in games before PhysX. When I did it was usually due to a glitch in the level, cheat codes, and/or alcohol.

Sure ragdolls and other objects tipping over is nice but a lot of the examples he gave are trivial. And even the good stuff has been done in other games without PhysX
That and ragdoll physics are completely misused. They should be used to simulate female body part movements in eye catching situations.
 
LOL Borderlands 2 the Nvidia guys are right cause they are head over heels probably alot smarter then the most of us.

 
I am one of the few that have two desktops with real Ageia cards in them. The effects in the 12 or so games that use the PPU look better than the Geforce card effects. It Was a shame that Nvidia bought Ageia, killed a new technology, but then again Nvidia buys out and then destroys any potential tech competition it can....
 
money is absolutely neccessary in pushing your innovations.

NVIDIA has enough guts to back their push for Physx with lots of money.
AMD = nope. Not their priority. Their ROI and risk analysis probably dissuaded them.

will it be best for gamers and consumers as a whole if AMD was willing to back their efforts with money too ? = OF COURSE

same applies for raytracing.
Yep. You've gotta put sugar on the end of your you know what for developers to do anything. Publish some new APIs and say "here ya go" and devs will say "cool maybe we'll use that some day". Never happens.

This is why you can't take the toddler meltdowns about "proprietary Ngreedia shit tecks, u suck!" seriously. These people would rather not see the new tech even exist than try to wrap their peabrains around the reality that somebody - usually Nvidia but sometimes AMD - has to pay the developer to make it happen.
 
Back
Top