Nvidia Physx strikes again: Cryostasis tech demo

Firstly, I think the majority of the market actually DOES have a PhysX accelerator. Over 50% of the discrete GPUs sold are nVidia.
The majority of the market currently isn't comprised of PhysX-capable NVIDIA GPUs. There's a difference between the marketplace being flooded with these cards and the current install base of them.
 
How would anyone not expect a similar future scenario for hardware physics?

I think you're not seeing my point.
I'm saying that PhysX is like Glide. You need to walk before you can run.
If it wasn't for Glide, there probably would never be a Direct3D either. It went something like this:
1) 3DFX introduces 3d acceleration hardware and Glide library
2) Games start using Glide as a special feature for the small group of people owning the hardware.
3) Other vendors introduce their own 3d accelerators with their own proprietary libraries.
4) Most games continue to support Glide and Glide only
5) Most other vendors fail to gain any marketshare and drop out of the 3d accelerator business.
6) The few vendors that are left, start to standardize their APIs bit-by-bit. First step is MiniGL, a subset of OpenGL... Later Microsoft introduces Direct3D, and MiniGL is eventually replaced by full-blown OpenGL support.

With physics it's something like this:
1) Ageia introduces physics acceleration hardware and PhysX SDK.
2) Some games start using PhysX as a special feature for the small group of people owning the hardware.
3) nVidia sees the potential of physics acceleration and how well it can run on their G80 architecture... so they acquire Ageia and their PhysX SDK, and make it work on their product line.
4) More and more games start using PhysX as a special feature for the now not-so-small group of people owning the hardware.
5) ???

These scenarios actually *are* incredibly similar.
Who knows what will happen next. They probably will standardize at some point, but it doesn't look like it so far. It took years in the graphics market aswell, and we went through 4 'de facto standard' APIs more or less... Glide -> MiniGL -> OpenGL -> Direct3D.
In the meantime it was hell for all customers who didn't have hardware that supported the API du jour properly, and that also resulted in many big graphics names dropping out of the consumer market (Trident, Cirrus Logic, Tseng Labs, S3, Matrox...).

I'm not suggesting PhysX can't be successful (in one sense), but what I am suggesting is that it's an unacceptable solution for consumers, just as Glide was an unacceptable solution for consumers.

I think you misunderstand me.
I fully agree that for consumers it's hell, and it's unacceptable that your hardware doesn't work the way it could because you don't have the right drivers.
I'm just saying that this hasn't stopped game developers before, so why should it stop them now?

In one sense, PhysX is good because it's paving a certain road for the adoption of hardware physics, but that isn't necessarily the road developers and consumers want to travel.

Neither was Glide, but would we be where we are today if 3DFX/Voodoo/Glide never happened? I don't think so.
In fact, you might actually *need* proprietary technology at first, in order to let the concept mature without having to worry about what the competition is doing. Going into a new technology with open standards is far more risky. Companies will be much more cautious because they have no idea if their investment is going to pay off. A proprietary API pretty much insures that software developers will support you, and only you... And they will support you well, because you made sure that the API is optimal for your particular product.

In some cases open solutions never quite replace the proprietary ones... How about Microsoft Windows/Office for example?
 
The majority of the market currently isn't comprised of PhysX-capable NVIDIA GPUs.

'The market' is the market for cutting-edge games... after all, old games or games not aiming for the cutting edge won't be using heavy physics anyway.
And in that market, yes, I think it's pretty safe to assume that most people will have a GPU of less than 2 years old (or will be willing to upgrade by the time a big new game is released... raise your hand if you bought a new videocard to play Crysis or such *raises hand*). Which in the case of nVidia is pretty much a guarantee for PhysX-capability.
 
Is there any point in activating ageia if you don´t have SLI?

Yes.
I have an E6600 and 8800GTS320, and the demo gets much better framerates with PhysX acceleration enabled. That's the whole point.
 
I thought the whole point was to get a dedicated PhysX accelerator and not waste performance on something that isn´t that noticable generally :). Calculations used on this could be used for more advanced shaders effects and other. Whereas if you have SLI you are generally maxing games out anyway :).

But otherwise I have had my eyes on this game for a very long time and it should showcase it much better then mirrors edge... Which playing the demo I can´t really see the reason to market physX with that game.

Still the water spray look awful but maybe that is intentional to show exactly how it´s done. I sure hope so because as much some effects impressed other just distracted me for not looking right :)
 
I thought the whole point was to get a dedicated PhysX accelerator and not waste performance on something that isn´t that noticable generally :).

No, for most people the point will be that an nVidia card can give much higher framerates with heavy physics cards than any other card can (which is testament to the great design of the G80 architecture, which is far more than just a videocard).
So they will buy nVidia cards.
Most people don't use SLI/CrossFire, and most people didn't buy a PPU card either.
They just want one CPU and one videocard, and whatever is best bang-for-the-buck.
In any game taking advantage of the hardware-accelerated power of PhysX, nVidia will beat AMD (and Intel) hands-down in bang-for-the-buck. And that is the point.
 
If you think people are going to run physx AND graphics on one card only, you are sadly mistaken. That is the point I think a lot of people are trying to make. Sure a lot of people might have a "physx capable" card, but not a lot of people have TWO to be able to use one for physx.
 
Most of the people arguing against Nvidia's approach with PhysX have not brought forward a better alternative. As Scali pointed out Microsoft is not a middleware company, it's doubtful that they will produce a standard physics library. Some have tried to use Direct3D as an example of a standard but that is a flawed comparison. Direct3D is more comparable to CUDA. You still have to go in and code the things that you want to do. Even if Microsoft did develop a middleware solution it would certainly be more proprietary than PhysX, as it will probably be based on DirectX 11 Compute and therefore only available on Windows platforms. Compare that to the ubiquitous support of PhysX across several major platforms/API's - Windows, Linux, PS3, Xbox and of course Geforce via CUDA. I expect an OpenCL implementation soon enough as they will eventually have to support competitor's GPUs to really get developers behind it.

AMD is trying to do the exact same thing Nvidia has done with PhysX except that they didn't have the foresight (or money) to purchase Havok. Now they have to try to work with Intel to get something out the door and they're obviously far, far behind and Intel probably isn't the biggest proponent of GPU accelerated physics as it's one more thing that makes CPUs less important. So neither AMD or Intel is trying to be any more open than Nvidia has been...they're simply hating on something that they can't compete with at the moment.

So if we can't rely on Microsoft to develop and maintain a free physics library then where is it gonna come from? Somebody has to step up and make the investment to push the technology forward. We have 2K, EA and now THQ getting on board. If Nvidia succeeds in getting enough people behind PhysX it will become the standard physics middleware library, just like Havok was a few years ago (and arguably still is depending on who you ask). Of course Nvidia will try to profit from its efforts and who can blame them. But sitting on the sidelines and whining without offering a better alternative isn't a very productive use of anybody's time.....
 
If you think people are going to run physx AND graphics on one card only, you are sadly mistaken. That is the point I think a lot of people are trying to make. Sure a lot of people might have a "physx capable" card, but not a lot of people have TWO to be able to use one for physx.

That doesn't matter if you're physics bound - http://www.firingsquad.com/hardware/cryostasis_techdemo_performance/page4.asp

It's going to take a while for people's mentality to change but physics is just another workload for the GPU. Just like rasterization, shading, shadowing and AA and all the other bits that come together to produce the final image. If you stop thinking about physics as some tacked on effect it'll be a lot easier to appreciate that.

A popular argument is that physics MUST impact gameplay for it to be worthwhile. That's funny considering none of the DirectX versions that Microsoft has put out over the years has done anything at all in that regard. All of this time everybody has been focused on making prettier images. Now all of a sudden pretty isn't good enough. I can't wait for AMD to get on board the physics train so we can move forward with the tech and get past the naysaying and bickering that always accompanies these evolutionary developments.
 
They can't use PhysX for anything ground breaking because the majority of the market doesn't have a PhysX accelerator.

You're absolutely right. It's a classic chicken and egg scenario. At least Nvidia is trying to provide both the chicken and the eggs at the same time. What are the other guys doing?
 
That doesn't matter if you're physics bound - http://www.firingsquad.com/hardware/cryostasis_techdemo_performance/page4.asp

It's going to take a while for people's mentality to change but physics is just another workload for the GPU. Just like rasterization, shading, shadowing and AA and all the other bits that come together to produce the final image. If you stop thinking about physics as some tacked on effect it'll be a lot easier to appreciate that.

A popular argument is that physics MUST impact gameplay for it to be worthwhile. That's funny considering none of the DirectX versions that Microsoft has put out over the years has done anything at all in that regard. All of this time everybody has been focused on making prettier images. Now all of a sudden pretty isn't good enough. I can't wait for AMD to get on board the physics train so we can move forward with the tech and get past the naysaying and bickering that always accompanies these evolutionary developments.

But even those aren't playable even at 1600 res. Don't you think most people will want to turn the physics off or down and get better graphics? I think that will be the case in most games and what most people do. Especially when they have older cards like most people do. Developers will know that.

Unless nVidia does something cool like put an extra 8900 chip onto their future cards JUST for physics I think that trend will continue. If nVidia did that, I think they would have a winner.
 
I'd buy a separate card from nvidia that does physx if it was compatible with my hd4870x2. From what I hear the old PPU is not a good enough solution + no driver support?
 
Don't you think most people will want to turn the physics off or down and get better graphics?

Considering physics effects result in better graphics I don't get the distinction you're trying to make. That's like asking someone to choose between more realistic smoke and better graphics. They're one and the same thing.

How is turning off a graphical effect that's animated by physics computations going to give you better graphics? More realistic smoke, fire, hair, water and cloth all contribute to graphical fidelity. Those things in combination with more advanced shadowing and shader techniques, more detailed textures and better animation will bring us more immersive games. It's a bit naive to say one thing will bring more gains over another. If anything it's physics that has been neglected for far longer than the other aspects of our gaming experience and can benefit the most from a renewed focus.
 
I'd buy a separate card from nvidia that does physx if it was compatible with my hd4870x2. From what I hear the old PPU is not a good enough solution + no driver support?

As far as the PPU being not a good enough solution- probably not for the future. For all current games, yes. And for driver support it's updated regularly by nVidia. I agree, though, for me I would buy one from nVidia to go along with my 4870x2 if it was compatible.


Considering physics effects result in better graphics I don't get the distinction you're trying to make. That's like asking someone to choose between more realistic smoke and better graphics. They're one and the same thing.

How is turning off a graphical effect that's animated by physics computations going to give you better graphics? More realistic smoke, fire, hair, water and cloth all contribute to graphical fidelity. Those things in combination with more advanced shadowing and shader techniques, more detailed textures and better animation will bring us more immersive games. It's a bit naive to say one thing will bring more gains over another. If anything it's physics that has been neglected for far longer than the other aspects of our gaming experience and can benefit the most from a renewed focus.

I see what you're saying and agree that it does affect graphics, but I think the real benefit of physics is not graphics but gameplay. I don't think we're at a point yet where the other graphics are so photo-realistic and GPU's are just sitting around doing nothing where they could take the additional physics load (on the same card).

No one wants realistic physics more than I do. I love Ghost Recon and realistic games like that and I long for the day of bullet drop, penetration, grenade fragments, etc- all the stuff you don't SEE. That's why I bought a PPU - for only a few games. But until there is a true physics solution that developers can COUNT on being there on most gamer's machines, I don't think it's viable. They will just put it toward things that don't affect gameplay (just graphics) and that is a shame. And I think that solution will eventually be CPU cores. Not the best solution as far as performance goes, but you have to go with what is out there and work up.
 
to me, the future of physics is one of 3 scenarios:
1. nvidia physx
2. dx 11 physics
3. 128 core cpu's and well optimized software physics

if nvidia physx succeeds soon and becomes well established, the other options will be deemed unnecessary and aborted.

all these new boards have multiple pcie x16 slots, so buying a 9600gt and having it be a dedicated ppu shouldnt be too difficult a proposition in the future...
 
And I think that solution will eventually be CPU cores. Not the best solution as far as performance goes, but you have to go with what is out there and work up.

I don't think so. If you look at the Steam Hardware Survey for example, you'll see that there are more people with a GeForce 8 or higher card than there are people with a CPU with more than 2 cores.
So as for what is 'out there', it's not CPU cores, it's the nVidia GPUs.
 
Whether DirectX gets physics support or not is becoming and more a moot question. With the installation base of Windows dropping more and more (OS X is a viable contender...), being cross-platform (PC, Mac and consoles!) is becoming more and more relevant. This means that OpenGL can not be counted out. DirectSound has already been killed off in Vista and games now use OpenAL, which is also supported on current consoles.

PhysX will remain what it is now: a separate library/API, just like OpenGL, OpenAL and OpenCL.

That said, my company uses OAL, OGL and PhysX for its game engine and we intend to make use of PhysX wherever possible to increase realism and improve graphics. It's actually a really cheap way to improve graphics, as almost no effort is required on the side of the developer. Hence as an independent developer we love this technology.

Of course, we always give you the option to scale it down to match your hardware :)
 
Whether DirectX gets physics support or not is becoming and more a moot question. With the installation base of Windows dropping more and more (OS X is a viable contender...), being cross-platform (PC, Mac and consoles!) is becoming more and more relevant. This means that OpenGL can not be counted out. DirectSound has already been killed off in Vista and games now use OpenAL, which is also supported on current consoles.

I think it's a bit early for that. Windows is still at about 90% marketshare. It could take years before it gets to a point where other OSes actually matter (if that ever happens... the media tried very hard to make Windows Vista a failure, which might be part of the reason of the current 'dip' if you can even call it that... For all we know, Windows 7 is going to be a huge success and Windows will regain its marketshare, perhaps even expand on it more).
Aside from that, I think Windows still has some things going for it. I don't think we'll see a tour-de-force game like Crysis on any non-Windows platforms anytime soon, because of various technical difficulties. Windows just has the best hardware-support and the OS and drivers are designed/optimized to handle gaming. So you can always use the latest graphics cards and features on Windows, and there never really are any performance problems.
I guess a good example of that was the Commodore Amiga. Although the PC had the largest marketshare, the Amiga was a more popular gaming platform, simply because there were more games, and if you had the same game on PC and Amiga, the Amiga version was usually better.
 
OS X is a viable contender



Hahahahahahahahahhahahahahahahah.


hahahahhahahhahahahahahahahahahahahahahahahahaha.

I am so getting banned for this but seriously, i find it hard to listen to your warped view of the tech world.
 
A popular argument is that physics MUST impact gameplay for it to be worthwhile. That's funny considering none of the DirectX versions that Microsoft has put out over the years has done anything at all in that regard. All of this time everybody has been focused on making prettier images. Now all of a sudden pretty isn't good enough. I can't wait for AMD to get on board the physics train so we can move forward with the tech and get past the naysaying and bickering that always accompanies these evolutionary developments.

Well, most of the naysayers have a history of just bashing anything that isn't their "color", if you know what I mean. These people only see black and white. There's no gray in between and these shades of gray, are represented by the potential of the tech and how it can improve our gaming experience, which they are obviously not wanting to see.

Realistic physics no doubt improve the eye candy and that's mainly what we'll see at first, but as more and more this tech is adopted by developers (which is already happening with these latest announcements by THQ, 2K and EA, which have many known franchises), those realistic physics that basically influence eye candy now, will gradually evolve to gameplay impacting physics.
 
I think it's a bit early for that. Windows is still at about 90% marketshare. It could take years before it gets to a point where other OSes actually matter (if that ever happens... the media tried very hard to make Windows Vista a failure, which might be part of the reason of the current 'dip' if you can even call it that... For all we know, Windows 7 is going to be a huge success and Windows will regain its marketshare, perhaps even expand on it more).
Aside from that, I think Windows still has some things going for it. I don't think we'll see a tour-de-force game like Crysis on any non-Windows platforms anytime soon, because of various technical difficulties. Windows just has the best hardware-support and the OS and drivers are designed/optimized to handle gaming. So you can always use the latest graphics cards and features on Windows, and there never really are any performance problems.
I guess a good example of that was the Commodore Amiga. Although the PC had the largest marketshare, the Amiga was a more popular gaming platform, simply because there were more games, and if you had the same game on PC and Amiga, the Amiga version was usually better.

Oh, I don't contend that Windows isn't superior in terms of hardware support. After all, Macs are still a relatively closed platform (although the walls have been crumbling for a while since the shift to Intel CPUs). Don't forget, however, that more and more Mac systems and laptops are being sold than ever before and that the marketshare of OS X is now at a point where more and more companies are paying attention to it.

I would like to state that the drivers for the little hardware OS X _does_ support is generally excellent. Macs have their issues, but being a relatively closed platform, the amount of quality control is relatively high compared to the PC platform.

At any rate, don't forget that out of all the current-gen consoles only the X360 supports (a modified version of) DirectX, while the others went with technologies like OGL and OAL. Consoles are also the primary gaming platform(s) for most people. So physics support in DX will ultimately be only a small piece of the puzzle :)

I am so getting banned for this but seriously, i find it hard to listen to your warped view of the tech world.
What, you missed the reports the past few years of skyrocketing sales of Apple systems? Also, you missed people gawking over Windows dropping far below 95% marketshare for the first time? Don't tell me you also missed Internet Explorer vanishing into obscurity (<50%)? I think there's a rock you may want to say 'hi' to :)
 
I see what you're saying and agree that it does affect graphics, but I think the real benefit of physics is not graphics but gameplay. I don't think we're at a point yet where the other graphics are so photo-realistic and GPU's are just sitting around doing nothing where they could take the additional physics load (on the same card).

Well you touched on two things there. In terms of gameplay physics there probably is enough room to grow there on CPUs. It's up to developers to be more creative with their gameplay physics integration the way Valve did with HL2. However, the vast majority of physics effects are too compute intensive for even the fastest CPUs, especially those involving particle systems or cloth simulation.

The other point with regard to "other graphics" is what I was referring to earlier. Why is it that "other graphics" must be maxed out before "physics graphics" gets a turn? Is one more important than the other? Would better HDR have a greater impact on your experience than realistic water? What about a game like Bioshock where water was everywhere? It's not a black and white decision, there are games where physics will have a greater impact than the "other graphics" effects.
 
I would like to state that the drivers for the little hardware OS X _does_ support is generally excellent. Macs have their issues, but being a relatively closed platform, the amount of quality control is relatively high compared to the PC platform.

No question about it, but it took quite a while for Apple to support the newer DX10-generation videocards (and even today I don't think you can run the latest models in Apple machines)... and aside from that, OpenGL didn't (doesn't?) support all hardware features in OS X.

At any rate, don't forget that out of all the current-gen consoles only the X360 supports (a modified version of) DirectX, while the others went with technologies like OGL and OAL. Consoles are also the primary gaming platform(s) for most people. So physics support in DX will ultimately be only a small piece of the puzzle :)

You shouldn't think of consoles that way. Firstly, the version of DirectX that Microsoft uses on the XBoxes isn't a carbon-copy of the Windows version. The APIs have a similar layout, but that's about it. There's no direct source-code compatibility, so it isn't portable.

Secondly, while various consoles may support OpenGL, they are generally not used because they are a poor fit for the specific hardware of that console.
For example, the strength of the PS3 is its Cell processor. A standard OpenGL API can not take advantage of any of that, and will just bottleneck the system core of the Cell (which isn't very fast).
Most game developers will roll a lot of their own graphics code with optimized routines, and pretty much bypass OpenGL altogether. As such, the code isn't portable anyway.

Portability is instead solved at the engine level. Game developers will design their engine in such a way that it abstracts the underlying hardware. The engine itself is then implemented and optimized for all the target platforms, and then (most of) the actual game code and content can run on top of that without changes. This is still a very costly job, which is why most smaller game studios will just license a multi-platform engine from Epic, ID, Valve or such.
In fact, probably the biggest OpenGL game engine, the Doom3 engine, might use OpenGL on Windows, OS X and linux, but they also ported it to the XBox, where it uses Direct3D.
 
Physics is the way to go for greater games/more realism.
Look around you...everything you see is PHYSICS.
The current status is that most games take palce in a dead, static world where "kinda-sorta" solutions are being used to dupe the user.

But with a better physics API(supported by hardware) we will get living, dynamic worlds...

Raytracing & physics are the next big things...with the physcis solution gaining momentum now, and Intel speaking about that they would like to go the raytracing route in the future.

I am looking forward to the clash between NVIDIA's PhysX and Intels Havok when the x86 "Larrabee" arrrives.

there is a limit to how pretty you can make a picture..without interaction.
 
Well, most of the naysayers have a history of just bashing anything that isn't their "color", if you know what I mean. These people only see black and white. There's no gray in between and these shades of gray, are represented by the potential of the tech and how it can improve our gaming experience, which they are obviously not wanting to see.

Well yeah that's obviously what's happening with most of the anti-PhysX posters. They're rallying against something that the competition has and that's understandable. What's annoying is that they're not offering a better solution - it's as if they prefer we game in bland static environments forever. I'm sorta surprised at Kyle's stance - his proposal of Microsoft sponsored physics libraries seems like a copout. He's just fueling the anti-PhysX noise instead of trying to educate his constituents on exactly what's in play here. And if he really believes Microsoft will be our savior he should explain why.

Tim Sweeney seemed happy to use PhysX for certain effects in UE3.0 but there hasn't been much more out of him with regard to where he thinks physics will live in the future. Newell and Valve seem focused on CPUs and given what they did with HL2 years ago they can probably do some impressive things with the new chips that are out there now. Carmack seems ambivalent. But let's see what the guys at THQ, EA and 2K come up with....if they fail then PhysX fails and all the naysayers will be proven right.
 
Elledan, unfortunately the people who are most likely buying apple are buying the brand and are not the sort of knowledgable people this topic concerns.

Another fact is that pc gamers want performance and in no way are buying any pre-built let alone an apple.

IE explorer has no relevancy here or indicative of microsoft as a whole.

Again, you display a naively or just ignorantly warped view of things. I am surprised you do not use the M$ moniker.
 
But let's see what the guys at THQ, EA and 2K come up with....

EA now includes CryTek aswell. Imagine what they could do if their next CryEngine will use hw=accelerated PhysX instead of their own CryPhysics.
 
I don't think the fact that these publishers have licensed PhysX means that they're going to force it down the throats of their developers. Especially not guys like Crytek.

This was Crytek's stance on it last year. I doubt that view has changed much.

Crytek’s Cevat Yerli, executive producer and director of Crysis, points out another problem with supporting a physics processor: The need to create a gaming experience that’s different enough from what’s possible on mainstream hardware to justify the gamer’s investment in the hardware without making an entirely different game for the majority of gamers who don’t have the hardware.

In an interview with Maximum PC’s editor-in-chief Will Smith, for an October issue cover story, Yerli says “We are not supporting GPU or dedicated physics processors for a variety of reasons. The main one is that we did not want to change the core gameplay physics for our min-spec configurations. We have been optimizing our dynamics code for many years now, so it can run robust and as optimally as it can on CPUs of previous generations while also taking advantage of newer multi-core architectures. So you are best equipped with a quad core—if you have the budget—but Crysis will do great on dual-core configurations as well.”
 
This was Crytek's stance on it last year. I doubt that view has changed much.

To be honest, that statement doesn't make a lot of sense to me (not the first time with mr. Yerli either).
Crysis is one of the most scalable games on the market today, as far as physics go. They have 4 different levels of physics.
That kind of scalability is EXACTLY what you'd want for accelerated physics... Just like it's what you want to handle single, dual and quadcore CPUs.
Since they are also part of the TWIMTBP program, even more reason to support nVidia/PhysX...

So when was this statement made? Probably when it was still Ageia with their PPU card? Or when Havok FX was just a rumour?
I'm quite sure that they have a completely different view on the situation today.
 
Well yeah that's obviously what's happening with most of the anti-PhysX posters. They're rallying against something that the competition has and that's understandable. What's annoying is that they're not offering a better solution - it's as if they prefer we game in bland static environments forever. I'm sorta surprised at Kyle's stance - his proposal of Microsoft sponsored physics libraries seems like a copout. He's just fueling the anti-PhysX noise instead of trying to educate his constituents on exactly what's in play here. And if he really believes Microsoft will be our savior he should explain why.

I don't understand it, because GPU physics effects, was said to be the next big thing (even by people @ ATI, that showed them running on their X1900 cards), but that's all they did. Now that someone is actually doing something about that, AMD and ATI downplay it, because they have nothing better and because it's NVIDIA doing it. And obviously their fanbase follows the same path...

trinibwoy said:
Tim Sweeney seemed happy to use PhysX for certain effects in UE3.0 but there hasn't been much more out of him with regard to where he thinks physics will live in the future. Newell and Valve seem focused on CPUs and given what they did with HL2 years ago they can probably do some impressive things with the new chips that are out there now. Carmack seems ambivalent. But let's see what the guys at THQ, EA and 2K come up with....

They just can't. The physics used in HL2 are not complex in any way, mainly because the amount of objects with which we and the world, can interact, is small. As you increase the object/particle count i.e. coming closer to reality, there's just not a single "affordable" option in terms of CPU, that the regular Joe can buy and get those effects. Increasing the number of CPU cores as we know them, is not a solution in both price and performance.
And I'm not surprised that Newell downplayed GPU physics effects a while back, because there's a big conflict of interests there. They are still using the aging Source Engine for their latest games, which uses Havok's API, so they gain nothing in adopting or praising PhysX at this point.

trinibwoy said:
if they fail then PhysX fails and all the naysayers will be proven right.

If they fail, that certainly doesn't make the naysayers right, because some of them are against a color, not the technology. But given that they only see in black and white, they don't make a difference between the two.
 
Do you at least acknowledge the potential of this technology? I believe Kyle does, although he's still putting idle hope on CPU cores.

What potential? I never said hardware accelerated physics is bad, what I mean is PhysX won't be the main physics engine in the future, not until more games support it.
 
Back
Top