Nvidia Physx strikes again: Cryostasis tech demo

kumquatsrus

[H]ard|Gawd
Joined
Sep 15, 2007
Messages
1,326
just found some performance previews on this latest title being developed with physx technology in mind. looks quite promising, moreso than mirror's edge. granted, it's still not gameplay changing physics, but it certainly makes it more immersive and atmospheric, imo...which is a start in the right direction. hopefully, the tech will eventually be adopted to where physics can completely change the way we play games in the future. that's the idea, at least. see the links below and offer your thoughts/ comments.

http://www.pcgameshardware.com/aid,...X_10_tech_demo_of_the_Physx_shooter_reviewed/

http://www.firingsquad.com/hardware/cryostasis_techdemo_performance/
 
The water looks cool, imo. The water ripples could use some work, though. I liked how the water physics worked in Resistance 2.
 
Thanks OP! Didn't know about this one. WIll try it out tonight!
 
Yeah, I would take the ATI scores with a grain of salt. I was "smart" and bought a PPU with my new system since I knew Nvidia was going to push Physx.

My test system:

E8500 (Stock Speeds)
4 GB DDR2-800
4870X2 (Stock Speeds - driver 8.12)
BFG Ageia PPU

Running at 1280x1024 Default Medium settings, No AA

First off, I had to go back one revision of the Physx software (8.09) to get any type of performance. Second, there were artifacts with the lights and some reflections (highlights on walls from the light) that were "rainbows".

That said, I got without PPU:

Avg: 10 fps
Min: 3 fps
Max: 115.1 fps

With PPU:

Avg: 21.7 fps
Min: 8.3 fps
Max: 105.9 fps

All other physx demos and games ran as normal.
 
Basically, NVidia is pushing Physx down everyone's throat and bribing enough people to make it so popular that it will be necessary to play any games. So ATi will be screwed unless they make their own comparable tech, or force everyone to buy a PPU.... Am I getting this right?
 
Keep em coming as it seems finally there is something to make good comparos with for PhysX HW..:)

Only if you're going to compare Nvidia CPU Physx TO Nvidia w/ GPU Physx TO Nvidia w/ PPU Physx. Any ATI I would invalidate as it's obviously so borked we're not sure what's holding it back and how it's affecting even PPU performance. I had the feeling my PPU was waiting on the video card.
 
Basically, NVidia is pushing Physx down everyone's throat and bribing enough people to make it so popular that it will be necessary to play any games. So ATi will be screwed unless they make their own comparable tech, or force everyone to buy a PPU.... Am I getting this right?

Well, for a game or two maybe. But until physics is in a standard, open format, I don't think you'll see it used much except for cloth, particles, etc. i.e. stuff that doesn't really affect the gameplay.

I also really think it'll go on CPU eventually with the CPU manufacturers putting core after core on chips now. Think about it. It will be easier for developers to just require a faster/more cpu as a standard and know everyone will have it versus requiring specialized hardware.
 
So ATi will be screwed unless they make their own comparable tech

They could still be screwed even if they DO make their own comparable tech... The problem is that physics libraries tend to be mutually exclusive.
So if a developer decides to use PhysX for a game, they can't easily switch to Havok or some other library down the line.
So if PhysX is already a common option, with many developers that are familiar with it, it will be very hard for ATi to make them switch to another library... Besides, that would only reverse the problem, because then nVidia users will be left out.

So PhysX could be playing a key role in the GPU market in the next few years. If it becomes a widely adopted standard, it will be hard for either AMD/ATi or Intel to get a hold on the graphics/physics market.

All I know is that I have a computer that's getting a bit long in the tooth now... but my 8800GTS320 ran this techdemo a lot better than my E6600 did. So even on my old hardware it makes quite a difference. Makes you wonder whether you should upgrade your CPU or your GPU first.
 
What is a good card to use for PPU? When is the card too suck? For example, would a 9500GT be as good as a PPU as 9800GTS?
Do we now need to review cards as PPU units too?
 
Tech demo's so far using physx have all been rigged, seeing as nvidia are even more crooked than ageia(maybe equal) you can't really trust performance figures.

Also, IMO intel may be the key player here alongside microsoft. Whilst we focus on the battle between nvidia and ati for gfx don't rule out intel muscling in and if they want to they will and with force, ATI and even AMD are one thing, but nvidia are small fry compared to intel and not even fanboys can't alter that.

Yes, i am completely biased against physx.
 
Basically, NVidia is advancing Physx because Ageia couldnt make it so popular that it will be necessary to play any games. So ATi will be screwed since they refused the tech when Nvidia offered it to them..... Am I getting this right?

Fixed..;)
 
I think physics will have to be a heterogeneous model that is part of DX before it ever becomes truly a part of gaming.
 
I think Havok will continue to be "it" for a good long while. Everyone has a CPU that can use the technology and more and more cores are sitting idle while gaming. GPUs tend to already be pushed to 100% utilization. Hmmmm. Now take that and start adding game designs in to leverage PhysX and half of your target market can't even begin to realize those game effiects because they have an AMD or Intel (looking forward here) video card. Thats not good ROI.

OpenCL is coming closer, but do not expect to see it rolled into Windows 7 or DX11, in the world of PC gaming, it is really a moot point for quite a while.

Like it or not, I think Microsoft has to take the lead on this to make it happen.
 
"Basically, NVidia is advancing Physx because Ageia couldnt make it so popular that it will be necessary to play any games. So ATi will be screwed since they refused the tech when Nvidia offered it to them..... Am I getting this right?"

That would be correct, and they gave it to us for free... :cool:

I am Pro PhysX, love the effects, and think it will be nothing less than a major win for Nvdia.

Good games will be out soon, and people just by their nature hate to lack the ability on their system to see any feature built into the game...

I know to some, GPU PhysX is just Smoke and Mirrors... ;)

But to me it's the way gaming will be going in the near future!!
 
I know to some, GPU PhysX is just Smoke and Mirrors... ;)

What part of the advancements in gaming in the past 10 years or so *weren't* just smoke and mirrors?
Basically games have mainly advanced in terms of eyecandy. The only two other improvements have been AI and physics, but physics has been held back by a lack of processing power. That is what inspired Ageia to develop a PPU, and now we have GPUs to take over that task.
So finally the groundwork has been laid for a breakthrough in gaming.
 
Yup, and as I say, games are in the business of pushing new hardware.

And this is why I made the statement about business above, as you are not looking at that in any of your thoughts. In a "Utopian" hardware world SOME of what you say falls in line.

And above is where you are 100% WRONG. Game companies are in the business of selling more games to the most people it can. NVIDIA is in the business of pushing game companies to use its feature sets that leverage NVIDIA GPU power and features. AMD and Intel are not going to use PhysX. That is why PhysX is doomed IMO. I don't think we will see a "real" GPU physics solution that is acceptable till 2012 if by then. Then it will be an open standard fully accepted and supported by Microsoft.
 
But don't the game companies think all the Nvidia customers are just waiting to buy a new gee wizz PhysX game? If money and sales are the motavation to developers on what features to include in their game, I think they would want to include PhysX just to move more units...
 
You see this is the thing, games are what drive hardware but in this case it is the hardware that is trying to drive the games.

That is not the way it works, we do not need realistic physics in games at the price nvidia are asking, the market needs to evolve at it's own rate making all the mistakes along the way, this is like genentic engineering, it will never be as well done as the real thing. (anyone who nitpicks that analogy is clutching at straws)
 
PhysX already is a real and acceptable GPU physics solution.
Real? Yes. Acceptable? Hardly. Any closed, proprietary solution tied to vendor-specific hardware is unacceptable for PC gaming, if you ask me. That's just not the direction we need to be moving.

The advent of hardware-accelerated physics simulation is a potential boon to the PC gaming market. PhysX is just not the solution we need to achieve that.

 
we do not need realistic physics in games at the price nvidia are asking

It depends on if you asking a Nvidia, or ATI guy.

Nvidia boy's got PhysX for free, so most would say bring on the realistic PhysX's! It just might be, the way it was ment to be played. :D

If they make us PhysX games, we will buy them.

http://www.bjorn3d.com/read.php?cID=1448
"We mentioned earlier the ability to use one GPU as the primary GPU and another GPU as the PhysX card. We've been privvy to some things that we can't tell you about just yet, but take our word for it if you plan on some of the hottest titles up and coming you might want to consider keeping the old GPU and using it for a dedicated PhysX card. This will work on non-SLI boards as well as SLI boards with no SLI bridge required. So you Crossfire board owners, dust off that old GPU and fire up your PhysX engines".
 
Real? Yes. Acceptable? Hardly. Any closed, proprietary solution tied to vendor-specific hardware is unacceptable for PC gaming, if you ask me. That's just not the direction we need to be moving.

That's some arbitrary limit that you try to impose.
The reality is that you don't have a say in this. There are plenty of game studios who use PhysX, so apparently they think it is acceptable, and they don't care that it's closed, proprietary or vendor-specific.

You won't have a choice... There are games that support GPU-accelerated PhysX, and there will only be more in the future.
You can only choose whether or not you want to buy and play these games... but at some point GPU-accelerated PhysX might be too compelling an option to miss out on. Just like soundcards or 3d acceleration... And they too started with proprietary standards (Adlib/SB-compatible and 3DFX/Glide). And you bought a soundcard, a 3d accelerator and the games.
So I have empirical evidence that it doesn't work like you hope it would.
 
How is this different from the first 3d accelerators?
Because the success of hardware physics is at least partially dependent on heightening gameplay interaction, which hardware-accelerated graphics wasn't (and still isn't, really). If we assume that the mid-range and high-end discrete graphics market is a 70/30 split between NVIDIA and AMD (for the sake of the argument), and if that split maintains for the next three years, when can developers begin to rely on hardware-accelerated physics for essential interactions? When does hardware acceleration become mandatory so developers can have a fixed target for physics interactions? When does hardware physics become less of a visual effect and more of a staple for innovative gameplay?

There are plenty of game studios who use PhysX, so apparently they think it is acceptable, and they don't care that it's closed, proprietary or vendor-specific.
You don't believe that developers want to reach the widest target audience possible? Believe me, given the choice between two solutions: one proprietary solution tied to vendor-specific hardware, and the other, open to a wide range of hardware, which would a developer choose?

They may not have a choice at this point, and they find it acceptable because it's the only solution currently available, but they certainly do care about spending resources on features that only NVIDIA owners can experience.

Just like soundcards or 3d acceleration... And they too started with proprietary standards (Adlib/SB-compatible and 3DFX/Glide). And you bought a soundcard, a 3d accelerator and the games. So I have empirical evidence that it doesn't work like you hope it would.
You have empirical evidence that, at one times, things were a certain way, and as time progressed, they're now entirely different? How exactly does that make any kind of case for PhysX?

I don't subscribe to the "suck it up and buy into the bullshit" mentality anymore. As a consumer, I have every choice, and I've chosen not to support a mere annoyance of a blip on what will be the whole of the physics processing radar that is PhysX. The future -- and it is absolutely a non-vendor-locked future -- is so much brighter.
 
Because the success of hardware physics is at least partially dependent on heightening gameplay interaction, which hardware-accelerated graphics wasn't (and still isn't, really).

I don't see your point?
You obviously didn't see mine: Hardware-accelerated graphics prove that people are willing to invest in new hardware for no other reason that higher framerates and more eyecandy.
So why should physics be different?
What makes you say it's dependent on heightening gameplay interaction? Perhaps that's what you (and most of us) are *hoping* that accelerated physics will bring to the table. But I really don't see why that would be a requirement for hardware physics to gain marketshare.

If we assume that the mid-range and high-end discrete graphics market is a 70/30 split between NVIDIA and AMD (for the sake of the argument), and if that split maintains for the next three years, when can developers begin to rely on hardware-accelerated physics for essential interactions? When does hardware acceleration become mandatory so developers can have a fixed target for physics interactions? When does hardware physics become less of a visual effect and more of a staple for innovative gameplay?

You have to be able to walk before you can run. We're entering the walking stage now with PhysX.

You don't believe that developers want to reach the widest target audience possible? Believe me, given the choice between two solutions: one proprietary solution tied to vendor-specific hardware, and the other, open to a wide range of hardware, which would a developer choose?

That is not the question here.

They may not have a choice at this point, and they find it acceptable because it's the only solution currently available, but they certainly do care about spending resources on features that only NVIDIA owners can experience.

So you agree, developers find it acceptable.
And they apparently don't care that only nVidia-users can make use of those features. Not to the point that they won't be using those features at least.

You have empirical evidence that, at one times, things were a certain way, and as time progressed, they're now entirely different? How exactly does that make any kind of case for PhysX?

PhysX is the 'Glide' of physics acceleration. It is the first of its kind.
Glide was a huge success for the first few years. PhysX could become a success aswell, although it is obviously inevitable that at some point a common standard will be required. Unless ofcourse ATi goes bankrupt during the PhysX phase and Larrabee turns out to be a failure.

As a consumer, I have every choice

The choice to either buy the games and the hardware or miss out. Not much of a choice really, more of a delusion.

The future -- and it is absolutely a non-vendor-locked future -- is so much brighter.

What future? Can you tell me that? Who is writing a non-vendor-locked hardware-accelerated physics library? And when will it be available?
PhysX is here, now... it could take years until an alternative surfaces... if it ever does. By that time it might no longer be relevant for all we know.
 
Problem is AMD is gaining market share in the discrete GPU market. Game developers won't create a game that can't be marketed to a huge part of the market. Scali2, I know you are a great programmer but I don't think that you will do as well if you starts a business.
 
It depends on if you asking a Nvidia, or ATI guy.

Nvidia boy's got PhysX for free, so most would say bring on the realistic PhysX's! It just might be, the way it was ment to be played. :D

If they make us PhysX games, we will buy them.

http://www.bjorn3d.com/read.php?cID=1448
"We mentioned earlier the ability to use one GPU as the primary GPU and another GPU as the PhysX card. We've been privvy to some things that we can't tell you about just yet, but take our word for it if you plan on some of the hottest titles up and coming you might want to consider keeping the old GPU and using it for a dedicated PhysX card. This will work on non-SLI boards as well as SLI boards with no SLI bridge required. So you Crossfire board owners, dust off that old GPU and fire up your PhysX engines".

Vista/DX10 does not allow you to load more than one video driver, so you have to still have an NVIDIA main display card with your "PhysX" card.
 
Problem is AMD is gaining market share in the discrete GPU market. Game developers won't create a game that can't be marketed to a huge part of the market. Scali2, I know you are a great programmer but I don't think that you will do as well if you starts a business.

I fail to see your logic, really.
If you choose to use PhysX, you still support CPUs, so people with AMD cards can still run your games. So it's not a 'problem' that AMD is gaining marketshare.

You just offer something extra for the nVidia-crowd, with not much extra effort.
This could actually give you more sales... and it could also give nVidia more sales.
You can try to insult my sense of business, but many large and successful game studios are doing it. So you look a bit silly really.
 
I fail to see your logic, really.

I realized earlier that you were not going to see anyones' "logic" when it came to the business side of the deal, that is why I stopped addressing your arguments. This is not about logic, it is about business, market share, penetration, footprint, and being able to generate revenue with all that.

His point is dead on for what its worth in how I see things.
 
They can't use PhysX for anything ground breaking because the majority of the market doesn't have a PhysX accelerator.
 
You obviously didn't see [my point]: Hardware-accelerated graphics prove that people are willing to invest in new hardware for no other reason that higher framerates and more eyecandy. So why should physics be different? I really don't see why that would be a requirement for hardware physics to gain marketshare.
True. At this point, physics currently isn't any different than graphics acceleration in that the benefit of allocating hardware to it provides nothing more than a visual boost, but that's certainly not the longterm goal of hardware physics (to be that and only that), and that's not an obstacle that can be easily overcome by a limited market and vendor-specific hardware.

As 3D acceleration evolved, it eventually become mandatory (in Quake 3, for instance) as developers no longer had any interest in providing software rendering with severely diminished visual fidelity. They hoped for a clearer target: a certain level of visual detail that could be scaled down to some degree to accommodate lower-class 3D hardware. At that time, a certain level of geometric complexity, mainly. And at that time, it became an acceptable concept to mandate 3D acceleration as non-vendor-specific standards existed that supported a variety of installed hardware. The success of 3D cards, to a certain extent, was dependent on those games that specifically required them, like Quake 3. How would anyone not expect a similar future scenario for hardware physics?

I'm not suggesting PhysX can't be successful (in one sense), but what I am suggesting is that it's an unacceptable solution for consumers, just as Glide was an unacceptable solution for consumers. I don't gauge the acceptability of new technologies based wholly on whether or not they "work" or if they serve a specific purpose despite them being tragically bad for consumers, and for the industry as a whole, in the long run. That's the reason why I couldn't possibly promote PhysX.

In one sense, PhysX is good because it's paving a certain road for the adoption of hardware physics, but that isn't necessarily the road developers and consumers want to travel.
 
They can't use PhysX for anything ground breaking because the majority of the market doesn't have a PhysX accelerator.

Firstly, I think the majority of the market actually DOES have a PhysX accelerator. Over 50% of the discrete GPUs sold are nVidia.

Secondly... Who said anything about groundbreaking? That doesn't have anything to do with it.
As I already said, we're in the 'walking' phase now. As long as games use PhysX and give nVidia owners a few extra fps here and a bit of extra eyecandy there, PhysX/nVidia will be getting exposure and gaining marketshare.
At some point they might reach critical mass, and then we can 'run', and bring on the groundbreaking accelerator-requiring physics.
But even if that never happens... who cares? How many top-selling games are really groundbreaking? Very little. Apparently that's not where the money is.
 
Back
Top