Any new PPUs?

I'm curious: What exactly do you feel you have won if they won't?

A certain useless e-rating will feel increased :p

I think that the reason why PhysX has more potential than Havok has been explained often enough already... to me it wouldn't be surprising either if it turned out that PhysX games will outnumber Havok ones 2:1 by the end of next year. Especially if EA churns out a few hundred games as usual ;)
 
I think that everyone will have lost if hardware-accelerated physics won't take off, and we remain stuck on slow CPUs.
 
I think that everyone will have lost if hardware-accelerated physics won't take off, and we remain stuck on slow CPUs.

Well, the thing people don't seem to 'get' is that everything around us IRL _is_ physics. Photons, matter, energy... it's all physics happening while we're observing it. To then say that we don't really need physics in games or equally ridiculous claims just makes me want to give up on humanity.

With just CPUs we won't get proper physics and thus realism any time soon as you said. We need to massive vector-processing capabilities of GPUs, as most physics calculations are (surprisingly enough) vector calculations, or can be sped up if treated as such, it'd be silly to leave such an opportunity to go to waste.

</rant>
 
Well, the thing people don't seem to 'get' is that everything around us IRL _is_ money. PhysX can be all that great but it won't take off if publishers don't think that it is profitable.
 
PhysX can be all that great but it won't take off if publishers don't think that it is profitable.

But apparently they do. EA supports it, and EA is by far the biggest publisher on the market. And EA is not the only one.
 
We will see if they really support it, you said that we will see more games in a few (12) months right. ;)
 
We will see if they really support it, you said that we will see more games in a few (12) months right. ;)

Guaranteed since more companies(3 EA,2k,THQ)have signed on in the last week than in PhysX whole existence.
 
Guaranteed since more companies(3 EA,2k,THQ)have signed on in the last week than in PhysX whole existence.

Less than one year to develope a game? Don't worry, I'll be here next year and we will see then how it will turn out.
 
Less than one year to develope a game? Don't worry, I'll be here next year and we will see then how it will turn out.

For EA? Quite often yes...
They have quite a few sports games that get a new version every year. And then there's things like Need For Speed and such.
They might for example spin off a bunch of games using the Mirror's Edge engine which supports PhysX (and will be released in just a few weeks, so there's 1 already).
 
Yay, EA signed a deal on PhysX and they release a new PhysX game within weeks!!! :rolleyes: The best part is they released the console version before they even signed the deal with nVidia!!!

The deal is obviously a PR stunt on nVidia's part because EA has actually released a few PhysX games before they even signed the deal with nVidia. EA doesn't need to sign a company wide deal with nVidia if they are interested to implement PhysX in their games and signing the deal does not mean that they will now use the tech in every new game they publish.
 
What is your point exactly?

Really, it's starting to get very annoying that you try to rag on PhysX in every way possible.
I know you must feel like an idiot for buying an ATi card and missing out on all the new PhysX glory... but just get over yourself and stop expressing your frustration in every thread here.
 
My point is, signing a deal does not mean that there will be a lot more PhysX games by the developer. Why don't we just wait for a year and see whether there are more games or not before saying that PhysX will be the next big thing after 3D hardware acceleration.

Why did nVidia make a huge fuss that they are now signing with more developers? Maybe because they can't make a big fuss over the "huge" PhysX game library. Well, maybe not yet ;)
 
My point is, signing a deal does not mean that there will be a lot more PhysX games by the developer.

Mirror's Edge was already under development before they signed the deal.
So the deal in itself is irrelevant. Without the deal they would have published PhysX titles aswell.
If anything, the deal makes sure that PhysX gets more press, and as such the brand becomes stronger. Which in turn might lead to more PhysX games.

Why did nVidia make a huge fuss that they are now signing with more developers? Maybe because they can't make a big fuss over the "huge" PhysX game library. Well, maybe not yet ;)

They're just promoting the PhysX technology. It's just marketing, you shouldn't try to look for more behind it. Especially you, who claims to focus on business and other aspects than just technology.

You cannot deny that there's a whole lot more talk about PhysX now... In fact, only a few months ago most people had no clue about physics at all, probably didn't even know what Havok was, or which games were using it.
 
What is your point exactly?

Really, it's starting to get very annoying that you try to rag on PhysX in every way possible.
I know you must feel like an idiot for buying an ATi card and missing out on all the new PhysX glory... but just get over yourself and stop expressing your frustration in every thread here.

I was about to ask the same thing..Really,why such a hardline against a tech that is obviously getting a shot in the arm??

Mirrors Edge comes out in a few weeks,Cryostasis in Feb.But Im sure you already convinced yourself that those two games suck since your ATI card wont run all the goodies or support the tech.Im thinking PhysX may even pop up in sports titles(Ea,2k)as Im sure they could be put to awesome use there.Im not saying PhysX is reinventing the wheel but,What has ATI done lately??
 
What has ATI done lately??

Let's see... they've been talking about how they'd accelerate physics on the GPU since the Radeon X1800/1900?
Then when nVidia showed the first working games with GPU physics, they've been talking about how much PhysX sucks, and how much better their vapourware is?
Oh, and then they tried to copy the Badaboom GPU-accelerated transcoder, and failed horribly :)
 
They created a competitive GPU that forces nVidia to drop the price of their GTX 200? They created the fastest single card and sell it at a very competitive price?
 
everyone should try to relax over this. it's alright to be a skeptic or a staunch supporter when it comes to unprecedented technology such as this. time will tell whether the physx backers, like those of us evangelizing it on this forum, will be vindicated. if not, then guess it's back to the stone age i go i guess, lol.
 
EA, lol, any other debate and they would be getting slammed for all their heavy handed and evil tactics.

Bottom line, if nVidia thinks people are going to run out and buy PhysX enabled GPUs and then be happy with 35fps they are high.
 
EA, lol, any other debate and they would be getting slammed for all their heavy handed and evil tactics.

Bottom line, if nVidia thinks people are going to run out and buy PhysX enabled GPUs and then be happy with 35fps they are high.

I fail to see any documentation for your claims?
 
Which claims, the well known DRM scheme they use on all new games or the numerous previews that show Cryostasis running at 35fps on a GTX260? If you honestly aren't aware of either of these then you aren't qualified to use the Internet any longer, please close your browser and go outside.
 
Okay, who let in the trolls?

*opens a window for some fresh air*
 
Which claims, the well known DRM scheme they use on all new games or the numerous previews that show Cryostasis running at 35fps on a GTX260? If you honestly aren't aware of either of these then you aren't qualified to use the Internet any longer, please close your browser and go outside.


Your post contains no vaild information, post reported.
Look, this is valid information:
http://www.hardforum.com/showthread.php?t=1376657

Okay, who let in the trolls?

*opens a window for some fresh air*

Not me...most have been someone with hurt feelings.
 
This has nothing to do with feelings, but play it how you feel you must.
 
Which claims, the well known DRM scheme they use on all new games or the numerous previews that show Cryostasis running at 35fps on a GTX260? If you honestly aren't aware of either of these then you aren't qualified to use the Internet any longer, please close your browser and go outside.

while the DRM scheme is true it has no place in this discussion other than to paint EA in a bad light. Stay on topic please, which btw is a dead topic.
 
Since several people in this debate have stated that support from EA is relevant to the success of PhysX, I think it has a place. If the game has PhysX support but no one buys it due to DRM it kind of cancels out the positive, doesn't it?
 
Since several people in this debate have stated that support from EA is relevant to the success of PhysX, I think it has a place. If the game has PhysX support but no one buys it due to DRM it kind of cancels out the positive, doesn't it?

For the informed gamer it does. The informed gamer also knows how to get around these things if they want. Some people will also still buy the games and want to get the most out of them........ very few people I would imagine. And for the uninformed gamer, it doesn't because they probably don't even know what Physx or SecureROM is, they are going to be quite shocked when they learn about SecureROM though. :eek:

probably the only effect these DRMs have is deterring honest people from paying money for them and downloading them instead. The games are still being played though and that is the only thing nvidia cares about.
 
Lol yea, who can argue with that kind of logic!
I'll give you one better: In the end it's all 0s and 1s!

Thing is, it's VERY hard to write a good physics API. Why do you think we only have two big players in the market (PhysX and Havok)? Anyone could easily have written a physics library for CPUs at any time... Very few people did... and even less actually succeeded.
Why is it so hard? Please provide anecdotal evidence besides your personal opinion, or otherwise indicate that it's your personal opinion. The behavior of 3D objects in space is something that already has to be done in any 3d game.

I don't see your logic though... It will be done at engine level? What? Physics? Well yes, more or less... But what does that have to do with OpenCL or DirectX? Neither are engines.

Okay I'll spell it out for you. Physics calculations will be done at the game-engine level. You know, that part of the game typically responsible for rendering scenes, object positioning and behavior, and collision-detection (which tada!!! Surprisingly which requires a lot of floating point mathematics, hopefully clearing up your earlier confusion about my statement about floating point math).

Still comes back to the same problem: very few developers are capable of writing a physics library. Crytek could pull it off...

Is it that very few developers are capable or very few developers are willing? Since there are far more failures in the game space than there are successes, most companies wouldn't be bending over backwards to add any more expense to making games than is there already. With even the most successful companies laying people off at a time when games are getting more complex, not less, there has to be proper motivation for them to incorporate something brand new into their product.

AMD now pays Intel for Havok... and it doesn't even give them GPU acceleration, unlike PhysX. So there goes that argument out of the window.

Oh, so in addition to Havok, you're suggesting that they're now going to pay even more money to Nvidia to support a proprietary API that will likely be supplanted by DirectX 11 when it likely supports Physics right? The only thing that has "gone out the window" is your objectivity.

As for Intel... they have no interest in PhysX yet, because they don't have a GPU yet... and when they do, they have Havok, so they're going to try and compete. At some point either one will have to give up and support the competing technology... or a third alternative will have to come out and support all architectures.

Congratulations Einstein, you just reiterated my original point.


As it stands however, PhysX is the only solution supporting game consoles, multicore PCs and GPUs. That, together with the fact that the SDK is free for use, makes it a very attractive option to developers.

So where is the dedicated hardware on a console that supports Physx? Yeah, that's what I thought. And you'll probably answer this by telling me how consoles can achieve PhysX acceleration by offloading the floating point to one of the other processing cores..

nVidia has been working very hard to get PhysX in the hands of developers, and I think in a few months the PhysX games will outnumber the Havok games.
Then it's up to Intel to try and regain the market with a completely new GPU architecture and the first try at a GPU-accelerated version of Havok.

PhysX is the "Glide" of today. It will get some market acceptance, but once Physics processing comes in Directx 11, PhysX will have no reason to exist.

This does not mean hardware accelerated physics are dead. Far from it; they are in fact getting stronger. I do see some early wins for PhysX... It is responsible for creating the hardware physics market, after all (much like Glide created the 3d gaming API market).
 
DX will never include a physics API. MSFT doesn't do middleware. GPU vendors will provide the APIs in their drivers, writers of game engines, simulations and other software use this API to quickly add physics functionality to their software.

Also, DX is the most proprietary API you can imagine which is used with games at the moment. I fail to see your objection against PhysX being proprietary as being logical in that regard. If you care about open standards, you should be all about OpenGL, OpenAL and soon OpenCL.

So where is the dedicated hardware on a console that supports Physx? Yeah, that's what I thought. And you'll probably answer this by telling me how consoles can achieve PhysX acceleration by offloading the floating point to one of the other processing cores..
There is no fundamental difference between a PPU and a GPU. Both are massively parallel vector processors. The PPUs released so far were more like GPUs of a few generations ago in processing power.

... most companies wouldn't be bending over backwards to add any more expense to making games than is there already. With even the most successful companies laying people off at a time when games are getting more complex, not less, there has to be proper motivation for them to incorporate something brand new into their product.
Look at Mirror's Edge for example. Enabling physics in the game adds completely new game elements. Yes, they are not crucial to the game, but they do add a new level of realism which is not achievable without PhysX, but frankly, you could replace all textures and mip-maps/shadow maps and everything else with flat shading and you'd still have the same gameplay, you'd just have removed non-crucial elements.

Gamers want to see pretty games, want to see things they haven't seen before. Generic Shooter #23 isn't going to get high marks no matter how inventive the ways to kill people in it is, in the end gameplay is closely followed by eyecandy. Just look at how a hyped-up game like Doom 3 (and Quake 4...) got killed in reviews and by gamers because it failed on both accounts.

What's the first thing you hear about Crysis? Not its gameplay, heck, I wouldn't even know what it's about other than that it's another FPS. All I have heard is that it's a very pretty looking game and the screenshots seem to agree with that.

You can be most assured that my company will be adding lots of PhysX and other eyecandy to our games :)
 
Why is it so hard? Please provide anecdotal evidence besides your personal opinion, or otherwise indicate that it's your personal opinion. The behavior of 3D objects in space is something that already has to be done in any 3d game.

The key words here are stability and performance.
Physics algorithms always use iterative approximations in order to keep performance at an acceptable level. A problem with such solutions is that they don't always deliver a stable solution. I guess we've all seen it happen in games: objects just vibrate and bounce like crazy, even intersecting eachother, when they should be lying still.
It happens even with the best of physics libraries. Imagine what the less physics-savvy developers would come up with :)
That's the main reason why physics libraries are so popular really: most developers are neither capable nor willing to spend the resources on reinventing the wheel themselves.

Okay I'll spell it out for you. Physics calculations will be done at the game-engine level. You know, that part of the game typically responsible for rendering scenes, object positioning and behavior, and collision-detection (which tada!!! Surprisingly which requires a lot of floating point mathematics, hopefully clearing up your earlier confusion about my statement about floating point math).

It's a whole different area of expertise. Most 3d calculations in a game are very simple linear algebra... just some basic matrix/vector operations. High school stuff.
Physics requires numerical integration and differential equations and such. Advanced mathematics at university level. You need math specialists for that... worse: you need math specialists who are ALSO expert programmers/optimizers. These people are rare.

Is it that very few developers are capable or very few developers are willing?

No, I specifically said capable.

Oh, so in addition to Havok, you're suggesting that they're now going to pay even more money to Nvidia to support a proprietary API that will likely be supplanted by DirectX 11 when it likely supports Physics right? The only thing that has "gone out the window" is your objectivity.

I never said anything about what it would cost them, let alone that it would be "even more money".
Aside from that, people keep coming back to DX11 supporting physics, which simply isn't true.

Congratulations Einstein, you just reiterated my original point.

Not quite.

So where is the dedicated hardware on a console that supports Physx? Yeah, that's what I thought. And you'll probably answer this by telling me how consoles can achieve PhysX acceleration by offloading the floating point to one of the other processing cores..

I have no idea what you're on about with dedicated hardware.
The point is just that developers can use the same API on the various platforms, and have the extra option of using acceleration on PC if they so choose.

PhysX is the "Glide" of today. It will get some market acceptance, but once Physics processing comes in Directx 11, PhysX will have no reason to exist.

Except that physics processing won't come in DirectX 11.
 
DX will never include a physics API. MSFT doesn't do middleware. GPU vendors will provide the APIs in their drivers, writers of game engines, simulations and other software use this API to quickly add physics functionality to their software.

Also, DX is the most proprietary API you can imagine which is used with games at the moment. I fail to see your objection against PhysX being proprietary as being logical in that regard. If you care about open standards, you should be all about OpenGL, OpenAL and soon OpenCL.

Time will tell if Microsoft supports it or not; they haven't said publicly whether they will or won't. My objection isn't against PhysX; I own a PhysX board so I'm covered either way. I just see every sign that it's following a similar path to Glide. I think the fact that Nvidia bought the technology is a good thing. It's good because it's getting people to put physics support into games. The natural evolution of that is for there to be a standard that doesn't require hardware vendors to pay royalties to use it.

As for DirectX being proprietary and what I should personally support, the distinction I would make is that a particularl hardware vendor does not control DirectX (and charge for the implementation/use). You mention OpenGL; does SGI charge a royalty to anyone who wants to implement OpenGL on hardware? As far as I know it doesn't, though honestly I've never worked a hardware company trying to implement OpenGL support.

There is no fundamental difference between a PPU and a GPU. Both are massively parallel vector processors. The PPUs released so far were more like GPUs of a few generations ago in processing power.
Right, I understand this and agree with it. But in the case of a console, you're taking a core away from something else.

Look at Mirror's Edge for example. Enabling physics in the game adds completely new game elements. <snip>

Well I agree with most of that commentary. But the point I was making was that game companies would rather that it be transparently handled by the game engine or some other API as much as possible. There was an earlier assertion that game companies couldn't implement their own physics API; my point is that they probably could but would rather someone else do it.

As for the balance of eye-candy versus gameplay discussion, while I mostly agree with what you said, neither Bioshock nor Fallout 3 are exceptional in the eye-candy department, yet both are highly regarded (well, DRM and bug issues in both notwithstanding) and sold very well (yes, I'm aware that Bioshock uses the Unreal engine and thus technically includes PhysX, but I honestly think anyone would be hard-pressed to find it makes a difference, visually or otherwise).

Until a game requires hardware physics acceleration to play, and that acceleration affects the gameplay substantially, you're not going to see it take off. And no software company in their right mind is going to require hardware physics until the majority of platforms out there support it. You're unlikely to approach even the 50 percent mark of hardware acceleration within the next four years.. This basically means that it's going to remain a novelty for a while.

That being said, Havok and PhysX are good first steps. I still maintain that it won't really take off until it gets blessed by Microsoft.
 
Time will tell if Microsoft supports it or not; they haven't said publicly whether they will or won't.

There's DX11 documentation and a tech preview in the November DirectX SDK:
http://www.microsoft.com/downloads/...6A-6D37-478D-BA17-28B1CCA4865A&displaylang=en

This version of the Direct3D 11 technical preview includes support for the following:


Tessellation
Compute Shaders
Multithreaded Rendering
Dynamic Shader Linkage
Windows Advanced Rasterizer (WARP)
Direct3D 10 and Direct3D 11 on Direct3D 9 Hardware (D3D10 Level 9)
Runtime Binaries
D3DX11
Completely Updated HLSL and Direct3D Compiler
D3D11 Reference Rasterizer
D3D11 SDK Layers
In addition, there are four new samples that highlight tessellation, computer shaders, mulithreaded rendering, and dynamic shader linkage.

No mention of physics.

The natural evolution of that is for there to be a standard that doesn't require hardware vendors to pay royalties to use it.

Is this even about royalties in the first place? I never heard either nVidia or AMD say anything about money. I think it's more about politics. AMD doesn't want to support PhysX because nVidia controls it and can therefore put AMD at a disadvantage.

does SGI charge a royalty to anyone who wants to implement OpenGL on hardware? As far as I know it doesn't, though honestly I've never worked a hardware company trying to implement OpenGL support.

Actually they do. Or they did anyway. SGI doesn't control the OpenGL name anymore, as far as I know. But OpenGL is/was a registered trademark, which could only be used by licensed manufacturers. This is the reason why the opensource implementation for linux and similar OSes is called MesaGL. It's fully OpenGL-compatible, but it isn't/wasn't allowed to use the OpenGL brand without paying royalties. It's much like the Unix name in that respect.
 
My objection isn't against PhysX; I own a PhysX board so I'm covered either way. I just see every sign that it's following a similar path to Glide.
Glide didn't fail. 3dfx failed due to stupid marketing people and retarded management running the company into the ground. If MSFT were to go under, DX would go with it as well. Such is the fate of proprietary tech when the company which owns it is run by monkeys.

As for DirectX being proprietary and what I should personally support, the distinction I would make is that a particularl hardware vendor does not control DirectX (and charge for the implementation/use). You mention OpenGL; does SGI charge a royalty to anyone who wants to implement OpenGL on hardware? As far as I know it doesn't, though honestly I've never worked a hardware company trying to implement OpenGL support.
OpenGL is an open standard. It is royalty-free. As pointed out in another post, SGI hasn't controlled OGL in years. The same is true for OpenAL.

I'm not sure whether MSFT charges royalties for a DX hardware implementation. But fact is that they could do so at any time. OGL is open and free and will always remain so.


Right, I understand this and agree with it. But in the case of a console, you're taking a core away from something else.
Oh sure, but a GPU is meant for eye candy, right? :) So using it for PhysX as well is akin to running an advanced custom shader routine or so to spruce things up. Not a big issue, I'd say.


Well I agree with most of that commentary. But the point I was making was that game companies would rather that it be transparently handled by the game engine or some other API as much as possible. There was an earlier assertion that game companies couldn't implement their own physics API; my point is that they probably could but would rather someone else do it.
PhysX handles all the heavy-lifting. All a game engine developer has to do is wire things up between the game engine and the PhysX API. If a game developer then uses that game engine, the process is transparant.

As for the balance of eye-candy versus gameplay discussion, while I mostly agree with what you said, neither Bioshock nor Fallout 3 are exceptional in the eye-candy department, yet both are highly regarded (well, DRM and bug issues in both notwithstanding) and sold very well (yes, I'm aware that Bioshock uses the Unreal engine and thus technically includes PhysX, but I honestly think anyone would be hard-pressed to find it makes a difference, visually or otherwise).
I recall plenty of reviews drooling over sights in Bioshock and Fallout 3, including the Arstechnica one. This while the game play in Fallout 3 has been called 'limited' and people have been whining about the ending(s) in Bioshock. Apparently graphics are convincing :)

Until a game requires hardware physics acceleration to play, and that acceleration affects the gameplay substantially, you're not going to see it take off. And no software company in their right mind is going to require hardware physics until the majority of platforms out there support it. You're unlikely to approach even the 50 percent mark of hardware acceleration within the next four years.. This basically means that it's going to remain a novelty for a while.
If enough games start using PhysX to add things which will run fine on CPU only and add more stuff which will work fine with a GPU, you'll see piles of gamers start using PhysX on a GPU, especially since it's so darn easy. If you like AMD cards, you can get a cheap 8800 GTX card and use that for PhysX. Easy as pie. I think that AMD will realize at some point that if they don't have PhysX on their GPUs, they might as well shuffle off this mortal coil right away.

That being said, Havok and PhysX are good first steps. I still maintain that it won't really take off until it gets blessed by Microsoft.

So you say. I say you're wrong. Of course, I'm merely a biased game developer :)
 
OK, a quetsion from a non technical person. Will new games developed with Physics work if you have an old PPU installed or will they now require the newer GPUs to run with PhysX active?

LouP
 
OK, a quetsion from a non technical person. Will new games developed with Physics work if you have an old PPU installed or will they now require the newer GPUs to run with PhysX active?

LouP

New games will work with an existing PPU. Performance is likely to be significantly higher with a new GPU, or a GPU (8800 or better) dedicated to PhysX, however.
 
Back
Top