Hawken PhysX Trailer

HAVOK. It is done entirely in software with great results and no vendor lock-in.

PhysX may be better overall, but so far every single tech demo or actual game implementation is filled with pointless effects that suck GPU performance for shiny lights and other cheap tricks such as waving flags, volumetric smoke, etc.

CHEAP tricks? Tricks maybe, but cheap? No.

Go play something like Borderlands 2 or Arkham City with Physx on. Sure the effects aren't needed and sure they don't affect game play but it ADDS to the realism and over all beauty and atmosphere of the game.
 
Well, yeah. I guess there's nothing cheap about sacrificing 30% of your GPU performance for some shiny "particle simulation" that might as well be implemented without the need for PhysX :D Even in Batman.
 
Well, yeah. I guess there's nothing cheap about sacrificing 30% of your GPU performance for some shiny "particle simulation" that might as well be implemented without the need for PhysX :D Even in Batman.

True, but when you have a bad ass GPU like mine that can handle it perfectly fine then it starts to not matter...
 
CHEAP tricks? Tricks maybe, but cheap? No.

Go play something like Borderlands 2 or Arkham City with Physx on. Sure the effects aren't needed and sure they don't affect game play but it ADDS to the realism and over all beauty and atmosphere of the game.

I cant speak for batman, but physx was really irritating on borderlands 2. I watched the videos and thought oh, cool! but then when i actually played the game with physx on i didnt like it at all.

On hawken it looks over done, i like some extra debris from weapons or when a mech blows up, but not as much as they have implemented.
 
A bunch of clutter that will be turned off by anyone that wants to be competitive at the game.

Agreed

It's not an ATI/nVidia arguement. In BF2 i'd turn all that distracting stuff off. You get score points for kills, tagging that guy in the dark back shadowed corner no one else saw.
 
open standards > closed standards
Because avoiding direct X, and sticking by OGL while it languished for YEARS was a good plan. With physx, you have an open standard that is not getting used, and a closed but easily accessible standard that is. We have also reached a point where use of it to do something that doesn't directly affect game play actually looks pretty cool.

AMD is failing to get things out the door that use their solution, and their theoretical advocacy rings pretty hollow when you don't see them trying to get it into games. I think the only thing that could affect the balance right now is if the rumor mill is correct, AMD is in both the next xbox and playstation, and there is enough overhead available after making stuff look closer to current gen PC performance to have physics effects be viable. That would potentially lead to massive market share from the dev side, and NVIDIA choosing between fighting a losing battle and pissing off customers, or supporting both openCL and cuda in terms of physics and doing it well.

I've pretty much toggled back and forth between team green and team red, but based on the last several years of GPU purchases, I think I'll stick with the tock end of the cycle and stick with higher end single card nvidia solutions.
 
All the people supporting an OpenCL whatever don't seem to realize, still, that OpenCL isn't a replacement for PhysX, it would be a replacement for CUDA.

MS looks as if its going AMD with Blu-Ray for the 720, and Sony looks to be going Blu-Ray and AMD on the PS4...

Physx is cool, don't get me wrong, but in my opinion NVIDIA should release the ability to either run one of its card where an AMD card is primary the the NVIDIA card is just for physx... this would expand their base and introduce many people to their hardware, and expand their profits... It seems nuts to disable that functionality... you have to go to hacked drivers to get it and that it crappy... if Nvidia made a driver that supported the use of Physx only, I may buy one of the cards and know others that may as well... like I said it would be great if they offered a stand alone card thats either 1x,4x,8x,16x

Charge 199 or 299 and I bet they would see great.. I would get one for my box... as long as I can choose my graphics seperate of my Physx

http://www.youtube.com/watch?v=8jGZv1YYe2c


It appears that NVIDIA cards really choke on this one...
 
Damn shame Nvidia killed the standalone Physx cards, an even bigger shame they can't use them (Physx or OpenGL) for things more important than drunken multi-colored firefly's
 
MS looks as if its going AMD with Blu-Ray for the 720, and Sony looks to be going Blu-Ray and AMD on the PS4...

People keep bringing this up - what does it have to do with anything? You do realize the GPU silicon that goes in the consoles are a custom, made-to-order part to Sony & MS's spec and has little to do with their consumer retail offerings.
 
So all the closet AMD owners here bashing it should watch the gameplay videos for themselves before passing judgment

If you're going to expect that much from 'closet AMD owners'... then maybe you should read the thread for yourself before passing judgement.

Plenty of nVidia users saying they'd rather have the effects disabled. Myself, included.
 
So all the closet AMD owners here bashing it should watch the gameplay videos for themselves before passing judgment

Actually, I've played the beta-- closed and open. I have a Radeon card. I like the game because I'm a fan of games like MechWarrior (non-online) and mechs and robots in general.

Not really bashing the video, but more like bashing the general attitude Nvidia has towards non-Nvidia owners.

All computer users should have equal access to the same gameplay including the enhancements to it depending if our hardware can handle it or not. We spend our hard earned money on our gaming computers, so we damn well expect to play the game that take advantage of it and/or push it regardless of the hardware vendors in our computers.

If you're going to expect that much from 'closet AMD owners'... then maybe you should read the thread for yourself before passing judgement.

Plenty of nVidia users saying they'd rather have the effects disabled. Myself, included.

Yeah, that's the thing, and it's been said before in this thread-- majority of the games that have physics-based effects seem tacked on. Only a few have properly added it to enhance the realism and gaming environment around it. Other games, though rare, take advantage of physics engine such as PhysX to make a game around it. But, when the physics effects either distract the gamer from the game itself or detracts from the experience or lags the gamer, then the extra physics effects isn't necessary and should be disabled.
 
game makers should just use all those idle cores on so many pc's these days for physx

byt this is Nvidia's way of driving up sales, I think AMD own Havoc physx dont they? been a while since i looked into it, i really dont care the majority of they Physx is extra fluff
 
Not really bashing the video, but more like bashing the general attitude Nvidia has towards non-Nvidia owners.

All computer users should have equal access to the same gameplay including the enhancements to it depending if our hardware can handle it or not. We spend our hard earned money on our gaming computers, so we damn well expect to play the game that take advantage of it and/or push it regardless of the hardware vendors in our computers.

And all rainbows should have lucky pots of gold at the end of them but we know thats also not going to happen. GPU makers are businesses that by nature exist to make money. NVIDIA had some foresight and paid a lot of money to acquire AGEIA. AMD bet a different horse and lost.

Essentially you're complaining that NVIDIA should now make it public domain and let AMD - their main competitor - copy their homework. I understand from the consumer point of view that everything should be equal but thats not a business reality.
 
Well, yeah. I guess there's nothing cheap about sacrificing 30% of your GPU performance for some shiny "particle simulation" that might as well be implemented without the need for PhysX :D Even in Batman.

Not sure what you are running but on my 560TI it runs like butter even in heavy firefights. I set it to medium because I like the effects. I think it adds to the game.
 
And all rainbows should have lucky pots of gold at the end of them but we know thats also not going to happen. GPU makers are businesses that by nature exist to make money. NVIDIA had some foresight and paid a lot of money to acquire AGEIA. AMD bet a different horse and lost.

Essentially you're complaining that NVIDIA should now make it public domain and let AMD - their main competitor - copy their homework. I understand from the consumer point of view that everything should be equal but thats not a business reality.

Personally 2 out of 3 of my most recent video card purchases are nvidia and my current gaming card is nvidia... I still think PhysX is a steaming turd. I'm not going to get in to the whole nvidia vs amd thing because frankly I don't give a shit.

At the end of the day, what matters to me is the fact because only nvidia can use PhysX, it's limited in it's scope to a few flashy effects here and there that can't actually affect gameplay because it'd leave out a large portion of gamers. What matters to me is that often physx games with physx off look worse in the effects department than other non-physx games. What matters to me is physx being used as a selling point for video cards instead of actually being used to improve a game. What matters to me is the physx effects in this tech demo look "meh" at best (IMO).
 
game makers should just use all those idle cores on so many pc's these days for physx

This.

The CPU is usually just running at 50% while the GPU is running at 100% WITHOUT physics.

Marketing... Sigh...
 
Back
Top