Dark Void PhysX Gameplay Performance and IQ @ [H]

Oh goodie I love these!


You guys are both using personal attacks. Nobody's a saint here.
What personal attacks did I use?


What? His point was that we don't NEED physX to see a cup of water being spilled over a breaking building. Let's look at the last 3 games I've played.

No his point was Physics can be done on the CPU, so why are they not producing better code to run PhysX on the CPU. He was contending that they were purposely sabotaging doing any PhysX on the CPU.

You can argue all you want that effects like the ones seen in physX aren't reproducible, but playing through those 3 games I've seen plenty of technical merit that did not need any help from physX to do its job properly. As seen in the game reviewed on the first post, PhysX takes a large toll on the GPU even when looking at the flagship 295.

It's slow and inefficient.

My argument has never been about whether you 'need' PhysX. I think it makes the game better. My argument is why are they crying about not being able to use PhysX when they have AMD hardware when they can hack it and still use it. And even more so, why cry about not being able to use PhysX on the CPU? That is even more ridiculous. And the comment about slow and inefficient is just ridiculous. Why have AA, it is practically pointless, extremely slow and extremely inefficient... Why have detailed 3d models and effects at all, many of them are slow and inefficient.

News at 11: Open Source definition changed to one entity.

OpenCL wouldn't be the only project out there, and how can you justify saying that it "might be faulty"? Did nVidia ship you a GTX Co-op edition Crystal Ball?

Also, I think something that is "efficient" and "not faulty" would run on the hardware it's designed for without needing 100$ worth of more purchases (inb4 that guy who said you can get a 8800gt for 5 cents and a hot dog)

I'd love some choice! Hmm, should I go for the nVidia GTX 671.9 hydro copper co op XTREME edition or the nVidia 710 GTX dual slot 8gb ram SPECIAL:$2,000 edition?

So hard to choose. Oh wait, I should get the 671.9 because I'll probably need another 200$ just to play it The Way It's Meant To Be Paid.
What are you even getting at here? This makes no sense. You need to go back and read the whole discussion before trying to barge in like this on my post. I was talking about Open Platform verse Open Source. They are 2 totally different entities. Open Source is even more ridiculous to think as a viable option. There are tons of Open Source programs that are nowhere near as good as their Closed Source counterparts. Open Platform is fine, but it is still Closed Source code. Microsoft develops the DirectX code and then everyone has to conform to what it says is right for gaming. Where is the advancement in that? People making the hardware should know far better what works well on the hardware. And the other comments about choices also doesn't make sense.

Durr. Let's see how PhysX might affect someone trying to interact with a market!

"Hi we're Company A developing physics for about a year and a half do you mind if we come in"

"Sure but can you pay us money to use your physics"

"No lol"

"No you cannot come in Company A father nVidia will be mad"

Plus, nVidia could simply go "nope" and not include support for those physics in their drivers. You can argue that if it became big then they would "have" to, but how would open source projects get support like that if it wasn't universal standard? How would they enter the market with PhysX still getting dev's attention?

Umm what?! This makes no sense.


This has absolutely nothing to do with CUDA. We're talking about PhysX.

Also, why would anyone work on PhysX aside from nVidia when nVidia acts the way it does? Disabling physX when an ATI card is present? these are clearly market tactics and nVidia's happy-go-lucky "anybody can license this derp" attitude is clearly not the case when it favors the idea of blocking the cooperation of hardware.

This has everything to do with CUDA. Didn't you know that Nvidia has been changing over its physics processing to CUDA? That is the plan for the future, have you been under a rock lately? Again about locking out a feature when a rival card is present that doesn't license their technology, cry me a river. That is just an absurd notion to believe they will allow a competitor and advantage when they offered to license the technology to said competitor in the first place. No amount of crying foul makes any sense in this regard. Locking out AA is something else entirely.


I think that the Nazis incinerating the Jews was a sad ordeal. This is an argument of opinion. now, according to your theory, this means I'm automatically wrong.

Those Nazis sound better now that you've phrased it that way. Thanks.

Yes if you make the statement that it was a sad ordeal for EVERYONE, then that argument would be wrong. You are stating an opinion and saying it is EVERYONE's opinion. That is not a valid way to argue. If you want to say that the Nazi's incinerating the Jews was wrong because it violates Human Rights, then that is a valid argument. And trying to use the analogy of Nazis treatment of Jews to further you argument is a tired tactic. Trying to make me out to be a Nazi sympathizer doesn't make it true, or make your arguments any better. Infact, it makes your arguments even weaker to try and bring a completely unrelated topic in to help argue about "opinion vs fact"

Maybe they an have a building made of water fall down. Oh man. Also stop mentioning CUDA in a PhysX argument. It's not used for games.

CUDA is used for games, I have been using it for 3 years now for games. So stop trying to say it isn't use for games when it clearly is. CUDA is what enables many of the cards today to process the Physx on the card while also computing the other graphics.

If it were so great, then there would be applications that use it exclusively. I'm sure nVidia's wet dream is to have every corporation in the world running a PhysX OS using PhysX to play pong.

PhysX is not some magic thing. They're not limiting it.


That's the entire point of PhysX, as well as what they're doing in the market.

If they wanted people to buy an nVidia card for PhysX, they wouldn't have disabled it. It deters people from using ATI.

This is even more laughable. There are games that use PhysX exclusively since it is an exclusive technology at the moment. So where exactly are you getting this argument from? Just like Havok is an exclusive product. The problem isn't whether or not PhysX is great, but if the game the developer is making requires that kind of physics processing and if they really want to take time to program a game that uses an exclusive technology.

If anything, dark void should serve as a reminder that nVidia will have devs force as much physx into a game as humanly possible until it's barely playable on their highest spec'd hardware on the ancient unreal engine.

Nvidia doesn't force any developer to do anything. That is up to the developer.

Gee, I wonder why Intel and AMD don't want to license something that inefficient. I wonder why every company doesn't bend to nVidia's will and pay a corporation for the rights to develop something that they may or may not use.

Well, for one Intel developed Havok to compete with PhysX and Havok is not as mature as PhysX. For another AMD already licensed Havok initially and they don't want to support a competitor. It has nothing to do with the efficiency of Physics. Besides, this whole efficiency talk is debuncted in the first place. Efficiency has everything to do with coding and development and what they want to do. Anytime you add more cutting edge technology to games it will cause performance hits, that is the whole point in pushing the envelope. If we went with your "efficiency" model, we would still be stuck with 16 bit graphics or worse.

If AMD licensed physx, all nvidia would have to do is let PhysX run absolutely god awful on ATI gpus like they did with cpu utilization and it would be another marketing ploy to make AMD's lineup look inferior to the space heater that is Fermi.

That makes absolutely no sense. Nvidia would lose money by making PhysX look bad on a product licensed and designed to use it. You are trying grasping at straws because of an obvious hatred of Nvidia.

It would be basically open source, but with someone controlling the whole thing while profiting off of it. Sounds great until you think about it objectively for more than eight seconds.

Say what? This is just utterly ridiculous. How many technologies, programs, games, etc out there are designed not to make a profit? If you think about that statement objectively, it completely falls apart.

My opinions>your opinions. Also, you're implying that you can't do anything other than pre-rendered effects on non PhysX platforms. Which is not only lame, but entirely untrue.

I don't see your point. Dynamic effects still respond to stimuli, even on the most basic level. Also, you're implying that you can't do anything other than pre-rendered effects on non PhysX platforms. Which is not only lame, but entirely untrue.

Ahh, see now we see your points. Everyone else is wrong because you said so. Thank you for pointing out your obvious objectiveness. I also never ever implied you couldn't doing effects. I implied you couldn't do dynamic physics without an engine like PhysX. Also it is nice that you just repeat yourself here without offering any insight or evidence. Come to think of it, you haven't offered any actual counter argument or evidence, just been trying to flame me the entire time.

This guy seems legit. Are you a game developer? When is the star wars mmo coming out?

Good to know that there are [H]ardcore people that understand the intimacy between a game and multiple cores. I get that it is better to use 1 core at 50% now.

Also, your argument is flawed. Watch me do it.

If it was simply a matter of PhysX being anything other than a huge pile of shit, a lot more AMD and Intel would use it.

Thank you for proving to me you have absolutely no understanding of the real world in terms of business competition, and game design. I have mentioned several reasons in posts why AMD would not license it. It is even MORE ridiculous when you bring Intel into it, since they developed Havok.

Utilizing the CPU is NOT better? You're right. I hope my i5's 3 other cores never see the light of day. I agree that is objectively better. Also, nice personal attack at the beginning. Really makes the rest of the meal go down smooth.

CPUs are designed entirely different than GPUs. I never said using multi-core programming was bad, just difficult. And no matter how many cores you use, it doesn't make it any better to produce physics. I.E. if it was that much better where is the long fabled Intel video card to do this?

CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA CUDA. This topic is about PhysX.

Try and do some research and your homework before making childish statements like this.

Logical Fallacies! My favorite.

AMD is losing in the GPGPU market therefore PhysX is a wonderful closed standard.
compare to
My grilled cheese sandwich is great therefore gay marriage should be banned in all states.

My statement wasn't about AMD losing the GPGPU market because of PhysX, but because of Nvidia's advancement in the Physics processing market. Again, read the discussion before making assumptions.

Man, you guys are so easy to pick apart. It's hard justifying something that runs like shit and limits the market, isn't it?

Stop trying to justify your purchases. :)

It is easy to pick apart when you offer no real arguments or points. Your argument is basically "Your wrong" and you don't even know anything about the field. I am glad you keep trying to use the "I'm rubber, your glue" process to your arguments it just shows that you don't really have any evidence, facts, or counter points to make.

Oh, and I fully expect a response picking and choosing which arguments you want to contend with, and using opinions and personal attacks as justification all the while.

Don't disappoint me!

I chose them all. Now why don't you try to do a little research and come back and argue when you know something about the computer graphics and physics processing field and how CPUs work verse how GPUs work. Also do a little research into business competition and real world applications of competition. Basically when you get a clue, try to use some valid points next time.
 
Back
Top