Dishonest PhysX comparisons

Wouldn't it be nice to see ATI and Nvidia work together and give Intel a hard time?
 
I really don't see how PhysX will work unless you have a dedicated GPU for it..

which is a waste of money and not many people have 3 PCIE slot.

and those physX effect beside water effect can be done on CPU easily and already exist in many other games out there...
 
Forgive my ignorance but I fail to see what is dishonest about this?
Dishonest isn't exactly the right word. Look at this analogy. There are TONS of different hardware manufacturing companies which make and release DVD players. Sony, Toshiba, Panasonic, Samsung etc etc etc too many to count.

Now lets say Toshiba pays a movie director a bunch of money so that all the extra features on his movie's DVD only works with Toshiba players. That would be frowned upon and the backlash would be insane. This is exactly why you have standardizations to follow. Everyone meets the standardizations (APIs) and not one companies requests.

In the high performance videocard industry since there are only 2 major companies atm battling it out they sometimes get away with things they shouldn't - but in fact if there were many more companies making high performance cards you would see a lot more of "optimizing to meet the API" (dx and opengl), instead of the poorer technique of "optimizing to meet a specific game".

Standardization is important for progress, and nothing is wrong with wanting to have an awesome physics pipeline - nVidia is going the wrong way about it by paying developers to implement features in their games which work only on their cards. Phsyx would be a lot more accepted if it were a collaboration between many of these companies instead of just nV alone.
 
I really don't see how PhysX will work unless you have a dedicated GPU for it..

which is a waste of money and not many people have 3 PCIE slot.

and those physX effect beside water effect can be done on CPU easily and already exist in many other games out there...

Try it, it works fine with my single GTX260.
What are you basing your comment on?

Your point about water has already been addressed earlier in the thread.
A CPU cannot easily do the same thing, you are misinterpreting or not noticing what has been achieved.
You can make water look fairly realistic with a CPU when there are no external influences on it.
If your character throws something into the water for example, the CPU version may show a splash, the GPU version has enough power to show a more detailed splash and model the ripples across the water while not disturbing other motion.
This will improve as techniques and hardware improve.

The idea behind GPU physics is to provide a lot more horsepower to provide better modeling of complex systems/behaviour.
Fluid dynamics cannot be done easily on a CPU while running the rest of the game and keeping framerate up.

Whether you think PhysX is worth it is your opinion but doesnt change the fact that better physical modeling is much required and that needs fast and highly parallel processing, something current CPUs are not good for.
Some effects arent much to look at but others are really good.
Its very early days for decent physics in games so expect huge improvements as faster hardware and software techniques are developed.
I'm glad NVidia has the balls to push this.
 
nvidia needs to let ATI users have Physx in some way or another, even if that means buying a cheap nvidia card to run it on, or else it will never take off, because only half of the people playing the game can use it.

If everyone had Physx, more developers could use it, and it would probably be a lot better than it is now.

Nvidia is screwing everyone, including themselves, by doing it the way they are now.
 
Wouldn't it be nice to see ATI and Nvidia work together and give Intel a hard time?

like cats and dogs agreeing to a truce to unite against their true common foe - birds, lol.

I really don't see how PhysX will work unless you have a dedicated GPU for it..

which is a waste of money and not many people have 3 PCIE slot.

and those physX effect beside water effect can be done on CPU easily and already exist in many other games out there...

software physx runs just fine on the cpu; hardware physx runs just fine on a single 8800 card or higher. you only need a 2nd card/ dedicated gpu for better performance, which is the same thing if you wanted to play at higher resolutions, or enable higher graphic options, aa, etc. (quality vs. performance) otherwise, just turn it off and be satisfied with the base software physics in the game. there are no additional costs if you already have the hardware to run it, so it's just an added bonus. 3 slots (2 slots even) are only needed if one plans on gaming on a very high end system in the first place, which would become necessary for someone pushing 4mp resolutions anyway, assuming the highest graphic options are enabled. just like nenu said, it can work very well with even a single 260 for currently supported games.

as for the effects, yes, certain effects can be done on the cpu, but not dynamically and in realtime without seeing a slideshow. even if they were just basic effects, the gpu is still exponentially faster at rendering them so why not use it? if that wasn't the case, then why have a gpu in the first place? we might as well be stuck playing quake running off the cpu. if the gpu can handle physics, graphics, encoding, and other highly parallel processes much better than the cpu, then there's no reason why it shouldn't be taken advantage of.

if one doesn't care for the effects, then that's a matter of opinion. same applies to dx10/ 10.1 visual effects in some games. eventually there will be gameplay-changing accelerated physics in future games, and i'd venture to surmise that future cpus will still be a poorer and slower performance alternative to future gpus in terms of physics acceleration. the only legitimate complaint i can see is that ati cards don't have the option for gpu physx. i'm sure that will no longer be a valid concern in the near future, so until then, patience is a virtue that unfortunately most do not possess.
 
Last edited:
nvidia needs to let ATI users have Physx in some way or another, even if that means buying a cheap nvidia card to run it on, or else it will never take off, because only half of the people playing the game can use it.

If everyone had Physx, more developers could use it, and it would probably be a lot better than it is now.

Nvidia is screwing everyone, including themselves, by doing it the way they are now.

physx doesn't have to "take off" since it's just middleware like havok. plenty of games use it on the consoles, pc, and even the iphone. now if you mean more specifically gameplay-changing physx will never take off, due to a split market, then you might be more accurate in that conjecture. as it stands right now, the hardware-accelerated physx effects being implemented in games as optional, is just that optional. as far as that is concerned, more and more titles are integrating that approach.

i agree that if gpu physx worked on ati cards then none of this would be an issue and there would be a solid foundation from which to develop games with physics that changed the way we play. perhaps nvidia is screwing everyone as you say. practically all corporations try to screw you over one way or another - either bend over and like it or don't do business with them like some have pledged. i don't pretend to know all the details as far as the talks btw. both companies as it relates to gpu physx. it's all hearsay to me. offers were discussed and ended. someone supposedly got it to work on ati cards, but it got stopped by ati, or was it nvidia? though i read a post where someone made a comment that perhaps ati didn't accept to utilize gpu physx due to nvidia potentially making it's competitors cards lag in performance. even if that were to become true, i don't see how that would be any different in terms of intel probably doing the same thing with havok and larrabee. regardless, we'll just have to wait and see how everything pans out in a few years. until then, grab your popcorn and soda.

p.s. - as far as win7 and the latest nvidia drivers are concerned, i will agree that it was a pretty bad move to cut off support for dual gpu support for ati/ nvidia cards in the same system, but apparently the green team decided they would rather retain a competitive advantage at all cost (even if it is potentially short-term) rather than attempt to boost their hardware sales.
 
Last edited:
p.s. - as far as win7 and the latest nvidia drivers are concerned, i will agree that it was a pretty bad move to cut off support for dual gpu support for ati/ nvidia cards in the same system, but apparently the green team decided they would rather retain a competitive advantage at all cost (even if it is potentially short-term) rather than attempt to boost their hardware sales.

yes, this is the part that confuses me, I would think they'd like to sell their hardware, since they are a hardware company....
 
physx doesn't have to "take off" since it's just middleware like havok. plenty of games use it on the consoles, pc, and even the iphone. now if you mean more specifically gameplay-changing physx will never take off, due to a split market, then you might be more accurate in that conjecture. as it stands right now, the hardware-accelerated physx effects being implemented in games as optional, is just that optional. as far as that is concerned, more and more titles are integrating that approach.

i agree that if gpu physx worked on ati cards then none of this would be an issue and there would be a solid foundation from which to develop games with physics that changed the way we play. perhaps nvidia is screwing everyone as you say. practically all corporations try to screw you over one way or another - either bend over and like it or don't do business with them like some have pledged. i don't pretend to know all the details as far as the talks btw. both companies as it relates to gpu physx. it's all hearsay to me. offers were discussed and ended. someone supposedly got it to work on ati cards, but it got stopped by ati, or was it nvidia? though i read a post where someone made a comment that perhaps ati didn't accept to utilize gpu physx due to nvidia potentially making it's competitors cards lag in performance. even if that were to become true, i don't see how that would be any different in terms of intel probably doing the same thing with havok and larrabee. regardless, we'll just have to wait and see how everything pans out in a few years. until then, grab your popcorn and soda.

p.s. - as far as win7 and the latest nvidia drivers are concerned, i will agree that it was a pretty bad move to cut off support for dual gpu support for ati/ nvidia cards in the same system, but apparently the green team decided they would rather retain a competitive advantage at all cost (even if it is potentially short-term) rather than attempt to boost their hardware sales.

:) couldn't of said it any better.
 
Or, Nvidia could just repackage one of their cards with a few tweaks, call it a "Physics card" and sell it. And let it work with ATI cards.
 
Developers love api, Physics is a top choice right now for developers because it performs without a gfx card and a gfx card could accelerate without it.

Havock physics engine just uses cpu so PhysX wins the choice on which to use.



Why don't you try to write your own physics engine and see how well it works.
 
Developers love api, Physics is a top choice right now for developers because it performs without a gfx card and a gfx card could accelerate without it.

Havock physics engine just uses cpu so PhysX wins the choice on which to use.



Why don't you try to write your own physics engine and see how well it works.

The point is that only half or so of the people buying the game can use it
 
:) couldn't of said it any better.

thanks

Or, Nvidia could just repackage one of their cards with a few tweaks, call it a "Physics card" and sell it. And let it work with ATI cards.

but they have! it's called a ppu, lol. seriously though, i don't know how that would work since ageia already failed to create the mindshare necessary to get people to buy their hardware in droves. i don't see how nvidia could do the same in creating this new "physics" brand, even if it was just a repackaged gpu in disguise. and tbh, it wouldn't make much sense considering all they have to do is to change the drivers back to the way they were. assuming one has xp/ 7, now an ati user is back to having alot of options to choose from for a dedicated gpu physx card vs. just one prepackaged card. and there are scaleable differences in hardware physx performance between cards due to the number of stream processors, clocks, etc.
 
Last edited:
ya lol, the better option would be to change the drivers back, I was just throwing that out there.
 
The point is that only half or so of the people buying the game can use it

ati is fully capable of offering developers free support to make sure their users are not left in the dark.

people buying the games can talk with their wallets. if ati users have issues with the non physx support, don't buy the games.
Posted via [H] Mobile Device
 
ati is fully capable of offering developers free support to make sure their users are not left in the dark.

people buying the games can talk with their wallets. if ati users have issues with the non physx support, don't buy the games.
Posted via [H] Mobile Device

I don't really think you're getting it...

The no-physx support is nobody's fault except nvidia. ATI can't do anything about it and game developers can't do anything about it.

If nvidia would simply revert the change back to letting ATI users use nvidia cards for physx, everyone would be better off.
 
I don't really think you're getting it...

The no-physx support is nobody's fault except nvidia. ATI can't do anything about it and game developers can't do anything about it.

If nvidia would simply revert the change back to letting ATI users use nvidia cards for physx, everyone would be better off.

Has there been any written proof that NVidia deliberately disabled running with an ATI card primary, PhysX on second NVidia card?
 
Has there been any written proof that NVidia deliberately disabled running with an ATI card primary, PhysX on second NVidia card?

Can I use an NVIDIA GPU as a PhysX processor and a non-NVIDIA GPU for regular display graphics?

No. There are multiple technical connections between PhysX processing and graphics that require tight collaboration between the two technologies. To deliver a good experience for users, NVIDIA PhysX technology has been fully verified and enabled using only NVIDIA GPUs for graphics.

That's what their website says.
 
Interesting and not good.
I wonder if the backlash can bring a change of mind.

Judging by this post, it looks very permanent.
http://forums.nvidia.com/index.php?showtopic=104314&pid=579021&st=0&#entry579021

Hello JC,

Ill explain why this function was disabled.

Physx is an open software standard any company can freely develop hardware or software that supports it. Nvidia supports GPU accelerated Physx on NVIDIA GPUs while using NVIDIA GPUs for graphics. NVIDIA performs extensive Engineering, Development, and QA work that makes Physx a great experience for customers. For a variety of reasons - some development expense some quality assurance and some business reasons NVIDIA will not support GPU accelerated Physx with NVIDIA GPUs while GPU rendering is happening on non- NVIDIA GPUs. I'm sorry for any inconvenience caused but I hope you can understand.

Best Regards,
Troy
NVIDIA Customer Care
 
Yep, business reasons. The funny thing is, it will cost them my business, because I can't buy an nvidia card to use for physx.
 
The few comparison videos I've seen show little but eye candy.

For example, the GRAW comparison video earlier in the thread showed someone shooting chips out of trees and cinderblocks. The chips look fabulous... yet the tree and wall remain untouched. How about using some of that hardware to allow deformation along with physics? Shoot out some key blocks and collapse a doorframe (or a building), or explode trees on a hill and have the shrapnel cut the enemy to pieces. Spray blood and gore into someone's eyes and blind them. Play tricks with sound reflection and lure guards away from their posts.

I'd like a little strategy with my physics, please, and I hope both ATI and nVidia can offer something compatible. Choice is good!
 
If ATI cards did do physics, then we would have 2 different, proprietary systems...... somehow that doesn't seem like a very good idea either...
 
We've had software Havok and software PhysX for a long time, AGEIA were first to push the hardware route, NVidia made it work.
ATI have had a long time to finish writing Havok on GPU and have made significant progress.
They have demo'd many different types of physics already but the API's havent been released yet.
I'm not sure if they have decided not to progress with it or maybe they are finalising code and training staff to support it...
 
Ya software physics are one thing, because everyone can run it, but when you need certain hardware to run it, things get complicated...

If ATI releases hardware physics support, then we're still going to see games only supporting Physx and some games supporting Havok "GPU edition" because ATI and NVIDIA will be competing (with money, probably) to get developers to use their solution..

I really just want their to be ONE system that will work on ALL hardware so physics can really advance, but I don't know if that's going to happen.
 
yeah it will be interesting to see how the "physics" revolution in gaming unfolds. will physx and havok continue to coexist as two options for devs in the future? will nvidia and amd/ intel both port their respective physics apis and allow their competitors products to function as well? or will gpu havok be blocked on nvidia and gpu physx on ati/ intel thus commencing the gpu physics wars? (which obviously wouldn't end to well i imagine.) will one overtake and dominate the other? will something else emerge and completely destroy them both? or will an utterly different scenario take place? will nick and jessica ever get back together? lol, it's fun to speculate.

i looked over the ngohq thread about the win7 drivers and it appears that they are trying to mod the drivers to get physx working again with ati/ nvidia setups. the plot thickens...
 
DirectX support is the most logical method.
DX11 allows use of the compute shader function on upcoming cards, this introduces fast stream processing.
Hopefully it will be extended soon to support older hardware.
 
The few comparison videos I've seen show little but eye candy.

For example, the GRAW comparison video earlier in the thread showed someone shooting chips out of trees and cinderblocks. The chips look fabulous... yet the tree and wall remain untouched. How about using some of that hardware to allow deformation along with physics? Shoot out some key blocks and collapse a doorframe (or a building), or explode trees on a hill and have the shrapnel cut the enemy to pieces. Spray blood and gore into someone's eyes and blind them. Play tricks with sound reflection and lure guards away from their posts.

I'd like a little strategy with my physics, please, and I hope both ATI and nVidia can offer something compatible. Choice is good!

actually in graw2 this does happen, albeit in the bonus level included entitled graw2 ageia island which has hardware-accelerated physics. not quite to the extent that it seems you would like, so i guess you will have to wait to play a future game where you can spray blood into someone's face to blind them, lol. in the main game, the physics were scaled down tremendously in comparison. this was done so that people wouldn't need dedicated physx hardware to play through the main game. otherwise, i think the devs could have made even something more dynamic and exciting.
 
If ATI cards did do physics, then we would have 2 different, proprietary systems...... somehow that doesn't seem like a very good idea either...

It's not, and I wouldn't support that either. What needs to happen is a 3rd party with no investment in the physics hardware needs to manage the standard, then there is no conflict of interest. If microsoft scooped up hardware physics and made it part of DirectX that would probably be the best solution in the current state of affairs.

If Nvidia own the rights to PhysX and they manage to make it an industry standard then they become a monopoly in the market place for it, businesses trying to do this sort of thing often face criminal charges like Microsoft in the EU over things like Internet Explorer and Windows Media player shipping with windows...this is done for exactly the same reason, you can't let one party take complete ownership of a technology like that, it allows them to direct the market and eventually take control over it.

Whatever short term benefits PhysX offers us, it would pale in comparison the problems we'd have if Nvidia were the sole owners of a technology which became widely adopted with no competing standard, it basically means they can do whatever they want, charge ATI whatever they want, all very bad.

Kill the fanboy bull shit for a second, please. Doing all that work for PhysX takes a hell of a lot more time and money then just doing CPU based stuff. Developers are never lazy. If you think they are, go watch them work. Saying that is a god damn insult to every single developer in the industry.

It was a conditional statement, I never claimed they were lazy, I said they would be lazy if...[condition]

A developer will choose to use PhysX because they like the extra options it gives them. nVidia isn't going to force them to do anything.

Er...well, I'm sure Nvidia have some say in the matter, it's a contract, they get stuff out of it, like their logo splashed in our faces on games that work with the TWIMTBP program, which while it's nice it's obiously a marketing ploy from Nvidia, they spend some money on engineers to work with developers and in return they get advertising at the start of games which show the developers endorsing one brand as "the way it's meant to be played"

TWIMTBP is there to provide FREE developer support. So yeah maybe things do get a little nVidia biased, but really its not like ATI gives a damn about helping developers.

I'd be interested to know the exact terms of TWIMTBP program, does anyone actually know for sure?

AMD is between a rock and hard place when it comes to both Havok and PhysX. They decided to go with the worst possible choice and support Intel. AMD could deal if nVidia tried to screw them over with a PhysX licence.

No they couldn't thats the whole point, AMD aren't stupid, the PhysX APi can be run on their cards and Nvidia invite them to use it, but they don't and while thats of a minor anoyance to us AMD card owners, ultimately it's of benefit because letting Nvidia entirely own and then make popular a standard like this is such a bad move.

I'm not a fanboi and Im not saying one company is more or less evil than the other, they're both fuckers for this sort of thing, but that doesn't mean we should shut up and not call them on it. I've been a long standing fan of Nvidia owning a top end card from them since their geforce 4 range (Ti4600, FX5900, 6800 GTX, 7950 GX2, 8800GTX) and this is my very first AMD setup (2 4870s in crossfire)

Intel, on the othe hand, is going to do everything possible to use Havok to their own advantage and screw both AMD and nVidia.

Yes yes, Intel will be no better than the other two.

the difference is that nvidia is paying developers to leave out features that can be done with ATI cards, so they can put it in using physx. they aren't just paying them to put logos in, they're paying them to screw over ati owners and call it physx.

and look at all the good it's done, now the batman game is delayed because they haven't finished the physx yet. so we can all sit around and wait longer for the game for it, even if we can't use it.

I dont think this is entirely true, I think they've made the API open and accessible to AMD, except AMD won't bite. What they have done is decreased the quality of the effects that PhysX is being compared against, to make PhysX appear more attractive than it really is.

Again SOME effects are unique to PhysX done on GPUs/PPUs which cannot be done on current CPUs and thats fine, if they want to make a comparison on these effects such as psudo cloth and psudo water then great, go ahead and boast the technology because it's really nice. However don't compare your 50 sparks ricochet effect to a 10 sparks one when we know full well that the modern CPU can handle that easily.

...and devs wouldnt turn down the chance to give us some very nice effects in games, its part of their own dreams to make it look as good as possible.

On what configuration? On all configurations?

It's called fallback rendering, it's where a technology isn't supported so you make a fallback effect which looks similar but not as good, this means people without the hardware can still enjoy a game that looks good.

The problem is the fallback effects in these comparisons are weak, not just weak compared to the PhysX counterparts but weak compared to any modern standard.

It's a dishonest comparison, we'd be stupid to let stunts like this slide, if something is dishonest we should call them on it and take a look at it in depth and get a bit of drama going to knock them back in to line so we can see clearly what are the ACTUAL benefits, and not the massaged benefits.
 
That was a really good post, Princess. I do wish that it hardware physics would be integrated into DX11 and work on ATI and Nvidia cards.

If not, one side will end up completely controlling the market, which would probably be nvidia.
 
The next time someone mentions 'integrating' physics into DX11 I'm going to hurt that person. DX has nothing to do with middleware like physics APIs. Originally it was a low-level API integrating graphics rendering (D3D), networking (gone now), sound (crippled) and input (gone). The shader programming in DX11 isn't interesting, BTW, as it's about the same as OpenGL's GLSL, which has been around since the dawn of time. OpenGL has always been ahead of D3D, and will always remain that way, simply because it's open and is the most widely used graphics API.

Anyway, at this point we have two physics APIs which matter: Havok and PhysX. The rest are in-house or smaller jobs which aren't relevant here. Differences between Havok and PhysX include their pricing strategy: Havok costs money to use in a game and royalties per copy of game sold. PhysX is free to integrate, free to sell.

Both work on CPUs equally well. PhysX offers GPU acceleration in addition to this, Havok may at some distant point in the future (no confirmation of commercialization, despite fancy demos). NVidia business practices are hare-brained. AMD's business practices are hare-brained and not practical. What we can observe here is the repeat of the graphic API wars of the 90s.

Back then we had an open API, OpenGL, a few closed ones, D3D and Glide, and a whole lot of fighting about what to use, if any. The early 90s solidified the use of vertices for 3D graphics, reducing the number of APIs, but now we had three APIs fighting it out. We all know that D3D rules on Windows platforms, whereas OpenGL has got virtually every other platform, from the iPhone to consoles and non-Windows platforms in general. This mostly happened due to Microsoft's tough (monopolistic) policies. The truth is that D3D until version 9 really wasn't that good. D3D 6 is universally loathed by devs subjected to it.

This time around the two companies most prominently behind the two physics APIs Which Matter (tm), nVidia and AMD, use their hardware as a bargaining chip in the Physics API wars, by limiting which APIs are implemented for their hardware, and what hardware can be in the same system (i.e. not the competitor's cards). This is much like the situation with D3D and Glide, where 3rd parties wrote wrappers to work around this limitation, allowing one for example to use Glide apps on a D3D card. This all due to company politics, not technical reasons.

What I'm also implying, I guess, is that GPU physics will become as ingrained as hardware rendering is now. Few people will still want a software renderer in the games they're playing, as its performance would be abysmal, at perhaps 10% of the performance of the GPU-accelerated version. The same is true for physics, even a GF8800-level GPU will run (many) circles around even a high-end i7 quad-core CPU. Simply because CPUs don't and never will have the vector processing capabilities of GPUs. Use the right tool for the job. Software renderers were used because GPUs weren't really used until the late 90s, as the hardware didn't exist, or wasn't prevalent enough. We see the same situation now.

Plenty of people have GF8-level hardware now, and those who don't can get a card for free or cheap. This fulfills the 'hardware present' issue. Now we just need to wrestle through the 'company politics' part of the war. As I said before, nVidia and AMD are all being bullies, ignoring what is practical and beneficial to everyone in order to protect their bottomline. PhysX is the most advanced API at this point. It's also the cheapest to developers and therefore should be the winner (price & performance wins). Unfortunately bone-headed decisions on both sides of the war are keeping anyone from 'winning'.

My company uses PhysX in our game engine due to its price & performance. We're keeping things on the CPU for now as much as possible. We think that nVidia is retarded for not allowing their GPUs to be used as PPUs with an AMD/other GPU. We take good care in implementing PhysX, even if we have to scale things down to accommodate GPU-physics-less systems. We would love to see PhysX become universal, together with GPU physics.

Don't bash PhysX because some developers don't know or don't want to properly implement the scaling of physics. Don't bash PhysX because so few games use it ( almost everybody is to blame here). Don't fawn over Havok because it's just as proprietary, even more limited and much more expensive than PhysX. Bash petty company politics preventing PhysX and Havok running both on the same hardware without any limitations.
 
You can't use hardware physics to good effect in a game unless the entire install base can run it. Effective use of physics isn't something you can just turn off.
 
The next time someone mentions 'integrating' physics into DX11 I'm going to hurt that person.

Physics should be integrated into DX11.







LULZ

But anyway, allowing ATI users to use Physx with a secondary card would be good in the short term, but unless Nvidia lets other companies' hardware also run Physx, then we're still going to have a monopoly on our hands, and that's no bueno, senor...
 
But anyway, allowing ATI users to use Physx with a secondary card would be good in the short term, but unless Nvidia lets other companies' hardware also run Physx, then we're still going to have a monopoly on our hands, and that's no bueno, senor...

PhysX was and still is the underdog in terms of market base. It's only since 2008 after Ageia got bought by nVidia that it has seen significantly more use. If anything you should be cheering nVidia on for bringing a (the) competitor to Havok on the market.

Also, it was AMD who refused to let PhysX run on its hardware. Re bone-headed company politics. AMD deserves at least as much blame as nVidia in proliferating this stalemate.
 
I don't like how either companies are handling this, but I still have to buy a video card at the end of the day. Both care more about their pocket books (and their pride) more than the consumer.
 
well, I'm just sort of upset at how we could have had some crazy Physics in games by now, but these 2 companies can't play nice, so we all suffer.

since when do competitors ever play nice? i know i don't in games or sports, lol. no mercy!
 
Last edited:
Could we blame consoles?

yes

custom_1251244927794_bricked.jpg
 
I don't like how either companies are handling this, but I still have to buy a video card at the end of the day. Both care more about their pocket books (and their pride) more than the consumer.

True enough.

For me CUDA and PhysX are important, AMD cards offer nothing of relevance (their OpenGL support is lacking, Stream is uninteresting compared to CUDA, and Havok doesn't run on AMD cards), ergo I always seem to end up buying nVidia cards. If AMD would actually listen to its entire (potential) market, this might change.
 
Back
Top