Nvidia for Better Or Worse

If they want more companies to use Physx, making your market smaller by cutting off ATi completely is ludicrous.
It's a good point. From this perspective, NVIDIA's position on this thing is totally self-defeating. That is unless, of course, they actually think people are going to buy a two or three NVIDIA cards, rather than AMD cards and a NVIDIA board, just for the opportunity to use PhysX in the one or two games currently out in which hardware PhysX has almost no usefulness whatsoever.

It doesn't quite add up.
 
Correct me if I'm wrong, but couldn't people just use an older PhysX driver (pre-disabling) to run physics on if they have an ATI video card?
 
Correct me if I'm wrong, but couldn't people just use an older PhysX driver (pre-disabling) to run physics on if they have an ATI video card?

Maybe, but the hacking the driver is a better idea in case new games that otherwise would automatically update your physx driver. I'm not saying they do, but plenty of games automatically update various drivers. I know some games refuse to install unless you are at a certain patch revision.
 
Troll much? Explain to me how it's OK for Nvidia to disable functions in a card you've paid for? The ATI that built the 9700? You mean the ATI that just handed nvidia it's ass with the 5870? All at a lower price point? Yeah, they are clearly sitting on their thumbs. Crawl back to your bridge.

lets take your asserations one at time.
first nvidia disabled it for i can quite a few could reasons.
1. the way laws are written they would be responsible for any driver interactions with ati drivers.
2. second it's not disabled if you run the card only, it's only disabled if it detects an ati card, and that is a distinction that matters, try and see if you can get past summary judgement.

3. you are joking with the 5870 right? all it is is faster, guess what it's supposed to be faster it's new gen.

4. however it's faster in a pointless way, I can already run any game i want at 1920x1200, at fast to accepable. who cares if you run it runs games at 80 with nvidia can do 60? it's not 1996, it's 2009.

5. it's the whole 3dfx vs. nvidia again. speed only goes so far. i'm far more intrigued by a video card that can usher in a new paradigm ala fermi than one that can run for already fine perfomaning games 30-50%.

ATI has not done anything innovative since the 9700 period much as AMD has not done anything great since the athalon-opeteron era.

if anything, it should have taugh them that merely being efficeint is not enough, innovation counts.
 
the way laws are written they would be responsible for any driver interactions with ati drivers.
I had no idea there were laws that revolve around Windows drivers. In what bizarre country do such laws exist?
 
The disabling thing is pretty silly. Just makes them look bad and accomplishes nothing. Someone somewhere might actually buy a secondary nvidia card just to do physx because their ati card wont. BUT HEY. Now they can't.

lets take your asserations one at time.
first nvidia disabled it for i can quite a few could reasons.
1. the way laws are written they would be responsible for any driver interactions with ati drivers.
2. second it's not disabled if you run the card only, it's only disabled if it detects an ati card, and that is a distinction that matters, try and see if you can get past summary judgement.

3. you are joking with the 5870 right? all it is is faster, guess what it's supposed to be faster it's new gen.

4. however it's faster in a pointless way, I can already run any game i want at 1920x1200, at fast to accepable. who cares if you run it runs games at 80 with nvidia can do 60? it's not 1996, it's 2009.

5. it's the whole 3dfx vs. nvidia again. speed only goes so far. i'm far more intrigued by a video card that can usher in a new paradigm ala fermi than one that can run for already fine perfomaning games 30-50%.

ATI has not done anything innovative since the 9700 period much as AMD has not done anything great since the athalon-opeteron era.

if anything, it should have taugh them that merely being efficeint is not enough, innovation counts.

Judging by your previous posts on any subject related to nvidia/ati you are hereby marked as an Nvidia fan boy. Stop it. Fan boys of either side are pretty annoying.

Also you like to say "it's not ******** it's 2009." as a justification for things.
 
lets take your asserations one at time.
first nvidia disabled it for i can quite a few could reasons.
1. the way laws are written they would be responsible for any driver interactions with ati drivers.
2. second it's not disabled if you run the card only, it's only disabled if it detects an ati card, and that is a distinction that matters, try and see if you can get past summary judgement.

3. you are joking with the 5870 right? all it is is faster, guess what it's supposed to be faster it's new gen.

4. however it's faster in a pointless way, I can already run any game i want at 1920x1200, at fast to accepable. who cares if you run it runs games at 80 with nvidia can do 60? it's not 1996, it's 2009.

5. it's the whole 3dfx vs. nvidia again. speed only goes so far. i'm far more intrigued by a video card that can usher in a new paradigm ala fermi than one that can run for already fine perfomaning games 30-50%.

ATI has not done anything innovative since the 9700 period much as AMD has not done anything great since the athalon-opeteron era.

if anything, it should have taugh them that merely being efficeint is not enough, innovation counts.

1. The way laws are written? Tell me your joking, please. Maybe you can cite some case law about driver interaction. Good god. Lord knows we've never had to put up with crap drivers before, you know due to the pending Driver lawsuit. I'll give you a real law Nvidia should be more worried about. How about anti trust laws. I wonder how long Intel would get away with making an X25 only partially functional if it detected an AMD cpu. Or windows shutting off the browser cause you had a Linux partition on your drive. See how stupid this is. Cite driver "laws" and I guess your off the hook though
.
2. Your right, and that's the whole point of the outrage. Have you read a single post in this thread? So your point is exactly? Your saying if I have an ATI card to render, I can't use a Nvidia card to do physix? Bingo, read the thread. But that's obviously cool with you since you can't see past the green spot in your head.

3/4. Faster. Umm clearly. Use's less power, single GPU solution, eyefinity. Glad your games at the res you play works for you. A summary judgment would be assuming it would for all cases, and assuming no one else could use the speed. Let's see if you can move past them, eh tiger? And what exactly is your point for 1996 vs. 2009? Some new catch phrase to grasp at? A rallying cry that is shouted in the face of facts? You say it again and again, yet it means nothing
.
5. A new paradigm? This should be rich. While everyone is moving to open standards like CL et al, what new paradigm is Nvidia set to deliver? Moving to GPU computing? That'll be a laugh. With AMD/ATI trying to beat them, and Intel about to join the fray I can see that working out wonderful for them.

So on top of them screwing their own customers over (which you defend due to driver "laws", sorry that is just never going to get old), being late to market with it's next gen product. Having it's chipset division crushed by the competition, faking a product (personally I could care less, but it's a misstep at the least), and attempting to push non industry standard API's (you know instead of DX, OpenCL etc) exactly what has Nvidia done for the enthusiast market?

Ion is still born, fermi is more or less vapor ware, all it's going to take is Intel to say no thanks to SLI licensing and Nvidia is plowed hard my friend. Here they sit teetering on relevance and their idea to help gamers is to bork their drivers(you know, it's the law and all)? I'm sure this makes the guys at Redmond ecstatic too, after they had to work on the code in win7 to get it working as it wasn't in Vista. I could rail on the asshat's that run Nvidia all day, truly I could. But it's pointless really. Some people can't see past their allegiance's. What's really funny is every single discrete Gfx card in my house is an Nvidia product, but that's clearly going to change now. Some may be perfectly willing to bend over and lube up for nvidia, or any company. Then again some of us know not to eat lead paint chips to.
 
Last edited:
Why do people assume Nvidia is obligated to provide PhysX support for their competitor's cards? They paid to acquire it as a selling point for their cards,they own the technology,they have every right to make it exclusive.From a business viewpoint,it's only common sense.

Read it again. No one is saying support physx for a competitors card, they're disabling physx for their OWN card when there is an ATI/AMD card in the system. It's a retarded business move, especially with DX11 around the corner which will support gpu physics.
 
The disabling thing is pretty silly. Just makes them look bad and accomplishes nothing. Someone somewhere might actually buy a secondary nvidia card just to do physx because their ati card wont. BUT HEY. Now they can't.



Judging by your previous posts on any subject related to nvidia/ati you are hereby marked as an Nvidia fan boy. Stop it. Fan boys of either side are pretty annoying.

Also you like to say "it's not ******** it's 2009." as a justification for things.

yes im one of those fanboys who has owed

matrox g200

ati 9700 pro, 1800xt

3dfx vodoo2, vodoo, and vodoo 5

nvidia, ti4600, 6800, 8800gt and 280gtx

sorry, but ATI's latest is a relic from an era when a speed bump gain you qualitative gains. that is no longer the case.
 
Wow? Nvidia, suddenly following the release of the ATI's 5800 series, burps out a new architecture to the media and Nvidia fanboys gain "lack of innovation" for their arsenal. :eek:
 
Judging by your previous posts on any subject related to nvidia/ati you are hereby marked as an Nvidia fan boy. Stop it. Fan boys of either side are pretty annoying.

fanboys from any camp are indeed pretty annoying, but nvidia's PR strategy of "dude, where my pwnz?" has created a whole qualitatively new breed of fanboy: the nvidiot.
 
Once again we end up on the same topic...gpu physx vs cpu....haven't you realized how much faster a GPU is for this compared to a traditional CPU?
Maybe we should go back to CPU ALL processing and forget video cards and everything else....

have you seen the hack to run physx on an ATI card in Batman AA? http://forum.beyond3d.com/showthread.php?p=1332461
http://www.youtube.com/watch?v=AUOr4cFWY-s&feature=player_embedded

not that much faster. at least not when your trying to render graphics and physx
 
please use your brain before opening your mouth, it is not about nvidia providing support for ATI cards, but it is nvidia disabling the support of the user's OWN nvidia card.

Please use your brain before flaming.They're disabling support if an ATI card is present.It may not be a popular move,it may not be right,but it's their technology to do with as they choose.Did anyone really expect Nvidia to play fair,that's that not the nature of the beast,in today's economy business is about about as cut throat as it gets.
 
Please use your brain before flaming.They're disabling support if an ATI card is present.It may not be a popular move,it may not be right,but it's their technology to do with as they choose.Did anyone really expect Nvidia to play fair,that's that not the nature of the beast,in today's economy business is about about as cut throat as it gets.

doesn't cancel out the fact we paid for the card, already. It's not like music, where IP is just a wierd mess.

It's a feature that was advertised as a major buying point, and now it doesn't work on our machines, how we want it.

It's not a service, it's a product that we already bought. Some of us may of bought it for Physx. Soley for Physx.


Yeah, draw the comparison to how nvidia doesn't need to provide continuing support, but still... it's out cards... and nvidia has the only stranglehold on physx.

They have gone out of their way to disable physx for ATI, and since only ATI really competes with nVidia in any area, ONLY ATi, not VIA, not S3, not Intel...

Now all I am waiting for is Intel to disable support for nVidia SLI in their next chipset design. Save the 5USD.
 
It's interesting what's happened in the last couple of years.

2 years ago nvidia were proclamed as dead men walking. AMD had bought Ati, Intel announced larrabee. The future was cpu's with gpu's integrated on to them. Nvidia didn't make cpu's, had no 8086 licence and was doomed.

Seems both AMD/Ati and Intel then took their foot off the gas. Since then neither company has really succeeded to produce an integrated cpu/gpu that mattered, and Intel still seems as far away from releasing larrabee as it was then. Both haven't really innovated sticking to what they know and kind of relying on the fact that the future was an 8086 cpu with a gpu attached so they were safe.

Nvidia on the other hand has been in fight or die mode - when you are running out of time you work twice as hard. They have been pushing a weak (not necessarly 8086 - ARM is fine) cpu and an nvidia compute engine.

At one end we have Tegra which is revolutionary - take an arm cpu, add everything you need for compute, music, games, HD video and fit it all in 1W power envelope. It just blows everything else away, and it's just the start - tegra 2 will blow tegra 1 away. This is starting to move up - so you will get ARM netbooks with nvidia gpu graphics (accelerating flash, video, games, etc) and a google OS eating into the low end notebook market (traditionally dominated by intel with it's integrated graphics). If microsoft start supporting windows on arm cpu's to compete with google OS then the whole notebook/desktop market is blown open. It's not just nvidia - with the iPhone, iPod touch the world was introduced to the fact you could do most traditional computer tasks - email, web surfing, watching ytube - on something much smaller and cooler.

At the other end we have supercomputers powered by nvidia telsa cards, these are equally revolutionary requiring a small fraction of the space and power to do the same thing as some AMD or Intel cpu equipped super computer. With Fermi they have the best super computer engine out there by a long way. This is working its way down into the desktop market with every PC containing an nvidia card essentially being a mini super computer (or for folding part of a bigger one). Apple are aware of this hence pushing hard to get open cl into their OS, and moving to using nvidia graphics because they are the best for gpu compute.

For the desktop and higher end notebook market the world hasn't changed so much - it's still and AMD/Intel cpu and an AMD/Intel/Nvidia graphics card on MS windows. But it's a shrinking market - non gamers are replacing their desktops with notebooks, then netbooks, even smart phones do much of what the desktop used to. Gamers have been forced to accept console ports, with the exception of MMORPG consoles now drive the games world, and PC's sit twiddling their high powered thumbs as they are asked to display yet another DX9c port. Nvidia are still trying hardest to innovate here - introducing hardware physics acceleration, and 3d to the console ports to at least make the PC version stand out from what's available on the consoles.

I don't know what the future may bring, but what's quite clear is the world is changing. Of the major players its the one in the worst position (nvidia) that has embraced the change the most in an attempt to survive. Intel and in particular AMD had better get going - stop reacting too late, and go for these new markets. They are both feeling safe at the top of their 8086 towers not realising the foundations are being slowly scraped away. If it all comes crashing down then AMD in particular with their huge debts don't have the money to climb back out of the hole they will end up in.
 
It's interesting what's happened in the last couple of years.

2 years ago nvidia were proclamed as dead men walking. AMD had bought Ati, Intel announced larrabee. The future was cpu's with gpu's integrated on to them. Nvidia didn't make cpu's, had no 8086 licence and was doomed.

Seems both AMD/Ati and Intel then took their foot off the gas. Since then neither company has really succeeded to produce an integrated cpu/gpu that mattered, and Intel still seems as far away from releasing larrabee as it was then. Both haven't really innovated sticking to what they know and kind of relying on the fact that the future was an 8086 cpu with a gpu attached so they were safe.

Nvidia on the other hand has been in fight or die mode - when you are running out of time you work twice as hard. They have been pushing a weak (not necessarly 8086 - ARM is fine) cpu and an nvidia compute engine.

At one end we have Tegra which is revolutionary - take an arm cpu, add everything you need for compute, music, games, HD video and fit it all in 1W power envelope. It just blows everything else away, and it's just the start - tegra 2 will blow tegra 1 away. This is starting to move up - so you will get ARM netbooks with nvidia gpu graphics (accelerating flash, video, games, etc) and a google OS eating into the low end notebook market (traditionally dominated by intel with it's integrated graphics). If microsoft start supporting windows on arm cpu's to compete with google OS then the whole notebook/desktop market is blown open. It's not just nvidia - with the iPhone, iPod touch the world was introduced to the fact you could do most traditional computer tasks - email, web surfing, watching ytube - on something much smaller and cooler.

At the other end we have supercomputers powered by nvidia telsa cards, these are equally revolutionary requiring a small fraction of the space and power to do the same thing as some AMD or Intel cpu equipped super computer. With Fermi they have the best super computer engine out there by a long way. This is working its way down into the desktop market with every PC containing an nvidia card essentially being a mini super computer (or for folding part of a bigger one). Apple are aware of this hence pushing hard to get open cl into their OS, and moving to using nvidia graphics because they are the best for gpu compute.

For the desktop and higher end notebook market the world hasn't changed so much - it's still and AMD/Intel cpu and an AMD/Intel/Nvidia graphics card on MS windows. But it's a shrinking market - non gamers are replacing their desktops with notebooks, then netbooks, even smart phones do much of what the desktop used to. Gamers have been forced to accept console ports, with the exception of MMORPG consoles now drive the games world, and PC's sit twiddling their high powered thumbs as they are asked to display yet another DX9c port. Nvidia are still trying hardest to innovate here - introducing hardware physics acceleration, and 3d to the console ports to at least make the PC version stand out from what's available on the consoles.

I don't know what the future may bring, but what's quite clear is the world is changing. Of the major players its the one in the worst position (nvidia) that has embraced the change the most in an attempt to survive. Intel and in particular AMD had better get going - stop reacting too late, and go for these new markets. They are both feeling safe at the top of their 8086 towers not realising the foundations are being slowly scraped away. If it all comes crashing down then AMD in particular with their huge debts don't have the money to climb back out of the hole they will end up in.

interesting post to say the least
 
Last edited:
Seems both AMD/Ati and Intel then took their foot off the gas.

If you lived in reality instead of computer fantasy land you'd be able to deduce that the slow down was mainly due to the U.S. economy nearly collapsing and that it was an industry wide slow down. But don't let facts get in the way of your fantasy.

Even then, there hardly has been any slow down from Intel or AMD. In fact they both actually spent more in building new fabs. Intel in China, Boston, Arizona for a total of about 7 billion and AMD's Global foundries spinoff started building a 4.2billion plant in upstate NY. Remember Intel and AMD do both CPU's and GPU's.
All Nvidia managed to do is piss off pretty much all the OEM's it does business with, for selling them defective chips and lying about it.

Since then neither company has really succeeded to produce an integrated cpu/gpu that mattered, and Intel still seems as far away from releasing larrabee as it was then. Both haven't really innovated sticking to what they know and kind of relying on the fact that the future was an 8086 cpu with a gpu attached so they were safe.

This is rich. While ATI developed a full line of Directx 10.1 gpus and has released 2 Directx 11 chips already, Nvidia "innovated" by renaming directx 10.0 chips and selling them to people who didn't know any better.
Intel will bring Larabee when it is ready because they can afford to take their time. That should give you a good impression on how commanding their position in the market is. If you somehow think that suddenly because Nvidia announced some paper specs on a press conference the market is instantaneously going to change I have a bridge I'd like to sell to you :)

Nvidia on the other hand has been in fight or die mode - when you are running out of time you work twice as hard. They have been pushing a weak (not necessarly 8086 - ARM is fine) cpu and an nvidia compute engine.

Their strategy is more like retreat and stall the market at this moment.
They are retreating from being a pure graphics accelerator company because that is a dead end market. They will be simply squeezed out of air between Intel and AMD (They both have their own fabs by the way, while Nvidia doesn't and depends entirely on TSMC)

They're also stalling the market by pushing proprietary standards while they still have a decent sized share of the market to make an impact. This hopefully will buy them time to implement their exit strategy and not bleed as much cash in the process.

At one end we have Tegra which is revolutionary - take an arm cpu, add everything you need for compute, music, games, HD video and fit it all in 1W power envelope. It just blows everything else away, and it's just the start - tegra 2 will blow tegra 1 away. This is starting to move up - so you will get ARM netbooks with nvidia gpu graphics (accelerating flash, video, games, etc) and a google OS eating into the low end notebook market (traditionally dominated by intel with it's integrated graphics).

I'll give you that point, but in the end Intel can simply destroy Nvidia's margins by lowering ATOM and their integrated graphics prices until they have Larabee variants out in force.

If microsoft start supporting windows on arm cpu's to compete with google OS then the whole notebook/desktop market is blown open. It's not just nvidia - with the iPhone, iPod touch the world was introduced to the fact you could do most traditional computer tasks - email, web surfing, watching ytube - on something much smaller and cooler.

It is nice to daydream about the future, but investors do not get paid by wishful thinking.

At the other end we have supercomputers powered by nvidia telsa cards, these are equally revolutionary requiring a small fraction of the space and power to do the same thing as some AMD or Intel cpu equipped super computer.

The HPC market only makes up 1.3% of Nvidia's total revenue. Also Intel and AMD are not the only players in the Supercomputer market. IBM, and Sun are big players there as well.

With Fermi they have the best super computer engine out there by a long way.
That's just an assumption, the product doesn't even exist yet. You also assume that there is nothing similar coming out to compete with it. But it is nice to daydream.

This is working its way down into the desktop market with every PC containing an nvidia card essentially being a mini super computer (or for folding part of a bigger one). Apple are aware of this hence pushing hard to get open cl into their OS, and moving to using nvidia graphics because they are the best for gpu compute.

Apple is a secondary market at best. Barely anyone does HPC on macs.


Gamers have been forced to accept console ports, with the exception of MMORPG consoles now drive the games world, and PC's sit twiddling their high powered thumbs as they are asked to display yet another DX9c port.
Let me remind you of the Assasins Creed directx 10.1 patch that Nvidia forced Ubisoft to pull.

Nvidia are still trying hardest to innovate here - introducing hardware physics acceleration, and 3d to the console ports to at least make the PC version stand out from what's available on the consoles.

Proprietary hardware physics acceleration that amounts to nothing more than a gimmick and that can be and is already being done on CPU's and that will be supplanted by versions done in OpenCL. I guess they need to justify the cost of spending the cash for buying Aegia to the investors somehow. I'll quote John Carmack when he said "I hope that Nvidia didn’t pay a whole lot of money for it(Aegia)". 3d glasses while nice is a niche market that has never caught on.
At the same time they lag behind in directx, rename and resell old chips, etc. Great way to innovate.

I don't know what the future may bring, but what's quite clear is the world is changing.
The first sensible thing you've said.

Of the major players its the one in the worst position (nvidia) that has embraced the change the most in an attempt to survive.
Embracing and having no choice are quite different things. Their actions as of late seem a little irrational and smell more of desperation than anything. They have even started attacking their own customers by not letting them use their cards in combination with an ATI card. That is simply absurd.

Intel and in particular AMD had better get going - stop reacting too late, and go for these new markets.
Yeah they're reacting slowly by launching new products while Nvidia launches paper and fake cards in a press conference.
They are both feeling safe at the top of their 8086 towers not realising the foundations are being slowly scraped away.
You again assume they don't already have plans for the future. Again I want to borrow your magical crystal ball.

If it all comes crashing down then AMD in particular with their huge debts don't have the money to climb back out of the hole they will end up in.
With a few pieces of paper and a fake card Nvidia will snuff AMD from the face of the earth. I'm sure less informed people will think your post as insightful but I find it rather lacking. Bye now.
 
Last edited:
If you lived in reality instead of computer fantasy land you'd be able to deduce that the slow down was mainly due to the U.S. economy nearly collapsing and that it was an industry wide slow down. But don't let facts get in the way of your fantasy.

Even then, there hardly has been any slow down from Intel or AMD. In fact they both actually spent more in building new fabs. Intel in China, Boston, Arizona for a total of about 7 billion and AMD's Global foundries spinoff started building a 4.2billion plant in upstate NY. Remember Intel and AMD do both CPU's and GPU's.

I don't think you know what GlobalFoundries is to AMD...You seem to live in your fantasy world aswell, in which AMD owns GlobalFoundries, which is hilarious.
Actually, I'm sure you'll easily find the info that shows that AMD had to lower their stake on GF :)

I'm going to jump through some of the points, because they consist of the typical rhetoric. NVIDIA's defective chips, renamings and how people were deceived by them, PhysX sucks, etc, etc...Nothing new among the anti-NVIDIA crowd.

Lorien said:
The HPC market only makes up 1.3% of Nvidia's total revenue. Also Intel and AMD are not the only players in the Supercomputer market. IBM, and Sun are big players there as well.

Which is why they are trying to increase their presence in it. And if you actually read anything about GF100 (Fermi) you would understand how it has many features to make NVIDIA's position in this market far more stronger.

Lorien said:
That's just an assumption, the product doesn't even exist yet. You also assume that there is nothing similar coming out to compete with it. But it is nice to daydream.

There is. It's called Larrabee and it's quite far from a release. AMD has Fusion, but that's even farther away. Assuming they are anywhere near release, is a nice daydream indeed :)

Lorien said:
Let me remind you of the Assasins Creed directx 10.1 patch that Nvidia forced Ubisoft to pull.

So the developer confirmed that they removed 10.1 support because of problems with their 10.1 path and I quote:

Ubisoft said:
In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly.

http://techreport.com/discussions.x/14707

And you conclude it was NVIDIA that forced Ubisoft to do it ? That would be serious indeed. Care to prove it or is this your own conspiracy theory ?

Lorien said:
Proprietary hardware physics acceleration that amounts to nothing more than a gimmick and that can be and is already being done on CPU's and that will be supplanted by versions done in OpenCL. I guess they need to justify the cost of spending the cash for buying Aegia to the investors somehow. I'll quote John Carmack when he said "I hope that Nvidia didn’t pay a whole lot of money for it(Aegia)". 3d glasses while nice is a niche market that has never caught on.
At the same time they lag behind in directx, rename and resell old chips, etc. Great way to innovate.

Supplanted ? So a standard to use GPU resources directly will supplant a physics API ? You don't know what OpenCL and PhysX is do you ? Also common among the anti-NVIDIA crowd...

Lorien said:
Yeah they're reacting slowly by launching new products while Nvidia launches paper and fake cards in a press conference.

You again assume they don't already have plans for the future. Again I want to borrow your magical crystal ball.

No need for a crystall ball. It's called Fusion and is quite far from a release as AMD's owns roadmaps show.
 
It's interesting what's happened in the last couple of years.

2 years ago nvidia were proclamed as dead men walking. AMD had bought Ati, Intel announced larrabee. The future was cpu's with gpu's integrated on to them. Nvidia didn't make cpu's, had no 8086 licence and was doomed.

Seems both AMD/Ati and Intel then took their foot off the gas. Since then neither company has really succeeded to produce an integrated cpu/gpu that mattered, and Intel still seems as far away from releasing larrabee as it was then. Both haven't really innovated sticking to what they know and kind of relying on the fact that the future was an 8086 cpu with a gpu attached so they were safe.

Nvidia on the other hand has been in fight or die mode - when you are running out of time you work twice as hard. They have been pushing a weak (not necessarly 8086 - ARM is fine) cpu and an nvidia compute engine.

At one end we have Tegra which is revolutionary - take an arm cpu, add everything you need for compute, music, games, HD video and fit it all in 1W power envelope. It just blows everything else away, and it's just the start - tegra 2 will blow tegra 1 away. This is starting to move up - so you will get ARM netbooks with nvidia gpu graphics (accelerating flash, video, games, etc) and a google OS eating into the low end notebook market (traditionally dominated by intel with it's integrated graphics). If microsoft start supporting windows on arm cpu's to compete with google OS then the whole notebook/desktop market is blown open. It's not just nvidia - with the iPhone, iPod touch the world was introduced to the fact you could do most traditional computer tasks - email, web surfing, watching ytube - on something much smaller and cooler.

At the other end we have supercomputers powered by nvidia telsa cards, these are equally revolutionary requiring a small fraction of the space and power to do the same thing as some AMD or Intel cpu equipped super computer. With Fermi they have the best super computer engine out there by a long way. This is working its way down into the desktop market with every PC containing an nvidia card essentially being a mini super computer (or for folding part of a bigger one). Apple are aware of this hence pushing hard to get open cl into their OS, and moving to using nvidia graphics because they are the best for gpu compute.

For the desktop and higher end notebook market the world hasn't changed so much - it's still and AMD/Intel cpu and an AMD/Intel/Nvidia graphics card on MS windows. But it's a shrinking market - non gamers are replacing their desktops with notebooks, then netbooks, even smart phones do much of what the desktop used to. Gamers have been forced to accept console ports, with the exception of MMORPG consoles now drive the games world, and PC's sit twiddling their high powered thumbs as they are asked to display yet another DX9c port. Nvidia are still trying hardest to innovate here - introducing hardware physics acceleration, and 3d to the console ports to at least make the PC version stand out from what's available on the consoles.

I don't know what the future may bring, but what's quite clear is the world is changing. Of the major players its the one in the worst position (nvidia) that has embraced the change the most in an attempt to survive. Intel and in particular AMD had better get going - stop reacting too late, and go for these new markets. They are both feeling safe at the top of their 8086 towers not realising the foundations are being slowly scraped away. If it all comes crashing down then AMD in particular with their huge debts don't have the money to climb back out of the hole they will end up in.

HPC stuff is a niche market that in no way can compete against the quantity of sales NV currently garners from retail and OEM markets. And even if inefficient in comparison, x86 hardware is cheap and, perhaps more importantly, more compatible with today's existing enterprises. What you're suggesting by so heavily trumping the GPGPU side of Fermi is that NV mgmt is actively seeking to become a niche player (shades of Silicon Graphics) by willingly foregoing the very markets that have made the company what it is today. Weirder things have certainly happened to other companies, but I don't think Jensen has ever demonstrated being that incompetent.

ikarinokami said:
yes im one of those fanboys who has owed

matrox g200

ati 9700 pro, 1800xt

3dfx vodoo2, vodoo, and vodoo 5

nvidia, ti4600, 6800, 8800gt and 280gtx

sorry, but ATI's latest is a relic from an era when a speed bump gain you qualitative gains. that is no longer the case

You're not even being consistent here. What was the 9700 Pro upon launch but a faster rasterizer that had no supporting software available for months and months to show off its DX9 support. Yet it was widely praised for being the fastest GPU on the market while still being future proofed, a la the latest iteration of D3D. Sound familiar at all? <none of this is meant to suggest Fermi will be another NV30>
 
ATI doesn't do PhysX because it would then have to pay a licensing fee to Nvidia for each video card they sold that did PhysX. The same way that any motherboard manufacturer that supports SLI must pay a licensing fee to Nvidia. It doesn't make sense for ATI to give dollars to a competitor for every card that they sell.

Kind of like AMD paying Intel for a x86 license? Its Nvidia's property. If its important enough to ATI to have it, let them cough up the money for it. I have no issue with Nvidia trying to prevent users from circumventing this by using an ATI card as a primary graphics and a hand-me-down Nvidia card as what is more or less an Ageia PhysX card. That's not what their purchase of that company and subsequent integration of PhysX into their GPUs was about.
 
I don't care for one company over the other, but nvidia is a giant, stupid asshole for doing this.
 
Kind of like AMD paying Intel for a x86 license? Its Nvidia's property. If its important enough to ATI to have it, let them cough up the money for it. I have no issue with Nvidia trying to prevent users from circumventing this by using an ATI card as a primary graphics and a hand-me-down Nvidia card as what is more or less an Ageia PhysX card. That's not what their purchase of that company and subsequent integration of PhysX into their GPUs was about.

Circumventing what exactly?

No one who purchased an Nvidia card agreed to a license that explicitly states they may only use some features of that hardware according to when and how Nvidia dictates. It doesn't matter in the slightest to an end user what Nvidia's purchase of Ageia was about. It's also not the responsibility of consumers to prop up a companies poor business decisions like that acquisition.

It boils down to this: Some people paid good money for an Nvidia card that has functioned more or less as advertised during its use. Now though, some people are purchasing newer products that just so happen to be from a different company. Because of this, Nvidia has taken it upon itself to break the functionality of an otherwise good piece of hardware that can still be used in some capacity. What makes it worse is that they're doing it seemingly out of spite to the end user for not being brand loyal.

There really is no defensible position Nvidia can take on this. They had to go out of their way to break PhysX intentionally by writing it into the drivers. Trying to claim it was a support issue was just laughable and they knew as much.

Also, stop trying to make this thread about ATI. Nvidia will be doing the same thing soon enough when people are using graphics systems from Intel or even Matrox.
 
It wasn't long ago when no one cared about PhysX or thought it was a big deal. Now with Batman: Arkam Asylum being so popular, all of a sudden many users are upset for Nvidia locking out ATI users. It sounds like PhysX is more important than we realized now that a good AAA title uses it. Being able to offload PhysX to a separate GPU is actually a cool thing and I can't see Nvidia giving it back up now they have AAA title.
 
I think 99% of the people out there don't know what physx is, and really don't care...as 99% of the people out there don't need it or use it.

You would be in the first group.
 
It wasn't long ago when no one cared about PhysX or thought it was a big deal. Now with Batman: Arkam Asylum being so popular, all of a sudden many users are upset for Nvidia locking out ATI users. It sounds like PhysX is more important than we realized now that a good AAA title uses it. Being able to offload PhysX to a separate GPU is actually a cool thing and I can't see Nvidia giving it back up now they have AAA title.
I don't think most of us have just realized anything. We've been aware of the possibilities but we never saw proof that it was really worth the extra expense. Not to mention the performance wasn't all that great. We're still waiting for that killer app, but what they did with Batman AA was pretty cool. I played Batman on a GTX280 doing graphics and PhysX and it looked great. When I ran it without the PhysX on my 5870, I really felt that I was missing something. It just looks better with the added effects. It's dissapointing to not be able to get the effects back when I have the product that's capable of doing the processing (GTX280), just because a competitors product is present in the system.

The bottom line is the restriction is a recent occurance, it wasn't always like that so I find it very reasonable to be upset about it.
 
I don't think most of us have just realized anything. We've been aware of the possibilities but we never saw proof that it was really worth the extra expense. Not to mention the performance wasn't all that great. We're still waiting for that killer app, but what they did with Batman AA was pretty cool. I played Batman on a GTX280 doing graphics and PhysX and it looked great. When I ran it without the PhysX on my 5870, I really felt that I was missing something. It just looks better with the added effects. It's dissapointing to not be able to get the effects back when I have the product that's capable of doing the processing (GTX280), just because a competitors product is present in the system.

The bottom line is the restriction is a recent occurance, it wasn't always like that so I find it very reasonable to be upset about it.

You make a good point, it did used to be possible to combine ATI for GPU and Nv for PhyxX until recently so I can understand how you feel. I guess I was too pragmatic, I just think Nvidia needs to do this for competative reasons since they are at least a few months off with a competing product. My point was that it seems PhysX seems to be more popular than even I expected and only took the new Batman game to raise it up to a new level. I don't think the decision will hurt Nvidia at all.
 
Kind of like AMD paying Intel for a x86 license? Its Nvidia's property. If its important enough to ATI to have it, let them cough up the money for it. I have no issue with Nvidia trying to prevent users from circumventing this by using an ATI card as a primary graphics and a hand-me-down Nvidia card as what is more or less an Ageia PhysX card. That's not what their purchase of that company and subsequent integration of PhysX into their GPUs was about.

That analogy doesn't work because at that time the majority of users were stuck with x86 Windows as their operating system. So you HAD to have x86 support to be able to offer a viable cpu that worked with the DOMINANT OS on the market. And if you don't have an OS, you can't run anything, well unless you buy an overpriced Mac. ;) [Excellent machines, but still overpriced, so don't flame me Mac users!]

PhysX on the other hand doesn't dominate anything. It's just for extra gee whiz physics effects on a limited number of games. ATI video cards are perfectly capable of rendering all versions of Windows and both current and future DX11 games as IS.

It still makes no sense for ATI to pay a license for an 'extra' feature and pay money to their direct competitor because PhysX IS NOT NECESSARY for basic function or compatibility, unlike the case with Windows + x86 processors. To argue otherwise does not compute from a logical or business point of view.

And this still does NOTHING to change the fact that Nvidia acted in an underhanded manner by shafting any of its consumers that DARED to put any video card into their machine besides an nvidia gpu. "How dare they buy another product, we'll show them, we'll DISABLE functions from their previously completely working product that they bought from us!" Petty? Yes. Consumer friendly? No. Fair business practice? Absolutely not.
 
Am I the only one finding it EXTREMELY hillaroius that first the "red team" declares that PhysX is "a useless gimmick"...to suddenly whine and cry over they can't use it...keep 'em comming...this is comedy gold! :D
 
Am I the only one finding it EXTREMELY hillaroius that first the "red team" declares that PhysX is "a useless gimmick"...to suddenly whine and cry over they can't use it...keep 'em comming...this is comedy gold! :D

Most of these people you are calling the "red team" are just consumers who happen to have both cards and want to have physx.. stop being a stupid fan boy.
 
Actually, it's people that already OWN Nvidia cards who are complaining that they can't use the full functionality of the video cards they have already bought. This so called 'red team/green team' thing is just made up fanboi stuff.

What matters bottom line is that when people PAY money for a product, they don't want the product features disabled just because they ALSO choose to buy other products. That would be like Ford building a limiter into your engine that didn't allow you to go into 5th gear if it detected you also had a Toyota parked in your garage. Ford would never get away with doing that without being torched publicly, and neither should Nvidia.
 
Most of these people you are calling the "red team" are just consumers who happen to have both cards and want to have physx.. stop being a stupid fan boy.

So now PhysX suddenly matters?
What happened? :p
 
So now PhysX suddenly matters?
What happened? :p
So because people don't worship a NVIDIA shrine every morning, they're not worthy of using their technologies? :rolleyes: Give me a break. If you bought the product you should be able to use the product for it's lifetime as well as all the features advertised for the product. I wonder if a class action suit will come out of this.
 
So because people don't worship a NVIDIA shrine every morning, they're not worthy of using their technologies? :rolleyes: Give me a break. If you bought the product you should be able to use the product for it's lifetime as well as all the features advertised for the product. I wonder if a class action suit will come out of this.


Nice try :)
But seriously what happend?:
http://www.hardforum.com/showthread.php?t=1450843&highlight=physx&page=2
There is nothing in that demo that is sweet, never mind awe-inspiring. Almost all of those effects can be (and have been) done in other games without the use of PhysX. This is just like the Arkham Asylum PhysX trailers - nothing was at all impressive with the hardware physics. They just ended up stripping out these "unique" effects instead of rendering them in another way to give the illusion that PhysX added "so much more." Back to Darkest of Days, the game's graphics look like crap (worse than a present-day console port) and the fact that the game appears to be shit (reading IGN's review) just aims at the fact that NVIDIA is releasing another techdemo with the facade that it is a "game." Whoever is in their marketing department needs to be fired ASAP, people are not this stupid, unless NVIDIA really is this desperate. What I'm curious to see is how deep NVIDIA's pockets are; how long can they keep this exclusive stuff up? Furthermore, if I extrapolate, why are they trying so hard? Has the 58xx cards really gotten them that nervous and they are that desperate?

Change of heart?
 
If you buy a car that has 6-speaker stereo, and you buy an in-dash CD player made of a competing brand, should the 6-speaker stereo default to 2-stereo mode to punish you? I'm surprised no one has already filed a lawsuit; despite all the (albeit justified) whining and nerdraging, no one wants to put their money where their mouth is, apparently.

Nowhere does Nvidia advertise its products as "Physx capable - but not if used with competing GPU." And even if it did, it's still not clear how that is at all legal. There is honestly no correct historical analogy here, where a product disables one of its own features if it detects that you have a product from another company as well.
 
Atech for the 5 millionth time THIS IS NOT an ATI vs Nvidia issue. This is strictly about the way Nvidia is acting towards their customers. My gaming machine uses an Nvidia 9800GX2, and this whole issue has me kinda ticked off. I'm looking to upgrade and it would be nice if the card that I paid $600 dollars for would still be able to give me a little bit of use. Just because I'm looking at buying the best card out right now doesn't mean that I should be punished because it doesn't use an Nvidia GPU. Granted I'm not a big fan of PhysX, It's still very much an after thought for me, but if Nvidia ever wanted to change my mind about that they sure are going about it the complete wrong way.

It's no different than if Nvidia had arbitrarily put out a driver update that forced my current card to display every screen font in comic sans or something just because they don't like the fact that I have a western digital hard drive. It was a stupid, unwarranted and downright dickish move on their part.
 
Unfortunately, that argument becomes quite flimsy when your able to run a simple patch and voila, you have ATI doing video rendering and your Nvidia card of choice doing the PhysX. I just don't buy it. If I patched PhysX and it was running with bugs or wierd performance issues then yes I would agree with your statement, but that isn't the case.

Really? EVERY game that supports PhysX works FLAWLESSLY with this patch? Not saying that it doesn't but without proof I'd be inclined to think that that isn't the case since that's not even true with an nVidia only setup.
 
If you buy a car that has 6-speaker stereo, and you buy an in-dash CD player made of a competing brand, should the 6-speaker stereo default to 2-stereo mode to punish you? I'm surprised no one has already filed a lawsuit; despite all the (albeit justified) whining and nerdraging, no one wants to put their money where their mouth is, apparently.

Nowhere does Nvidia advertise its products as "Physx capable - but not if used with competing GPU." And even if it did, it's still not clear how that is at all legal. There is honestly no correct historical analogy here, where a product disables one of its own features if it detects that you have a product from another company as well.

If it can be proven that this is TOTALLY artificial you might have a point. But I really doubt that's the case.
 
Back
Top