Geforce8 cards to get PHYSX support shortly

You really think they are going to let you SLI an 8800 GT or GTX with an 8400 GS?


It's just another gimmick to get you to SLI...just like Crysis is a gimmick to get people to SLI and buy quad cores.


phys-x never increased framerates by much anyway....it's all a scam.


FAIL. :rolleyes:

Parallel processing is a much better option to physics processing than serial processing done by the current x86 instruction set CPUs.

These kinds of "gimmicks" are for games to push the limits and appear more how we want them to be. If you don't like them then don't buy them and stop complaining. No one's forcing you or anyone else to purchase anything.
 
Actually, SLI and Crossfire are not inconsistent.. developers are inconsistent in how well they design their games to take advantage of more than 1 gpu. So it really comes down to them, as NVIDIA and ATI cannot recode and run their game on the fly for them :)

SLI is the entire package of hardware, drivers, and game support. The cause of the inconsistency is irrelevant to the customer.

 
Why does everyone feel that writing games to support PhysX in GeForce8's won't happen?

Games have support for plenty of optional components or capabilities already.

EAX 2,3,4,5 is a hardware option, and is enabled or disabled in the game. It plays fine without it, but better with it. The game writers have written the game to run with or without it.

IIRC, Doom was written to use software rendering, 3dfx or OpenGL. That sounds kind of like writing critical game components differently to support optional hardware - no?

I remember reading in the Bioshock tweak guide about how different hardware rendering features were enabled were set when low/med/high was set on various options. Don't you think that also represents coding for hardware support of optional features?

If pretty much everyone with nVidia 8 hardware has PhysX when the dust settles, that's probably worth writing optional code for in future games, or upcoming patches of current games. I don't see why we would have a "PhysX support - YES/NO" or "Physics - software/PhysX/??" option.

Hopefully the drivers or the game software will give us some knobs to tweak to control how many resources we want to make available for physics at the expense of other rendering.

On, say, HL2 or COD4 I bet my G92 GTS has some cycles to spare at 1600x1200 so I may allow significant resources for physics instead of cranking AA/AF to the max. On Crysis, however...

Just to comment on openGL and doom.
OpenGl ran on everything, it was a programing API, You didn't have to have a special accelerator for it to work. 3dfx started dedicated openGL accelerator production and managed to sell quite well. Remember that openGL is "similar" to directX nothing more nothing less.
Here it is more like "do we implement something that we can't be sure anyone acctually will be able to use?? what is the added cost/benefit for us?"

I'm not saying that discrete ppu or using the gpu is a bad idea, might be good, but noone is going to make something that alters the game significantly (sure eyecandy but not the "real" physics most of us would like) since the vast majority won't be able to take advantage of it. The software companies do not make games for "us" here they make them for the "average joe" with his dell basic computer and that computer usually have more cpu time free than gpu time.
 
I am pretty sure that a nVidia GPU fabbed at .065u running CUDA will be able to perform in the same ball-park as a Physix PPU fabbed at .13u.

Yes because the manufacturing process is the perfect indicator of speed, right? Oh wait, no it isn't :rolleyes:

The Phys-X will destroy an onboard GPU, is most likely significantly faster than just using a part of the 8800's GPU. Fortunately for nVidia and 8xxx owners, no game has really stressed the Phys-X card so it doesn't really matter how much faster a Phys-X card is compared to nVidia's solution :D
 
Yes because the manufacturing process is the perfect indicator of speed, right? Oh wait, no it isn't :rolleyes:

The Phys-X will destroy an onboard GPU, is most likely significantly faster than just using a part of the 8800's GPU. Fortunately for nVidia and 8xxx owners, no game has really stressed the Phys-X card so it doesn't really matter how much faster a Phys-X card is compared to nVidia's solution :D

manufacturing process is an excellent indicator of potential transistor budget and potential transistor speed.

The Physix Chip has a 125m transistors fabbed on a .013u process, much the same as an nVidia NV30 GPU, so i'd be willing to have a guess that the clock-speed is very similar, i.e. somewhere between 400MHz and 500MHz. While the process may improve over time, remember that the PPU was half the starting price of the 5800, so costs have been cut and the clocks won't be much better, if at all.

The Geforce 8200/9200 is probably very similar to an 8400GS which has 210m transistors and is clocked at 450MHz and 900MHz for the shaders (the important part), but only fabbed at .08u. This includes 16 unified shader units that probably occupy half the available die (i.e. near enough 125m).

If we presume that 8200/9200 clock-speeds will bear greater similarity to the 9500GT which run at 650MHz and 1625MHz due to them both sharing a similar fab process, then it is not unreasonable to split the difference and say that the 8200 (the lowest end) will have a core clock of 550MHz and a shader clock of 1200MHz.

Is it totally unreasonable to expect 16 unified shader units belting along at 1200MHz, to perform in the same ballpark as an equivalent amount of dedicated Physix silicon at a mere 400-500MHz?

I don't think it is.
 
with a simple software download i get physx on my dualies! good job in part of nvidia, awesome.
 
manufacturing process is an excellent indicator of potential transistor budget and potential transistor speed.

The Physix Chip has a 125m transistors fabbed on a .013u process, much the same as an nVidia NV30 GPU, so i'd be willing to have a guess that the clock-speed is very similar, i.e. somewhere between 400MHz and 500MHz.

The Geforce 8200/9200 is probably very similar to an 8400GS which has 210m transistors and is clocked at 450MHz and 900MHz for the shaders (the important part), but only fabbed at .08u. This includes 16 unified shader units that probably occupy half the available die (i.e. near enough 125m).

If we presume that 8200/9200 clock-speeds will bear greater similarity to the 9500GT which run at 650MHz and 1625MHz due to them both sharing a similar fab process, then it is not unreasonable to split the difference and say that the 8200 (the lowest end) will have a core clock of 550MHz and a shader clock of 1200MHz.

Is it totally unreasonable to expect 16 unified shader units belting along at 1200MHz, to perform in the same ballpark as an equivalent amount of dedicated Physix silicon at a mere 400-500MHz?

I don't think it is.

You know how the techworld works...when something is new...you will always have those people convinced it will suck or not work...simply because they enjoy being naysayers

you make a convicing arguement
 
Just to comment on openGL and doom.
OpenGl ran on everything, it was a programing API, You didn't have to have a special accelerator for it to work. 3dfx started dedicated openGL accelerator production and managed to sell quite well. Remember that openGL is "similar" to directX nothing more nothing less.
Here it is more like "do we implement something that we can't be sure anyone acctually will be able to use?? what is the added cost/benefit for us?"

I'm not saying that discrete ppu or using the gpu is a bad idea, might be good, but noone is going to make something that alters the game significantly (sure eyecandy but not the "real" physics most of us would like) since the vast majority won't be able to take advantage of it. The software companies do not make games for "us" here they make them for the "average joe" with his dell basic computer and that computer usually have more cpu time free than gpu time.

Example: New game - super shooters

Super shooters has lots of physics, its been designed to have the physics processed by the computer or if you have physX support it'll be processed by the GPU.

The game has mutliple levels of physics quality, Low-Medium-High-Ultra High. Assuming physX is as good as they say it is, you have a medium system that only lets you run at medium settings, but if you buy an nvida card instead you suddenly can run physics at Ultra High settings instead of medium.

Thats very ideal although not sure how realistic it is.

Now also for Crysis, what if reformatting all the physics to be recoded under the physX engine is possible. I see that being extremely difficult to accomplish (impossible), but assuming it was for some reason really easy and became a patch much like SLI i could see physX being very useful. Liscensing issues with Havok would probably cause problems as well.

If Havok was smart and wanted to completely shutdown physx they could probably make their system be very compatible with GPU processors.
 
For all the insulting you're doing you seem to be pretty clueless yourself.

The point of physics technologies is not to improve framerates... it's not a faster CPU or GPU. The point is to add to the IMMERSION/experience of the game, by adding an element that hasn't really been there up until now, without necessarily DROPPING your framerate.

If you could take your system and add massive physics calculation power without dropping graphics performance you've only improved your game and opened up tons of new gameplay possibilities.


And that technology is so wonderful and popular that everyone had a phys-x card, right?

Everyone spent that 300 dollars for those sparse extra effects in GRAW, right?

It was a nice idea....but it never worked for practical purposes. Now nvidia is using the name to sell more video cards.

If I took a pint of chocolate ice cream, and put it in freezers at local 7-11s, I couldn't get 99 cents for it.


If I take the same pint of ice cream, put "Ben and Jerry's" on the label, they'll sell for 5 bucks a pop.


It's a scam I say..... a scam.
 
WTF? If I ALREADY HAVE an 8xxx series card and I download a drive that allows physx on it. How is that a scam?
 
And that technology is so wonderful and popular that everyone had a phys-x card, right?

Everyone spent that 300 dollars for those sparse extra effects in GRAW, right?

It was a nice idea....but it never worked for practical purposes. Now nvidia is using the name to sell more video cards.

If I took a pint of chocolate ice cream, and put it in freezers at local 7-11s, I couldn't get 99 cents for it.


If I take the same pint of ice cream, put "Ben and Jerry's" on the label, they'll sell for 5 bucks a pop.


It's a scam I say..... a scam.

Blimey, the ramblings of a closed mind eh!
PhysX previously was too expensive for the return the consumer got.
That will undoubtedly change with NVidia holding the reigns.

You dont need to tell us the bleeding obvious either, of course NVidia bought up AGEIA to sell more cards, they didnt buy it to sell less.
 
I don't question that a GPU is more efficient than a CPU when doing physics.

But the thing that seems to be happening when playing games is that you are already using 100% of the GPU - why burden it with additional calculations.

If you buy a Quad or Oct CPU and start playing games on it - I'd be willing to say you have a good 2 to 6 processors that are just sitting there twiddling their collective thumbs - While the GPU is stressed to the max.
 
Did i miss something, everyones complaining about SLI, but unless I missed something, it looked like I didn't need SLI to use PHYSX :confused:
 
I think a lot of you are a bit confused about this.

NVIDIA is not putting physics processing units in their card (not yet anyway). PhysX is NOT the PPU. PhysX is the physics SOFTWARE, just like Havok, for example. Many games already use PhysX for their physics so now we'll simply be able to process those physics using the video card instead of only the CPU. This can only help, since GPUs are faster than CPUs.

Now obviousely the problem still lies with developers, not with the software, and I see that everytime I look at my quad-core CPU usage and see it at 30-50% at best. We have a TON of processing power to spare, but most games and apps simply don't use it. They're just very inefficient and I wish developers would spend more time learning to use that power instead of just waiting for faster hardware...that they can use in a half-assed way.
 
I think a lot of you are a bit confused about this.

NVIDIA is not putting physics processing units in their card (not yet anyway). PhysX is NOT the PPU. PhysX is the physics SOFTWARE, just like Havok, for example. Many games already use PhysX for their physics so now we'll simply be able to process those physics using the video card instead of only the CPU. This can only help, since GPUs are faster than CPUs.

Now obviousely the problem still lies with developers, not with the software, and I see that everytime I look at my quad-core CPU usage and see it at 30-50% at best. We have a TON of processing power to spare, but most games and apps simply don't use it. They're just very inefficient and I wish developers would spend more time learning to use that power instead of just waiting for faster hardware...that they can use in a half-assed way.

That new game you just got started development at least two years ago, when even dual core wasn't the norm. I agree we have processing to spare, but it takes awhile for the developers to catch up. That doesn't happen overnight, but it will.
 
Blimey, the ramblings of a closed mind eh!
PhysX previously was too expensive for the return the consumer got.
That will undoubtedly change with NVidia holding the reigns.

You dont need to tell us the bleeding obvious either, of course NVidia bought up AGEIA to sell more cards, they didnt buy it to sell less.


They are taking chocolate flavored 8800 GTs, putting "Ben and Jerrys" on the label, getting people to buy more to SLI with...but in the end, you're still getting the same ol' 8800 GT.


You only think you're getting something better. It's bollocks, mate. Bloody, buggery bollocks.


It's amazing how you all cream your panties over this.
 
If Havok was smart and wanted to completely shutdown physx they could probably make their system be very compatible with GPU processors.

They did, it was called Havok FX (until Havok was bought and that project was scrapped) - nVidia was (surprise surprise) a close partner with Havok in developing GPU accelerated physics: http://techreport.com/discussions.x/9610

Seems nvidia just took the code they already had (from Havok FX) and slapped a new API on it :D
 
They are taking chocolate flavored 8800 GTs, putting "Ben and Jerrys" on the label, getting people to buy more to SLI with...but in the end, you're still getting the same ol' 8800 GT.

While obviously NVIDIA hopes people will buy a card especially for PhysX, it didn't say anywhere that you'd need to. Look at the 8800 GT and especially the 8800 GTS 512MB. One of the complaints about how they don't have enough memory bandwidth to saturate all the shaders...hmm, what to do with the unused steam processors. ;)

Keep in mind that NVIDIA doesn't just want to sell additional GeForce 8-series cards to those who already own one...they want people to upgrade from previous generations, and those who would instead buy an AMD card. So there are solid business reasons to enable it in all cards, not just secondary cards (which very few people are going to buy, just as very few people bought standalone PhysX cards).
 
Yaa, because I was so wanting to buy a Agea Physix card already, but instead, I can buy a video card on top of the one I already own. :rolleyes:

It's funny how just yesterday, only a fool would buy a PHYSX card, but today, it's perfectly sane to buy an additional video card to run PHYSX on.

Also, yesterday, Agea was a company who's product held no value, PHYSX was a joke... but today, this is going to put ATI in the dust.

Get real... PHYSX won't do squat. Never has, never will.
 
They are taking chocolate flavored 8800 GTs, putting "Ben and Jerrys" on the label, getting people to buy more to SLI with...but in the end, you're still getting the same ol' 8800 GT.


You only think you're getting something better. It's bollocks, mate. Bloody, buggery bollocks.


It's amazing how you all cream your panties over this.

Its possible through careful wording to make anything sound bad.
That seems to be your forte.
And why the bad attidude, you come across as simply an angry person.

Its a shame for you that you cant see the potential, perhaps you should leave that to people who understand the situation with the clarity it needs
 
Its possible through careful wording to make anything sound bad.
That seems to be your forte.
And why the bad attidude, you come across as simply an angry person.

Its a shame for you that you cant see the potential, perhaps you should leave that to people who understand the situation with the clarity it needs


I am an angry person.


I always have a bad attitude when i'm being bullshitted.


I would love for phys-x, Crysis in SLi, Crossfire and MultiCore etc. to work as they were advertised and hyped. But these companies often times pitch a load of rubbish to you for sales.

You'd be a fool not to be skeptical. Show me the real, tangible, reasonably affordable benefits of this stuff and my attitude towards it will be 180.

As for now, what I see, is what saw before Crysis came out:

"Just wait....a new driver....a new patch and Crysis will run twice as fast! It's not optimized yet....once the game comes out, it will be soooo much better."


And to this day, Crysis with all the drivers and patches doesn't run any better than the demo that came out before the game.


In the meantime, all the people who bought quad-cores and sli's and crossfire systems just for Crysis got scammed.


Even tri-sli barely runs it well on high settings.


So why do I care about phys-x? Why do I read these message boards? Because I wanna buy this stuff just like you guys...but I want to know fact from fiction first.
 
A little news for you that might help.
Nearly all companies BS to keep up with the BS that the competition gives.
We read through the BS and see the real picture but we dont get angry about that.
We get pissed if we are being stiffed but that isnt the case here, in fact this is an extra for almost free.
The people who should be pissed are those that adopted AGEIAs card early and didnt get good use from it.

Leave the thinking on this to those that can look at the situation objectively to work out what it will become in the future.
No one is forcing you to buy any new hardware and there is no implication that you will need to use PhysX support if you dont like it.

You have over sensationalised PhysX use in a negative light.
Rather than trying to predict a very unlikely grim output, you'll be better off waiting to see what actually happens.
If you dont like it then, fair enough, its something you can use to get mad again :)


Rather than getting mad about quad cores not supporting games, think of the uses quad cores can be put to.
If you need one, get one, if you dont then dont.
Its easy to find out if one will be useful to you as you can ask here and we will tell you.
No secrets really.

You dont have to have hardware on the bleeding edge, that is your choice.
If you cant handle it though, its better to stay a few months behind.
 
Bound to get a lot of pessimism when Physics acceleration isn't very tangible. The difference between software and hardware accelerated graphics was huge visually, but the same can't be said about physics.

Hardware acceleration may just give us more of what is already being done in games like HL2. Not much wow factor unless you are going to use tons of physics effects, but adding physics just for the point of having lots of it isn't as impressive as amazing visuals.

So what if I can throw around 2 billion boxes with no slow down? Amazing physics is already being demoed in "The Force Unleashed" on a machine that doesn't have dedicated physics hardware. So why do it need it?

IMHO, Impressive physics is going to come from impressive algorythms, not a driver or hardware.
 
Yaa, because I was so wanting to buy a Agea Physix card already, but instead, I can buy a video card on top of the one I already own. :rolleyes:

It's funny how just yesterday, only a fool would buy a PHYSX card, but today, it's perfectly sane to buy an additional video card to run PHYSX on.

Also, yesterday, Agea was a company who's product held no value, PHYSX was a joke... but today, this is going to put ATI in the dust.

Get real... PHYSX won't do squat. Never has, never will.

I still wouldn't buy a graphics card for dedicated physics, its still a waste of space and resources in my opinion. But if a portion of my GPU can be used for physics (at the cost of some overall graphics power) then I'm in for that. The question now is how well a GPU can do physics: Will we need a whole separate GPU for the same physics processing power as the PhysX cards, or do we need only a portion of the gargantuan g80 cores (and the rest of the revisions) for the same power? Efficiency is what I'm worried about.
 
A little news for you that might help.
Nearly all companies BS to keep up with the BS that the competition gives.
We read through the BS and see the real picture but we dont get angry about that.
We get pissed if we are being stiffed but that isnt the case here, in fact this is an extra for almost free.
The people who should be pissed are those that adopted AGEIAs card early and didnt get good use from it.

Leave the thinking on this to those that can look at the situation objectively to work out what it will become in the future.
No one is forcing you to buy any new hardware and there is no implication that you will need to use PhysX support if you dont like it.

You have over sensationalised PhysX use in a negative light.
Rather than trying to predict a very unlikely grim output, you'll be better off waiting to see what actually happens.
If you dont like it then, fair enough, its something you can use to get mad again :)


Rather than getting mad about quad cores not supporting games, think of the uses quad cores can be put to.
If you need one, get one, if you dont then dont.
Its easy to find out if one will be useful to you as you can ask here and we will tell you.
No secrets really.

You dont have to have hardware on the bleeding edge, that is your choice.
If you cant handle it though, its better to stay a few months behind.


Oh I can handle it, baby. As long as it works as advertised.


I'd never ask any of you for advice about computer stuff. I read these forums to keep up to date about what's coming out, prices, etc. I bought my horrible power-consuming planet-killing, noise polluting, 60 percent slower than an 8800 GT Sapphire HD 2900 Pro 512-bit because of these forums.

I took the majority of what most of you had to say about it, considered it, studied it, concluded that it was all bullshit, then bought the best bang-for-the-buck video card i've ever seen.

It's funny....In the months that i've had it... It's no louder than an HD 3870, runs cooler than a stock 8800 GT, my electric bills haven't gone up, and it's just as fast in my games as the 5 8800 GTs i've owned.
Oh, and surprizingly, the planet earth is still here, too. Al Gore is just gonna have to wait a little while longer...

I just hope it doesn't mysteriously break (because you guys think computer hardware breaks all the time without any overclocking or abuse)....according to these forums, Sapphire is Satan Incarnate and doesn't honor their warranty.
 
Oh I can handle it, baby. As long as it works as advertised.


I'd never ask any of you for advice about computer stuff. I read these forums to keep up to date about what's coming out, prices, etc. I bought my horrible power-consuming planet-killing, noise polluting, 60 percent slower than an 8800 GT Sapphire HD 2900 Pro 512-bit because of these forums.

I took the majority of what most of you had to say about it, considered it, studied it, concluded that it was all bullshit, then bought the best bang-for-the-buck video card i've ever seen.

It's funny....In the months that i've had it... It's no louder than an HD 3870, my electric bills haven't gone up, and it's just as fast in my games as the 5 8800 GTs i've owned.
Oh, and surprizingly, the planet earth is still here, too. Al Gore is just gonna have to wait a little while longer...

I just hope it doesn't mysteriously break (because you guys think computer hardware breaks all the time without any overclocking or abuse)....because according to these forums, Sapphire is Satan incarnate and doesn't honor their warranty.

I buy my hardware to do a job I need it to, I dont go by the advertisements.
If you want to believe those you are going to have a hard time.

lol, I think this is the wrong forum for you.
No matter how good a forum is, its not going to be good enough for you by the looks of it.
Sorry we cant baby sit you better.
Oh and remember the world isnt perfect but we all manage it ok.
 
I buy my hardware to do a job I need it to, I dont go by the advertisements.
If you want to believe those you are going to have a hard time.

lol, I think this is the wrong forum for you.
No matter how good a forum is, its not going to be good enough for you by the looks of it.
Sorry we cant baby sit you better.
Oh and remember the world isnt perfect but we all manage it ok.


I love this forum. I am thoroughly entertained and I often patronize the sponsors of it.

I also often patronize other members.

So you see, I am a model citizen of [H]ard Forum.
 
I love this forum. I am thoroughly entertained and I often patronize the sponsors of it.

I also often patronize other members.

So you see, I am a model citizen of [H]ard Forum.

Not really, you came here all guns blazing and being mad at everyone for something that isnt our fault.
It has taken quite a lot posts to get you to respond in a reasonable fashion.
I'm happy to leave this if you are, I have a lot more to offer if not :)
 
So what if I can throw around 2 billion boxes with no slow down? Amazing physics is already being demoed in "The Force Unleashed" on a machine that doesn't have dedicated physics hardware. So why do it need it?
Don't mention it's partly Havok physics in the PPU forum or they might attack you. :p

---

The more I think about it today, the less great this idea seems. For all it's other faults, the discrete Ageia PPU was a one sized fits all solution and developer support (no matter how lazy) for the hardware was for a fixed speed device. With CUDA PhysX, the speed and even the presence is a variable, leading back to the same situation as before. If nvidia makes a low cost headless GPU into a physics processor, the potential for growth is still good. If it is relying on just G8x and later cards to make developers support GPU physics, it's doomed... until compute shaders come to the rescue eventually as a generalized and universal solution.
 
Not really, you came here all guns blazing and being mad at everyone for something that isnt our fault.
It has taken quite a lot posts to get you to respond in a reasonable fashion.
I'm happy to leave this if you are, I have a lot more to offer if not :)


So what are you saying? It's not your fault that phys-x is a load of bull? Or that it's not your fault that you are buying into this? It's not your fault that companies fabricate things
to get you to buy?


I've been reasonable the whole time. If anything, it's people who drool over this kinda stuff who aren't reasonable.


I could see if Phys-x was a proven technology, but no one used it anyway because it sucked. If it was any good at all, all the people who could afford one would have it.

All of sudden, because 8-series owners think they are somehow getting something extra for free... That its now some monumental jump forward for the future of computer gaming.

"ATi is going out of business, Al-Qaeda declares peace on the western world and Jesus is returning to rebuild the Temple. Shiny happy people rejoice!"


Even the slightest hint of some new computer
tech sends people's common sense right out the window.
 
So what are you saying? It's not your fault that phys-x is a load of bull? Or that it's not your fault that you are buying into this? It's not your fault that companies fabricate things
to get you to buy?


I've been reasonable the whole time. If anything, it's people who drool over this kinda stuff who aren't reasonable.


I could see if Phys-x was a proven technology, but no one used it anyway because it sucked. If it was any good at all, all the people who could afford one would have it.

All of sudden, because 8-series owners think they are somehow getting something extra for free... That its now some monumental jump forward for the future of computer gaming.

"ATi is going out of business, Al-Qaeda declares peace on the western world and Jesus is returning to rebuild the Temple. Shiny happy people rejoice!"


Even the slightest hint of some new computer
tech sends people's common sense right out the window.

No you are saying PhysX is bull not me.
People didnt buy it before because hardly any games supported it, most of those that did werent impressive and the PhysX card cost too much.
Theres nothing bad about the idea though but it has to present good value.
NVidia can make sure of that, AGEIA couldnt.

You make up a right load of BS for someone who is complaining that companies BS pisses you off.
Everyone else who has posted here has shown more common sense than you by the way :)
 
Theres nothing bad about the idea though but it has to present good value.



Everyone else who has posted here has shown more common sense than you by the way :)



And an idea is exactly what you're being sold. Just like the Killer NIC... a three hundred dollar idea. Ideas that have become famous for their excess and what they promised to give consumers.

Now those ideas are being bought, repackaged and resold to you.

You're buying into a name...a brand. But not anything of any practical value.



And you're right, I do lack common sense. I don't have the sense enough to be quiet and accept the fact many people are flat out stupid.
 
Yeah, I get my legacy drivers from Ageia now.

Ok..
emotpsyducklt8.gif
 
And an idea is exactly what you're being sold. Just like the Killer NIC... a three hundred dollar idea. Ideas that have become famous for their excess and what they promised to give consumers.

Now those ideas are being bought, repackaged and resold to you.

You're buying into a name...a brand. But not anything of any practical value.



And you're right, I do lack common sense. I don't have the sense enough to be quiet and accept the fact many people are flat out stupid.

lol
Games have had Physics for a long time but as you may have noticed it isnt very good.
Its not a new idea, it is something that is needed in games and desperately needs more processing power

Implementing Physics in hardware allows much greater fidelity to be achieved.
The processing requirements for Physics get exponentially larger the more realistic the desired effect is and standard CPU designs are not optimised to do the major legwork of Physics processing.
However Graphics Card Stream Processors are ideally suited to the task so it makes sense to incorporate some of the processing on a graphics card.
This also eliminates much of the traffic/latency on the slower PCI-E and very much slower PCI bus when the Physics hardware needs to talk to the Graphics hardware, a double extra bonus.
The PhysX card was very powerful but communication across the buses killed some of its strengths.

What NVidia are doing isnt new, its a way of doing the same thing AGEIA started but bringing it to the masses cheaply and with much enhanced support and performance.
 
Back
Top