So there will be no GPU physics on the R800 series?
oops GPU physics?
here is the problem
nVidia card -> PhysX enable -> Game initialize -> Lag out the game -> ignore the lag -> ALL hail the physX!!!
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
So there will be no GPU physics on the R800 series?
oops GPU physics?
here is the problem
nVidia card -> PhysX enable -> Game initialize -> Lag out the game -> ignore the lag -> ALL hail the physX!!!
This is not 2006 anymore...not even 2007:
http://www.anandtech.com/video/showdoc.aspx?i=3171&p=4
Do you ever come with a vaild claim backed by reality?
Like I stated before:
AMD GPU users will start praising GPU physcis...as long as it isn't PhysX
So there will be no GPU physics on the R800 series?
The thing about GPU physics is that currently it's not a compelling enough reason to choose one brand over another. Physics processing support needs to become universal across most PC gaming platforms. If we ever want to see a majority of new titles take advantage of the technology, customers will have to have some reason to care more about the ability of their new card to handle physics processing. The best way for that to happen is for a company like Microsoft to step in and develop an open standard for physics acceleration in the same vein as direct3D or OpenGL.
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."On the other hand, ATI has DX11. If Nvidia releases a DX 11 part by December, I'd go with them. Otherwise, it's an impossible decision to choose between DX11 eyecandy versus PhysX eyecandy, and I don't know what I'd end up with.
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."
Personally, I think Havok will see some very interesting changes when Intel finally begins showing off Larrabee. Remember, Havok was working on a GPU-based implementation around the time Intel bought them up. With AMD also working with Intel in support of Havok, we could very well see some form of Havok FX re-appear, supporting AMD and Intel GPUs.
That's why PhysX barely holds my interest currently - I'd rather just go with the top performance card in general, until such issues as physics rendering have been determined. And if it does in fact some day come down to nvidia with PhysX vs. Intel/AMD with Havok, I think I'll put my money on the big blue.
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.But since Intel is going with a x86 GPU, wanna bet that Havok is going to be tailored towards that?
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.
It seems like people tend to view AMD and Intel as having a lot of animosity between each other, and while they are competitors in the CPU market and, to a lesser extent currently, the GPU market (with that increasing once Larrabee is released), Intel has a lot to gain by having AMD use Havok. They get licensing fees, a better chance that developers will use it (why support an API that only represents one of the three companies selling discrete cards, when you can use a physics solution supported by the majority, one of whom is Intel), etc.
That's one of the reasons why Havok is so appealing - it's so open, whereas PhysX is controlled by wonderfully-draconian nvidia, lol.
Do you really think that if Intel didn't want AMD to fully take advantage of Havok, they would even have let a deal between AMD and Havok be done? Like I said before, Intel has a lot to gain from AMD using Havok.Havok isn't "open", it's controlled by Intel.
You think Intel is going to play nice with AMD?
Take a look on the CPU markedet...Intel don't play nice with AMD...and neither will they here.
There might not be. I'm not aware of the specific terms in regards to the arrangement AMD and Havok have, so yeah, it may be completely free for AMD to use.I am pretty sure that there is no license fee's involved with havoc except with the game companies. Unless that has changed with any proposed gpu based physics but havoc is not like physx correct me if I am wrong.
No offense, but that basically sounds like: "If nVidia releases a DX11 part by December, I'd go with them. Otherwise, I'll probably go with them anyway."
Personally, I think Havok will see some very interesting changes when Intel finally begins showing off Larrabee. Remember, Havok was working on a GPU-based implementation around the time Intel bought them up. With AMD also working with Intel in support of Havok, we could very well see some form of Havok FX re-appear, supporting AMD and Intel GPUs.
That's why PhysX barely holds my interest currently - I'd rather just go with the top performance card in general, until such issues as physics rendering have been determined. And if it does in fact some day come down to nvidia with PhysX vs. Intel/AMD with Havok, I think I'll put my money on the big blue.
Extrapolated some of the comparable data to see what a GTX 285 looks like next to the new cards, remember this is all information from the leaked performance results that hit the net and not my own data.
Its only 3 letters... some people cant even place them in the right order....
Havok can support more than just one tailored solution... it can be optimized for both Larrabee (with its P54C CPUs) and AMD's GPUs. Havok FX was meant to support (at the time) both GeForce 6/7 and the X1000+ Radeons, and while obviously a GPU is different from a stripped-down CPU, I have no doubt they could get it to work.
It seems like people tend to view AMD and Intel as having a lot of animosity between each other, and while they are competitors in the CPU market and, to a lesser extent currently, the GPU market (with that increasing once Larrabee is released), Intel has a lot to gain by having AMD use Havok. They get licensing fees, a better chance that developers will use it (why support an API that only represents one of the three companies selling discrete cards, when you can use a physics solution supported by the majority, one of whom is Intel), etc.
That's one of the reasons why Havok is so appealing - it's so open, whereas PhysX is controlled by wonderfully-draconian nvidia, lol.
Extrapolated some of the comparable data to see what a GTX 285 looks like next to the new cards, remember this is all information from the leaked performance results that hit the net and not my own data.
I dont see how this chart to be legit for Crysis...
41 fps on GTX 285? I think something is smoking...
Crysis has always favored NVIDIA GPUs.I dont see how this chart to be legit for Crysis...
41 fps on GTX 285? I think something is smoking...
Crysis has always favored NVIDIA GPUs.
Ah, that's what you meant. Well they're only playing at 1680x1050, and the numbers are probably from the GPU benchmark, not actual in-game play (that would be my guess anyway).I had a GTX 295, the FPS never goes that high on max setting, even with 0xAA..
its probably either Medium setting that need to be mention
PS: Warhead seem to favor nVidia a bit, while Crysis the original favor on ATI....
Yeah, I realized after I posted it that it could mean IBM also. Eh, you never know, IBM is always cooking up stuff in their labs...Betting on IBM?
Well, in that case, I'll think of it like US vs. Germany in Day of Defeat: roughly similar weapons/performance between eachThink of it like WW2, Havok is USA and Physx is Germany. AMD represents "the third guy" Russia that can tip the outcome of the war.
I said it before, Intel has a vested interest to use AMD to help take NV off the #1 spot first. Only after that should they go USA vs USSR on eachother.
Yeah, I realized after I posted it that it could mean IBM also. Eh, you never know, IBM is always cooking up stuff in their labs...
Well, in that case, I'll think of it like US vs. Germany in Day of Defeat: roughly similar weapons/performance between each
In all honesty, I could see Intel one day trying to more closely integrate Havok with its own discrete graphics cards, and leaving AMD the "odd man out", but I don't think that would be for quite some time, because first, as you said, Intel wants to have AMD assist them in knocking nvidia down some. Even then, I think that now that Havok is being provided for free possibly, AMD could make their own enhancements to benefit from it. It'll be interesting to see what does ultimately happen from the AMD/Intel Havok partnership. I just think it's kinda stupid for nvidia fans to already declare that PhysX is the next big thing.
Well, it is:
http://www.hardforum.com/showthread.php?t=1451856
It's also the only API currently running on a PPU, a CPU a Cell SPE and a GPU...
Achievement Unlock: Ride on the PhysX Fail Train
Achievement Unlock: Ride on the PhysX Fail Train
lol, dude what are you trying to say? the only thing i get from that link is that the physx api is now used more than the havok api by well established developers on all major platforms. if you hate physx that much, i suggest you boycott the middleware by not playing games that use it, including new & upcoming titles like batman aa, mass effect 2, mafia 2, dragon age origins, nfs shift, shattered horizon, dark void, etc. if you don't care for gpu physx, that is a different story, but that won't change the fact that the physx middleware itself is used in a lot of games.