AMD's ATI Radeon HD 5870 Video Card Review @ [H]

So wait, you mean nobody cared about physics in games before something named AGEIA? And you start off this sentence with FUD? Is that a warning or something..?

No one cared about hardware physcis...prove me wrong.

And why can't they do the physics in my unused processor cycles in my unused processor cores? Brent says that DX11 will allow this to happen and there is only one gen of cards right now that support it... AMD's. But you want me to have to waste my precious GPU flops to run physics?

There is no physcis in DX11...prove me wrong.

And what was I supposed to see in that Batman video? Batman's cape? I should waste my GPU flops so Batman's cape can flutter? How about I use my GPU to get more AA or better effects and game developers use up my completely unused CPU cycles for physics?

Not the cape, but I have a feeling that you are not really looking, but try again...take notice of the side-by-side parts:
http://www.youtube.com/watch?v=6GyKCM-Bpuw

Going to get my grilfriend, who is no none IT person and dosn't game to watch the video too(she hasn't seen the video yet)...and ask her if she can spot the difference...as a control.

I really don't understand the reason to have physics done in my GPU.

It's really that this simple:
For SIMD (the kind of math/calculations physics needs) GPU > CPU.
That you don't understand it, isn't a valid argument.
 
Prove you wrong? I didn't know that was my job. Got better things to do then help you correct your illogic, maybe you should try to prove yourself right.
 
Prove you wrong? I didn't know that was my job. Got better things to do then help you correct your illogic, maybe you should try to prove yourself right.

Why?
You admitted to not understanding the basic concepts...why should I waste any more time on you?
*waves*
 
You're sounding very religious there Atech with the prove me wrong thing, in other words borderline psychotic. Remember, it is only tech and tech in it's early stages.

Oh and having a laymen/woman evaluate something is not a control, again, that is very religious in nature, backing away from proper testing and putting on a show to persuade the uneducated.

Maybe take a holiday, clear your head.
 
So wait, you mean nobody cared about physics in games before something named AGEIA? And you start off this sentence with FUD? Is that a warning or something..?

And why can't they do the physics in my unused processor cycles in my unused processor cores? Brent says that DX11 will allow this to happen and there is only one gen of cards right now that support it... AMD's. But you want me to have to waste my precious GPU flops to run physics?

And what was I supposed to see in that Batman video? Batman's cape? I should waste my GPU flops so Batman's cape can flutter? How about I use my GPU to get more AA or better effects and game developers use up my completely unused CPU cycles for physics?

I really don't understand the reason to have physics done in my GPU.

Ghostbusters makes very good use of the CPU for physics and the physics in that game are pretty good. The [H] even has that video showing it that I posted earlier.

Does that mean a GPU should never be used for physics? No, of course not because it can do more. CPU physics and GPU physics will both have their place once a standard that any vendor can use is set and companies aren't paying off developers to use proprietary bullshit.

All the PhysX crusaders should be saying hey PhysX looks great but get the standards rolling so everybody can enjoy physics on the GPU because proprietary crap will only hinder us.
 
Directcompute doesn't have widespread support yet. But it's coming. And the 5870 fully supports it.

PS the 5870 really takes the Nvidia cards to the shed when running Directcompute effects on the wave program. Seen here.

http://www.anandtech.com/video/showdoc.aspx?i=3643&p=8

20136.png
 
Last edited:
I find that graph odd. That is an nvidia demo. Yet, the 295 does not seem to be using sli.
 
All the PhysX crusaders should be saying hey PhysX looks great but get the standards rolling so everybody can enjoy physics on the GPU because proprietary crap will only hinder us.

Quoted for truth. I wouldn't mind Physx if Nvidia hadn't made it so proprietary. The way Nvidia drivers are set up now, although I am an Nvidia card owner, I will be 'locked out' of using physx effects the moment I plug in a non-nvidia card. That is irritating. If I paid my money for the card, why can't I use it the way that I want to use it?

If Nvidia hadn't decided to 'lock out' physx support on their video cards when detecting a non-nvidia card in the system, I would have found a use for their cards as a physx dedicated accelerator. But that option is out the window now and by Nvidia's own choosing.

I find that graph odd. That is an nvidia demo. Yet, the 295 does not seem to be using sli.

It probably just means that Nvidia's Directcompute support is less than perfect on the 295. They might get it fixed in their next generation of cards, but who knows if that's even a priority for them.
 
It probably just means that Nvidia's Directcompute support is less than perfect on the 295. They might get it fixed in their next generation of cards, but who knows if that's even a priority for them.

The GT300 is going to be a MIMD part, not a SIMD part...so that is really a BIG step in adding Directcompute performance.
 
GF just watched the video...when I told her a poster couldn't see any difference, she gave me a strange look...not of the good kind :D

She said one funny thing...she would play the game first with PhysX OFF...as she had an easier time spotting enemies.
But without the PhysX the world look dull and static...she used the words "like the difference btween an SD and HD movie"..as she could see more details and the world looked more real.
 
GF just watched the video...when I told her a poster couldn't see any difference, she gave me a strange look...not of the good kind :D

She said one funny thing...she would play the game first with PhysX OFF...as she had an easier time spotting enemies.
But without the PhysX the world look dull and static...she used the words "like the difference btween an SD and HD movie"..as she could see more details and the world looked more real.

Know what I noticed in the video? A shit load of effects that could have been left in the game in a static form when you aren't using physx (smoke, flags, etc.). I also noticed A LOT a really simple physics effects that were left out for someone not using physx.

A lot of physx games over do it too. It's like their goal is to put as much random shit on the screen as possible and make it fly around in the most unrealistic way as possible.
 
Know what I noticed in the video? A shit load of effects that could have been left in the game in a static form when you aren't using physx (smoke, flags, etc.). I also noticed A LOT a really simple physics effects that were left out for someone not using physx.

A lot of physx games over do it too. It's like their goal is to put as much random shit on the screen as possible and make it fly around in the most unrealistic way as possible.

we are not the big fan of flying papers, we can hardly understand them..
 
Know what I noticed in the video? A shit load of effects that could have been left in the game in a static form when you aren't using physx (smoke, flags, etc.). I also noticed A LOT a really simple physics effects that were left out for someone not using physx.

A lot of physx games over do it too. It's like their goal is to put as much random shit on the screen as possible and make it fly around in the most unrealistic way as possible.

Exactly and thats what makes the Batman PhysX demo so pathetic. Instead of using scripted effects vs. PhysX GPU effects, they just show it with effects on vs. effects off.

Its like we have never seen destructable objects before :rolleyes:
Moving capes :rolleyes:
Flying papers :rolleyes:
Smoke that comes out of crates :rolleyes:
Or debree that moves in the air :rolleyes:
No game before have been capable of having sparks coming out of broken equipment :rolleyes:

No, you need PHYSX!! :D
 
Exactly and thats what makes the Batman PhysX demo so pathetic. Instead of using scripted effects vs. PhysX GPU effects, they just show it with effects on vs. effects off.

Its like we have never seen destructable objects before :rolleyes:
Moving capes :rolleyes:
Flying papers :rolleyes:
Smoke that comes out of crates :rolleyes:
Or debree that moves in the air :rolleyes:
No game before have been capable of having sparks coming out of broken equipment :rolleyes:

No, you need PHYSX!! :D

Gee...sounds like the Ghostbusters game minus PhysX. ;)
 
Gee...sounds like the Ghostbusters game minus PhysX. ;)

You shouldn't say that Ghostbusters have those effects. It might break the heart of some and it would be like telling there is no santa either. :D

To be serious though, PhysX is not a bad thing, but when they leave out effects that are pretty standard in games of today (granted that some are scripted) to show off PhysX, its only hurting games. If PhysX becomes open standard some day, perhaps we'll see some physics that are worth more then a sidenote.

Again, I want to show the ingame physics done on CPU:
http://www.youtube.com/watch?v=cfSBHJRU9_Q

Weather physics where the wind direction determine how fire is spread and much more. I doubt we'll ever see ingame GPU accelerated PhysX as long as its a closed standard.

There are many more exiting techs coming out now then PhysX as well:
http://www.pcper.com/article.php?aid=783

What I am really waiting for now, is two things:
[H]'s review on Eyefinity (I want to spice up my old games and buy 2 cheap 25.5" screens next to my 2690wuxi) and 5870X2 perhaps.

Should Nvidia bring out something revolutionary with the GT300, I will either buy a 5800 series card now and perhaps then buy the GT300 when it comes, or wait until the GT300. I really hope that Nvidia puts in something like Eyefinity as well.
 
You shouldn't say that Ghostbusters have those effects. It might break the heart of some and it would be like telling there is no santa either. :D

Well I've mentioned it three times in the thread now and no PhysX crusader has commented so I must have hit pretty close to home. ;)

Now, somebody poke Brent with a hot [H] cattle brand because we demand 5850 benchmarks!
 
5850 review will be published 11PM Central Time on September 29.
 
Any update on when the Eyefinity review comes? I'm REALLY looking forward to that one!
Did Kyle get any more news about the bezel management?

I am still working on it, hopefully up tomorrow. I lost both my cameras we shoot with around here, so the video end is suckage right now. :(

No new software from AMD at this time.
 
Oh dear lord. You ati famboys make me laugh, really. Is physx cool? Ya it is. Is it the be all end all? No is it totally necessary at this point so far? No a neat feature on nvidia gpus with a hopefully bright future? Yes. Ok so the common sense facts are in. Wow no news there, except for 1 small thing.

If physx was a proprietary ati technology the ati fans would be all over it like flies on shit and it would be akin to the second coming. Go ahead deny it, I dare ya. The double standards here really make me laugh. No really they do. /sigh
 
How does bezel management work? Does it edit out the area that would have been obscured by the monitor bezels? I'm assuming you'd have to measure your bezels and input the values then?

PS I don't care who owns Physx but I do care that Nvidia is "locking me out" of having Physx ability when I plug in a non-nvida card into my system.
 
Oh dear lord. You ati famboys make me laugh, really. Is physx cool? Ya it is. Is it the be all end all? No is it totally necessary at this point so far? No a neat feature on nvidia gpus with a hopefully bright future? Yes. Ok so the common sense facts are in. Wow no news there, except for 1 small thing.

If physx was a proprietary ati technology the ati fans would be all over it like flies on shit and it would be akin to the second coming. Go ahead deny it, I dare ya. The double standards here really make me laugh. No really they do. /sigh

Yes, that is exactly why we all dislike hardware physx. ATi FOR LIFE DAWG
 
Thinking about this, we will hold off till Monday or late Sunday to publish most likely. And I am still very unhappy with the video and sound quality. Could not get the connected mic to work on the damn camera.
 
Sounds good. I really want the most out of your review, so I am happy that you pospone it a bit rather then rushing it out. :)

Besides dealing with a shitty camera for audio and video, which is leaving me a bit pissed, we are still having issues getting 3 dual-link DVI monitors working with an active DP to DVI adapter and I really want to be able to show that working since we have been told it is possible.
 
Back
Top