When Nvidia brings its GT300 to the market, I bet it will generate a lot of interest there as well. At least for me and [H] has a history of welcoming new Nvidia cards as well.After reading a lot of reviews around the net I'm simply not all that impressed by this card. I really was expecting more. In the time I spent reading I kept wondering if nvidia was the one to come to market with its next gen card first would it have been welcome to the same reception? Or is nvidia held to a higher standard for some reason? I can't help but feel people would have been bashing nvidia for such a medicore jump in performance after all the hype. Am I wrong? I doubt it sadly.
Qft.crysis is an awesome game. nothing wrong with the coding, it was designed to push hardware, and honestly it's the only game benchmark worth a damn, since almost every other game is a 360 port which presents almost no challenge to last generations hardware.
Thanks for the helpful response.On AMD 2X, 4X and 8X AA are MSAA modes. 12X and 24X CFAA are CFAA modes. CFAA uses the shaders to perform edge antialiasing, no memory bandwidth required, and it does not affect textures or make them blurry, just on polygon edges.
The 256bit bus does not hurt it since they use high speed GDDR5 to keep the bandwidth high.
It handles MSAA well, just like the 4800 series did. I saw very nearly no drop in performance between 2X and 4X AA in NFS: Shift. 8X AA as usual was a lot faster than NV's 8xQ MSAA.
The 5870 trounced the GTX 285 in our tests, so yes, it trounces a GTX 280 which is clocked slower.
I ordered one from the Egg today. Keep looking online, they tend to come in and out of stock on the various etailers. If you have a microcenter or something like that near you, then call and check to see if they have a card in stock.Hmm... I think Nvidia might be turning red just about now
This is the first time in about about 6 years that I can actually say that I REALLY want a new video card!
The 4800 series already had my interest, but the 5800 series has totally won me over. When are we gonna see these available?
WHOA!!! Finally 64 posts after 5 years of noobness
Check out the numbers for yourself:So, I suppose that moving from the GTX 280 to the 5870 running at 1920 res would be a relatively huge boost in both visuals and performance then?
Is that a safe assumption to make? Or would that be over-playing things?
Well yeah, if they came out at the same time. That's all those dam dual gpu's are, another way to milk the system. No thanks, I'm not paying for the same piece of video real estate twice.I am always impressed when a 1 gpu card is equal the permance of a 2 gpu card.
That makes no sence. If the dual gpu card and single gpu card came out at the same time with the same proformance what would be the point of having a dual gpu card.Well yeah, if they came out at the same time. That's all those dam dual gpu's are, another way to milk the system. No thanks, I'm not paying for the same piece of video real estate twice.
I've got a 4870x2 where I run games in 1920x1200 resolution and still consider upgrade. Reason is simple: Eyefinity. Old games can be played with more immersion and though TH is out with DP edition, this implementation seems to be less fuzz (though some initial bugs and lack of features with Eyefinity like bezel management needs to be dealt with which TH already have fixed). I have performance enough for games on the marked now, but I don't have eyefinity.
DX11 hardware support for some features I want like:
Directcompute on SM5 (I have support for 4.1 now though)
New feature support in OpenCL 1.1 in future version
Here's a list:
I'm considering to buy some cheap 26" (25.5") TN just for added immersion in existing games and a new 5800 series card (preferably X2, since I've grown fond of them and I hope Gainward comes with a custom cooling edition like the one I have).
Looking forward to [H]'s Eyefinity review!
PhysX's little eyecandy was underwelming, but this:
Now thats adding to the game, even when Eyefinity is in the start phase!
Edit: Nice review as always from [H] btw. Tnx!
Show me scripted PhysX GPU physics?When Nvidia brings its GT300 to the market, I bet it will generate a lot of interest there as well. At least for me and [H] has a history of welcoming new Nvidia cards as well.
Sadly, I don't think that you have given the 5800 any welcome. You've complained about lack of PhysX evaluation and how little impressed you are by it. Perhaps you'd prefer that [H] is talking about Nvidia when evaluating an ATI card?
PhysX itself is not very impressing in its current state. It has potential, but not shown it yet. Cloth we've seen already in UFC 2009: Undisputed which is not scripted running only on CPU. Flying papers and leaves might appeal to some, but I must say I'm more impressed with Farcry 2's ingame physics with weather that impacts gameplay where wind direction affects your flamethrower and grass burning etc:
Eyefinity however gives more to gameplay then any physics it seems, also in batman:
You might argue about cost, but thats up to everyones wallet. If you buy a cheap GFX, you'd probably turn off PhysX as well to run the game decently.
Current PhysX is sadly only a sidenote in my opinion... If you have the same effects shown on PhysX GPU scripted, it doesn't give that much to desire. Some might be a bit more impressed if effects are simple turned off instead.
As long as PhysX isn't an open standard on GPU, you'll probably never see anything major in them anyway. Perhaps when games with either Bullet on OpenCL or Havok on OpenCL comes, we'll see some GPU accelerated physics thats worth a second thought.
There are also screenshots showing the difference between default and adjusted LOD.In comparison to the internal Oversampling, which can be forced on Geforce graphics cards with the tool Nhancer, the texture LoD is not adjusted with Ati's SSAA. To put it bluntly this means that the amount of AF is not increased. With a third party application like the Ati Tray Tools you can adjust the LoD by hand in order to receive the best texture sharpness. For 2x SSAA we don't recommend any changes yet, but for 4x SSAA you should set the LoD to -1 and for 8x SSAA -2 or -3 are flicker free - depending on the texture content.
And if that's your preference that's fine. That doesn't make PhysX better though. You and Silus have this holy crusade going on to try to prove PhysX is the best shit ever.
Their Linux support sucks, and this is precisely why I've been avoiding GPUs from the red team in my rigsIn regards to open-source support for the Radeon HD 5800 series, we would expect initial mode-setting support to appear within the xf86-video-ati driver within weeks and hopefully we will also see kernel mode-setting support shortly thereafter or around time of the Linux 2.6.33 kernel, We would expect it to be at least a couple of months before there is any open-source 3D acceleration support for these new graphics cards.
That is only a demonstration of how little PhysX gives. They just took away effects that you can get in several years old games. Take a look at this:
What are you talking about? I'm talking about having GPU PhysX effects in games vs. scripted physics effects showing the same.Show me scripted PhysX GPU physics?
Games are getting better and better at this, I agree that there is a lot of room left in Quad-Core CPUs for Physics, and they are very underutilized. One thing DX11 has is multithreading,so driver, dx runtime and the application can run in separate threads, this should naturally make DX11 games more multithreaded.Why don't game companies just grab a full core on my quad core and use that for physics? Or maybe two full cores? Oh yeah cause Nvidia is paying them moneys... /slap
Rigid bodies in that small scale dosn't impress me...this isn't 2006 anymore.
Ad hominen is not a valid argument....it only waste yours and my time.Calling that Batman video a demonstration for GPU accelerated physics? I bet if they removed the cape and called the cape PhysX effect, you'd be all over it saying that "PhysX gives batman cape".
I countered your link, your premise is false.What you see in that video, is effects thats been in games for years. They added these effects and run them in realtime physics, instead of scripted (the youtube link I gave you is realtime physics as well on CPU).
No what is "pathetic" is people trying to sell unstable, 1/5th accurate physics as the same a stable 100% physics...and people buying into that lie and even promoting the lie...PhysX gives nothing special to games and they have to remove effects that other games have without PhysX to try to sell it. Pathetic.
FUD...they didn't before PhysX, it's only after AGEIA that focus really came onto physics and I love how you try and ignore this...borderlining deperate:What are you talking about? I'm talking about having GPU PhysX effects in games vs. scripted physics effects showing the same.
Like Mirrors edge's banners. The PhysX there needed to have a "with banners on GPU" and without banners at all. Even though the most stupid game developers are able to add banners without physx.
The cost of more monitors, fisheye vision, zoomed in gameplay..or nothing at all...all documented.Eyefinity on the other hand, actually adds something new to the game thats worthwhile.
So wait, you mean nobody cared about physics in games before something named AGEIA? And you start off this sentence with FUD? Is that a warning or something..?FUD...they didn't before PhysX, it's only after AGEIA that focus really came onto physics and I love how you try and ignore this...borderlining deperate:
Lots of the effects cannot be scripted..as the objects are INTERACTIVE *chough*
The cost of more monitors, fisheye vision, zoomed in gameplay..or nothing at all...all documented.