The DX10 numbers are in!

What may be overlooked here is how terrible the dropoff between the 8800 series and the 8600 series is...like 50% performance hit, and we're not talking high res here either. The 8600 series is a joke of a Dx10 solution apparently.

I'll leave the 2900xt flamebomb to someone else.
 
What may be overlooked here is how terrible the dropoff between the 8800 series and the 8600 series is...like 50% performance hit, and we're not talking high res here either. The 8600 series is a joke of a Dx10 solution apparently.

I'll leave the 2900xt flamebomb to someone else.

Yeah, they attribute the 8600GTS's performance to the unnecessarily overly-crippled memory bus. I'm glad I didn't snap one of those up at launch like I was planning, it looks lik my X1950Pro has some life left in it yet if DX10 performance is looking this bad.
 
I saw this coming and I'm pretty sure most people expected this as well. Drivers may improve performance but as of right my predictions have come true. G9x/R700 here I come.
 
Both NVIDIA and AMD were very upset over how little we thought of their DX10 class mainstream hardware.

m'kay.

I feel like I'm in Alice in Wonderland. They think they came out with something better when it doesn't perform that well?

Please, someone wake me up from this weird dream.

EDIT: Another quote: "By choosing to design their hardware without a significant, consistent performance advantage over the X1600 and 7600 class of parts, developers have even less incentive (not to mention ability) to push next generation features only possible with DX10 into their games. These cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9.
Uh, they're saying here that with the low-end cards they came out with, game developers are not able to include a lot of DX10 effects in their games. To help game developers, they should have come out with higher end cards as the mid-range.
On top of that, I think MS deliberately LIED about DX10 and how it would be so cool for gamers. Yeah right! It's CRAP that's being rammed down our throats!
Gee, feels wonderful to be a consumer doesn't it?
Is this their answer to Linux?
 
But these games are DX9 games PATCHED to DX10 or with DX10 added after. I believe games actually created and coded for DX10 will be better, otherwise how does crysis expect to sell if there isnt a single video card out 3 months before release that can play it?
 
But these games are DX9 games PATCHED to DX10 or with DX10 added after. I believe games actually created and coded for DX10 will be better, otherwise how does crysis expect to sell if there isnt a single video card out 3 months before release that can play it?

Oh crysis can be played just don't expect maxed out for a while.
 
Anandtech said:
NVIDIA also tells us that some code was altered in Call of Juarez's parallax occlusion mapping shader that does nothing but degrade the performance of this shader on NVIDIA hardware. Again, we are unable to verify this claim ourselves.
This is very troubling. It's one thing to do executable detection (Half-Life 2, Quake 3), but this is going beyond the concept of improving performance for end users into purposefully damaging the product for a particular group of end users. I hope these claims can be substantiated at some point, but these things typically have a habit of just fading into the background.

HDR-correct anti-aliasing seems like an important feature (we all know how bad aliasing can be without some form of gamma correction/blending), but Anandtech doesn't state whether it truly improves IQ in this case or not.

None of this stuff is really surprising, nor very telling. All of the games tested run poorly considering what's being rendered (Oblivion-level graphics, or worse, for the most part), and one of them even Anandtech has dubbed the "worst console port ever". We see some trends -- R600 excelling without multisampling, though taking quite a hit with it, and the GTS 640 started to lose a bit of hold, but this is DX10 and Vista, and this is to be expected at this point.

I'm waiting for Sweeney's baby to give us a firmer idea of DX10 performance. All of the titles tested are amateurish hackjobs, in my opinion, so just call me when a real developer releases a DX10 title.
 
Hey, I'm a long time reader and finally got the nerve to go register :p .

Anyway, I think these "DX10 benchmarks" are still very premature. Like phide said, these are nothing more than "hack jobs", just DX10 patches. I think time (driver development) and especially Crysis will tell a much more accurate picture. Yes I'm sure some know that Crysis has been designed with Nv in mind, but still. Another thing that I wish Anand would do like the [H] is to freaking include the game settings they run these benchies at.

Again, going back to these "patch games", performance is really not as clear cut as it seems. I can only attest to my own rig, which is the same as their test set-up, and I pull better numbers with more candy. I'm sure I'm not the only one here who has a rig close to that with a GTS 640MB; @1280 x 1024, 148.45 (latest Vista drivers) in the LP demo, the higher the AA/AF the better it seems to do. Anand has always had this history of capping their trials at 4xAA even at low resolutions.

Just for kicks I get 48 Average in Snow, and 53 in Caves *16CSAA & 16AF - Everything maxed except for Shadow Quality at Medium and HDR at Medium (more HDR = headache). And that never dropped below 30fps.

Still, it is too early to tell and I think Crysis, from what I've seen, does a much better job of utilizing DX10 more intelligently (keyword). Plus I don't see where Anand got 162.18 drivers with Vista support (they were 162.15s and were taken down), unless they apparently only specified XP drivers then we have to assume they used 148.15s for Vista.
 
wow in every dx9 vs. dx10 pic theres practically no difference in quality what so ever. I really can't see it. unless you really want to knit pick about how the grass is slightly a bit of a different green than the other *Shrug* I'm thinking a x1950xt is a hell of a better buy than a 8600GTS the DX10 doesn't even make a difference.
 
wow in every dx9 vs. dx10 pic theres practically no difference in quality what so ever. I really can't see it. unless you really want to knit pick about how the grass is slightly a bit of a different green than the other *Shrug* I'm thinking a x1950xt is a hell of a better buy than a 8600GTS the DX10 doesn't even make a difference.

Yeah, I assume this is what you get with a patch, and a console port at that. Even the Crysis DX9 vs DX10 demos don't show a huge difference. Basically it just shows a much better "picture" or environment, more dynamic. I think it will be like that for a while, but I do enjoy that increased dynamic feel, DX9 in that game just looks like a next gen FarCry (look at the Jungle demo comparison). I still think people are looking to DX10 (Dx9.5) with too high of hopes.
 
don't get me wrong I'm sure there's a ton of potential for it, but right now I dont think any of these "mid range cards" would even make they games play that great. a x1950xt or a 7900GT would make them much more playable from looking at those numbers. I mean look what a 8800 can do maybe DX11 is where we're gonna see some huge differences cause these new high end cards are pretty mean.
 
wow in every dx9 vs. dx10 pic theres practically no difference in quality what so ever. I really can't see it. unless you really want to knit pick about how the grass is slightly a bit of a different green than the other *Shrug* I'm thinking a x1950xt is a hell of a better buy than a 8600GTS the DX10 doesn't even make a difference.

The lighting in the DX10 CoH screens is alot better and realistic than the DX9 screens.

I think overall though, Nvidia and ATI have done bad jobs with their "mid range" cards, and I like that Anandtech has elaborated on their disgust for them as well. Unlike here at HardOCP, where they gloat about them, Anandtech lays down the law, and rightfully so.

Nvidia/ATI have to get the message.
 
The lighting in the DX10 CoH screens is alot better and realistic than the DX9 screens.

I think overall though, Nvidia and ATI have done bad jobs with their "mid range" cards, and I like that Anandtech has elaborated on their disgust for them as well. Unlike here at HardOCP, where they gloat about them, Anandtech lays down the law, and rightfully so.

Nvidia/ATI have to get the message.

It's a tiny bit brighter, not sure how that would exactly define it as more realistic. I could with DX9 turn up the brightness. Better go run and buy a $400 card for that!
 
Saw a curious thread title from the AT forums alongside the review, and as I read the results I kept it in mind. Based on the benches, this crazy idea made sense--that it may actually be worth it to buy an 8800 Ultra now. Regardless of debate about GTS vs. XT or GTX, the Ultra absolutely ruled. And it can now be had for right at or just under $600. If you really wanted to play DX10 right now and had the money, that would be the way to go.
 
There is definitely a difference with COH at DX9 and DX10. If you actually read the article, it says COH was designed for DX10 from the beginning but they had to scale it back because of limitations with the current DX10 cards.

Anandtech said:
While Company of Heroes was first out of the gate with a DirectX 10 version, Relic didn't simply recompile their DX9 code for DX10; Company of Heroes was planned for DX10 from the start before there was any hardware available to test with. We are told that it's quite difficult to develop a game when going only by the specifications of the API. Apparently Relic was very aggressive in their use of DX10 specific features and had to scale back their effort to better fit the actual hardware that ended up hitting the street.

If any of you have had a chance to see the COJ DX10 bench at decent frame rates with high resolution, it looks rediculous. The screen shots don't do COH or COJ justice for the DX10 versions, it looks much better then the DX9 versions.
 
There is definitely a difference with COH at DX9 and DX10. If you actually read the article, it says COH was designed for DX10 from the beginning but they had to scale it back because of limitations with the current DX10 cards.



If any of you have had a chance to see the COJ DX10 bench at decent frame rates with high resolution, it looks rediculous. The screen shots don't do COH or COJ justice for the DX10 versions, it looks much better then the DX9 versions.

While that may be true, frame rates are very bad on all the cards.

I'm glad John Carmack is still developing with OpenGL :) ;)
 
It's a tiny bit brighter, not sure how that would exactly define it as more realistic. I could with DX9 turn up the brightness. Better go run and buy a $400 card for that!

um, you notice in the dx9 shot all the shadows are going toawrds the lower left corner, while in the dx10 pic the shadows all radiate out from the light source. Then if you look closely at some of the shadows of the soldiers you can see 2 shadows (which might be made by 2 light sources, can't really tell since you can't see the whole picture)
 
hmmm this whole situation looks incredibly familiar. oh yeah thats right. same situation as with the 9700 pro and directx 9 games. took ati a while to get decent performance out of the drivers for the card.

give ati and nvidia time and directx 10 performance will improve. although its going to take atleast a refresh before the performance is decent. just like it took the 9800 pro for directx 9.
 
um, you notice in the dx9 shot all the shadows are going toawrds the lower left corner, while in the dx10 pic the shadows all radiate out from the light source. Then if you look closely at some of the shadows of the soldiers you can see 2 shadows (which might be made by 2 light sources, can't really tell since you can't see the whole picture)

Exactly. It seems as though the moonlight is the only lightsource in the DX9 version, whereas in the DX10 that light post is also giving off its own light making realistic shadow placement. So there are 2 light sources giving off 2 distinct shadows on the objects and soldiers.

They have 2 shadow shades: 1 heavy dark from light post, 1 lighter "softer" look from moonlight because the moonlight shadow is appearing within the light post light source. Very well done effect I must say.
 
This is a total suprise, DX10 performance sucks on every card, did not see this happening :eek: /sarcasm

Honestly, seeing the early benches I wonder if it can be improved with drivers alone, glad I sticked with my 1900xtx, waiting gladly for the second gen DX10 hardware. Whats funny, is that all the touting M$ has doned with DX10, every review I read about the matter states that almost everything you can do with DX10 can also be doned with DX9.

edit:

Must add about my driver comment, ofcourse there can be driver improvement with current gen DX10 performance, but by the time those drivers are out, 8800/2900 will be outdated. Even if I say outdated, they still can make a deacent buy, but that all depends on where the DX10 performance lands on these first gen DX10 cards. Notice Iam talking about 8800/2900, I would be awestruck if 2600/8600 can get double digit FPS on upcoming DX10 games.
 
Current gen videocards are not really prepared for the future...My X1950 will suffice until DX10 runs properly.
 
Yea I'll wait till the 9 series and HD 3000 series to get a DX10 card and OS
 
And we thought the DX9 performance for ATI was bad...:eek:

The 8800's series may have good performance in DX9, but they are pre-mature cards in the DX10 aspect.

There isnt like, a bug or anything.
They were just the first in the series to support DX10.

In theory, over time, the DX9 performance we have, will become the DX10 performance we will have.

Just takes time.

As of right now, DX10 is a brand new schpeel.
It will become more widely supported, and perform better, simply over time.
:)
 
It seems vista has something to do with this horrible performance as well. Vista is new, dx10 is new. Currently we have a hack job with no optimization.

These #'s are awful.
 
all I can say is f#ck. my upgrade to a 8800gts 640 is done and I really done want to shell out more for a new card 6 months later, I don't have the money. This sucks hardcore. I guess I will be catching up on my older games for a long while now.
 
CoJ benchmark is a DX9 game "upgraded" to use DX10 effects, it doesn't show good relative performance because they added DX10 effects on top of what's already being rendered so slow downs are to be expected.

Lost planet is quite frankly the joke of the century, it's a console port and it just runs like crap anyhow, it's a very bad example of DX10 usage.

Let's wait for games like crysis which we know for a fact are being built by proven competant development teams and aimed for DX10 from the ground up, they claim max settings are possible with a 8800GTX so that seems quite reasonable to me.
 
Its an inmature technology with a not well thought out required hardware upgrade path. It will get better eventually.

If nothing else perhaps people will learn a little lesson in regards to all the "must have DX10 hype and marketing BS since before Christmas" but probally not.

Now quick, go buy a DDR3 board, that will fix it.
 
Back
Top