Call of Juarez DX10 Performance and Image Quality @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
Call of Juarez DX10 Performance and Image Quality - Call of Juarez enjoys a very large and noticeable visual difference in DirectX 10 versus DirectX 9. This is the first game to make us say “Wow!” at the visual quality improvements using DX10. We will explore GPU performance scaling and image quality comparisons using the 8800 Ultra, GTS, & 2900 XT.

Call of Juarez is a visually impressive game and entertaining title. Call of Juarez, in DX10 mode, places the gamer into some of the most immersive outdoor environments we have seen; even rivaling and sometimes surpassing Oblivion. The HDR, lighting, and high resolution textures in DX10 are what sets this game apart. The fact of the matter is that DX10 feature utilization by its creative team has made Call of Juarez a better game in that it places the gamer in a more believable environment. Those DX10 features combined with already engrossing gameplay simply make it a better game. Hopefully we will continue to see DX10 and its extensive feature set making such a positive impact on future titles.
 
great review kyle. I've been wating for something to show the difference between dx9 and dx10.:cool:
 
The water shows a marked difference, as done the HDR and some of the lighting effects. Let's hope more game devs will think of the PC as a first choice instead of porting more games from consoles. API's are nice, but devs need to take advantage of them. Here's to hoping! :)
 
great review kyle. I've been wating for something to show the difference between dx9 and dx10.:cool:

Yeah nice review Kyle. I was surprised to see that the game was actually playable because the demo sucked big time. I wasn't surprised with the 2900xt's performance, most people already knew CoJ and 2900xt were "made for each other". I was surprised by the 8800-line though, I thought their overall performance would've been lower rather than higher as was the case for the CoJ demo (2900 > 8800). I also wonder how the 163.44's would've improved performance?

Cyrsis will be the "end all, be all" for quite some time though. Bio' and CoJ will be forgotten next month when the Crysis single-player demo is released (confirmed for September 25th) :eek: . Likely, Crysis will scale much better and provide increased visuals and performance at the same time :) .
 
As everyone else said, great review. It's good to see that DX10 is finally making a difference. Even though my 8600GT is technically DX10, it's clearly not fast enough for this. Hopefully in the near future I will be able to upgrade to a GeForce 9 series (or Radeon x3k series) card.

Vista has taken so much crap, but the bottom line is that it's really a pretty damn solid OS. Once you throw 2GB of DDR2 at it and a dual-core proc, it's pretty fast too - and that kind of hardware is cheap today ($60 65nm Athlon 64 X2s and $90 2GB DDR2 kits).

Even my (by today's standards) low-end system (Athlon 64 X2 3600+ @ 2.8GHz, 2GB DDR2, GeForce 8600GT @ 550/950) can run a less-demanding game (WoW for example), Media Center (which is also 3D), and Google Earth (OpenGL) at the same time. That doesn't even work in XP.
 
Great job you turds! Now I have to get that game. :D
Curious if the 1GB 2900 cards would make use of the added memory?

Glad to see no difference in IQ for both AMD and Nvidia users finally. :)

"In the screenshot above you will see that in DX10 HDR is much brighter, almost washing out some of the textures. While immediately in this static screenshot it may look way overblown,
when you are in the game moving around you will notice the entire environment is much more vibrant. There is also an iris like effect that dims and brightens the HDR as you move around."

When I first looked at those shots I thought it was overdone until I read your comment and that's what sold me on getting the game.
 
Now I'm really starting to regret buying that 8800GTS 320MB. *sigh* Time to sell it off and get a new card.
 
Even my (by today's standards) low-end system (Athlon 64 X2 3600+ @ 2.8GHz, 2GB DDR2, GeForce 8600GT @ 550/950) can run a less-demanding game (WoW for example), Media Center (which is also 3D), and Google Earth (OpenGL) at the same time. That doesn't even work in XP.

Sure, just don't try to only play an MP3 and use your LAN at the same time.

:p

Okay, I kid, I kid. Vista is definitely a move in the right direction, and I look forward to SP1 with a great deal of eagerness.

And this review is fantastic - it is great, indeed, to see DX10 finally 'live up to the hype'!
 
I currently have an 8800GTS 320MB card, and am wondering if it would be worthwhile to simply get another 320MB and SLI them, or should I just sell the card I have and upgrade to a single 640MB?

(In other words, I think an SLI/Crossfire followup would be useful) :)
 
I currently have an 8800GTS 320MB card, and am wondering if it would be worthwhile to simply get another 320MB and SLI them, or should I just sell the card I have and upgrade to a single 640MB?

(In other words, I think an SLI/Crossfire followup would be useful) :)

Depends on the games, for CoJ the higher memory capacity would probably make a larger difference. Keep in mind that adding a second video card will not increase the useable local memory capacity. Meaning that it will still be able to use only 320 MB of video card memory, not 640.

We will have to continue testing more DX10 games to see if this trend in memory capacity performance makes a difference. So far we've seen it be true for Lost Planet and Call of Juarez. Next week we'll know if it is true for Bioshock.
 
How many of those changes were because of DX and how many were simply programming changes on textures and settings? I'm not breathtaken by any DX10 display so far and I doubt I will be.

Marketing hype at its best.
 
Gotta be honest, some of what you showed in that review was extremely impressive (great review, btw) but...

...honestly, you stated what I've been saying, is that some of what you see could indeed be done with DX9 if they'd just push it a bit more.

Still not enough to make me spend $200 on a new OS, despite some of the visuals.

Especially because there's about that much $ or more in games coming out in November alone! LOL. :D

I'm sure Crysis will show some big differences as well with DX10, but still... it looks incredible even with DX9, so good enough for me for now.

A little temping, though, to test out Vista/DX10 on another drive if I was currently able, gotta admit :rolleyes:
 
Once again, very authoritative review, Brent (and Kyle). And I was especially interested in the relative parity between the 2900XT and 8800GTS, which beyond having been 'claimed' by myself and others around here (to much mocking, I might take occasion to remind), importantly it highlights the need of developers to code with DX10 as their primary goal; not a console, DX9, or one particular brand of GPU. Here's to parity coding as the servant of the gaming consumer!
 
Nice review, good and useful info.
For the first time I feel my Graphics Card / OS is getting properly left behind (X1800PE / XP)
Dammit wheres the smaller die GTX, I'm getting Vista when I have one.

Maybe the higher res textures can be put into DX9 but unless someone actually does it and it performs ok, it doesnt matter.
With the improved detail in lighting and better colour contrast in DX10, the textures alone wont swing it for DX9.
I want to replay CoJ in DX10.

Roll on NVidias die shrink and Vista SP1 !
 
Nice review .1 which I myself have been waiting for some time now.

There is this one thing and that is n=how much resources does this game use in task manager whilst being played ? That is in DX9 and DX10 also.

I have tried this with lost planet and it uses around 200MB for dx10 and 400MB for dx9.

Is it similar with call of juarez ?
 
How many of those changes were because of DX and how many were simply programming changes on textures and settings? I'm not breathtaken by any DX10 display so far and I doubt I will be.

Marketing hype at its best.

Yes, it can be done in DX9, but I think it is best summarized in the review: "DX10 is much more efficient at reducing per-object state changing overhead compared to DX9, which allows developers to add more detail in their games. While some of the things do seem like they could be done in DX9 it may very well be the case that DX10 provides performance optimizations to allow all of these effects to work together with greater efficiency. Besides, the geometry shader accelerated particles can only be done in DX10 on DX10 GPUs"

Give it time and DX10 should mature nicely beyond just "marketing hype".
 
Great article. I like the image quality differences between dx9 and 10 and apples to apples.

Why can't they have same texture settings in dx9 as dx10 I don't know.
 
Now I'm really starting to regret buying that 8800GTS 320MB. *sigh* Time to sell it off and get a new card.

I saved myself from that regret with my 640MB pick. I felt the extra memory will be useful with new games and I think I'm right...
 
1. With the memory being a dominant factor in this game, I think it would make the review much more "well rounded" to add in the 1GB version of the 2900XT, even if it is just a footnote at the end. I think a great many people want and/or need to see this. You did such a good job showing how nVidia scaled...how about ATI?

2. As for the comments of "well, DX9 could look better", I think they are right...but missing a key factor. If you took the quality of the DX9 version compared to DX10 and then look at the FPS, I think it would be cirrect to say that the DX9 version would take a fairly large FPS hit in order to match that IQ. This review definately show the what DX10 was "meant" to do for the same FPS.
 
I saved myself from that regret with my 640MB pick. I felt the extra memory will be useful with new games and I think I'm right...

QFT. Sometimes I kick myself for not biting the bullet and going with a GTX but this review showed me that it was a good decision to go with the 640MB GTS instead of the 320MB version. It also gives me hope that my GTS will be able to handle DX10 games for the near future when I eventually make the move to Vista/DX10.

Very good review though. It's nice to see DX10 "come into its own" and start showing us what it's potential really is for gaming and image quality.
 
QFT. Sometimes I kick myself for not biting the bullet and going with a GTX but this review showed me that it was a good decision to go with the 640MB GTS instead of the 320MB version. It also gives me hope that my GTS will be able to handle DX10 games for the near future when I eventually make the move to Vista/DX10.

Very good review though. It's nice to see DX10 "come into its own" and start showing us what it's potential really is for gaming and image quality.
yea the 320 vs 640mb decision was a tough one for me also, but I went 320 since I figure I'm going to buy a >512MB version of the next gen. The 8800GTS is only a temporary arrangement.

Let's hope to see more advancements from dx10 in the future also.

Good article H
 
Im anxious to see how these games improve with the second generation of DX10 cards due out in November.

its nice to see that the more memory you have the more it gets used in these newer games
 
This looks a lot more like they purposefully downplayed DX9 capabilities. Most other titles that I have seen with DX9/DX10 look so close you can hardly tell the difference anywhere, minor smoke and lighting effects is about the extent of the difference. It is not that other titles are under utilizing DX10 and look worse than CoJ in DX10, but that CoJ is under utilizing DX9 and looks much worse in DX9.

Though I had been expecting this behavior as the ultimate stick to move people to DX10. DX9 will purposefully be underutilized to make DX10 look better. Yipee!
 
Sure, just don't try to only play an MP3 and use your LAN at the same time.

:p

Okay, I kid, I kid. Vista is definitely a move in the right direction, and I look forward to SP1 with a great deal of eagerness.

And this review is fantastic - it is great, indeed, to see DX10 finally 'live up to the hype'!

Yeah, the MP3 bug is.... odd. The problem is that it throttles network bandwidth to free up CPU time for media decoding - but it's designed for video as well as audio. Playing H.264 1080p movies and copying a file on a single-core XP system does result in glitching, while it doesn't on Vista. Of course, playing an MP3 works fine on both - yet Vista still throttles the network.

Hopefully they can fix this one with a hotfix, like the file copy issue. Of course, I've never really ran into either problem.
 
i know a lot of those textures can be done with DX9, the extra stuff you can download for Oblivion show a similar difference, the lighting and sky though I'm not sure about.

nice article either way
 
Yeah, don't get overly happy because you can see "differences". While this is true for CoJ, Crysis in DX9 looks better imo. DX9 can be pushed quite a bit, look at Bioshock's DX9 visuals between it and CoJ in DX10. If Crysis in DX9 is any indication, the game in DX10 mode should mop the floor with these titles. I love the review (as always), but I don't care for the game personally, but just saying :).

I hope Crysis and CoJ are at the opposite ends of the "optimization"/performance spectrum (likely), I don't want to upgrade again just to play Crysis at 1280x1024. Bioshock's DX10 performance gives me hope so it's all good.


+ Crysis...........Bio....LP/CoJ -

Efficiency-Performanc and Visual Rating ftw!
 
I find it interesting that these smaller dev houses (Techland, etc.) seem to be the ones on the cutting edge of graphics stuff.
 
i ordered a x2 3800+ from newegg earlier this month for my htpc, and they threw in call of juarez for free. having read the piss poor reviews this game was getting, i didn't think of installing it on my rig, but after reading this review i'm chomping at the bit to see it run in dx10.
 
This looks a lot more like they purposefully downplayed DX9 capabilities. Most other titles that I have seen with DX9/DX10 look so close you can hardly tell the difference anywhere, minor smoke and lighting effects is about the extent of the difference. It is not that other titles are under utilizing DX10 and look worse than CoJ in DX10, but that CoJ is under utilizing DX9 and looks much worse in DX9.

Though I had been expecting this behavior as the ultimate stick to move people to DX10. DX9 will purposefully be underutilized to make DX10 look better. Yipee!


Seriously, would you want to play CoJ at the performance levels it would be at if they implemented all those features in DX9?I mean, I switched back to my Viewsonic P220F after getting the GTS because the 12x10 rez of my old 19" LCD was holding me back in everything, even Oblivion. Now I'm back to 12x10 for CoJ. Yippee!!!

With my GTS 640 I'm already playing at settings below what I would like in DX9. As far as I'm concerned the only cards here with decent DX10 performance are the GTX/Ultra. Without AA this game needs 16x12 rez, and even at 16x12 AA makes it look so much better.
 
This looks a lot more like they purposefully downplayed DX9 capabilities. Most other titles that I have seen with DX9/DX10 look so close you can hardly tell the difference anywhere, minor smoke and lighting effects is about the extent of the difference. It is not that other titles are under utilizing DX10 and look worse than CoJ in DX10, but that CoJ is under utilizing DX9 and looks much worse in DX9.

Though I had been expecting this behavior as the ultimate stick to move people to DX10. DX9 will purposefully be underutilized to make DX10 look better. Yipee!

Its not like Microsoft downplays the quality either:

http://www.gamesforwindows.com/en-US/AboutGFW/Pages/DirectX10.aspx

Crysis (DX10 (oh ya and DX9)) vs Halo

Can't wait to see DX10 actually being faster and adding in the newer features w/o trying to make DX9 look bad.
 
Seriously, would you want to play CoJ at the performance levels it would be at if they implemented all those features in DX9?

Have you looked at other games that didn't sandbag DX9, like "world in conflict"? The versions are nearly identical and DX9 is faster. The biggest difference happening in CoJ is the poor textures they used in DX9.
 
Can't compare two different games.

With it's current level of performance in DX9, would you really want CoJ to run any slower? Sure it's not the best optimized game ever, but it's a nice example of how you can get a a lot more IQ at similar performance just by using the inherent efficiency of DX10. Sure they could have just bumped everything up to insane levels in DX9, but who wants another EQ2? I mean the level of detail in CoJ is pretty crazy if you just sit back and look at it.

WiC is a completely different game, even a different genre. Comparing the DX9/10 scaling between it and CoJ is just silly. Wait...

I'll modify that...

Saying that WiC is more representitive of how much better (or worse) DX10 is then DX9 in this case, is about as valid using Lost Planet as an example. I'd the similarity in WiC (honestly I haven't messed with it in DX10 yet) between the APIs is more telling of their lack of DX10 execution then a testament to the great way they did DX9.
 
WiC is a completely different game, even a different genre. Comparing the DX9/10 scaling between it and CoJ is just silly.

What is silly being convinced that one poorly programmed game, that looks worse than any other current game in Dx9 is somehow evidence of the superiority of DX10.

It is not just WiC, but Bioshock as well and probably Crysis as well that look nearly identical in DX9. You are ignoring the trend and believing the exception.

Also note that this game as well is running slower in DX10, there was headroom to at least enable the higher quality textures.
 
I'll admit up front to owning a 320meg card, but I do have to wonder if the DX10 version is so bad on the 320meg card due to poor optimization. Not just the game, but the Nvidia drivers also. Double hit.

It just seems a little odd that I can run Oblivion maxed out with 8x AA in 1680x1050 (mins in the 20's I think), but this game brings the card to it's knees with no AA.
 
I find it interesting that these smaller dev houses (Techland, etc.) seem to be the ones on the cutting edge of graphics stuff.

Perhaps they have less to lose by forging ahead into the present day's DX10? The big dogs will let loose within three months of the time Nvidia refreshes its line. It is an industry, after all.
 
Back
Top