Call of Juarez DX10 Benchmark out!

Uh well actually I did see that post, but being in the game section I thought it would be more relevant for talk about the demo itself. My post here was to gain some interest for benchmarking video cards and posting results.
 
Comp specs in sig, swap x1800 XT for 8800 GTX @ 621 core - 1512 shader - 1026 memory

1680x1050 - Full screen - High details - Normal shadows - 4x MSAA

11.9 min FPS
40.0 max FPS
21.9 avg FPS

There seems to be a lot of hitching during the benchmark, most notably when the camera zooms into the barn area - it kind of rubberbands slightly :-/
 
Holy shit I might as well have integrated graphics.


1280x1024 (stock 540/1400)
2x msaa
low details
avg 2.6 fps:eek:


1024x768 (stock 540/1400)
no aa
low details
avg 5.1 fps:(


1024x768 (oc 667/1776)
no aa
low details
avg 5.4 fps::mad:
 
Holy shit I might as well have integrated graphics.


1280x1024 (stock 540/1400)
2x msaa
low details
avg 2.6 fps:eek:


1024x768 (stock 540/1400)
no aa
low details
avg 5.1 fps:(


1024x768 (oc 667/1776)
no aa
low details
avg 5.4 fps::mad:

Yeah.
8600 + DX10 = Not so sexy.

I am downloading right now.

EDIT: WOW LOL YEAH I SOOOOOOOOOO HAVE DX10.
Makes me want to go download Vista.
 
Yeah.
8600 + DX10 = Not so sexy.

I am downloading right now.

EDIT: WOW LOL YEAH I SOOOOOOOOOO HAVE DX10.
Makes me want to go download Vista.
Its not like the 8800gts and gtx are tearing through the game either. This DX10 shit is worthless for 95% of people since it wont be playable.
 
My Rig:

X23800 @ 2.4Ghz
2 gig Pc 3200
8800GTS 320 meg oc'd to 600/900
Vista Home Premium

Default Settings:
1280x1024 res
2048x2048 textures
2x AA
low detail

min 8.8 Max 82.5??? Avg 21.9

I only bought the 8800GTS 320 for an interim solution (for Dx9) till we get G92, cause blowing my wad on a 8800GTX is a waste of money. The 8800's are just not going to cut it for DX10. You will be seeing my GTS in the FS forums in about 6 months.
 
My Rig:

X23800 @ 2.4Ghz
2 gig Pc 3200
8800GTS 320 meg oc'd to 600/900
Vista Home Premium

Default Settings:
1280x1024 res
2048x2048 textures
2x AA
low detail

min 8.8 Max 82.5??? Avg 21.9

I only bought the 8800GTS 320 for an interim solution (for Dx9) till we get G92, cause blowing my wad on a 8800GTX is a waste of money. The 8800's are just not going to cut it for DX10
Yeah but think about this..... you have an overclocked 8800gts and are only getting 22 fps at 1280 with low details. So even if the G92 is twice as fast you still wont be cracking 30 to 35 fps at 1600 with higher details. So many people with 8800gts and gtx cards have monitors that run at 1600 or higher and this dx10 stuff is way to taxing. Also COJ doesnt even look that good either.
 
Yeah but think about this..... you have an overclocked 8800gtx and gts cards are only getting 22 fps at 1280 with low details. So even if the G92 is twice as fast you still wont be cracking 30 to 35 fps at 1600 with higher details. So many people with 8800gts and cards have monitors that run at 1600 or higher and his dx10 stuff is way to taxing. Also COJ doesnt even look that good either.

What is that?
A theoretical linear increase?

Hopefully it will be twice as fast in the terms as, how amazingly fast the 8800GTX is compared to it's competition and predecessors in DX9, except for DX10.

The 8800's were out quite some time before, any DX10 application.

Maybe the G92 will have it all optimized, fixed, and sexy.

Or it could suck.
Let's pray it doesn't suck.

Its not like the 8800gts and gtx are tearing through the game either. This DX10 shit is worthless for 95% of people since it wont be playable.

Of course. No card is extra sexy for DX10 purposes. But this is all just breaking the ice of DX10. Give it some time, optimization, new cards, it will all be worth it. It isn't like games can DEPROVE in terms of quality and performance.

... I hope.
 
Of course. No card is extra sexy for DX10 purposes. But this is all just breaking the ice of DX10. Give it some time, optimization, new cards, it will all be worth it. It isn't like games can DEPROVE in terms of quality and performance.

... I hope.
Its really sad that at 1024, no AA, and low settings an 8600gt cant even get out of the single digits in COJ and Lost Planet benchmarks. Those games dont even look that good to me so what about even more graphically demanding DX 10 games?:eek:
 
Its really sad that at 1024, no AA, and low settings an 8600gt cant even get out of the single digits in COJ and Lost Planet benchmarks. Those games dont even look that good to me so what about even more graphically demanding DX 10 games?:eek:

Lost Planet looks REALLY good in DX9. In my opinion.

I personally said, screw it, and bought a Xbox 360.
The standards for hardware are the same for everyone, so, it doesn't lag.
No multiplayer issues (yet)

I am personally considering if I need a PC upgrade, to buy a PS3, put Linux on that bad boy, and save myself some money. It is pretty powerful, operating system wise.
Not gaming though.

Let's just say that, the 8000's are the testing stage for both nVIDIA and game developers.
We are the guinea pigs.
Developers need to optimize, and nVIDIA needs to, see what the developers are doing. :)
 
Machine in sig:
Min: 11
Max: 35
Average: 20

Chugged the whole way... choking on memory somewhere? started with about 800MB ram used, I don't kill apps to play games or benchmark- but otherwise pretty

oh: shadows on normal, 1600x1200, 2xMSAA- think that's it
 
Someone else remarked that, "oh yes, we've broken the 30 fps mark in DX10" in regards to LP. This (CoJ DX10) is pathetic, but considering how poorly the DX9 version ran in XP, I'm not surprised in the slightest. Luckily Crysis appears to run at a playable framerate, so things should pick up.
 
Someone else remarked that, "oh yes, we've broken the 30 fps mark in DX10" in regards to LP. This is even more pathetic, but considering how poorly the DX9 version ran in XP, I'm not surprised in the slightest. Luckily Crysis appears to run at a playable framerate, so things should pick up.

LP ran poorly on XP?
It may be a STRAIGHT console port, but, it ran good for me?
Sure, I had to get some new drivers, turn concurrent operations off, and concurrent rendering to 1, but after that I was hitting 45FPS average?
More than playable.
 
No, I meant CoJ in DX9/XP (sorry about the way I wrote it). Depends on whether that's 45FPS as minimum, average, or maximum, but compared to a lot of games recently, I would agree that is a playable framerate, provided it doesn't dip much. My point is that the three main DX10 apps ATM (LP, CoH, now CoJ) are not impressive in the performance category, but I didn't expect much else from tacked on DX10 features.
 
I get OK frame rates for most of it, but there is a lot of hitching.

I see the main issues as being these :

1 - Drivers aren't mature enough to get all the performance that the 8800 series can provide

2 - Drivers aren't yet optimal for DirectX 10, and it'll be some time before they are

3 - The Demo isn't using the video card's full memory. Hit ESC when it's running and you'll see it's only using a portion of whats really on the card. In my case it's using 248megs of my 320meg card.

4 - Games haven't yet been optimized for duel core, huge swags of memory, 64 bit CPU's, 64 bit OS nor Vista. When THAT happens, we'll see some HUGE peroformance boosts.
 
Since this is an TWIMTBFUNV Game the MSAA and SSAA Anti-alasing are the only settings available. I not really sure what those settings is forcing the AA too, maybe someone knows? Anyways CCC would not take control over the ingame settings. Renaming the .exe to oblivion.exe or whiteoutd3d10.exe made no noticable differnce in FPS or image quailty, neither .exe rename would allow the CCC to control the AA or AF settings.

ATI2900XT
CojDX10 demo
Full Screen-Details High-Shadow Maping 2048x2048-Shadows High

cojdx10na2.jpg
 
I think motion blur, when used, is going to be one of the most taxing features a game can have. I also think that in the near future, we will not be able to live without it- it just looks too good, like in the Lost Planet demo. Otherwise, like in the new Dirt demo, you see wheels as if they were steady when they're really spinning- motion blur is one of those big cinimatic effects that really makes up for realism, right next to HDR, which itself is still far from perfect.
 
Back
Top