Assassin's Creed Gameplay Performance and IQ @ [H]

how much did AF effect performance on the 9600GT? the apples to apples has 9600GT with AF vs 3870 without AF, would that make any difference in the comparison?
 
IPC is also very important, my single core CPU is able to outperform greatly the Pentium D which is a Dual Core CPU in games, even the yougamers web tool which is not 100% reliable states that my CPU is slighly less powerful in raw performance as the Pentium D @3.2GHz, in WinRaR I was able to smoke a Pentium D at 2.8GHz which scored less than my CPU. (534KB/s vs 643Kb/s)

But as I stated before, 3DMark0X and Yougamers is not 100% reliable. I compared three systems using Futuremark's data base having the same Radeon HD 3850 at stock, but one of them was using a Pentium M at 2.7GHz and the AGP version of the HD 3850, and the other one was using a Pentium 4 at 4.1GHz and a Pentium D at 3GHz, and the PCI-E version. The Pentium D and Pentium 4 CPU score was higher than from the Pentium M, but the overall score was in favor of the Pentium M, even the single core Pentium 4 was able to have a much higher overall score than the Pentium D even though the latter had a slighly higher CPU score and is Dual Core.

That proves that Multi Core CPU's are important, but the architecture is even more important, the Pentium M higher IPC stretched considerably the Graphics Tests scores than the Pentium D and Pentium 4 even though in raw power, the Pentium M is slighly less powerful. Also the Netburst needs Hyperthreading to remain competitive due to it's long pipelines which remains idle most of the time. The only Pentium 4 based Dual Core CPU's that have an impressive performance that rivals many X2 and some Core 2 Duo was the Pentium EE 965 and 955. Even though they need a nuclear plant to feed them and an Iceberg to keep them cool.

Luckily, the current Dual core implementation based on the Core 2 Duo architecture and the Phenom architecture, are more robust than the venerable Netcrust, Netbust, NetPOS, marchitecture which it's design is not suitable for multi core implementation, but for exceptional single core performance, specially in heavy workstation operations. Sorry for my long blah blah blah :D
 
Thanks for the single core test. But why use 1600x1200 resolution? That is an insane res to play in period. My biggest worry was buying a game that says dual core CPU only and the game not allowing me to install as I have a single core CPU. Us single core losers still can play it just not as fast lol
 
Some of you guys will drop $600+ for a POFS console that is obsolete in a year, but you won't spend $100 for a dual core CPU? Unbelieveable.
 
I wonder how an old P4 with Hyperthreading would hold up to AC's dual core requirements. Anyone want to try it, with & with out HT at like 3.0Ghz so we can compare it to a Core 2 at 3Ghz.
 
i think it would be interesting that would speculate future intel performence with hyper threading on their imc part
 
I am the one who said Assassin's Creed is playable on a 2.2ghz AMD64 3500+ OC to 2.4ghz.

Let me clarify.

I failed the install test. I installed it anyway.

The cinematics are noticeably slower then what they should be. The dialogue is about a half a second ahead of the motion in the characters. This really isn't a problem because its just a cinematic.

The gameplay itself is 100 percent playable and responsive. I press the button the character responds. No delay. I can jump across rooftops, climb buildings, run full speed through the city, fight 6+ guards at once, without any lag.

I've been playing video games for 25 years. I was on a Return to Castle Wolfenstein team that had a 2 year streak at #1 on Team Warfare ladders (a record for any game). I am not that girl.

If you require further proof I will happily run any benchmark or test, but I assure you, it is perfectly playable.
 
I am the one who said Assassin's Creed is playable on a 2.2ghz AMD64 3500+ OC to 2.4ghz.

Let me clarify.

I failed the install test. I installed it anyway.

The cinematics are noticeably slower then what they should be. The dialogue is about a half a second ahead of the motion in the characters. This really isn't a problem because its just a cinematic.

The gameplay itself is 100 percent playable and responsive. I press the button the character responds. No delay. I can jump across rooftops, climb buildings, run full speed through the city, fight 6+ guards at once, without any lag.

I've been playing video games for 25 years. I was on a Return to Castle Wolfenstein team that had a 2 year streak at #1 on Team Warfare ladders (a record for any game). I am not that girl.

If you require further proof I will happily run any benchmark or test, but I assure you, it is perfectly playable.

Any chance you can post a fraps or something? And thanks for speaking up.
 
Thanks for the single core test. But why use 1600x1200 resolution? That is an insane res to play in period. My biggest worry was buying a game that says dual core CPU only and the game not allowing me to install as I have a single core CPU. Us single core losers still can play it just not as fast lol
Insane resolution? I am sad if I have to drop to only 1600x1200 resolution :(
 
The AMD 3500+ oced to 2.4GHz still packing a good punch for gaming performance, many people will try to simulate a single core CPU disabling the other ones in a dual core CPU, but that's not the same because of the terrible overhead, but a native single core CPU like that one still having power left, of course, it would be bottlenecked seriously with Crossfire/SLI cards or slighly bottlenecked with a GeForce 8800GT or Radeon HD 38X0 series...
 
What overhead? Why do you think there is some "overhead" if you only use 1 core in a dual-core setup? Gawd, I'm not going to explain how a processor works, how it communicates w/ main memory, pci bridge, etc... What you need to worry about is a bottleneck.... It's quite apparent that AC is bottlenecking it's FP calculations so a dual-core or a much-faster single-core is the way to go. If this wasn't a console port I'm sure they would have off-loaded more of the FP to the GPU.

Go read something, get smarter and stop playing so many video games.
 
2008-04-30 16:55:13 - AssassinsCreed_Dx9
Frames: 1247 - Time: 60000ms - Avg: 20.783 - Min: 16 - Max: 31

1600x1200 85hz, 1/3 shadows, 2/4 detail/graphic quality

2.2ghz 3500+ @2.4ghz
2gb corsair
74gb raptor
6800GT OC @ (391/1050)

Well I can say I am surprised. It feels better then the stats look.

I am sure if I turned down the detail it would run better.

edit:

2008-04-30 17:19:03 - AssassinsCreed_Dx9
Frames: 1619 - Time: 60000ms - Avg: 26.983 - Min: 18 - Max: 42

1280x1040 85hz, 1/3 shadows, 2/4 detail/graphic quality


Also I did it in Acre, ran along the crowds in the rich districts like the review did while killing guards.
 
What overhead? Why do you think there is some "overhead" if you only use 1 core in a dual-core setup? Gawd, I'm not going to explain how a processor works, how it communicates w/ main memory, pci bridge, etc... What you need to worry about is a bottleneck.... It's quite apparent that AC is bottlenecking it's FP calculations so a dual-core or a much-faster single-core is the way to go. If this wasn't a console port I'm sure they would have off-loaded more of the FP to the GPU.

Go read something, get smarter and stop playing so many video games.

Well, that's not a nice way to talk to somebody who is older than you, that's what you are demostrating for the way that you are replying with that juvenile-ignorant attitude, shows that you are hoving around the low 10's in age, seems that your parents didn't educate you enough. I guess that you are the one who need to read more and grow, once you accomplish that and stop the trolling and personal offenses, then I will reply to you specifically why the overhead, otherwise I will just ignore you and keep posting with the mature ones.
 
Unfortunately I'm much older then most people here. Didn't sleep well and I just grew tired of listening to people make up stuff to type so they can feel smart. Also I have to occasionally post or my login will be erased, again.

What you said was crap, and everything I can think to respond back with I know is going to go over your head. At least I can let everyone else know that what you said is crap and you need a dual-core processor these days.
 
Unfortunately I'm much older then most people here. Didn't sleep well and I just grew tired of listening to people make up stuff to type so they can feel smart. Also I have to occasionally post or my login will be erased, again.

What you said was crap, and everything I can think to respond back with I know is going to go over your head. At least I can let everyone else know that what you said is crap and you need a dual-core processor these days.

Well, if you don't feel capable to reply without involving your emotions, you shouldn't reply until your mind is clear. Have you ever asked why when Intel launched the Core Duo CPU, it had higher cache latencies than the Pentium M? Because of the cache coherency mechanism and dynamic size cache to save power. Intel uses inclusive cache while AMD uses exclusive cache, and Intel's approach generally results in higher latencies because sometimes the data stored for CPU 1 cache can spill into the CPU 2 cache and cause some latencies etc. A single core is highly optimized with it's own cache, execution units, it's own HT link (AMD CPU's) etc, disabling a core from a Dual Core CPU is not gonna give you the same performance as a native single core. In many gaming tests that weren't optimized for multi threading, the more powerful Core Duo lagged behind of the Pentium M.

Dothan @ 2.13Ghz vs Yonah @ 2.0Ghz (Yonah has considerable improved FPU

L1 Cache Latency L2 Cache Latency
Dothan 3 cycles 10 cycles
Yonah 3 cycles 14 cycles

DivX Doom 3
Dothan 39.7 fps 95.5 fps
Yonah 57.5 fps 93.8 fps

Source of those tests.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2627&p=3

More to read:

http://www.madshrimps.be/?action=getarticle&number=5&artpage=1908&articID=434
 
Well, if you don't feel capable to reply without involving your emotions, you shouldn't reply until your mind is clear. Have you ever asked why when Intel launched the Core Duo CPU, it had higher cache latencies than the Pentium M? Because of the cache coherency mechanism and dynamic size cache to save power. Intel uses inclusive cache while AMD uses exclusive cache, and Intel's approach generally results in higher latencies because sometimes the data stored for CPU 1 cache can spill into the CPU 2 cache and cause some latencies etc. A single core is highly optimized with it's own cache, execution units, it's own HT link (AMD CPU's) etc, disabling a core from a Dual Core CPU is not gonna give you the same performance as a native single core. In many gaming tests that weren't optimized for multi threading, the more powerful Core Duo lagged behind of the Pentium M.

Dothan @ 2.13Ghz vs Yonah @ 2.0Ghz (Yonah has considerable improved FPU

L1 Cache Latency L2 Cache Latency
Dothan 3 cycles 10 cycles
Yonah 3 cycles 14 cycles

DivX Doom 3
Dothan 39.7 fps 95.5 fps
Yonah 57.5 fps 93.8 fps

Source of those tests.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2627&p=3

More to read:

http://www.madshrimps.be/?action=getarticle&number=5&artpage=1908&articID=434

Or, you know, the <2% performance difference might have something to do with the ~6% clockspeed difference between the two chips. Maybe. Or perhaps it was the preproduction motherboard with the unfinalized BIOS. I'm just throwing that out there.

But then I may not be looking at this with a clear mind.

And now that the CPU architecture lesson that nobody asked for or needed is over, lets move on:

I am the one who said Assassin's Creed is playable on a 2.2ghz AMD64 3500+ OC to 2.4ghz.

Let me clarify.

I failed the install test. I installed it anyway.

The cinematics are noticeably slower then what they should be. The dialogue is about a half a second ahead of the motion in the characters. This really isn't a problem because its just a cinematic.

The gameplay itself is 100 percent playable and responsive. I press the button the character responds. No delay. I can jump across rooftops, climb buildings, run full speed through the city, fight 6+ guards at once, without any lag.

I've been playing video games for 25 years. I was on a Return to Castle Wolfenstein team that had a 2 year streak at #1 on Team Warfare ladders (a record for any game). I am not that girl.

If you require further proof I will happily run any benchmark or test, but I assure you, it is perfectly playable.

2008-04-30 16:55:13 - AssassinsCreed_Dx9
Frames: 1247 - Time: 60000ms - Avg: 20.783 - Min: 16 - Max: 31

1600x1200 85hz, 1/3 shadows, 2/4 detail/graphic quality

2.2ghz 3500+ @2.4ghz
2gb corsair
74gb raptor
6800GT OC @ (391/1050)

Well I can say I am surprised. It feels better then the stats look.

I am sure if I turned down the detail it would run better.

edit:

2008-04-30 17:19:03 - AssassinsCreed_Dx9
Frames: 1619 - Time: 60000ms - Avg: 26.983 - Min: 18 - Max: 42

1280x1040 85hz, 1/3 shadows, 2/4 detail/graphic quality


Also I did it in Acre, ran along the crowds in the rich districts like the review did while killing guards.

Soooo....unsynchronized audio in cutscenes, barely playable framerates in the city and you say they're worse when you leave the city walls. All this under DX9 on an XP machine. Sounds like you may not meet the minimum system requirements.
 
Or, you know, the <2% performance difference might have something to do with the ~6% clockspeed difference between the two chips. Maybe. Or perhaps it was the preproduction motherboard with the unfinalized BIOS. I'm just throwing that out there.

But then I may not be looking at this with a clear mind.

And now that the CPU architecture lesson that nobody asked for or needed is over, lets move on:





Soooo....unsynchronized audio in cutscenes, barely playable framerates in the city and you say they're worse when you leave the city walls. All this under DX9 on an XP machine. Sounds like you may not meet the minimum system requirements.

Hmm, [H]ard said 25 fps was playable on this game.
 
Or, you know, the <2% performance difference might have something to do with the ~6% clockspeed difference between the two chips. Maybe. Or perhaps it was the preproduction motherboard with the unfinalized BIOS. I'm just throwing that out there.

But then I may not be looking at this with a clear mind.

And now that the CPU architecture lesson that nobody asked for or needed is over, lets move on:

In my single core CPU, it is playable with everything maxed, lowering the quality settings barely does a difference, it is always in the mid 30's, in indoors goes all the way to 65fps and when there are a lot of people, it goes in low 20's in most cases, which is barely playable, but still playable. The clock speed barely have something to do with cache latency, otherwise, the Pentium 4 will have the lowest latency ever, but this CPU architecture lesson that nobody asked or needed is unnecessary as your opinion about the preproduction motherboard or unfinalized BIOS, let's move on ;)
 
Maybe I missed it, but why no mention of all the artifacts in the AMD screens? In the conclusions anyway, which gave the impression image quality was basically equal. From what I saw, the AMD screens ALL had artifacts and of course poor filtering.
 
Well, seems that it could be driver issues, because it's performance is a bit low on the side with DX9, I'm currently playing it and I haven't found any artifacts, nor with my X1950XT or HD 3850, the AF must be forced in the control panel, because the game doesn't have an option to enable it.
 
It is likely that Nvidia wants to make their 9900 cards seem even more superior, than the 8 series and since they arent ready to release it, they have deemed the world not ready for 10.1
 
how about dx9 vs dx10 performance and screenshots? or is that for a future article?

game runs fine on a 8800 gts 512mb OC in dx 10, but it has glitchy shadows/graphics on some foliage, and frankly I do think ATi's hdr looks more popful and more true to life, it would've been better if they just tune it down a bit, sunlight is pretty strong in real life.
 
Is it just me or are shadows in the game still jaggedy? I am playing @ 2560 x 1600 with max Max AF and Max MSAA enabled by tweaking the ini file. It definitely improved the overall IQ but the shadows still bug me. Just wondering if anybody sees the same as well.
 
But the game does support Anti Aliasing, but the shadows looks jaggified no matter if it's in high or low, bad shadow implementation.
 
The only thing fishy about this is that they want to remove support for DX10.1. If it has a problem, fix it. There's no justification to remove it completely.
 
There would be no reason for a program to only run on dual core CPUs, because from the program's point of view, all it's doing is creating multiple threads, and the OS decides how to load-balance all threads across all cores. However, that doesn't mean that programs designed for dual-core CPUs wouldn't work badly on a single-core CPU, and that appears to largely be the case.

I do find the story somewhat suspicious myself; the DX10.1 path looks practically the same, so this costly rendering path can't be doing much that's very obvious. However, there are a few artifacts in the 3870 output, which could possibly be it. I'd have expected significant visual changes for a render path that's taking up 20% of the GPU's time though, not no effect except possibly reducing a few artefacts.
 
Back
Top