Assassin's Creed Gameplay Performance and IQ @ [H]

... wow single core gaming really is on the way out for new games it looks... or maybe if games were made for PC and not consoles and then ported we would have better overall performance anyways :p

It's a little bit of both. There is almost certainly some further optimization that can be done to just about any software to enhance performance, but the return on the investment of time and manpower to execute those optimizations is increasingly small.

Overall, I feel that games have just about exhausted the quality experience that is deliverable from a single CPU or single-core CPU. Going forward, games are only going to get more demanding, and if you refuse to upgrade your processing engine accordingly, you are going to be left behind.
 
Ok, I have a funny feeling that since the removal message came from Ubisoft via an NVIDIA email,this is raising red flags all over the place and hopefully the "big boys" will pick this up and run with it.

If there turns out to be a real problem down the lane , then the OPTION to turn it on/off should be available

Nvidia keeps cutting 10.1 support out of their cards (why/what is the problem) so until you see a huge announcement I would wonder about if they will have REAL 10.1's in the pipeline
 
When Rage3d published their review of AC and mentioned DirectX 10.1, I was playing the game on DX9/XP with all settings maxed using a very-overclocked 8800 GT. I visited the Ubisoft site on that same day, and viewed a not-quite-complete blurb about them removing DirectX 10.1 support in the next patch. That was over two weeks ago, mind you, and the patch is yet to arrive. But yesterday when I visited the wording was different, and claims DirectX 10.1 will be reinstated in a subsequent patch.
 
So, upon Kyle's request, I ran the test again in an A2A fashion with the game's affinity set to Core 0 only. I chose 1600x1200, with max in-game settings and AA disabled.

Here is the result:

cpu_core_graph.gif


While this isn't exactly the same as a pure Single Core test, it's as close as my testing system will get me.
So does anyone of you knows which of these three multithreading techniques/mode do they use?

Synchronous Function Parallel.
Asynchronous Function Parallel.
Data Parallel.

http://www.gamasutra.com/features/20060906/monkkonen_01.shtml

All that I've found on the net seems to discourage the actual use of multicore for gaming.
 
wow so dx 10.1 really makes difference i remember crytek and nvidia saying opposite :D nice review by the way
 
i still have to say that the house is burning ubisoft. they released a game with poor dx 10.1 implementation and it needs a patch to fix it but before that patch they are just going to remove it; is that not cutting off a limb for a little cut. i bet this patch will come when nv's next gen cards that support dx10.1 show up removing any advantage amd has. :rolleyes:

Adherently this is whats wrong in the PC gamin industry. I and sure you and i want a game that works flawlessly as it does on a console. Again still waiting for the Pc gamin alliance to do anything to help us pc gamers receive quality games.:rolleyes:
 
So does anyone of you knows which of these three multithreading techniques/mode do they use?

Synchronous Function Parallel.
Asynchronous Function Parallel.
Data Parallel.

http://www.gamasutra.com/features/20060906/monkkonen_01.shtml

All that I've found on the net seems to discourage the actual use of multicore for gaming.
Wat?

Why would multicore possibly be discouraged for gaming? Unless you're referring to the general tradeoff between clock speed and core count, in which case clock speed is generally superior, but with more games becoming multi-core aware, that advantage is diminishing.
 
Yea I agree with other, the fact that this announcement comes from nVidia makes this seem way to fishy....
 
So does anyone of you knows which of these three multithreading techniques/mode do they use?

Synchronous Function Parallel.
Asynchronous Function Parallel.
Data Parallel.

http://www.gamasutra.com/features/20060906/monkkonen_01.shtml

All that I've found on the net seems to discourage the actual use of multicore for gaming.

just as i've been saying in another thread...with all those physics going on in the game, a whole city full of people and their reactions, wildlife, sandbox atmosphere.....there's no way a single core cpu could crunch all those numbers efficiently.
 
Has anyone played both a console version (especially PS3) and the PC version. I am about to get a PS3 and was wondering which version of the game I should purchase. I'm sure I could get better graphics on the PC, but how is the immersion feel since the game was originally created for a console controller and not a keyboard and mouse.
 
Exactly how I understood it. They said that their implementation of DX 10.1 removed a render pass, hence the performance boost. However, removing the render pass should actually decrease IQ compared with not removing it. Therefore in their current implementation, DX 10.1 has inferior IQ to DX 10. It's basically a bug and they've decided to remove support altogether until they can figure out how to implement DX 10.1 without the performance penalty. I'm surprised that the authors of this article didn't interpret this correctly. The article does *not* in fact show that DX 10.1 improves performance for AMD 3800 series cards. It's a bug in Assassin's Creed, that's all.

Also, the final apples to apples comparison (9600GT vs. 3870) is NOT apples to apples. Anisotropic filtering is enabled on the 9600GT, resulting in a performance penalty and invalidating the comparison. I understand that the 3870 had problems with AF, therefore the comparison should have been done with AF disabled on both cards.

Totally agree with everything Axel said right there: the 3870 has an advantage in DX 10.1 because somehow they can't do the AF correctly in 10.1.

So obviously we must wait again till they make DX10.1 work with AF, then we can claim X% improvement for AMD/ATI cards...
until that patch, please redo the Apple to Apple test with the same level of AF !! :p

Show me some updated benchmarks !!! :D
 
just as i've been saying in another thread...with all those physics going on in the game, a whole city full of people and their reactions, wildlife, sandbox atmosphere.....there's no way a single core cpu could crunch all those numbers efficiently.

If you have a single core and a dual core that can do the same number of total calculations per second, how is being dual core going to be any diffrent? :rolleyes:
 
Sounds like i'm going to have to finally upgrade from my FX-55.

RIP, she lived a good life.
 
Well, the game is clearly written from the ground up to be multi-core, rather than having multi-core support added as an afterthought. In theory, the game should only be half as fast with 1 cores vs 2, so there is also an obvious lack of single-core optimizations. Makes sense since even low-end CPUs are at least dual core these days.

How does the game scale with 4 cores, 8 cores (two quad core CPU's) etc?
 
That CPU scoring seems fishy to me, I was able to play AC at 1024x768 with everything maxed except shadows on 2/3 without Anti Aliasing (For testing purposes, I will try later with maxed shadows and anti aliasing to see) and the game was perfectly playable, but for some reason, FRAPS doesn't work well with this game, it shows some weird numbers, will try again later. Multi Core in games is important, but a more powerful GPU will give you more gains generally, unless if your CPU is ridiculously slow (Celeron)
 
I noticed this too. I understand that Nvidia can do AF fine and ATI cant, its fine for the highest settings tests to be done like this but not the apples to apples because it simply isnt apples to apples.

The exact same in-game settings are selected for both video cards in the ATI vs. NV graphs. AF is broken on ATI cards, and if we were have to disabled AF for the NV cards we would have had to change the visual quality options, making the in-game settings very different. So while AF is broken on the ATI cards, all the in-game settings are set at the same values.
 
I tested the game with Anti Aliasing and seems that the frame rate drop is minimal, even If I lower the in game settings to the lowest, the performance gains aren't that great (About 5 to 13fps), I'm CPU and GPU limited, but seems that I'm more CPU limited than GPU limited in this game. And the performance overall of the game isn't that great considering that Crysis looks better and runs faster on my system than this game, strange.
 
how about dx9 vs dx10 performance and screenshots? or is that for a future article?

gosh enough with the DX9 already.. [H] stopped taking note of 8800GTX in testing and I'm not complaining.. DX9 is so far from [H] it's not funny... ;)

Why don't they bench DX8 as well for those people with intel integrated graphics? Who cares if the game can't run in DX8, just pull fps from a blank screen. Or perhaps WineDX9..

Seconded. I'm still not willing to install Vista.
ugh.. there's just no reasoning with some people.. i can't believe this is still an issue with some people..

*yawn*

there are already enough results to have to sift through, and you want to double them.. I for one have no interest in DX9, Windows XP, outdated vga cards. Be happy that the games are still compatible with DX9.. I'm sure that will stop before too long...

In a pansy voice: Ohe no, i can't run useless 3DMark Vantage cas I'm too scared and gay to run Vista, please protect me big boy... ooooh. ooooh.
 
Wat?

Why would multicore possibly be discouraged for gaming? Unless you're referring to the general tradeoff between clock speed and core count, in which case clock speed is generally superior, but with more games becoming multi-core aware, that advantage is diminishing.

Well why don't you research yourself about gaming and multithreading. Now if this game is proof of anything my question is how did they pulled it off. Specially for this type of gaming.
 
gaspah, you're a funny man/woman!

...but i do agree with you. The Ludites are still with us.
 
i would like to see a cpu comparison with amd's duals vs tricore vs quads and then toss in intels duals and its quad for a bit of spice. i would be interesting to see if amd's tricore actually delivers on the more cores is better front when compaired to intel's duals
 
Well why don't you research yourself about gaming and multithreading. Now if this game is proof of anything my question is how did they pulled it off. Specially for this type of gaming.

Is true that we cannot discourage multi core for gaming, but the game doesn't look that great compared to other games which are multi core aware (Like Crysis), and runs even slower, giving more work for the CPU to do instead of the much more massively parallel GPU is not a wise move, even though the GPU lack of intelligent collission detection, Crysis proved through the sys_physics_cpu=0, that GPU's are better for physics than CPU, and even though I don't know which kind of work is Assassin Creed making the CPU to make, but no CPU is powerful enough for massive draw calls, physics or graphic effects.

DX10 reduced considerably the draw calls, driver requests and many graphic effects that relied heavily on the CPU.
 
In response to the person who wanted to know whether to buy AC for his PS3 or his PC. Go with the PC version if you can run it! I beat it on my friend's X-Box, then I bought it and am playing it on my PC. It is much better on the PC, but then I use a USB X-Box controller to play it...

Also using an E4300 dual-core @ 3GHz w/ an 8800GTX @ 640MHz core / 1040MHz memory. Running under WinXP on DX9. I have no plan on upgrading to Vista. Maybe when I can finally upgrade my video card...

No problems running the game with every setting maxed at 1920x1200.

The mouse/keyboard controls work flawlessly, I just prefer to lounge back with my controller.
 
Very interesting testing, never thought it'd be such a huge gap between dual/single cores... What kinda CPU is in the Xbox 360 anyway? I know the original Xbox's wasn't a multi-core proc was it? I do know the PS3's Cell processor is highly efficient at multi-threading tho...
 
If you have a single core and a dual core that can do the same number of total calculations per second, how is being dual core going to be any diffrent? :rolleyes:

no matter how fast one hand is, you'll never be able to grab as many apples as 2 hands.

you saw the graph, how obvious can it be?

you're STILL splitting 1 CPU with all/any background process AND trying to crunch an insane amount of physics at the same time. not going to happen, at least not efficiently.

Besides, when the DEVELOPERS state on the box that you need a minimum 3800 X2 in order to play the game, i think i'll listen to them over everyone else:)
 
I am glad I read this article. Quite frankly, I am shocked. Nvidia just released a refresh to their line of cards knowing full well that Vista SP1 and therefore DX10.1 were coming (and are now here) but yet they completely ignored implementing that support? Are they retarded? It's not like they didn't have enough time. Before reading the article I was under the assumption NV released 10.1 support with the 9x series. They should have had support *before* SP1 even came out. Why, oh why, are so many end-users, hardware manufacturers, and software vendors taking SO DAMN LONG to move on from XP? Vista SP1 is out already for crying out loud (and it runs great)! By the time they all get their $#!^ together, Windows 7 will be out and the problem will repeat itself.

I'm still using an older 7800GT and was going to upgrade when we get our tax stimulus checks. Knowing now that NV HW doesn't support it, what should I do? Should I wait for their next series? Is DX10.1 HW support even worth holding out for or should I buy now? When does NV plan on implementing support? Does AMD have any new spectacular gfx boards on the way soon worthy as an alternate choice (I'm all for AMD, but sorry, quite frankly I don't see a reason to buy with their current releases).

/rant
 
The exact same in-game settings are selected for both video cards in the ATI vs. NV graphs. AF is broken on ATI cards, and if we were have to disabled AF for the NV cards we would have had to change the visual quality options, making the in-game settings very different. So while AF is broken on the ATI cards, all the in-game settings are set at the same values.

Not sure I follow your second sentence. Isn't there an option in the NV control panel to simply disable AF while leaving everything else alone, i.e. not touching other visual quality options or in-game settings?

Regardless, the bottom line is that the chart is in the "apples to apples" section but it's not apples to apples. So the chart should either be removed (with an appropriate explanation provided) or fixed.

Apart from this and the flawed interpretation of the DX 10.1 issue, good review.
 
no matter how fast one hand is, you'll never be able to grab as many apples as 2 hands.

you saw the graph, how obvious can it be?

you're STILL splitting 1 CPU with all/any background process AND trying to crunch an insane amount of physics at the same time. not going to happen, at least not efficiently.

Besides, when the DEVELOPERS state on the box that you need a minimum 3800 X2 in order to play the game, i think i'll listen to them over everyone else:)


So your saying that if One-armed Joe can pick 10 apples per second with his one hand, but bob can pick 5 apples per second with each hand, he will be able to pick more apples?

I specifically said they could do the same amount of calculations per second. If anything, two cores will loose to overhead.

It's all a mute point though, because people have posted in this VERY ????ing thread, about running it on AMD 64 3200ish (OC'ed) at playable FPS.

Get that through your head.

I'm not going to argue with that graph, but that is only 1 core of a dual core. Which isn't exactly the same.



Let me clearify something, I think dual cores are great. I think quad cores are neato. I hope gaming moves towards more support for dual cores. BUT no matter how much they said this thing requires a dual core to run, people are playing it at playable FPS without the "required" dual core.
 
...Pats his 939 opteron...My precious..

I knew it wouldn't take long. I made the move to dual core as soon as it was economically feasible. Hell, I think my grandfather hearing aid has more than one core!
 
Nvidia just released a refresh to their line of cards knowing full well that Vista SP1 and therefore DX10.1 were coming (and are now here) but yet they completely ignored implementing that support?

Nah, they probably have the DX 10.1 tech ready, but they're holding on to it because of the lack of competition.

They've done it before - Instead of releasing the much faster Geforce4, which was ready, they re-released Geforce3 as "Ti" to milk some more money from NV20 before releasing NV25.
 
I am very disappointed with the performance on my system. I am currently seeking help through Ubisofts technical support. There are serious bugs with game that are not even resolved from a re-install, so the game is likely incompatible with my video card. I have already spoke of this in http://www.hardforum.com/showthread.php?t=1294846

Be warned, if anyone has a X1900XT 512MB running the latest catalyst drivers 8.4, You may not be able to run the game wtihout serious problems. I have some interesting screen shots on it.
http://img212.imageshack.us/my.php?image=acbuggh0.jpg Very obvious that the lighting is wrong.
http://img140.imageshack.us/my.php?image=acbugrectol0.jpg

And the game runs kinda choppy even on low settings.
5000X2 BE 3.0Ghz
X1900XT 512MB
2GB DDR2-826ish 4-4-4-12
 
So your saying that if One-armed Joe can pick 10 apples per second with his one hand, but bob can pick 5 apples per second with each hand, he will be able to pick more apples?

I specifically said they could do the same amount of calculations per second. If anything, two cores will loose to overhead.

It's all a mute point though, because people have posted in this VERY ????ing thread, about running it on AMD 64 3200ish (OC'ed) at playable FPS.

Get that through your head.

I'm not going to argue with that graph, but that is only 1 core of a dual core. Which isn't exactly the same.



Let me clearify something, I think dual cores are great. I think quad cores are neato. I hope gaming moves towards more support for dual cores. BUT no matter how much they said this thing requires a dual core to run, people are playing it at playable FPS without the "required" dual core.

1) Lose

2) Moot

3) Actually, when it comes to processors, two 1Ghz chips working together IS better than one 2Ghz chip in a multi-threaded app. Don't argue, because you're wrong.

Also, you can't really say for sure how well it's running on those single core systems because the only information we have is one guy saying it's "playable".

Playable is very subjective. There is a girl at my work that installed Oblivion on a system that consisted of a Socket A Sempron, Radeon 9550 clocked at 450Mhz and 768MB of DDR333 RAM. She says it's playable. What do you think?
 
Playable is very subjective. There is a girl at my work that installed Oblivion on a system that consisted of a Socket A Sempron, Radeon 9550 clocked at 450Mhz and 768MB of DDR333 RAM. She says it's playable. What do you think?

That's actually kinda funny you'd say that' because when I bought oblivion the day it came out, I installed it on a very similar system, Socket A 2700+ and a 9700Pro with 512MB DDR333 and I will say that it was only "playable" at the lowest of possible settings, even had the lighting turned off so everything was just bright white. Oblivion still struggles on my current system, sometimes even Morrowind does lol.

But I think this whole argument about DC vs SC is pointless because eventually the market will shift to DC , so no one will have to worry about it. Also If the computer can't handle the game there is a console version that I hate to admit, looks good and can be had for a lot cheaper. (360 is 3-400$ vs my Gaming rig at 7-800$)

I think its vital for producers to start this kind of push, so it forces the competition to actual write code with parallelism in mind.
 
1) Lose

2) Moot

3) Actually, when it comes to processors, two 1Ghz chips working together IS better than one 2Ghz chip in a multi-threaded app. Don't argue, because you're wrong.

Also, you can't really say for sure how well it's running on those single core systems because the only information we have is one guy saying it's "playable".

Playable is very subjective. There is a girl at my work that installed Oblivion on a system that consisted of a Socket A Sempron, Radeon 9550 clocked at 450Mhz and 768MB of DDR333 RAM. She says it's playable. What do you think?

Your using Ghz as a measure of speed what is loose.....:rolleyes:

Nothing has changed since the day this was released, and that nothing is games are still GPU limited not CPU limited. If you've got a really fast single core like say a FX-62 or so, you'll easily be able to beat the preformance of the "required" 3800 X2, even with some added multithread overhead.

Oh, and we're talking about people in this thread saying it was playable. Members of the [H]ard community. But go back to reading the specs on the box, cause those are allways 100% right...:rolleyes:
 
Your using Ghz as a measure of speed what is loose.....:rolleyes:

I'm sorry, I didn't feel the need to delve into IPC for a general statement.

Nothing has changed since the day this was released, and that nothing is games are still GPU limited not CPU limited. If you've got a really fast single core like say a FX-62 or so, you'll easily be able to beat the preformance of the "required" 3800 X2, even with some added multithread overhead.

FX-62 is a dual-core chip.

Oh, and we're talking about people in this thread saying it was playable. Members of the [H]ard community. But go back to reading the specs on the box, cause those are allways 100% right...:rolleyes:

Ah, but you're in this community too, and forgive me, but I don't think I"ll be taking your word on this.
 
To update on my own performance issues. After doing a complete wipe of my video drivers and installing them over has fixed the problems I was having with lighting, but there are still errors, like the rectangle shown in my pictures. Somehow ubi's tech guys could see that my drivers were "out-of-date" even though the CCC was and is still showing the same driver info.:confused:
 
Back
Top