Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
what's faster for games....
2 cores at 2.5GHz each on same chip
or 1 core at 5.0GHz
I know there is more to processors than just the speed of the processor in terms of megahertz... but what if the differences were the same?
I am the one who said Assassin's Creed is playable on a 2.2ghz AMD64 3500+ OC to 2.4ghz.
Let me clarify.
I failed the install test. I installed it anyway.
The cinematics are noticeably slower then what they should be. The dialogue is about a half a second ahead of the motion in the characters. This really isn't a problem because its just a cinematic.
The gameplay itself is 100 percent playable and responsive. I press the button the character responds. No delay. I can jump across rooftops, climb buildings, run full speed through the city, fight 6+ guards at once, without any lag.
I've been playing video games for 25 years. I was on a Return to Castle Wolfenstein team that had a 2 year streak at #1 on Team Warfare ladders (a record for any game). I am not that girl.
If you require further proof I will happily run any benchmark or test, but I assure you, it is perfectly playable.
Insane resolution? I am sad if I have to drop to only 1600x1200 resolutionThanks for the single core test. But why use 1600x1200 resolution? That is an insane res to play in period. My biggest worry was buying a game that says dual core CPU only and the game not allowing me to install as I have a single core CPU. Us single core losers still can play it just not as fast lol
What overhead? Why do you think there is some "overhead" if you only use 1 core in a dual-core setup? Gawd, I'm not going to explain how a processor works, how it communicates w/ main memory, pci bridge, etc... What you need to worry about is a bottleneck.... It's quite apparent that AC is bottlenecking it's FP calculations so a dual-core or a much-faster single-core is the way to go. If this wasn't a console port I'm sure they would have off-loaded more of the FP to the GPU.
Go read something, get smarter and stop playing so many video games.
Unfortunately I'm much older then most people here. Didn't sleep well and I just grew tired of listening to people make up stuff to type so they can feel smart. Also I have to occasionally post or my login will be erased, again.
What you said was crap, and everything I can think to respond back with I know is going to go over your head. At least I can let everyone else know that what you said is crap and you need a dual-core processor these days.
Well, if you don't feel capable to reply without involving your emotions, you shouldn't reply until your mind is clear. Have you ever asked why when Intel launched the Core Duo CPU, it had higher cache latencies than the Pentium M? Because of the cache coherency mechanism and dynamic size cache to save power. Intel uses inclusive cache while AMD uses exclusive cache, and Intel's approach generally results in higher latencies because sometimes the data stored for CPU 1 cache can spill into the CPU 2 cache and cause some latencies etc. A single core is highly optimized with it's own cache, execution units, it's own HT link (AMD CPU's) etc, disabling a core from a Dual Core CPU is not gonna give you the same performance as a native single core. In many gaming tests that weren't optimized for multi threading, the more powerful Core Duo lagged behind of the Pentium M.
Dothan @ 2.13Ghz vs Yonah @ 2.0Ghz (Yonah has considerable improved FPU
L1 Cache Latency L2 Cache Latency
Dothan 3 cycles 10 cycles
Yonah 3 cycles 14 cycles
DivX Doom 3
Dothan 39.7 fps 95.5 fps
Yonah 57.5 fps 93.8 fps
Source of those tests.
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2627&p=3
More to read:
http://www.madshrimps.be/?action=getarticle&number=5&artpage=1908&articID=434
I am the one who said Assassin's Creed is playable on a 2.2ghz AMD64 3500+ OC to 2.4ghz.
Let me clarify.
I failed the install test. I installed it anyway.
The cinematics are noticeably slower then what they should be. The dialogue is about a half a second ahead of the motion in the characters. This really isn't a problem because its just a cinematic.
The gameplay itself is 100 percent playable and responsive. I press the button the character responds. No delay. I can jump across rooftops, climb buildings, run full speed through the city, fight 6+ guards at once, without any lag.
I've been playing video games for 25 years. I was on a Return to Castle Wolfenstein team that had a 2 year streak at #1 on Team Warfare ladders (a record for any game). I am not that girl.
If you require further proof I will happily run any benchmark or test, but I assure you, it is perfectly playable.
2008-04-30 16:55:13 - AssassinsCreed_Dx9
Frames: 1247 - Time: 60000ms - Avg: 20.783 - Min: 16 - Max: 31
1600x1200 85hz, 1/3 shadows, 2/4 detail/graphic quality
2.2ghz 3500+ @2.4ghz
2gb corsair
74gb raptor
6800GT OC @ (391/1050)
Well I can say I am surprised. It feels better then the stats look.
I am sure if I turned down the detail it would run better.
edit:
2008-04-30 17:19:03 - AssassinsCreed_Dx9
Frames: 1619 - Time: 60000ms - Avg: 26.983 - Min: 18 - Max: 42
1280x1040 85hz, 1/3 shadows, 2/4 detail/graphic quality
Also I did it in Acre, ran along the crowds in the rich districts like the review did while killing guards.
Or, you know, the <2% performance difference might have something to do with the ~6% clockspeed difference between the two chips. Maybe. Or perhaps it was the preproduction motherboard with the unfinalized BIOS. I'm just throwing that out there.
But then I may not be looking at this with a clear mind.
And now that the CPU architecture lesson that nobody asked for or needed is over, lets move on:
Soooo....unsynchronized audio in cutscenes, barely playable framerates in the city and you say they're worse when you leave the city walls. All this under DX9 on an XP machine. Sounds like you may not meet the minimum system requirements.
Or, you know, the <2% performance difference might have something to do with the ~6% clockspeed difference between the two chips. Maybe. Or perhaps it was the preproduction motherboard with the unfinalized BIOS. I'm just throwing that out there.
But then I may not be looking at this with a clear mind.
And now that the CPU architecture lesson that nobody asked for or needed is over, lets move on:
how about dx9 vs dx10 performance and screenshots? or is that for a future article?
Hmm, [H]ard said 25 fps was playable on this game.
But the game does support Anti Aliasing, but the shadows looks jaggified no matter if it's in high or low, bad shadow implementation.
meh i wouldn't take anything less than 30