- Jan 28, 2014
1680x1050 here with q6600 and soon after 8800gt sli . Dell 2005fpw ips monitor.I remember posting on this forum back in 2007 during the release of the orginal Crysis. All of us "enthusiasts" posting pics of our new Q6600s and 8800 gtx and Ultras all in an attempt to run Crysis maxed out at 1280x1024. Good times
To this day with Crysis more cores doesn't really provide the performance you'd expect from a modern cpu. In the words of Devolver Digital engineer........It's a really nice trick!View attachment 470393
I remember when the game first came out it was extremely sensitive to CPU clocks and cores. It took me the better part of 2 days to get it running with the least amount of issues. It absolutely refused to run on my QX6700 when it was overclocked to 2.97 GHz. It ran best with 2 cores disabled at stock clock speed until there was a patch for Windows Vista that fixed an issue with newer quad core processors.
To be fair, multicore just became a thing on consumer desktops around 2004, and I imagine it wasn't easy for game developers to adapt to the parallelization required to run well in a multicore environment. One of the hardest things to do in games is to synchronize all your threads and make sure they don't deviate every single frame. It was an extreme paradigm shift.re: Crysis and CPUs, I remember a few games around that time making a big deal about "multicore support!!!1"- Supreme Commander and Bioshock also come to mind. Thing is, the hype evaporated as soon as actual performance reviews came out showing 4C failing to come out ahead again and again. Sure 2C was way better than 1C and 1C/2T and technically 4C/4T could be utilized to some extent, but it wasn't until the 2010s that I saw games really utilizing 4C in a way that could make or break performance.
I imagine the ideal CPU for running games from the late DX9 / early DX10 era would be like 2C/4T with massive clock speed and big L3.