Your 9900k still cant handle Crysis

Lol that makes no sense. The fucking game does NOT even keep 60 fps on hardware that is orders of magnitude faster than what was available back then. If it was optimized worth a damn then it would have scaled with faster cpu IPC. A modern Intel cpu at 5.0 ghz dropping into the 50s is a goddamn joke for a game made when Pentium cpus were the norm and Core 2 was cutting edge. And even from a gpu standpoint it is insanely demanding for how outdated it looks. At 4k and 2x MSAA it cant even keep 60 fps with an oced 1080 ti when there are modern games that run as good or better and make this look like pure shit from a graphics standpoint. IMO anyone saying this game is well optimized is foolish. There is a reason this game is a popular meme you know. Anyway I will stop there as no way in hell we are going to agree with each other.

Did you even watch the video that I posted? its the very first post of this thread. Until then I assume you are just talking out of opinion? That is fine we all do that. Watch then come back with a new outlook.

Anyways you dont have to leave the thread, just stop being so damn angry and combative over a discussion. Crysis was a benchmark of its time. Every review site used it as a standard by which to measure much of everything else.
 
Did you even watch the video that I posted? its the very first post of this thread. Until then I assume you are just talking out of opinion? That is fine we all do that. Watch then come back with a new outlook.
Yes I watched the video nearly a year ago...
 
The game is very well optimized for the time it was crafted. Dual Cores was the norm and even then people were like .... nahh well never need two cores.

It was ahead of its time for tech. That doesn't mean that it's unoptomized.

The reason it still didn't run great on PCs years later wasn't because it needed massive GPU power but because the game mostly uses one core of a CPU even after the multicore patch. As you say, dual cores were the norm when it was released and Quad cores were also out for nearly year. So it was pretty obvious that things were going multicore. Yes, maybe nobody ever thought that We would need more than 2 cores for games back then, but, why make Crysis so CPU bound by limiting it to one core.

You could argue that it didn't even make use of the tech of the time to it's full potential. Does that mean it's badly optimised or badly designed?
 
why make Crysis so CPU bound by limiting it to one core.
It's 12 years past the release of Crysis and a lot of games even today aren't very well optimized for more than 2-4 cores, and that's with over a decade of development where 4+ cores are common and tools and engines to support it are much more widely available. I don't think anyone intentionally made Crysis limited with regards to how it scales across multiple cores, it's that it's hard to do and especially so when no one has done it before. Remember how hard everyone found PS3 development at the time?
 
It's 12 years past the release of Crysis and a lot of games even today aren't very well optimized for more than 2-4 cores, and that's with over a decade of development where 4+ cores are common and tools and engines to support it are much more widely available. I don't think anyone intentionally made Crysis limited with regards to how it scales across multiple cores, it's that it's hard to do and especially so when no one has done it before. Remember how hard everyone found PS3 development at the time?

4 months prior to the release of Crysis Capcom released Lost Planet to the PC and it is a game that scaled well up to 8 cores/threads. From a Japanese developer not known for PC ports or even PC development.

Simpler game and different engine and all that but it shows that it was possible at release to have something loading more than a single core.
 
Did you not even look at the video? Your cpu would not even be able to hold 50 fps in spots. Your cpu would be even worse than my 4770k at 4.3 in this game as you have nearly identical IPC and you are clocked lower.

I only played the first few levels, also only at 1080p.
 
4 months prior to the release of Crysis Capcom released Lost Planet to the PC and it is a game that scaled well up to 8 cores/threads. From a Japanese developer not known for PC ports or even PC development.

Simpler game and different engine and all that but it shows that it was possible at release to have something loading more than a single core.
The first Lost Planet is an amazing game. It's still used as a gaming benchmark for CPUs to this day. Capcom still uses MT-Framework in their games.
I only played the first few levels, also only at 1080p.
When you get to the mountain where the alien ship is and you have all that snow, ice and fire, that still brings the hardiest of CPUs to its knees. How did the cutscene in the beginning when you come across the beached destroyer go?
 
I'm gonna install it tonight and make some videos of it. I'll use Process lasso to force the threads to specific logical cpus on my threadripper so video recording isn't sharing any of the same individual threads as crysis.

I'm really curious to see what happens.

I'll be playing it at 1080p on my 240hz panel then 3440x1440p.

I'm genuinely curious. I have a 2080ti so I am essentially offering the monster of crysis the top 1% of GPUs in the world.
 
The first Lost Planet is an amazing game. It's still used as a gaming benchmark for CPUs to this day. Capcom still uses MT-Framework in their games.

They are moving away from it though. The last few major releases have been the RE Engine. Monster Hunter World was still on MTFramework since the engine had custom tools from previous Monster Hunter games.

Panta Rhei was suppose to be the replacement but I don’t think they’ve mentioned it in years now.
 
The first Lost Planet is an amazing game. It's still used as a gaming benchmark for CPUs to this day. Capcom still uses MT-Framework in their games.

When you get to the mountain where the alien ship is and you have all that snow, ice and fire, that still brings the hardiest of CPUs to its knees. How did the cutscene in the beginning when you come across the beached destroyer go?

There were some slowdowns there, dips to 60-70, not noticeable due to freewsync, but i was still floored that 10+ years later with good hardware you still cant max crysis. Also annoyed the 24Hz bug is still there, that drove me nuts for the first 10 mins till i figured it out.
 
It was ahead of its time for tech. That doesn't mean that it's unoptomized.
https://www.dsogaming.com/news/cryt...ysis-2-didnt-implement-tessellation-properly/

Crysis PC was an unoptimized mess and that CryEngine 2 was a huge waste of resources. Everything from the assets to the post-process pipeline wasn’t given much thought as Crytek was purely PC at the time and the move to consoles really opened their eyes to it..

Crysis may have been "ahead of it's time" but it was certainly not 10 years ahead of it's time. The game still plays like shit for a reason.
 
It's 12 years past the release of Crysis and a lot of games even today aren't very well optimized for more than 2-4 cores, and that's with over a decade of development where 4+ cores are common and tools and engines to support it are much more widely available. I don't think anyone intentionally made Crysis limited with regards to how it scales across multiple cores, it's that it's hard to do and especially so when no one has done it before. Remember how hard everyone found PS3 development at the time?

You sort of made the point, those games aren't well optimized, just like Crysis wasn't. There were plenty of games back then that did scale well across cores. You also have to remember that Crysis wasn't just a game it was also a tech demo to sell CryEngine. Of course it was intentional, they made the choice to design the game to mainly use a single core.

The Cell processor of the PS 3 was radically different than any other multicore processor, that's why it was so hard to develop for.
 
Title should be 'Crisis is so poorly coded it cant take advantage of powerful hardware properly'.
 
Title should be 'Crisis is so poorly coded it cant take advantage of powerful hardware properly'.

I was at 60fps solid without a glitch maxed out on my 2600x and 1070ti. I have no idea what the problem is.

Game ran maxed out. 60fps is the game engine hard cap to my understanding.
 
I was at 60fps solid without a glitch maxed out on my 2600x and 1070ti. I have no idea what the problem is.

Game ran maxed out. 60fps is the game engine hard cap to my understanding.
Didnt read the whole thread, just responded to the click-bait title. But have a feeling Crisis @ 60fps should no longer be a problem with any decent hardware nowadays.
 
Didnt read the whole thread, just responded to the click-bait title. But have a feeling Crisis @ 60fps should no longer be a problem with any decent hardware nowadays.

What is click bait? Seriously how does one title anything anymore. Everything is click bait now sigh.
 
Last edited:
What is click bait? Seriously how does one title anything anymore. Everything is click bait now sigh.
I would say "still can't handle" is clickbait. Modern processors can handle the game just fine. Processor speeds and IPC have never been faster.
 
I would say "still can't handle" is clickbait. Modern processors can handle the game just fine. Processor speeds and IPC have never been faster.


Not bait by intention.

And considering click bait, I am not paid when you click on something I post here, so there is no baiting anyone for clicks.
 
Last edited:
https://www.dsogaming.com/news/cryt...ysis-2-didnt-implement-tessellation-properly/



Crysis may have been "ahead of it's time" but it was certainly not 10 years ahead of it's time. The game still plays like shit for a reason.

It actually rather well optimized (for the time).
It scaled REALLY good...most of the FUD about it being unoptimized comes from people where the game broke their E-peen.

Trying to compare it to mordern games is intellectual dishonesty...where did the game hurt your e-peen?
 
It actually rather well optimized (for the time).
It scaled REALLY good...most of the FUD about it being unoptimized comes from people where the game broke their E-peen.

Trying to compare it to mordern games is intellectual dishonesty...where did the game hurt your e-peen?
I’ve already mentioned it once in this thread but let’s again compare it to a game released months before it. Lost Planet, a game which scales across 8 cores.
 
It actually rather well optimized (for the time).
It scaled REALLY good...most of the FUD about it being unoptimized comes from people where the game broke their E-peen.

Trying to compare it to mordern games is intellectual dishonesty...where did the game hurt your e-peen?


I’ve read this a few times and still can’t understand what you’re trying to say. Seems like you wanna just use the word epeen in a smartass way but failed miserably .
 
Crysis + ray-tracing enabled....

a new extension currently in development for the powerful post-process injection tool, Reshade, created by modding veteran and Nvidia Ancel contributor, Pascal Gilcher...the Reshade RT filter is still deep in development (though Reshade Patreon supporters can access the latest alphas right now) and there are a number of limitations to factor in...the biggest is that Reshade's access to in-game data is limited to screen space, meaning that anything you don't see on-screen won't be ray traced...the video reveals a range of scenarios where this can break the effect...another profound limitation is that because Reshade only has access to depth and colour information, it can only make educated guesses on where light is coming from and how it should be traced...

 
I’ve read this a few times and still can’t understand what you’re trying to say. Seems like you wanna just use the word epeen in a smartass way but failed miserably .


No one knows what it means, but it's provocative....
 
Back
Top