Crysis does suck considering that even 2 generations later you still cannot run it at acceptable frames............
You sir are the reason why PC gaming is stuck with crap console ports...thanks for proving the point.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Crysis does suck considering that even 2 generations later you still cannot run it at acceptable frames............
I wonder how we have determined that Crysis is not properly "optimized," when by nearly all accounts, we have never seen a game like it. To what do we compare its optimization? I don't deny that Warhead seemed to show performance improvements over the original Crysis, but both games still seem to give "average" systems a lot of trouble.Crysis does suck considering that even 2 generations later you still cannot run it at acceptable frames.
I wonder how we have determined that Crysis is not properly "optimized," when by nearly all accounts, we have never seen a game like it. To what do we compare its optimization? I don't deny that Warhead seemed to show performance improvements over the original Crysis, but both games still seem to give systems a lot of trouble.
Regarding frame rates, it has been my experience that only minimum frame rates really affect performance. 60+ fps makes no difference to me, if I am running 30 fps minimum. As such, two 5870's in CF seem to meet the standard requirements (Enthusiast, 1920x1200, 8X AA, 30 fps minimum). Is this not acceptable?
Thanks. Yeah, that is pretty much in line with what I was driving at, I think. I believe the perceived lack of optimization was largely just graphically-intensive software capability far exceeding current (at the time) hardware capability. Enthusiasts should stand up and cheer, in my opinion, because why else should hardware companies continue to advance? Why else should we get excited about new tech, if there is nothing out there that will ever need it?Crysis has better graphics than Crysis Warhead...hence the "added" performance in the latter game.
WTF are you smoking? limited user base? lol lol lol
Heh, you know I've heard prices on X360's are getting quite reasonable . . .No, because HIS rig can't run it at +60FPS...so the game most be badly coded
You sir are the reason why PC gaming is stuck with crap console ports...thanks for proving the point.
Being ahead of its time makes it "suck" ? Ok...
That's precisely what makes PC Gaming much more appealing. New games would always be ahead of the hardware, which would make hardware manufacturers much more committed to provide better and faster hardware, to beat the games that "ridiculed" the last generation. Games like FEAR, Oblivion, Supreme Commander, Lost Planet,...(among other very intensive games when they got out). Now, with mostly console ports, only Crysis remains to be "beat", while the majority of other games runs great on current hardware.
lol I play most of my games on a PC but if the game cannot run properly on a mainstream rig, why would i waste my money on it?
Heh, sorry for basically quoting this entire post in a later post of mine. Didn't see this one!Being ahead of its time makes it "suck" ? Ok...
That's precisely what makes PC Gaming much more appealing. New games would always be ahead of the hardware, which would make hardware manufacturers much more committed to provide better and faster hardware, to beat the games that "ridiculed" the last generation. Games like FEAR, Oblivion, Supreme Commander, Lost Planet,...(among other very intensive games when they got out). Now, with mostly console ports, only Crysis remains to be "beat", while the majority of other games runs great on current hardware.
Thanks again for proving the point.
QFT...
PC gaming was never about what you could run with what your "weekly allowance" could afford. It was going broke while trying to to have the most intensified gaming experience available, and if your money couldn't provide you with that, then you would step up your game and start OCing your shit...
Seriously, when did such a [H]ard place become home to so manyofties?
gotta call bs on that one man.......very few people upgrade to top of the line hardware just to play a game.........at least when doom 3 came out the next gen hardware could play it at acceptable frames
Agreed. I'm not sure when we changed from [H]ard|OCP to [F]rugal|OCP. Attitudes on this site shifted dramatically in the last few years. It used to be that if someone had an purchased some super high end hardware many people would have said "Nice, I wish I could afford that." Now they say "why did you waste your money on that?"
lol I play most of my games on a PC but if the game cannot run properly on a mainstream rig, why would i waste my money on it?
Hmm... You may have a hardware problem. Crysis ran fine on my old 8800GTS 320 and I played it on a PC with an 8400GS. Make sure your drivers are up to date, nothing's underclocked and perhaps...
turn down the settings a bit
Crysis will RUN on damn near anything. Whether your ego permits you to enjoy it is another matter.
Well per Charlie on semi-accurate, nvidia is EOL all the gt200 based boards...
Rumor Site, so take it with a grain of salt of coarse.
http://www.semiaccurate.com/2009/10...x275-gtx260-abandons-mid-and-high-end-market/
I say that but am having an issue with it. They have been selling at near loss (or outright loss depending on the source) for a while now. why quit 4 months from your next gen?
if this is true though charlie must of creamed his jeans over it.
Well if they have been selling for a loss for a while, and the new gen from ati means they would have to lower prices even more, may make more sense just to EOL the chips, and wait until fermi is ready after new years, athough it will leave a huge hole in the lineup between fermi and the low end.
it would basically equate leaving the gpu market. I just don't see it. also getting rid of the mid section would be a death knell for their GPGPU strategy. with all their effort into this why now? they knew this was coming months ago. they just had their first real success with physx and just now getting some decent cuda projects going (a anti virus scan that doesn't kill your boot time or loading time would be awesome).
also I don't understand why they just don't do a die shrink of the GTX285, and how hard would it be to drop it down to 256 bit bus with GDDR5? at 40nm it would be cheaper to produce and so would the AIB. maybe I am missing something but it seems that would have been a card that would still have been competitive with ATI and their GPGPU strategy.
There was a planned die shrink for the gt200 chip, I believe it was the g212 chip, however it was canned. That was there plan however not much came of it. They are making some 40nm chips on the low end, the g210/g220, however those chips mostly only show up on OEM machines, and are the low end.
If there was a successful shrink of the gt200, it would have been release of it a few months ago. Of coarse TSMC 40nm process also has been sucking, so that hasn't helped things.
I never did hear what happened exactly. what was the issue? (if you have heard please) esp with the 40nm issues now corrected.