How long before playing Crysis Warhead at 60fps constant at 1920x1200

Is it true that once you reach higher resolution settings that AA+AF become less and less of an issue per se?
They're still a major factor an image quality, but the weight of their effects changes: AA becomes less noticeable, AF becomes MORE noticable. At 2560x1600, I literally cannot tell a difference between 4xAA and 16xAA in game. Even in screenshots it's very difficult. However, AF becomes much more noticeable as it's easier to notice details at a higher resolution. I run 16x AF in every game.
 
I really don't get the obsession with 2x/4x/8xAA in Crysis when EdgeAA does a much better job at improving the IQ because of all the foliage (and the fact that you're spending 80% of the game outdoors in the jungle). From my own personal experience comparing 2x EdgeAA to 4xAA, I prefer the EdgeAA image. For those of who still haven't played Crysis yet, EdgeAA is only activated when you turn off regular AA. ;)
 
I just read HardOCP's sweet review.

5870 seems like a great card, but I'm eager to see Nvidia's offerings soon. I remember when the first F.E.a.r. game came out, it took a while for a system to run it at 60+ constant.

I guess we will just have to see what the 5870x2 can do...or perhaps what the gt300 series can do.
 
My question is similar but different.

How long until we can at least run Crysis at 30 FPS+, at 2560x1600 with all Enthusiast settings?

Alot of people claimed Crysis was tamed already, but even checking out the AnandTech Article that has 5870 CF, they used all Gamer settings, that to me isn't tamed, thats a concession.

Remember Nvidia claim that 280 GTX's would finally make Crysis playable at 2560x1600, and here we are again.

Honestly why in these reviews they don't compare the max settings. I wouldn't care if its 2 FPS, but I would like to know, What is Crysis's FPS at 2560x1600, without AA, AF, and with? Max everything, even if its a slideshow.

What am I realistically going to have to run to get there on my 30'', is it Quad CF, or Quad GT300 SLI, hopefully we'll get these answers soon.

And with Eyefinity, everything is upside down, because when can we run Crysis at 8000x4000 etc. X_X.
 
Crysis2009-04-2808-08-22-13.jpg
Crysis2009-04-2808-08-37-65.jpg

Crysis2009-04-2808-07-59-07.jpg
 


What are your specs mike, and are your frames consistent?

And how come none of the reviews match your claim yet?

I mean honestly, I am looking to drop 1k to 2k in graphics cards, I want to make an educated decision.

Thanks!
 
What are your specs mike, and are your frames consistent?

And how come none of the reviews match your claim yet?

I mean honestly, I am looking to drop 1k to 2k in graphics cards, I want to make an educated decision.

Thanks!

I'm guessing quad GT295 or Tri Sli 285. If you need to drop money I think would be wise to go newer tech. 5870x2 + 5870 or wait for GT300 sli would destroy any current one.
 
I'm guessing quad GT295 or Tri Sli 285. If you need to drop money I think would be wise to go newer tech. 5870x2 + 5870 or wait for GT300 sli would destroy any current one.
No of course, thats why I am waiting, I am almost certainly going to wait until the GT300 hits, or if I can get a good deal on any current tech, I just want to buy something, I won't regret, and that I will use for the next 2 years.
i7 975 3xBFG 280GTX OC2

I see, so any ideas why reviews have different numbers etc? Because clearly if everyone had 38 FPS consistently, there wouldn't be a problem with people saying 2x 295's etc can't run on max, nor 5870 CF etc.

Any tweaks done etc?
 
Do the trees not get anti-aliasing in Crysis and Warhead, or only in Crysis, if you enable regular AA through the Crysis game options? And why the hell would that be the case?

I'm playing with 2xAA enabled from game options and the tree leaves look fine to me. Although honestly, tree leaves in real life are "aliased" to begin with so maybe that doesn't bother me. However, when all the buildings and shacks and trucks, which are all rectangles, have aliasing, that definitely bothers me. It doesn't help that, artistically, they look pretty ugly to begin with. So I definitely prefer regular AA, leaves or no leaves.

And while I don't get 60 FPS at 19 x12, I want to note that on my second sig system, the gameplay experience feels absolutely smooth and spectacular throughout the game at that 1080p (all Enthusiast, 2x AA), with the usual exception of one or two crazy scripted sequences. I average around 40-45 FPS, dipping to 30 when things go nuts. Maybe it's the motion blur, but as others have said, this game just feels smoother at lower frame rates and I think you're wasting your time if you hold off on enjoying it until and unless you can squeeze 60 FPS out of it.

I'm at the point where I won't play through the game again just for the graphics/smoothness because I'm pretty satisfied with both. Of course...25x16 and Eyefinity would be a different matter!
 
Well we are still waiting to play this game at 1920x1200 4xaa 16xaf all enthusiast settings @ 60FPS CONSTANT. The 5870 by itself got a minimum fps of 12 at that resolution...and it was only using 2xaa!

I know H is suppose to do a 5870CF review, so that should hopefully bring that minimum fps to 20 maybe...but we are still a good...gasp...3 generations behind I would guess at running 60fps constant since these new generation cards have a minimum fps of 12...

The key word is constant BTW...meaning fps cannot drop below 60 at all, at any point in the game - it almost seems like a decade would have to pass to achieve this.
 
Yeah ^^^^^

I am guessing that Crysis will only be fully maxed in 2012. The year that apocalypse happens!!!


MUAHAHAHAHAHAHAHAHA ;)
 
Do the trees not get anti-aliasing in Crysis and Warhead, or only in Crysis, if you enable regular AA through the Crysis game options? And why the hell would that be the case?

as i understand it, transparency aa is broken in crysis.
 
I really dont see the appeal of a game that doesnt run on pretty much the average computer.
 
I really dont see the appeal of a game that doesnt run on pretty much the average computer.

I'm on the opposite end of the stick. After playing on PCs for years and years, I don't get the point of games that only run on consoles. I think, if I spent a good few years of my life making a game, I'd want to play it, showcase it to friends, showcase it to family members, etc on the best machine possible which obviously is not a console. Keyboard+mouse+higher end graphics+higher end sound+higher end physics(in some cases) ftw.
 
I agree, love it when a game pushes extreme machines, and makes you dream about one day maxing out the settings. Whilst I can't go 16xAA on it, it's only now that I can run it comfortably on 8X maxed out....and look how long the games been out! and if I went to a 24" the frame rate, I'd have to no doubt play on 4X AA.
 
Given that you have a pretty damn average gaming PC (and I'm being very generous here...), that's not terribly surprising :p

Put a fergie in that pc and see what happens. Any how, 5970s couldn't run crysis at a constant 60fps @ 1920x1200 4xaa maxed out, maybe a Fermi or two will do it. :cool:
 
Turn the AA off and it'll run Crysis fine. The edge AA turns off when full screen AA is turned on.

There's a big ass nVidia The Way It's Meant to Be Played and Intel branding all over the damn game...dunno why people are surprised when their ATI/AMD platforms don't perform as well as nVidia/Intel platforms :\.
 
Turn the AA off and it'll run Crysis fine. The edge AA turns off when full screen AA is turned on.

There's a big ass nVidia The Way It's Meant to Be Played and Intel branding all over the damn game...dunno why people are surprised when their ATI/AMD platforms don't perform as well as nVidia/Intel platforms :\.

TWIMTBP shit normally does hold back ATI cards but AMD processors are just plain slower than Intel's offerings unless you want to compare a Phenom 2 965 to an E5200 or something stupid. It's been that way for a few years now.
 
Argument is moot now that MW:LL is out anyway :)

If you guys can't get the game going without slide-showing than go stand on a mountain and TAG enemy mechs for those of us who can handle the game :p
 
TWIMTBP shit normally does hold back ATI cards but AMD processors are just plain slower than Intel's offerings unless you want to compare a Phenom 2 965 to an E5200 or something stupid. It's been that way for a few years now.
just a wee bit of an exaggeration there on the cpu comment. the Phenom 2 is right there with the i7 in most games when ran at realistic settings that people actually use. in fact the Phenom 2 is faster in some rare cases.
 
just a wee bit of an exaggeration there on the cpu comment. the Phenom 2 is right there with the i7 in most games when ran at realistic settings that people actually use. in fact the Phenom 2 is faster in some rare cases.

http://www.xbitlabs.com/articles/video/display/radeon-hd5870-cpu-scaling_8.html

Not sure what bizzaro world you live in but the i7 leaves the Phenom 2 in the dust once you aren't GPU limited, the Phenom even gets beat by the dual core E8400 sometimes. Phenom 2 isn't even as fast as a Core2 clock for clock and get's smoked by the i7. They are all pretty much the same only if you are GPU limited. I'm no Intel fanboy, it's just facts.
 
http://www.xbitlabs.com/articles/video/display/radeon-hd5870-cpu-scaling_8.html

Not sure what bizzaro world you live in but the i7 leaves the Phenom 2 in the dust once you aren't GPU limited, the Phenom even gets beat by the dual core E8400 sometimes. Phenom 2 isn't even as fast as a Core2 clock for clock and get's smoked by the i7. They are all pretty much the same only if you are GPU limited. I'm no Intel fanboy, it's just facts.
depends on what review and game you look at. the Phenom 2 certainly isnt getting smoked here and actually beats the i7 even on a clock for clock basis at some rare spots. yes its at 2560 but they are using a 5970. http://www.legionhardware.com/document.php?id=869&p=20

the Phenom 2 look really good here too as its beating the i7 in Far Cry 2. http://www.bit-tech.net/hardware/cpus/2009/04/23/amd-phenom-ii-x4-955-black-edition-cpu-am3/7

overall I would certainly pick the i7/i5 line of cpus but the Phenom 2 isn that bad and is more competitive then most people think.
 
Last edited:
Well, this thread is about Crysis and Warhead so FarCry2 benchmarks are irrelevant. Besides, it's a console game like nearly all of this generation's games so what's the point of where the ceiling is if a lowly AMD X2 can get sufficient frames :\

I'm being slightly tongue-in-cheek here, but on a serious note we're talking about a dying breed of gaming development that pushes the envelope and here the Intel/nVidia architecture reins...at least in my experience.
 
depends on what review and game you look at. the Phenom 2 certainly isnt getting smoked here and actually beats the i7 even on a clock for clock basis at some rare spots. yes its at 2560 but they are using a 5970. http://www.legionhardware.com/document.php?id=869&p=20

the Phenom 2 look really good here too as its beating the i7 in Far Cry 2. http://www.bit-tech.net/hardware/cpus/2009/04/23/amd-phenom-ii-x4-955-black-edition-cpu-am3/7

overall I would certainly pick the i7/i5 line of cpus but the Phenom 2 isn that bad and is more competitive then most people think.

Well you can pretty much completely ignore the bit-tech review because it's running a GTX280 at 1680x1050 with no AA or AF. In the Legion Hardware review they have become GPU limited by running at 2560x1600 8xAA and do not have minimum FPS numbers on any of their graphs, if the Phenom wins it is normally by 1-3 fps, and with no minimum FPS graphs we cannot see the full effect the CPUs are having. 1920x1200 and 2560x1600 are two completely different beasts and as I said once you become GPU limited your CPU becomes less a factor. Future upgrades will weigh more heavily on Phenom systems than Core2/i5/i7 because even if they get a new GPU and stop being GPU limited they will not see numbers as high as their Intel counterparts. I'm not saying the Phenom systems are bad or uncompetitive, I built one for a friend a few weeks ago but saying it's faster than a Core2 or i5/i7 is preposterous.
 
Well you can pretty much completely ignore the bit-tech review because it's running a GTX280 at 1680x1050 with no AA or AF. In the Legion Hardware review they have become GPU limited by running at 2560x1600 8xAA and do not have minimum FPS numbers on any of their graphs, if the Phenom wins it is normally by 1-3 fps, and with no minimum FPS graphs we cannot see the full effect the CPUs are having. 1920x1200 and 2560x1600 are two completely different beasts and as I said once you become GPU limited your CPU becomes less a factor. Future upgrades will weigh more heavily on Phenom systems than Core2/i5/i7 because even if they get a new GPU and stop being GPU limited they will not see numbers as high as their Intel counterparts. I'm not saying the Phenom systems are bad or uncompetitive, I built one for a friend a few weeks ago but saying it's faster than a Core2 or i5/i7 is preposterous.
you can always ignore benchmarks that dont agree with your opinions. so we disregard bit-tech because they are testing at just 1680 and then we disregard the tests at legionharware because they are testing at 2560x1600. lol. I already said the i5/i7 is better overall but to claim it only competes with the E5200 like you said earlier is beyond ridiculous. I do 100% agree that going forward the i5/i7 will have more staying power overall.
 
Last edited:
you can always ignore benchmarks that dont agree with your opinions. so we disregard bit-tech because they are testing at just 1680 and then we disregard the tests at legionharware because they are testing at 2560x1600. lol. I already said the i5/i7 is better overall but to claim it only competes with the E5200 like you said earlier is beyond ridiculous. I do 100% agree that going forward the i5/i7 will have more staying power overall.

Where did I say to disregard the legion hardware review? It was a meh review, all it proved is there is no tangible difference when you are GPU limited but it barely even did that with no minimum FPS numbers. All I said is that they were GPU limited in that review. Of course you won't see much a difference between the CPUs when you are GPU limited, maybe 4x5870 would be a more appropriate setup for testing CPUs at 2560x1600. As for the Bit-Tech review no one plays at 1680x1050 no AA with a GTX280, it was a pointless test. I also think you misread my E5200 comment.
 
It's not about the programming, the problem is all the 'stuff' in the game; there's far more flora than in any other game released, so when all that stuff needs lighting/shadowing and all the other special effects, it simply becomes very demanding. The huge view distance doesn't help any either.

This made me laugh.
 
This is my setup and I run 1920X1200 with everything maxed along with graphics enhancer mod.

Processor:
Intel 3.0ghz Quad Core 2 q9650
Memory:
4 GB's Crucial Ballistic 2000mhz DDR3/PC3-16000
Hard Drive:
2 TB Western Digital (SATA2) + 750gb Seagate 32mb (SATA2) + 500gb Western Digital (SATA2) + 500gb Western Digital (SATA2)
Video Card:
Dual SLI EVGA's 285 GTX
Monitor:
Samsung SyncMaster T240 HD 24inch 1080 DPI
Sound Card:
Creative Sound Blaster X-Fi Titanium PCI
Speakers/Headphones:
X-540 Logitech 5.1 Surround Sound + X-Fi XtremeGamer Fatal1ty Pro Series Headset
Keyboard:
Wireless Logitech
Mouse:
OCZ Dominatrix Blue/Black 7 Buttons Tilt Wheel Laser 2000 dpi Gaming Mouse
Motherboard:
EVGA Nvidia nForce 790i SLI
Computer Case:
ThermalTake Speedo -Advanced Full Tower
 
Put a fergie in that pc and see what happens. Any how, 5970s couldn't run crysis at a constant 60fps @ 1920x1200 4xaa maxed out, maybe a Fermi or two will do it. :cool:

From what I've seen, Fermi is going to be a huge steaming bucket of FAIL. :p
 
Back
Top