Hard Reset Gameplay Performance Review @ [H]

I have no trouble holding around 60fps at 5760x1200 so the performance is now scaling correctly however the game stutters worse than King George The 6th. That being said this behaviour isn't really isolated to just Hard Reset and is more a general problem with 2 or more card crossfire triple head setups. I have seen no similar issues on SLI/Surround setups.

I'd be curious to see a follow up editorial from the staff on the performance of multi monitor + multi gpu setups.

Ah, perhaps you may be talking about a phenomena that does really exist but many people can't feel it called microstutter? It does exist but I'm not particularly sensitive to it. For me the game feels smooth now. The mouse cursor felt strange, then I went into the options and added a check to smooth mouse and that was the end of the strange mouse.
 
I have no trouble holding around 60fps at 5760x1200 so the performance is now scaling correctly however the game stutters worse than King George The 6th. That being said this behaviour isn't really isolated to just Hard Reset and is more a general problem with 2 or more card crossfire triple head setups. I have seen no similar issues on SLI/Surround setups.

I'd be curious to see a follow up editorial from the staff on the performance of multi monitor + multi gpu setups.

Stuttering could be the issue due to other reasons. If AMD multi-GPU fix it out and people are reporting that the game is running smoothly on their systems, it might possibly be an isolated problem on your end, meaning further troubleshooting is probably a good idea.

I noticed stuttering with Crysis Warhead when I OC'd my i7 920 to 4.4ghz. When I dropped it down to 4.2, it went away. This was with OC'd GTX 460's.
 
Stuttering could be the issue due to other reasons. If AMD multi-GPU fix it out and people are reporting that the game is running smoothly on their systems, it might possibly be an isolated problem on your end, meaning further troubleshooting is probably a good idea.

I noticed stuttering with Crysis Warhead when I OC'd my i7 920 to 4.4ghz. When I dropped it down to 4.2, it went away. This was with OC'd GTX 460's.

I'm fairly certain it's not just limited to myself, I've got a friend who runs 3-way crossfire using 6970's in Eyefinity and another with crossfire 5870 1GB's and they both have identical problems in pretty much all titles. It's definitely more that micro stutter (It doesn't really bother me to much), this problem is significantly worse, the perceived framerate for the end user seems to be about half if not 1/3 what is actually being reported.

http://hardforum.com/showthread.php?t=1506951
http://widescreengamingforum.com/node/13198
http://wsgf.cdn.crackerjackmack.com:81/node/14385

The above are just a couple of examples of the a good chunk of users which have exactly the same problem.

In fact the only instance in which I've seen the problem go away is in older titles where i'm able to pull 120fps, at which point everything smooths out which does seem to indicate some kind of problem with how AMD's drivers handle the secondary adapter (e.g. they don't pass the same display timing information to the secondary card).
 
My experience with the demo was I had everything turned to maximum except for AA which was turned off. This was at 1920*1200 with no Vsync. The FPS was almost constantly at 60 or above and when it did dip it didn't go under 40. I am happy with that seeing as how my video card is almost four years old and I am on a dual core CPU. The game wasn't any mind blowing graphical experience but it was no slouch either.
 
Notwithstanding the fact that I liked the demo, I fail to see why being "PC exclusive" is itself a virtue when the demo itself indicates the game is a far cry from a system buster. What do I care if they don't release it on another platform if they aren't going to go all out with graphics and features? The way this game had been hyped in certain circles, I was almost led to believe this would be the next Crysis or something.
 
The geforce 285.27 drivers is very bad drivers and make a lot of problems, so use the normal drivers if you want to have a better look on nvidia cards at this game. !
 
Notwithstanding the fact that I liked the demo, I fail to see why being "PC exclusive" is itself a virtue when the demo itself indicates the game is a far cry from a system buster. What do I care if they don't release it on another platform if they aren't going to go all out with graphics and features? The way this game had been hyped in certain circles, I was almost led to believe this would be the next Crysis or something.

You clearly haven't played many console ports :p
 
Notwithstanding the fact that I liked the demo, I fail to see why being "PC exclusive" is itself a virtue when the demo itself indicates the game is a far cry from a system buster. What do I care if they don't release it on another platform if they aren't going to go all out with graphics and features? The way this game had been hyped in certain circles, I was almost led to believe this would be the next Crysis or something.


This game is right out of the box PC ready.
You can fiddle with almost any setting including the vertical FOV.
Not to mention the game looks fabulous and runs smooth as silk.

It's more than just a pretty face, it's about having the control over the game that you'd expect from a PC that these shit console port necessitate you going deep into the files to find the setting you need. Including Hex Editing.

I draw your attention to the just released Dead Island if you want to discuss bad.

To reply about Crossfire, I have a Triple Crossfire and see no stutter. I had the CAP4 right out of the box and using every top setting I'm getting 80+ FPS throughout at 6056 x 1200.:D
 
man oh man you guys and your mega pixels.

Any developers out there? What are your guys' thoughts on DX9 engines looking subjectivley better than the dx11/dx10 ones? What do my fellow gamers think?

I'm seeing dx9 games just looking stunning and not seeing much wow from dx11.
 
I think it's probably a placebo. You expect much of DX10 and DX11, and when it's a mild difference, you don't really see the benefit. Doesn't mean to say DX9 games inherently look better, but there are of course some DX9 games that look better than some DX10/11 games, that's a given.
 
man oh man you guys and your mega pixels.

Any developers out there? What are your guys' thoughts on DX9 engines looking subjectivley better than the dx11/dx10 ones? What do my fellow gamers think?

I'm seeing dx9 games just looking stunning and not seeing much wow from dx11.


Some of the recent DX9 stuff........like this game and Dead Space2 ( I think is a DX9) are fabulous........and run like rockets.:D

and are more impressive than most DX11 titles........but then that's because DX11 isn't on a console........in three years DX11 will look good too.
 
Last edited:
So these guys used a power button logo on the title of the game.......Does that mean Best Buy is going to send them a letter telling them to stop like they did New Egg?:rolleyes:
 
Some of the recent DX9 stuff........like this game and Dead Space2 ( I think is a DX9) are fabulous........and run like rockets.:D

and are more impressive than most DX11 titles........but then that's because DX11 isn't on a console........in three years DX11 will look good too.

haha I too think dx11 is being held back by the consoles but I have no proof, its just my subjective opinion. It'd be nice I guess when we get Xbox 720s and graphics get bumped but then are held back for the life of the xbox 720... ugh consoles.
 
I noticed that they added FXAA in the 1.01 patch: http://forums.steampowered.com/forums/showthread.php?t=2134283 I hadn't yet tried the community fix method (defaultluser provided this link on the second page: http://hardforum.com/showthread.php?t=1626373), but now it's officially in the game. My framerate was already very good on my GTX570, but now it's much, much better. I'm pushing over 100 fps at 1920x1200 with all in-game settings maxed out. I also ran the benchmark in the Extras menu and it's averaging around 77 fps. FXAA looks just as good (if not better) than 4xAA in this game, and the performance is much better. Glad the devs decided to add it in.
 
I noticed that they added FXAA in the 1.01 patch: http://forums.steampowered.com/forums/showthread.php?t=2134283 I hadn't yet tried the community fix method (defaultluser provided this link on the second page: http://hardforum.com/showthread.php?t=1626373), but now it's officially in the game. My framerate was already very good on my GTX570, but now it's much, much better. I'm pushing over 100 fps at 1920x1200 with all in-game settings maxed out. I also ran the benchmark in the Extras menu and it's averaging around 77 fps. FXAA looks just as good (if not better) than 4xAA in this game, and the performance is much better. Glad the devs decided to add it in.

Wow FXAA, NICE!! Hopefully FWH does well on this game. It's really short and seems like more of an experiment to see if they can be successful on the PC making an exclusive title. Hopefully they go a little harder and make a more robust expansion pack with a deeper story. I don't need a deep story to enjoy a game like this but it wouldn't hurt if we knew just a little more and cared enough to know what's going on and what charecters names are. Currently it's all kinda bland and no names are worth remembering.
 
I'm not normally one to praise technology that nvidia have had a hand in, usually because it's proprietary and ultimately near-useless because of that fact. However, I have yet to see an instance of FXAA vs MLAA where I actually prefer the latter image. To me, FXAA seems to be like a less-demanding version of MSAA, which I very much welcome.
 
Wait a minute, they didn't just add FXAA as an option but actually replaced MLAA completely? That's an interesting move.
 
I'm excited to see FXAA in action. My OC'd 6850 should produce some good speeds hopefully.

@magoo your q6600 and core i7 systems have ram running at 1600 Hz ;-)
 
I'm not normally one to praise technology that nvidia have had a hand in, usually because it's proprietary and ultimately near-useless because of that fact. However, I have yet to see an instance of FXAA vs MLAA where I actually prefer the latter image. To me, FXAA seems to be like a less-demanding version of MSAA, which I very much welcome.

FXAA isn't less demanding than MSAA. Just MSAA used in deferred rendering engines as it is very taxing. In crysis 1.2 and Dirt 2 4X MSAA is actually takes little less performance hit than FXAA at about 10% compared to 15%. In deferred rendering games like Bad company 2 or Metro 2033 MSAA takes a big hit and FXAA will help alot in those titles.

BTW Hard reset doesnt use MSAA. It uses FSAA which is actually supersampling. Which is why the performance hit is huge with this option.
 
True, I haven't really seen FXAA much outside deferred rendering games, so I made a bit of a generalisation for those cases.
 
Back
Top