Stalker Clear Sky: ATI obliterates nvidia at its own game(again)

I can confirm that an OC'ed 9800GX2 is not enough to run the game in DX10 mode with everything maxed, even if using no AA. I'm getting 10-15 fps less than I do in Crysis(V.high settings). The game just puts a crazy load on the GPU. Supposedly it's playable with a 4870X2 though, even when using 2-4xAA.

Turn off the Dx10 effects and the game's fps go sky-high, even if it doesn't look *too* different.

One interesting thing is that the game seems to be practically single-core only. When playing, only one of my cores would have 100% load, the rest were at 5%'ish.

However, the game does appear to take good use of dual GPUs. Judging by the core temperatures, both my GPUs were working at 100%. In the game's console the game also properly detects and activates dual GPU mode, which would explain why the 4870X2 and the 9800GX2 takes the lead on the 280.

interesting. good scaling for multiple GPUs but poor CPU scaling. so I plays better in DX9 as well, I wonder if that is a bug in their engine or what.
 
You're referring to Clear Sky? In the first sequence in the camp, I was pegged at 60fps the whole time with every available option on max and AA enabled in game (It didn't look like more than 2xAA though). That was at 1920x1200.

Out in the bog there were several instances where the game dropped to ~10fps for no reason. When I looked up above the grass it jumped to about 90fps. After an exit to the menu and change of any option, then change back to max, the problem would go away and I'd be at 40-60fps again. Aside from that (presumably optimization) problem it was 100% smooth as butter completely maxed out at 1920x1200 with AA enabled.
 
You serious? You're 100% sure you're running in DX10 mode? If so, then that's just amazing.

Basically, right after you get to control your character and move around in the camp, I would get about 15-25 fps( 30-40 fps if I moved a bit away from the camp and looked 'outward' ), this with everything maxed and 2xAA, at only 1280*1024 resolution.

And this with a [email protected].
 
You're referring to Clear Sky? In the first sequence in the camp, I was pegged at 60fps the whole time with every available option on max and AA enabled in game (It didn't look like more than 2xAA though). That was at 1920x1200.

Out in the bog there were several instances where the game dropped to ~10fps for no reason. When I looked up above the grass it jumped to about 90fps. After an exit to the menu and change of any option, then change back to max, the problem would go away and I'd be at 40-60fps again. Aside from that (presumably optimization) problem it was 100% smooth as butter completely maxed out at 1920x1200 with AA enabled.

Ahhhhhhh man,,,,,,,,,,,,,,,,

SOC does this to me on my current system. I'll be running around and for no reason it goes to ~10 fps.........I have to pull down the console and hit vid_restart or tab out of the game and boom back up to normal fps. Never did it to me on my 680i system using 7800 through 8800 video cards but my big new X48 with 4870's (which I built to play Clear Sky) has these random slowdowns in SOC. No other game.......just SOC...........I was really hoping they were gone with CS.
|sigh|
 
You serious? You're 100% sure you're running in DX10 mode? If so, then that's just amazing.

Basically, right after you get to control your character and move around in the camp, I would get about 15-25 fps( 30-40 fps if I moved a bit away from the camp and looked 'outward' ), this with everything maxed and 2xAA, at only 1280*1024 resolution.

And this with a [email protected].

xrEngine2008-09-0612-06-27-19.jpg


Image has been reduced for Photobucket. Link me to a hosting site that will post full 1920x1200 and I'll post it using that. Yeah, its maxed on everything. I'll take a screenshot of the options menus.

EDIT: Went back in and checked. DX10 lighting wasn't enabled.:confused:

Here are some screenshots with DX10 enabled and everything on maximum. 1920x1200, max in-game AA/AF:

xrEngine2008-09-1012-23-01-29.jpg

xrEngine2008-09-1012-21-04-99.jpg

xrEngine2008-09-1012-16-56-66.jpg

xrEngine2008-09-1012-06-52-34.jpg


Fraps captured each screen at about 4-5fps lower than in game:confused:. Visual difference is negligible, but performance dropped from 60fps average to 30fps minimum and about 45fps average. Interesting, though, the FPS drops have disappeared in DX10. I might try running the CPU at 3.6ghz and testing it out, Core 1 goes 100% the entire time I'm in game, so I might see a bit of an increase there. Right now I'm at 3.2ghz because my room is getting in the 90's while the rest of my house sits at 76F. My watercooling setup can't cope with those high ambient temps without loading the CPU above 60C.
 
PS: I tried the game with the exact settings in the review and they are pretty much spot-on if they used the camp at the beginning of the game. Out in the Bog the card stays above 60fps at 1680x1050. This 4870x2 has just become my favorite card of all time!:D I wonder, with these new games, if the 4870x2 will begin distinguishing itself from the GTX280 even more?
 
PS: I tried the game with the exact settings in the review and they are pretty much spot-on if they used the camp at the beginning of the game. Out in the Bog the card stays above 60fps at 1680x1050. This 4870x2 has just become my favorite card of all time!:D I wonder, with these new games, if the 4870x2 will begin distinguishing itself from the GTX280 even more?

If games continue to be shader intensive, then the huge shader advantage of the 4870x2's will really begin to pull it away from the rest
 
Dethred, in the screenshot where you have 38fps with Dx10 mode enabled, I have, with everything maxed including AA and lighting distance/shadow quality( which are enabled in the latest patch )...11 fps...

I need a 4870X2.
 
wtf
the GTX260 is behind the 8800GTX and the 9800GX2, how does that make sense

Did anyone else notice this? I loved that totally off-topic argument about assassins creed and 10.1 support, when people are still taking these obviously flawed benchmarks seriously. The GTX 260 is behind the 8800GTX in performance, therefore, the test is wrong, and by the time the version ships the GTX series will be looking a lot better.
 
is the game out yet? where are you guys buying it? i thought it wasn't release until the 15th?
 
is the game out yet? where are you guys buying it? i thought it wasn't release until the 15th?
It came out in American stores but some of the games were missing the cd keys and were recalled. However before the recall many users were able to purchase the game.
 
Dethred, in the screenshot where you have 38fps with Dx10 mode enabled, I have, with everything maxed including AA and lighting distance/shadow quality( which are enabled in the latest patch )...11 fps...

I need a 4870X2.

I haven't tried the patch, so the FPS might lower. Although, this card eats up AA for breakfast. What card do you have now? That screen at 38fps was actually around 42fps except Fraps lowered the number down whenever I hit F12 (screenshot button).
 
I don't really think the fps is lower with the patch.

So you basically have 4 times my performance, and you're running a much higher resolution. And I'm running the card which gets the #2 spot in the OP's article, the 9800gx2. I don't even see how this is possible, how on earth can the 4870x2 be that much faster? Is it purely a driver issue, or does the x2 just have that much more raw power? Is this the DDR5 flexing its muscles?

Going from full AA to 0 AA gives me a 50% performance boost, giving me a whopping 15 fps in the scene where you had 38/42 fps.
 
I don't really think the fps is lower with the patch.

So you basically have 4 times my performance, and you're running a much higher resolution. And I'm running the card which gets the #2 spot in the OP's article, the 9800gx2. I don't even see how this is possible, how on earth can the 4870x2 be that much faster? Is it purely a driver issue, or does the x2 just have that much more raw power? Is this the DDR5 flexing its muscles?

Going from full AA to 0 AA gives me a 50% performance boost, giving me a whopping 15 fps in the scene where you had 38/42 fps.

driver and the game may be very shader dependent. (if its scaling well there are 800 on a 4870 and 1600 on the 4870X2 but they way the actual hard ware is set it does not work like that) but lowering settings does not always increase performance. In the COJ benchmark I actually saw much better performance with the GTX280 using the high setting then the low. (not AA though)

then again who know?
 
Also, the RV770 gpu doesn't suffer from higher res and AA settings that the G92 did, so his card is lessl ikely to take hits than yours is
 
I don't really think the fps is lower with the patch.

So you basically have 4 times my performance, and you're running a much higher resolution. And I'm running the card which gets the #2 spot in the OP's article, the 9800gx2. I don't even see how this is possible, how on earth can the 4870x2 be that much faster? Is it purely a driver issue, or does the x2 just have that much more raw power? Is this the DDR5 flexing its muscles?

Going from full AA to 0 AA gives me a 50% performance boost, giving me a whopping 15 fps in the scene where you had 38/42 fps.

Let me see if I can get this game patched and running with lighting distance at max. I am not sure if my AA in game is working fully. Some things are anti-aliased, and some things still have jaggies.
 
Let me see if I can get this game patched and running with lighting distance at max. I am not sure if my AA in game is working fully. Some things are anti-aliased, and some things still have jaggies.

Consider me confused. I put everything in the game at 100%, including AA( I too have noticed AA doesn't seem to work on all objects ), lighting distance and shadow quality, those unlocked in the patch. And now, in the scene we were talking about, I'm suddenly getting 50 fps where I yesterday had 10'ish fps? I started a new game too, so it's not the in-game weather( which has a huge impact on fps ). If I'm on the ground looking towards the bonfire and the guys sitting there, I'm now getting 30'ish fps.

I'm starting to wonder if I'm actually experiencing driver difficulties which causes slowdowns in the game until I restart my computer, because the game is actually playable now with everything maxed( fps gets higher outside the starting camp, as you've noticed ). Only difference since yesterday is the freshly restarted computer, and 100mhz higher shader clocks, 50mhz more on the video memory.

I'm gonna lower the clocks again and see if that's what's causing the major difference.

Edit: Lowering the clocks made no difference what so ever. Guess I just have to restart the computer if the game performance starts acting oddly, then.
 
Try ALT/TAB to desktop upon slowdowns and go back in game. There might be a memory leak in the game. BTW, what resolution were you using each time?
 
Consider me confused. I put everything in the game at 100%, including AA( I too have noticed AA doesn't seem to work on all objects ), lighting distance and shadow quality, those unlocked in the patch. And now, in the scene we were talking about, I'm suddenly getting 50 fps where I yesterday had 10'ish fps? I started a new game too, so it's not the in-game weather( which has a huge impact on fps ). If I'm on the ground looking towards the bonfire and the guys sitting there, I'm now getting 30'ish fps.

I'm starting to wonder if I'm actually experiencing driver difficulties which causes slowdowns in the game until I restart my computer, because the game is actually playable now with everything maxed( fps gets higher outside the starting camp, as you've noticed ). Only difference since yesterday is the freshly restarted computer, and 100mhz higher shader clocks, 50mhz more on the video memory.

I'm gonna lower the clocks again and see if that's what's causing the major difference.

Edit: Lowering the clocks made no difference what so ever. Guess I just have to restart the computer if the game performance starts acting oddly, then.

That is not that uncommon bro, I see the same think in COJ, set everything to low and I will max out at 30 FPS, turn everything up high and its 60. esp with new / unoptimized games.
 
I'm at a loss here. When I try playing the game again now, I suddenly get the craptastic performance again, even after a fresh restart. Must be something odd with my computer then I guess.
 
I like how the site's benches uses 1024x768, 1280x1024 and 1680x1050 all with no AA/AF, cause that is how I play my games.
 
The original S.T.A.L.K.E.R. was a great game. Mainly because it was an FPS and more. It was like Oblivion with guns. :cool:
 
I'm at a loss here. When I try playing the game again now, I suddenly get the craptastic performance again, even after a fresh restart. Must be something odd with my computer then I guess.

What resolution were you using?
 
I played through some of STALKER and I have to say it is one of my least favorite FPS's in recent memory.
 
I'm running it at only 1280*1024.

I tried it at 1440x900 and I was getting about 70-90fps with everything maxed during a nice gunfight early in the game. At 1920x1200 I was getting 20fps with the new patch. I almost want to go back to the unpatched game, as it runs better. Also, the game just crashed after I accidentally shot a friendly, and he ran over to another stalker and appeared to ram his head up the other's ass, as a 3rd spun around in place. Really freaking weird.
 
xrengine200809130033565cs6.jpg

xrengine200809130035136kj6.jpg


I suspect I know what was wrong now. When I was getting high fps, the game must not have set Dx10 mode correctly( even though the menu said so ).

Above I posted screenshots, one with the game set to Enhanced Full Dynamic Lighting( Dx10) and one with just the regular Enhanced Full Dynamic Lighting. In both cases all other settings, including AA, were maxed.

As you can see, the Dx10 version does indeed look better; less 'washed-out'. However, this comes with a cost of *41* frames per second, in my case. So deactiving Dx10 actually gives me a 400% increase in fps in this specific scenario.
 
I tried it at 1440x900 and I was getting about 70-90fps with everything maxed during a nice gunfight early in the game. At 1920x1200 I was getting 20fps with the new patch. I almost want to go back to the unpatched game, as it runs better. Also, the game just crashed after I accidentally shot a friendly, and he ran over to another stalker and appeared to ram his head up the other's ass, as a 3rd spun around in place. Really freaking weird.

Dude you just made me lolz all over the place, haha.

Maybe you accidently loaded the "gay" bullets?

Note to self, don't install patch, enjoy.
 
You say this:

Like I said, they might be leaving the logos, but ATI has paid them a ton of money to put out a *special* patch for their game. Trust me :)


And then say this:


Oh, I presume NVIDIA stormed into Ubisoft's HQ and removed the code themselves eh? lawl... :rolleyes:

It's probably pretty easy to play victim when you own a 4850 huh? ;)

Am I the only one who sees the irony? :rolleyes:
 
You say this:




And then say this:




Am I the only one who sees the irony? :rolleyes:

not at all, just didn't think it worthwhile, the discussion was interesting if you read around the fanboy post. One of the first new games to challenge the new cards and would like to see the results. not that your not right about though. I was just ignoring.
 
how are you guys playing a game that's not out yet? PIRATES!
In Russia the game came out august 22nd. In America the game came out as well but was recalled partly due to a misprint of cd keys. It's not out on steam for that reason due to the publisher being a dick about retail sales.
 
Any word out yet about what kind of performance those of us with hd3870's will experience (particularly in dx9). I've been holding off on upgrading until after this winter, and am hoping to get some decent frames in this game when I pick it up on Monday.
 
Stalker Clear Sky is another TWIMTBP game.
From the bench, 4870X2 is clearly the winner.
Notice that the 4870 is just ONE freakin frame behind GTX 280.
and
the 4850 beats the GTX 260 at high resolution

So much for the notion that X2 / CF need good profile to work properly
So much for "single core card performs consistently"
:D

It looks like ATI cards are better at newer games, while nvidia cards, umm, still good at Crysis.


http://www.pcgameshardware.com/aid,659033/Reviews/Stalker_Clear_Sky_DX10_Benchmark_Review/?page=2

Lovely generalizations. I don't get this fanboi mentality that thinks you have to get on a soapbox and "try" to run down other brands/options to justify your purchase. The current generation of both companies play games extremely well. One side isn't seriously trumping the other at any price point. Its all good.

Also, you can glean reviews all over the net using different systems/games/benchmarks that will show one card winning over the other. It really doesn't matter much, they're all fine. There's no doubt the HD 4870 X2 is faster in most games - it should be with two chips and $130+ more. And sooner or later Nvidia will come out with a GTX 280 GX2 and it'll be back to the HD 3870 X2/9800 GX2 situation. Back and forth.
 
Lovely generalizations. I don't get this fanboi mentality that thinks you have to get on a soapbox and "try" to run down other brands/options to justify your purchase. The current generation of both companies play games extremely well. One side isn't seriously trumping the other at any price point. Its all good.

Also, you can glean reviews all over the net using different systems/games/benchmarks that will show one card winning over the other. It really doesn't matter much, they're all fine. There's no doubt the HD 4870 X2 is faster in most games - it should be with two chips and $130+ more. And sooner or later Nvidia will come out with a GTX 280 GX2 and it'll be back to the HD 3870 X2/9800 GX2 situation. Back and forth.
keep on dreaming :D
 
we are not going to see a GTX 200 X2 anytime soon. that will not happen. Nvidia will be doing something but there is no way, the 4870X2 is already a hair drier, Dual GPU GTX 200 series would be worse.

A simple solution for nvidia is to enable SLI on any PCIe motherboard. At some point they are going to have to decide if they want to sell more cards or keep sell crappy motherboards. They have to realize how much in sales there proprietary motherboards are costing them.
 
You say this:




And then say this:




Am I the only one who sees the irony? :rolleyes:

There is no irony. I know for a fact money change hands in one case to push the technology ATI is claiming is so essential (which is why it required $?), and I know for a fact money did not change hands in the other. Ubisoft removed the code, not NVIDIA. Not going to give up sources though.

I apologize you can not be inside the same circles I am though <3
 
Back
Top