is it just me or does last generation cards not keeping up in fear and cod2

Status
Not open for further replies.
USMC2Hard4U said:
Thats why they are "Last Gen" cards....
.... I bet Fear and COD2 will run great on Next Gen Cards. I beleive thats their purpose.


So thier purpose is to be OK but not great or even very "playable" until 6-12 months when a better VC comes along. Kind of like buying a car that can only go 25mph until better gas is availble....
 
dderidex said:
Did you READ the article?

Seems like not.

They *used* the 1.01 patch. AND beta release drivers from ATI and nVidia (both said to improve FEAR performance).

The numbers they got were still completely unacceptable. I have an LCD, so gaming at the native res is the only option. And if you think 'no' or '2x' AA is acceptable in a shooter, you are crazy.

At 1600x1200, 4xAA, 8xAniso, game settings maxed, there isn't a card out today you can buy that would provide acceptable framerate in that game.

Heck, even with NO AA or Aniso applied, at 1600x1200, with soft shadows, the game is unplayable on any card at all.

What kind of crackhead releases a game that's playable on NO CARDS AT ALL?? Or, rather, only playable to people with lower standard.


Yup, I can't even run 1024x768 with max everything and get consistently playable FPS in FEAR. In fact, the game auto detects my system and video card as 'medium'.
 
COD2 demo runs fine on my system when using 1024x768 on my 5700Ultra 128mb. The original Fear demo ran like crap but thats to be expected. So did Serious Sam2 demo, just gotta keep up with the times, and if you want those 16x12 4xAA options, get a 2nd job or go back to your XBOX/PS2


My rig:
P4 3Ghz 800 HT
1gb Mushkin DDR400
evga 5700Ultra 128mb 544/1.04 (stock 300/906) on air
WD 120gb SE/ WD 40gb non-SE
Sony DVD-R/W, Samsung CD-R/W, Pioneer DVD
 
Netrat33 said:
I don't understand why people always HAVE to play at 1600x1200 and up.

You know lower resolutions often look just as good. I mean...aren't you playing games for...the game itself?

ON LCD's they don't. Mine does a good job dropping down to 1280x1024 in games, but I can still notice the change to a non-native resolution. 1024x768 looks like ass on ym monitor.
 
PC gaming is all about evolution, games push the boundaries, and we upgrade, games are really the only thing pushing consumer 3D graphics evolution. Just think of how much farther we have to go to render a perfectly realistic image in realtime in 3D. I mean just look at how games are made now, only the surfaces of objects are rendered, there is no actual depth to a 3D model, only the surface of the polygons have textures. We have a long way to go, and in that computer hardware must be pushed forward, games are a big driving force.
 
Re: FEAR: at 12x10 with everything on max except 2xAA and 8xAF I get something like 80% above 40fps in that demo and at 16x12 with the same settings I get 60% between 25 and 40 and only 1% below 25. If I bump the AA and AF up to max then I get the same fps at 12x10 as I did with 16x12 with AA and AF turned down a notch. I can't remember another time when the fastest hardware on the market wasn't enough to max out the settings in a game but that's not necessarily a bad thing. And at 16x12 you don't need AF and AA.
 
GregP24 said:
Yup, I can't even run 1024x768 with max everything and get consistently playable FPS in FEAR. In fact, the game auto detects my system and video card as 'medium'.


damn and u have the best card out.............................................

HydroSqueegee said:
If you cant enjoy a game just because it doesn’t look pretty, then why are you playing the game?


well it has to run smooth at least come on now.


well guys i think since bf2 was the 2 gig standard..........pretty soon(possibly now) it will become a 2 gig and 2 video card standard. dont you think?
 
Does the game really look THAT much better than HL2 and Doom3? Seriously...

I haven't played FEAR yet. Now if it does indeed look 2x better than those two games OK I can see the poor performance. But if they don't indeed look that much better then it is probably just a poorly coded game. Like Halo was.
 
Netrat33 said:
I don't understand why people always HAVE to play at 1600x1200 and up.

You know lower resolutions often look just as good. I mean...aren't you playing games for...the game itself?

Some of us have LCD's where the graphics look like ass when you play at resolutions other than the res the LCD was designed for.

My A64/3000, 9800 Pro, 1gb RAM system would barely even run CoD2 on my Dell 2005FPW's native 1680x1050 resolution. I ended up running it at some gawd-awful 800x480 resolution (or something like that) and it looked terrible.

Edit: in comparison BF2 and HL2 run pretty nicely on my system.

That is why.
 
Riptide_NVN said:
Does the game really look THAT much better than HL2 and Doom3? Seriously...

I haven't played FEAR yet. Now if it does indeed look 2x better than those two games OK I can see the poor performance. But if they don't indeed look that much better then it is probably just a poorly coded game. Like Halo was.


HL2 isn't even a current game engine, it looks good because of the years of art development that went into it.

Far Cry also ran pretty terrible on the video card that were out at the time. It wasn't until the 6800 and x800 came out that it feasable to play with everything maxed out.
 
tvdang7 said:
im with this guy......i think they could have made it more optimized and consumer friendly :confused: i would hate to think of the numbers for the agp generaton cards such as 9800 pro and 5900 xt they probably SUCK

I have a xp3000/1 gig/9800pro system and although I bought F.E.A.R. on launch day I haven't gotten past the the 3rd level. It was immediately obvious I couldn't put off upgrading any longer. A shame to play a game like that at such low settings. Highest playable for me was 8x6 mostly medium and it still lagged occasionally. And the textures looks like crap compared to the high setting. So next week when I build the new rig I'll have F.E.A.R. and Q4 to play. :)

Oh and I installed it on my brother's system to test it. On 640x480 everything low it's basically unplayable. (xp1700/512mb/9600pro) So he gets my old rig and doesn't get the eyecandy but he's not [H] so it doesn't bother him just so it plays.
 
2 GB ram helps a lot...if you have 1 GB i recomend getting 1 more GB before you bash this game, cuz it seems that the people with 2GB ram have better frame rates.
 
These guys who are saying they get 60+ fps at all these crazy settings in FEAR and CoD2 are lying to make themselves feel better. We are all feeling the pinch.
 
Cabezone said:
HL2 isn't even a current game engine, it looks good because of the years of art development that went into it.
That didn't really address my question. HL2 not being a "current" game engine (debatable) is completely beside the point.

Yes or no. Does FEAR look so much better than HL2 and Doom3 that the performance issues should be expected? Or... not? To me, if it runs half as fast as those games then it had better look twice as good. If not then it's probably poorly coded.

I haven't played the game yet.
Far Cry also ran pretty terrible on the video card that were out at the time. It wasn't until the 6800 and x800 came out that it feasable to play with everything maxed out.
Oh I agree and even now I don't play FarCry w/4xAA even with my X850. To slow in 1920x1200. But again... this is not really answering my question. FarCry was noticeably much nicer than anything else out at the time so those performance issues were to be expected. Is that the case this time around. That is what I'm getting at here.
 
Riptide_NVN said:
That didn't really address my question. HL2 not being a "current" game engine (debatable) is completely beside the point.

Yes or no. Does FEAR look so much better than HL2 and Doom3 that the performance issues should be expected? Or... not? To me, if it runs half as fast as those games then it had better look twice as good. If not then it's probably poorly coded.

I haven't played the game yet.
Oh I agree and even now I don't play FarCry w/4xAA even with my X850. To slow in 1920x1200. But again... this is not really answering my question. FarCry was noticeably much nicer than anything else out at the time so those performance issues were to be expected. Is that the case this time around. That is what I'm getting at here.

I major chunk of the performance hit is from the realtime shadows. Every shadow in the game is rendered in realtime and there's no option to to down on the shadows really. It's either off or on. That doesn't make it look a ton better till you're using a flashlight of moving lights around. So weather or not that makes it look "twice as good" is up to the person.

It's doesn't mean that the game is poorly coded.
 
Look, uh, we just happen to be living in this part of history where computational machines are beginning to touch the ability to render fantasy close to reality. Because of the world economic system, generations of enthusiast boards permeate the market, creating incentive for companies and corporations to push the envelope. One day there will be a card that can do it all, or no card at all. Right now we have to accept the fact that we are making history.


In short, stop bitching.
 
Mister E said:
These guys who are saying they get 60+ fps at all these crazy settings in FEAR and CoD2 are lying to make themselves feel better. We are all feeling the pinch.
What is it you're saying exactly? FEAR is a resource-hogging bitch and those with enough resources to hog are obviously going to be the ones with playable (and above) framerates. At 1280x1024 with 2xAA and 8xAF I get WAY above playable framerates, so fast in fact I think I'll go back to 4xAA.
COD doesn't interest me so I have no input there. But I get STEADY 70fps average in BF2 at 16x12 with everything maxed and that makes me feel like more of a man so :rolleyes: on you.
P.S. soft shadows kick ass.
P.P.S. that little girl in fear scares the shit out of me.
 
I laugh at those that bought LCD's. :p

CRT's all the way-- can go down to 640 x 480 and it still looks good. :)

-J.
 
GeForceX said:
I laugh at those that bought LCD's. :p

CRT's all the way-- can go down to 640 x 480 and it still looks good. :)

-J.

640x480 looks bad on anything except a 14" CRT

anyone gaming at 640x480 has my deepest sympathies.
 
tranCendenZ said:
The 1.01 patch for FEAR helps performance a lot.

Not for me. I have seen zero proof of this, yet you've posted it several times.

If you guys want to talk about how the PC affects the relationship, take it to General Mayhem, this isn't the place. - p[H]
 
Soul.Survivor said:
640x480 looks bad on anything except a 14" CRT

anyone gaming at 640x480 has my deepest sympathies.

Not even a 21" CRT? Sure it looks good. Perhaps you haven't realized you could resize the screen...

-J.
 
GeForceX said:
Not even a 21" CRT? Sure it looks good. Perhaps you haven't realized you could resize the screen...

-J.

Are you joking? The point of having a 21" screen is, oh yeah, large viewable area. IDK about you, but it seems kinda retarted to make the actual screen smaller. Thus the 14" comment.
 
Project_2501 said:
Are you joking? The point of having a 21" screen is, oh yeah, large viewable area. IDK about you, but it seems kinda retarted to make the actual screen smaller. Thus the 14" comment.

Well, playing console games on a big tv is pretty much the same thing, and people do that all the time.
 
Depends on the console though. Like an XBox supports whatever HD resolutions so it doesn't resize anything, and it looks like it should.

I honestly can't see playing SNES on a 57" plasma hdtv... :rolleyes:
 
USMC2Hard4U said:
Thats why they are "Last Gen" cards

Of course they are not gonna do that good. These Games comming out now arent even meant to play good on current hardware. These games are stressing the best of the best, thus making hardware makers push new and better hardware.

I bet Fear and COD2 will run great on Next Gen Cards. I beleive thats their purpose.
Great like i'll care about playing F.E.A.R.,Quake 4, and Call of Duty 2 in 6 months after I've already beaten them. I think the only reason i'd care is if they had multiplayer equal to Battlefield.
 
Riptide_NVN said:
Does the game really look THAT much better than HL2 and Doom3? Seriously...

I haven't played FEAR yet. Now if it does indeed look 2x better than those two games OK I can see the poor performance. But if they don't indeed look that much better then it is probably just a poorly coded game. Like Halo was.
Agreed, F.E.A.R. is fun but I'm more impressed visually with Quake 4. Call me crazy but the enviroments gets dull after miles & miles of catwalks,pipes,toolboxs and the same damn hammer with a blue handle. Quake 4 has great visuals while FEAR has more tactical fun.
 
It'd be worth the performance if it looked any better than HL2 or Doom 3, which it doesn't :/.
 
Smith said:
It'd be worth the performance if it looked any better than HL2 or Doom 3, which it doesn't :/.

@ 1024x768 I'd agree. But when you put the rez above 1280x1024 with max details and shadows and manage to stay above 40+ fps, FEAR looks really really good to me. Better than Doom3 and as good if not a tad better than HL2.
 
Let's put it this way: with soft shadows, everything looks very pretty, much better than anything out there. Unfortunately, that is a totally unfeasible option for 95% of gamers. On my 6800 GT, there is lag between firing a weapon and the bullet discharging just because of soft shadows. It's like Riddick if you enable the 2.0+++ specification, which must be soft shadows...totally unplayable.
 
doing some testing of a 7800 gtx oc i found 1280x960 2x tr ssaa/16xaf with no soft shadows the highest playable setting in fear, it seems enabling 4x aa (no tr) really brings the framerate down, and also going to 1600x1200 equally brings the framerate down

it seems you have to choose, do you want soft shadows in fear or AA, even with a 7800 gtx

tough game fo sure

i guess it's really built to stand the test of time, it'll look great on next gen hardware i bet
 
tvdang7 said:
im with this guy......i think they could have made it more optimized and consumer friendly :confused: i would hate to think of the numbers for the agp generaton cards such as 9800 pro and 5900 xt they probably SUCK

Well you don't have to have all the settings on "maximum" or "high" to determine what is "playable" or "not playable." It's the next generation, and instead of calling the mode "Ultra" like in Doom 3, F.E.A.R. is preparing for the future by calling it high. Sure they could have just moved everything up a notch, so normal was high, and high was ultra, and some crazy setting above that was super-ultra, but why bother?

I'm actually glad they even gave us that option. What if there was just normal, low, and lower? Then sure we could say we can play it at max settings with great fps, but we'll never know just how good it could be once we do have cards to play it.

I play with everything on normal and it looks great, on par with Doom 3. When I play it with things higher, it looks even better, like Doom 3 on Ultra mode. But just as I can't play Doom 3 on Ultra, I can't play this on maximum. I'll be happy in a couple months when I can with a geforce 8 series graphics card.
 
kubebot said:
Great like i'll care about playing F.E.A.R.,Quake 4, and Call of Duty 2 in 6 months after I've already beaten them. I think the only reason i'd care is if they had multiplayer equal to Battlefield.

People still play quake 3 and call of duty 1 if that counts... and both of those are more than 6 months old.
 
Smith said:
It'd be worth the performance if it looked any better than HL2 or Doom 3, which it doesn't :/.
Well that's at least one person that doesn't think this game is all that and a chicken wing. lol

I'll buy it once I get my 7800GTX installed and see how I feel about it. If it doesn't look much better than HL2 and it runs like shat, I'll be pretty unimpressed.
 
"I play with everything on normal and it looks great, on par with Doom 3. When I play it with things higher, it looks even better, like Doom 3 on Ultra mode. But just as I can't play Doom 3 on Ultra, I can't play this on maximum. I'll be happy in a couple months when I can with a geforce 8 series graphics card."

Dude, your post is ridiculous because you make it sound like money grows on fucking trees. "Well, gee, who cares if the games runs like shit for most people NOW, a year from now it will run great on yet ANOTHER $500 video card."

Wtf? Where is the logic here. For you maybe buying expensive cards is some kind of addiction or money drain but most gamers want an investment, not a tease.
 
I've been playing FEAR 1280x800 everything max settings with 2XAA 4xAF with the rig in my sig (yes that low of a res on my 2405) and it looks way better to me than 1680x1050 with no AA. Plays better too with 59fps average with with that built in test. Scaling lower resolutions on this monitor is no big deal IMO. Way better (looking as well than trying to salvage some shred of hope playing 1920x1200 with medium settings at like 15fps.
 
Charles said:
People still play quake 3 and call of duty 1 if that counts... and both of those are more than 6 months old.
I was refering to the single pleyer aspect. I doubt there will be lot of people playing FEAR or QUAKE 4 multiplayer as opposed to BF2 & CS:S. My point is I dont upgrade so that my old games look better, I've played & beaten them. I upgraded for BF2 because I play it lot and have ever since it came out.
 
Status
Not open for further replies.
Back
Top