i hope it can play far cry 2 though.
Trust me. A 9800GX2 WILL play FarCry 2.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
i hope it can play far cry 2 though.
Good point, it is kind of "the ultimate buyer's remorse item" right now!dammit it sucks that i have a 9800gx2 which is like the ultimate buyer's remorse item right now because of eol and all that. oh well considering i got it for only 250 its not bad. i hope it can play far cry 2 though.
If that rumoured price of $500 for Geforce GTX 280 holds true then what will be the price for GTX 260 which also sounds pretty intresting?
If that rumoured price of $500 for Geforce GTX 280 holds true then what will be the price for GTX 260 which also sounds pretty intresting?
If you already payed the "entryfee" for it so why not "step-up" for low or no money besides a small shipping fee? That if i understand correct about "step-up" and both Gx2 is 599 and GTX280 will be 599.time to upgrade says the person with the 9800GX2... you make meh sick sir!!
im off to a corner to cry...
Best buy i ever did in nov 2006 so if you bougth it then you should not feel any pain in the wallet even for giving it away or killing it, bought during late 2007/2008 then i feel sorry for you, specially if you bought an Ultra for way to mutch comparing a GTX .personally my pockets are still burnt from my 8800GTX!!!
Thats why you buy EVGA (or BFG for that matter) that allowes step Up for a small fee and their cards are not any higher in price either.dammit it sucks that i have a 9800gx2 which is like the ultimate buyer's remorse item right now because of eol and all that. oh well considering i got it for only 250 its not bad. i hope it can play far cry 2 though.
Should be tackling the $400-450 range.
Don´t expect that low prices for GT200
Expect like this as it always has been around 600USD for topcard:
GeForce 9900GTX 1GB GDDR3 599$
GeForce 9900GTS 896MB GDDR3 549$
GeForce 9800GTS 512MB aka G92 GDDR3 249$
That only point to that GTX280 is not so good as rumored that it can be sold for 599 or a higher 599 card will be out later, perhaps a 9900Gx4 for "a bit more".That's not what several rumors point to. GTX 280 aka 9900 GTX, is rumored to cost no more than $499. The "lower" high-end model GTX 260 aka 9900 GTS should cost something around $400-450.
Trust me. A 9800GX2 WILL play FarCry 2.
How do you know that?
I dont think its coded as efficiently as it could be. The framerate is poor even in sections with crappy graphics. STALKER blows Crysis away on the inside environments and can be maxed out at 1280 with just an 8600gt and modern dual core cpu. Most of the inside scenes in Crysis are very poor looking to say the least yet the framerate doesnt reflect that. DX10 has little to do with it since even on DX9 Crysis gets basically the same framerates.
How do you know that?
STALKER looks like utter shit compared to Crysis, anywhere. If you wanna talk about a poorly coded game, STALKER is it.
I do hope the cards are bringing a big performance increase. I have grown quite tired of my 8800.
STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did.
As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did
Thanks.Yes, they changed the naming.
Ok, this is all I can take.
STALKER was not poorly coded to the point that it effected performance. It had bugs that would mess up certain missions/story/etc. but the overall engine of the game was nearly perfect.
I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.
As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.
Crysis had shitty programing. End of story.
EDIT: and btw, I do look forward to seeing the new cards. I wonder how well they will be able to run Crysis. GTX 200 playing Crysis with max settings at 20 fps, ftw.
I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.
As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.
Crysis had shitty programing. End of story.
I have seen nothing in Crysis that indicated poor coding. Crysis was simply too ambitious in its graphical quality and released a bit too soon. That said, your entire point boils down to "It can't do X FPS at Y settings while game Z can, so clearly it is poorly programmed", which, frankly, is complete and utter bullshit and reeks of childish behavior. There are also numerous hidden costs in some of the choices the devs made in Crysis, such as the day/night cycle which basically means that the lighting for the entire level must be dynamic (which will greatly increase the poly count for the terrain just to maintain lighting quality). Toss in the near-fully destructible environments, and there just aren't many major optimizations you can do.
Ok, this is all I can take.
STALKER was not poorly coded to the point that it effected performance. It had bugs that would mess up certain missions/story/etc. but the overall engine of the game was nearly perfect.
I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.
As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.
Crysis had shitty programing. End of story.
EDIT: and btw, I do look forward to seeing the new cards. I wonder how well they will be able to run Crysis. GTX 200 playing Crysis with max settings at 20 fps, ftw.
Wow, you guys made some good points that I didn't take into consideration. Thanks!
I suppose I was simply comparing the base graphics but not counting things like destructible environments and the dynamic lighting, which the other games I had listed didn't really have.
Aw man... That's like a week after my Step-Up expires.
Is eVGA lenient about this sort of thing, could I talk them into letting me step up?
All of that is assuming it does actually launch 6/18.
QFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.finally a real 1gb card. nice
Well..no. HD2900 XT 1024MB didn't perform much better than HD2900 XT 512MB..minimal performance increaseQFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.
QFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.
Yes STALKER is buggy and the outside scenes are nothing great. Crysis is awesome outside but youre out of your mind if you dont think the inside environments of STALKER arent much better looking than Crysis. STALKER is the BEST looking game inside and its not even a contest. Also STALKER can show those awesome inside environments even on a mid range card and that was my point.STALKER looks like utter shit compared to Crysis, anywhere. If you wanna talk about a poorly coded game, STALKER is it.
Fortunately, as games continue to use higher resolution textures and high-res monitors continue to get cheaper, having 1GB of memory with reasonable bandwidth will become justified.
Yes STALKER is buggy and the outside scenes are nothing great. Crysis is awesome outside but youre out of your mind if you dont think the inside environments of STALKER arent much better looking than Crysis. STALKER is the BEST looking game inside and its not even a contest. Also STALKER can show those awesome inside environments even on a mid range card and that was my point.
Its not an opinion that STALKER looked better inside its a fact and anybody with a set of eyes knows this to be true. If you think differently then you, at the very least, need glasses.If your comparing the inside enviroments of STALKER to the inside enviroments of Crysis...then maybe...how many times did you have to go inside a building in Crysis? 4? If your insinuating the inside enviroments of STALKER are better than Crysis in general, well, ok, thats your opinion, but consider the enviroment...huge levels with every single attention to detail and dense foliage VS...inside a building...