Geforce GTX 200 launch on June 18th

Status
Not open for further replies.
dammit it sucks that i have a 9800gx2 which is like the ultimate buyer's remorse item right now because of eol and all that. oh well considering i got it for only 250 its not bad. i hope it can play far cry 2 though.
Good point, it is kind of "the ultimate buyer's remorse item" right now!
$250? Where did you buy it, in a bar? :D
 
Let's hope the dates are somewhat accurate. Although, I won't be running out to buy the latest and greatest this time around. My GX2 will have to choke on the next round of games before I spend anymore money on a video card. It will be interesting to see how this card performs on a 30" monitor with the latest games at the time of release.
 
If that rumoured price of $500 for Geforce GTX 280 holds true then what will be the price for GTX 260 which also sounds pretty intresting?

Expect around 550, no less for cheapest GT200 as the card is not cheap like G92:s to make partly due to a 512-bit bus.

time to upgrade says the person with the 9800GX2... you make meh sick sir!! ;)

im off to a corner to cry...
If you already payed the "entryfee" for it so why not "step-up" for low or no money besides a small shipping fee? That if i understand correct about "step-up" and both Gx2 is 599 and GTX280 will be 599.

personally my pockets are still burnt from my 8800GTX!!! :D:D:D
Best buy i ever did in nov 2006 so if you bougth it then you should not feel any pain in the wallet even for giving it away or killing it, bought during late 2007/2008 then i feel sorry for you, specially if you bought an Ultra for way to mutch comparing a GTX .

dammit it sucks that i have a 9800gx2 which is like the ultimate buyer's remorse item right now because of eol and all that. oh well considering i got it for only 250 its not bad. i hope it can play far cry 2 though.
Thats why you buy EVGA (or BFG for that matter) that allowes step Up for a small fee and their cards are not any higher in price either.
Buying the cheapest around is just fooling you self and will burn you, just like with this card.
I have no remorse at all about the Gx2, fells like the brand and timing of purches is bulls eye just like the 8800GTX was with monsterperformance in DX9 compared to the DX9 card that was out during that time and i dont understand the one that waited (to long) for DX10-games before buying it and loose what made 8800GTX the best value/time buy ever possible with 16 month on the throne and still kicking some ass.

The GTX 280 for only shipping cost feels just about right and choosing EVGA even more right as it did not come with a premiumprice worth talking about.

Even if GTX280 is "only" 10-15% faster than Gx2 the stepup feels just right as it does not cost anything.

Should be tackling the $400-450 range.

Don´t expect that low prices for GT200

Expect like this as it always has been around 600USD for topcard:

GeForce 9900GTX 1GB GDDR3 599$
GeForce 9900GTS 896MB GDDR3 549$
GeForce 9800GTS 512MB aka G92 GDDR3 249$
 
Don´t expect that low prices for GT200

Expect like this as it always has been around 600USD for topcard:

GeForce 9900GTX 1GB GDDR3 599$
GeForce 9900GTS 896MB GDDR3 549$
GeForce 9800GTS 512MB aka G92 GDDR3 249$

That's not what several rumors point to. GTX 280 aka 9900 GTX, is rumored to cost no more than $499. The "lower" high-end model GTX 260 aka 9900 GTS should cost something around $400-450.
 
My BFG step-up ends June 27th... it better come out the 18th!!!!
 
That's not what several rumors point to. GTX 280 aka 9900 GTX, is rumored to cost no more than $499. The "lower" high-end model GTX 260 aka 9900 GTS should cost something around $400-450.
That only point to that GTX280 is not so good as rumored that it can be sold for 599 or a higher 599 card will be out later, perhaps a 9900Gx4 for "a bit more".
 
I'd be tempted by the shinies to get two of these. I'm not sure what I do about my 780i though. If this thing requires a PCI-E 2.0 slot that leaves me with looking at upgrading to a 790i if I want to go TRI-SLI. But I just don't think there will be a need for that if this card is everything they say it is.
 
Aw man... That's like a week after my Step-Up expires. :(

Is eVGA lenient about this sort of thing, could I talk them into letting me step up?

All of that is assuming it does actually launch 6/18. :)
 
I dont think its coded as efficiently as it could be. The framerate is poor even in sections with crappy graphics. STALKER blows Crysis away on the inside environments and can be maxed out at 1280 with just an 8600gt and modern dual core cpu. Most of the inside scenes in Crysis are very poor looking to say the least yet the framerate doesnt reflect that. DX10 has little to do with it since even on DX9 Crysis gets basically the same framerates.

STALKER looks like utter shit compared to Crysis, anywhere. If you wanna talk about a poorly coded game, STALKER is it.
 
How do you know that?

From the rumors I've read Far Cry 2 runs fine on current hardware. Remember, its also on the consoles and wasn't developed by CryTek and doesn't use CryEngine 2.

Truth be told, who really NEEDS more GPU power for games? The only reason I want a GTX 280 or a pair is to play Crysis in all of its glory.
 
STALKER looks like utter shit compared to Crysis, anywhere. If you wanna talk about a poorly coded game, STALKER is it.

Ok, this is all I can take.

STALKER was not poorly coded to the point that it effected performance. It had bugs that would mess up certain missions/story/etc. but the overall engine of the game was nearly perfect.

I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.

As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.

Crysis had shitty programing. End of story.


EDIT: and btw, I do look forward to seeing the new cards. I wonder how well they will be able to run Crysis. GTX 200 playing Crysis with max settings at 20 fps, ftw. ;)
 
I do hope the cards are bringing a big performance increase. I have grown quite tired of my 8800.

Same, there has been very few parts that are significantly better than the 8800GTX to justify the upgrade. If these are 2x performance of 8800GTX or better then I'll be getting one sharpish, there's a lot of games I want to step up to 2560x1600 while playing. Crysis is one of them :D
 
STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did.

I would say that STALKER and Jericho both have great graphics, but not quite on par with Crysis. COD4 and UT3 are simplistic in comparison and shouldn't even be mentioned in the same sentence as Crysis graphics. Running Crysis on my rig at 1680x1050 using the Ultima Cuban Doom whatever config Level 3 I average about 40FPS and it looks significantly better, literally stunning compared to any UT3 based game.
 
Wait I am confused is the GTX 280 the offical name for what we have been calling the 9900GTX? Or is a different card and the 9900GTX is coming out later?
 
With as much as I want to keep upgrading my computer. I think I am stuck at with my 8800GTX's for a while. I don't want the newer video cards becoming a bottleneck with my FX-60. I have already got my FX-60 running @ 2.7 on air andmy 8800's running a few steps faster then what they should be running. I can't keep pumping $500 when i new video card comes out. I work fulltime, but also attend school for my masters degree as well...Plus the SF Bay area is not cheap!!!! I think my next computer purchase will be a console since most of the games I play are already out for that as well.
 
I hope this hits before June 27, because that is when my 8800GTX EVGA step-up ends. I bought my 8800 in hopes I could step up and have not even used it yet.
 
I will wait for ATI to put out their competitive product (hopefully its competitive) and then wait a few months. I purchased my 8800GTS back in early January and besides Crysis there really isn't any need to shell out another $500 (which would be about 75% the cost of my current build). I seriously doubt Far Cry 2 will be any more intensive than Crysis.

Ok, this is all I can take.

STALKER was not poorly coded to the point that it effected performance. It had bugs that would mess up certain missions/story/etc. but the overall engine of the game was nearly perfect.

Thats putting it nicely. My copy had several showstopping bugs to the point I was never able to finish the game.

I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.

As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.

Crysis had shitty programing. End of story.


EDIT: and btw, I do look forward to seeing the new cards. I wonder how well they will be able to run Crysis. GTX 200 playing Crysis with max settings at 20 fps, ftw. ;)

Crysis doesn't need 60 fps to be enjoyed at good settings. In fact I never played any other game that was playable at low fps. I played through the whole game once without checking the FPS. I played the whole game at roughly 27fps without noticing.

STALKER's exterior environments would have been passable as impressive back in 2005 when the game was originally slated for release. Unfortunately it slipped a couple years and technology left it behind. The other games you mentioned don't have expansive environments which Crysis has. The interior environments in STALKER looked good to be sure, I felt like I was playing two seperate games.
 
Psht. I knew it all along, down to the eventual release date as well :). Wicked news though.

I wonder if the 499$ pricetag I've been quoted from a few sources will turn true as well...

Heh, having made a few dollars selling PC parts so far this year, I'll have enough to buy one, and a waterblock, and give the 8800GTS to the GF to play the sims and oblivion :eek:.
 
I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.

As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.

Crysis had shitty programing. End of story.

I have seen nothing in Crysis that indicated poor coding. Crysis was simply too ambitious in its graphical quality and released a bit too soon. That said, your entire point boils down to "It can't do X FPS at Y settings while game Z can, so clearly it is poorly programmed", which, frankly, is complete and utter bullshit and reeks of childish behavior. There are also numerous hidden costs in some of the choices the devs made in Crysis, such as the day/night cycle which basically means that the lighting for the entire level must be dynamic (which will greatly increase the poly count for the terrain just to maintain lighting quality). Toss in the near-fully destructible environments, and there just aren't many major optimizations you can do.
 
I have seen nothing in Crysis that indicated poor coding. Crysis was simply too ambitious in its graphical quality and released a bit too soon. That said, your entire point boils down to "It can't do X FPS at Y settings while game Z can, so clearly it is poorly programmed", which, frankly, is complete and utter bullshit and reeks of childish behavior. There are also numerous hidden costs in some of the choices the devs made in Crysis, such as the day/night cycle which basically means that the lighting for the entire level must be dynamic (which will greatly increase the poly count for the terrain just to maintain lighting quality). Toss in the near-fully destructible environments, and there just aren't many major optimizations you can do.

Wow, you guys made some good points that I didn't take into consideration. Thanks!

I suppose I was simply comparing the base graphics but not counting things like destructible environments and the dynamic lighting, which the other games I had listed didn't really have.
 
Ok, this is all I can take.

STALKER was not poorly coded to the point that it effected performance. It had bugs that would mess up certain missions/story/etc. but the overall engine of the game was nearly perfect.

I can't say that about Crysis. It had such poor coding that the performance sucked complete ass. Unless a person was willing to pay over $2000 on decent computer parts, prepare to run Crysis like sh!t. It should not take 2 (or 3) 8800GTX's OC'ed just to get Crysis to run on DX10 at NEAR 60 fps. Try playing it with an 8800GTS 320/640. At 1280x1024, all settings medium (or low) it never even reached 60 fps, more like 30 fps at best. Try playing ANY other game on a 8800GTS 320/640 at the same resolution. They ALL run at or above 60 fps with all settings at full.

As for Crysis' graphics, they weren't that impressive. STALKER, Jericho, COD4, and UT3 all looked way better than Crysis did, and those didn't take an 8800GTS/X/Ultra to run at 30 fps at max settings.

Crysis had shitty programing. End of story.


EDIT: and btw, I do look forward to seeing the new cards. I wonder how well they will be able to run Crysis. GTX 200 playing Crysis with max settings at 20 fps, ftw. ;)


The STALKER engine is mediocre at best. Doom3 is a good engine. Source is a good engine. UE3 is a good engine. There isnt a single game out there that can touch Crysis as far as graphical quality, even at medium to high settings, which the now 150 dollar 8800 GT can handle. BTW, just because it doesnt reach 60fps, doesnt mean its not playable. Crysis is incredibly playable even around 20-30fps. I had a 7900GTX that i played Crysis on at 1680x1050 with a mix of medium and high, shaders had to be set to medium. It ran around 25-30fps for most of the game, however i will point out i had to turn more settings down for the Assault level.

Wow, you guys made some good points that I didn't take into consideration. Thanks!

I suppose I was simply comparing the base graphics but not counting things like destructible environments and the dynamic lighting, which the other games I had listed didn't really have.

Nor did they have half the shader quality that Crysis has. The Crysis gameplay was nothing amazing, but the graphics ability of CryEngine2 have seriously raised the standard.
 
Aw man... That's like a week after my Step-Up expires. :(

Is eVGA lenient about this sort of thing, could I talk them into letting me step up?

All of that is assuming it does actually launch 6/18. :)

Imagine how many others are +/- minus a few days... :(
 
finally a real 1gb card. nice
QFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.
 
QFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.
Well..no. HD2900 XT 1024MB didn't perform much better than HD2900 XT 512MB..minimal performance increase
 
QFT. I was getting tired of seeing those 1GB cards with 256-bit and lower bus width. At least with the 8800GTS and GTX (pre-G92) we had 320-bit. The only card that really took advantage of 1GB were the HD 2900 series with 512-bit bus.

Actually the extra ram did next to nothing.
 
STALKER looks like utter shit compared to Crysis, anywhere. If you wanna talk about a poorly coded game, STALKER is it.
Yes STALKER is buggy and the outside scenes are nothing great. Crysis is awesome outside but youre out of your mind if you dont think the inside environments of STALKER arent much better looking than Crysis. STALKER is the BEST looking game inside and its not even a contest. Also STALKER can show those awesome inside environments even on a mid range card and that was my point.
 
Fortunately, as games continue to use higher resolution textures and high-res monitors continue to get cheaper, having 1GB of memory with reasonable bandwidth will become justified.

For myself, I've been using well over 512MB of vram in Oblivion for a while and I'm excited to play that game without stuttering.
 
Fortunately, as games continue to use higher resolution textures and high-res monitors continue to get cheaper, having 1GB of memory with reasonable bandwidth will become justified.

Only if you have a GPU to push the pixels...which the 2900 series did not have.
 
Yes STALKER is buggy and the outside scenes are nothing great. Crysis is awesome outside but youre out of your mind if you dont think the inside environments of STALKER arent much better looking than Crysis. STALKER is the BEST looking game inside and its not even a contest. Also STALKER can show those awesome inside environments even on a mid range card and that was my point.

If your comparing the inside enviroments of STALKER to the inside enviroments of Crysis...then maybe...how many times did you have to go inside a building in Crysis? 4? If your insinuating the inside enviroments of STALKER are better than Crysis in general, well, ok, thats your opinion, but consider the enviroment...huge levels with every single attention to detail and dense foliage VS...inside a building...
 
If your comparing the inside enviroments of STALKER to the inside enviroments of Crysis...then maybe...how many times did you have to go inside a building in Crysis? 4? If your insinuating the inside enviroments of STALKER are better than Crysis in general, well, ok, thats your opinion, but consider the enviroment...huge levels with every single attention to detail and dense foliage VS...inside a building...
Its not an opinion that STALKER looked better inside its a fact and anybody with a set of eyes knows this to be true. If you think differently then you, at the very least, need glasses.

Let me explain my point one more time...STALKER had extremely good graphics on the inside environments but didnt slow framerates down to a crawl. Crysis looked like a game from 2004 inside the huts and other inside environments yet the framerate was still almost as shitty as it was outside where the graphics were good. ;)
 
Status
Not open for further replies.
Back
Top