GTS 250 Reviews

It's kind of amazing to see how far Nvidia was ahead with their 8800 series. Still, this name change is ridiculous. It makes less sense than the move from 9800gtx to 9800gtx+, at least there we got a die shrink.

For me, it makes sense. Having a lineup to cover low-end, mid-range and high-end, it should reflect so in names as well. The only thing that differs, is the performance. Using 9800gtx would be more misleading for new users and those not reading reviews as much. Its easy then to think that they are buying an "old tech" GF card, while it has the same features as the other GT2xx cards. There is no hardware features that seperates the generation, so nobody gets ripped off here. The name indicates its a lower performing card as well.

Those who are more active reading forums and reviews wont be affected by this anyway,

Cant see anything wrong with changing the name.
 
For me, it makes sense. Having a lineup to cover low-end, mid-range and high-end, it should reflect so in names as well. The only thing that differs, is the performance. Using 9800gtx would be more misleading for new users and those not reading reviews as much. Its easy then to think that they are buying an "old tech" GF card, while it has the same features as the other GT2xx cards. There is no hardware features that seperates the generation, so nobody gets ripped off here. The name indicates its a lower performing card as well.

Those who are more active reading forums and reviews wont be affected by this anyway,

Cant see anything wrong with changing the name.

Agreed.

A mod needs to lock this thread, you guys are ridiculous that you need to bitch so much about a name change. Go shit on the other threads bitching about the name change, this thread is about the reviews and performance of the GTS250, regardless of what it actually is.
 
It's kind of amazing to see how far Nvidia was ahead with their 8800 series. Still, this name change is ridiculous. It makes less sense than the move from 9800gtx to 9800gtx+, at least there we got a die shrink.
Well now we get smaller power consumption and smaller card. That's something.
 
I'm so glad ATI finally got cards that hurt Nvidia out (forced Nvidia to drop their pants on their price right after release), or else we'd be paying $300 for a shinny new 8800 I mean GTS 250!
 
I'm so glad ATI finally got cards that hurt Nvidia out (forced Nvidia to drop their pants on their price right after release), or else we'd be paying $300 for a shinny new 8800 I mean GTS 250!

I think that your forgeting about AMD and Nvidia being under investigation for price fixing. Im sure that had something to do with it too.
 
Agreed.

A mod needs to lock this thread, you guys are ridiculous that you need to bitch so much about a name change. Go shit on the other threads bitching about the name change, this thread is about the reviews and performance of the GTS250, regardless of what it actually is.

I think that it's relevent to the thread. Did you read the Anandtech review? Its the only one that I read, I had a good chuckle.
 
You seriously need me to proove the concept of a bottleneck? Not like you have any proof either, otherwise you would have posted it already.
 
@nissanztt90: You're are arguing about another topic, subject and another thread in this one? Take it to the PMs or go back to that thread.
 
Regardless of the renaming fiasco what is there really to write home about. It still needs a price drop to stay competitive with the recently dropped 4870. It isn't a terrible card for the price but ATI seemed on the ball when it came to undercutting the announcement.

If this is dropped to 4850 prices it is much more attractive.
 
'NVIDIA's take on this is also flawed in that it treats customers like idiots and underlines the fundamental issue we have. Do I need a card with a new name on it to believe that it is worthy of my purchase, or can I go read reviews comparing the hardware and learn for myself whether or not any card (regardless of the name) fills my need? Maybe this name change is for people who don't know anything about graphics hardware then. In that case the thing that "sells" the card is the simple fact that NVIDIA has convinced someone that this part is an affordable version of a card from their latest line of products. Saying they need a name change to maintain current naming is essentially admitting that the only reason the name needs to be changed is to mislead uninformed people.'

anands censored part :)
 
@nissanztt90: You're are arguing about another topic, subject and another thread in this one? Take it to the PMs or go back to that thread.

Lol please. This thread was off track 6 seconds after it was made with all the whiners bitching about the name change instead of the reviews and performance of the card. Theres only 17 other threads they could have done that in. :rolleyes:

Proof?

Edit: Stop trolling and if you want to debate that subject atleast post it in the correct thread.

Nice, you just could resist making the troll comment huh? I said i thought you were wrong, the naming is not relevant. If you cant form a decent counter arguement youd do better to just not say anything then pull out the old ad hominem.
 
This is crap. Why rename an old card to confuse consumers. Get off your ass and make somthing new.
 
This is crap. Why rename an old card to confuse consumers. Get off your ass and make somthing new.


GT200's are just too expensive for them to make GPU's affordable in the low-mid range segment, so they did what they had to do this time, call on their cheaper G92's to fit the market where they can't compete. I personally don't like it either, but from business point of view it makes sense. I see G92's sticking around for a while longer, now that the news is out that nVidia is having problems manufacturing GT200's on 40nm to make it cheap and affordable for them to compete with GT200 architecture. Good luck nVidia, you're gonna need it.
 
Why cant this thread just die? Actually, on second though...everyone should get as mad as they can over renaming.:rolleyes:
 
Pretty sure im the only one in this thread who hasnt complained about the renaming.

You're the one constanly complaining about the people who do though.....you're the one getting all worked up lmao... :D and even bumping the thread that you want to die...oh the irony. :p
 
GT200's are just too expensive for them to make GPU's affordable in the low-mid range segment, so they did what they had to do this time, call on their cheaper G92's to fit the market where they can't compete. I personally don't like it either, but from business point of view it makes sense. I see G92's sticking around for a while longer, now that the news is out that nVidia is having problems manufacturing GT200's on 40nm to make it cheap and affordable for them to compete with GT200 architecture. Good luck nVidia, you're gonna need it.

There are no such "news". What we know is that TSMC's 40 nm process, is not ready for prime time yet, since it suffers from leakage problems.

Also, G92 is used, because it still competes with ATI's offerings in the same price range. Had ATI made something better than a HD 4850 or HD 4830, you can bet that NVIDIA would have a GT200 based chip to compete with them, but the fact is ATI doesn't, so G92 stays a bit longer, which means less costs in R&D.
 
There are no such "news"...

http://www.dailytech.com/article.aspx?newsid=14480


"One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series."
 
http://www.dailytech.com/article.aspx?newsid=14480


"One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series."

First time I ever read that NVIDIA was even trying to make GT200 @ 40 nm and there's no other site reporting that, besides dailytech, so those "sources" are kind of sketchy. But dailytech is usually "reliable" so I'll buy it for now...

Thanks for the link.
 
There is no "new" technology from 8800 to GTX2XX. Just more of everything to make it faster (and smaller die to make it cooler and more power efficient). Nothing bad that Nvidia rename their cards to make it more understandable that they are sort of "the same generation". Buying GTX250/260/285 is only a matter of performance, not features. Its not like they put an obsolete card into their range, just more options which is understandable also for those that doesnt read reviews every day. Thats a good thing. :)

Well, the move from 8800GT/9800GTX to GTX 280 wasn't really smaller, actually it got bigger, pretty much relative to the increase in transistors, stream processors and other stuff, but you are right, both cards are identical at the architecture level, so it will scale nice with their performance number schemes.

GT200's are just too expensive for them to make GPU's affordable in the low-mid range segment, so they did what they had to do this time, call on their cheaper G92's to fit the market where they can't compete. I personally don't like it either, but from business point of view it makes sense. I see G92's sticking around for a while longer, now that the news is out that nVidia is having problems manufacturing GT200's on 40nm to make it cheap and affordable for them to compete with GT200 architecture. Good luck nVidia, you're gonna need it.

Yeah, strangely enough, they haven't been able to perform a die shrink to the GT200 GPU to 40nm, and create some 160 stream processors card or something strong enough to be named as GTX 240, instead of the GTS 250 G92 based card, but since they're identical at the architecture level, meh. They seems to have huge lots of G92 chips everywhere, even in their muffins!!
 
First time I ever read that NVIDIA was even trying to make GT200 @ 40 nm and there's no other site reporting that, besides dailytech, so those "sources" are kind of sketchy. But dailytech is usually "reliable" so I'll buy it for now...

Thanks for the link.


You're welcome, and yes you are right, thay are the only one i've seen saying this so i guess we'll have to take it with grain of salt for right now.


Yeah said:
:D

It's gonna get better from now on, they'll trow us a bone pretty soon, hopefully
 
Its in stock at zipzoomfly.com

You couldn't pay me to use ZipZoomfly. They ship the card at the bottom of a box with some paper thrown on top of it and their customer service is horrific.

Any word on how these overclock with that single 6-pin and all. I know the 65nm 9800gtx was very power hungary.
 
Huh, I guess that it's just the 1gb models that use a single 6-pin. Thats odd, I know that they draw alot, thats got to be kind of dangerous to oc.
 
Die shrink+smaller PCB should equal much less power draw, so it may be ok...
 
I thought the 40nm GT200 was GT212, I heard plenty of rumors about it a few months ago. Oh well, I wasn't looking forward to that anyway...i'll be upgrading to gt300 most likely.
 
I currently have a vanilla 8800gts 640. For the price and the resolution I game at (1920x1080) this may be a decent upgrade. I just know i'm not dropping $430 on a card again like I did back when my card first came out.
 
I guess I was referring to the original 9800 GTX. Either way...
Once these 250's come down in price a bit (Sub $100) I might get one for PhysX...
 
Back
Top