gtx 295 dissapointment

By all means base a card's overall performance on WoW. CPU dependent in crowded areas.

Have no idea why you would lay out the cash you did for GPUs when all you play is WoW. An HD 4870 1GB or GTX 260 216 is all you need even at the resolution you play. Performance wouldn't be that much different.
 
My 295 is the fastest single card I've ever used in everything I've tossed at it so far. I've seen no such issues, but I also don't play WoW. I wanted a way to reduce heat and power usage (was running SLI 280s,) and this performs nearly as good, and my case is totally cool now. (I have good air flow, but I don't like noise, so I run the fans slowly.) Anyway, I haven't found a single bad thing about this card yet. I say use whatever works for what you're doing, but I thought I'd throw a different perspective out there.
 
Blizzard needs to update their engine to better utilize these high end graphics cards. That game is so CPU bound its not even funny.
 
HD 4870 1GB ~ GTX 260 Core 216.
HD 4870 X2 ~ 2x HD 4870 1GB
GTX 295 ~ 2x GTX 260 + more shaders

Why blame Blizzard when HD 4870 X2 can perform in the game while GTX 295 can't?
 
My 295 is the fastest single card I've ever used in everything I've tossed at it so far. I've seen no such issues, but I also don't play WoW. I wanted a way to reduce heat and power usage (was running SLI 280s,) and this performs nearly as good, and my case is totally cool now. (I have good air flow, but I don't like noise, so I run the fans slowly.) Anyway, I haven't found a single bad thing about this card yet. I say use whatever works for what you're doing, but I thought I'd throw a different perspective out there.
I'll agree with you 100%. This is the best damn card I have owned, absolutely amazed by it every time I open a game.

That said, OP, it seems like you have a driver issue. The card is new, and I wouldn't be surprised if this was the case.
 
I'll agree with you 100%. This is the best damn card I have owned, absolutely amazed by it every time I open a game.

That said, OP, it seems like you have a driver issue. The card is new, and I wouldn't be surprised if this was the case.

That can't be it, I got the impression that there is no driver issue from nVidia cards ;)
 
That can't be it, I got the impression that there is no driver issue from nVidia cards ;)
Try playing Warhead on this baby :p (it has issues)

Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect :cool:.
 
Try playing Warhead on this baby :p (it has issues)

Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect :cool:.

He was just trying to get my attention...pretty much just trolling about drivers like he usually does.

Or just trolling over nVidia in general...

It couldn't be louder than your XBox360, could it?

And lets not forget this little gem of a flamebait...

To the OP: lesson learned. The grass looks greener on the green side but in the end, you prefer your reddish grass afterall. :p

And his next post:

You better watch your back, nVidia fanboys will jump on you for this :p

Just further prooves my point...
 
Try playing Warhead on this baby :p (it has issues)

Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect :cool:.

You better watch your back, nVidia fanboys will jump on you for this :p
 
Try playing Warhead on this baby :p (it has issues)

Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect :cool:.

QFT, I hate when people start that fud. its a matter of picking your issues.
 
wow. all of this acrimony over a user's dissatisfaction with one product in one game. you no likey, you go back to whatever worked. problem solved.
 
I'd say it depends on the games as to which card you pick.

For example if you're an AoC addict the HD 4870 X2 is a better card. I've tested HD 4870 1Gb Crossfire vs GTX 260 Core 216 SLi and Tri-SLi in that game. Crossfire will clown it in heavy grass situations. nVidia setups I mention can't hit 40FPS while Crossfire does 60 in the same situation. Most of the time the nVidia cards are around 35FPS in heavy grass. That's 1920x, all high, SM3, bloom, 4xAA|16xAF and full ground quality radius.

If you're a Crysis fan nVidia is a better choice. They also are better in FC2 and you can do things like set 4xAA in game then "enhance" it to 8x-16x CSAA from the nVidia CP. Looks good and is perfectly playable.
 
I'd say it depends on the games as to which card you pick.

For example if you're an AoC addict the HD 4870 X2 is a better card. I've tested HD 4870 1Gb Crossfire vs GTX 260 Core 216 SLi and Tri-SLi in that game. Crossfire will clown it in heavy grass situations. nVidia setups I mention can't hit 40FPS while Crossfire does 60 in the same situation. Most of the time the nVidia cards are around 35FPS in heavy grass. That's 1920x, all high, SM3, bloom, 4xAA|16xAF and full ground quality radius.

If you're a Crysis fan nVidia is a better choice. They also are better in FC2 and you can do things like set 4xAA in game then "enhance" it to 8x-16x CSAA from the nVidia CP. Looks good and is perfectly playable.

meh, I manage to run Crysis a lot better on my 4870 CF than GTX 260 SLI in DX10 all Very High...
 
I'd say it depends on the games as to which card you pick.

For example if you're an AoC addict the HD 4870 X2 is a better card. I've tested HD 4870 1Gb Crossfire vs GTX 260 Core 216 SLi and Tri-SLi in that game. Crossfire will clown it in heavy grass situations. nVidia setups I mention can't hit 40FPS while Crossfire does 60 in the same situation. Most of the time the nVidia cards are around 35FPS in heavy grass. That's 1920x, all high, SM3, bloom, 4xAA|16xAF and full ground quality radius.

If you're a Crysis fan nVidia is a better choice. They also are better in FC2 and you can do things like set 4xAA in game then "enhance" it to 8x-16x CSAA from the nVidia CP. Looks good and is perfectly playable.

actually why that is true crysis is no long a good example of this (at least warhead) I wonder what they did. I have been told that far cry 2 is a lot better though.
 
Are you interested in selling this 295 GTX? I'm looking for folding boost in addition to overall gaming, etc. (I don't play WOW).
 
Is the default SLI performance mode setting for World of Warcraft in the Nvidia drivers still "disabled/off"? That's what it used to be when I was running SLI. I never had much luck getting SLI to work well in WoW when trying to force it on.
 
Whats funny is the blatant misinformation people like you are allowed to spread around.

Lol...sure.

How about we link to the original 4800 review here on [H] and see what's up? This is the 4870 512mb

http://www.hardocp.com/article.html?art=MTUyNCw0LCxoZW50aHVzaWFzdA==

Crysis - GTX260 wins
Assassin's Creed - 4870 wins
Age of Conan - GTX260 wins

Then we can look at the most recent review of the old GTX260 to get a clearer picture, with newer drivers (still 512mb 4870):

http://www.hardocp.com/article.html?art=MTU1MSwzLCxoZW50aHVzaWFzdA==

Crysis - still GTX260
AoC - still GTX260, although settings on 4870 improved
Call of Duty 4 - tie, although GTX260 gets higher avg. FPS

So there you go, evidence for everyone to peruse, take it as you will. Unfortunately for readers of [H] in this topic, in my quick search I did not find a 1gb 4870 vs. old GTX260.
 
If you compare the reviews of the Core216 to the original 192 version, youll see there isnt much, if any improvement at all. Maybe 1-2fps. Anyone not biased can fairly easily and correctly extrapolate the data and compared the old GTX260 to the 4870.

Further, you can look in the game reveiws done by [H]. They review the old GTX260 and the Core216, as well as the 1gb 4870 at the same resolution.
 
Let's put it this way, why would NVIDIA release a GTX 260 216 if it didn't have to? For shits and giggles? I think not. Instead of being able to disable two shader blocks (or whatever they're called) on each core, they now could only disable one for the 216 core, which probably cut into their yields. They wouldn't have done it if they didn't need to.
 
Let's put it this way, why would NVIDIA release a GTX 260 216 if it didn't have to? For shits and giggles? I think not. Instead of being able to disable two shader blocks (or whatever they're called) on each core, they now could only disable one for the 216 core, which probably cut into their yields. They wouldn't have done it if they didn't need to.

Lets put it this way...what you said makes absolutely no sense. Based on what you said...if they are already killing off two shader clusters...what the hell do they care if they now only have to kill off one? Do you think they would save money by killing off two? Or one even? You really think they are artificially inhibiting the chip with the shitty yields they got? How could that have possibly cut into their yields anyway?

What you said is basically ass backwards... Not all G200's produced come with all 10 clusters working properly, and those chips end up getting sold as GTX260's as long as they have 8 of the 10 working properly. Over time yields improve as the process matures, you get more GPUs with 9 or 10 working clusters. Sure if the yields are great...they might disable some to sell more GPUs coupled with a smaller ring bus and less memory to cater to another portion of the market...but would you call G200 yields good for the first few months?

Realize...they did the exact same thing with the G80 8800 GTS. First it was 96 shaders...then right before G92 there was a 112 shader G80 8800GTS. You honestly think they did that to compete with the 2900? It sure as hell wasnt to compete with the 3870...thats what G92 was.
 
Just my 2 cents.
I like competition. I like it ALOT.

Fact is both new NVIDIA cards are great. And if there was no competition we would all trow money towards NVIDIA.
But with the agressive price of the 4870x2 it is very very hard for most of us to go out and pick a GTX285 or GTX295.

When you start checking all the reviews out there you pretty much end up with one conclusion.
The 4870x2 beats the GTX285 and gets shares punches with the GTX295. The 4870X2 is priced quite close to the GTX285, being that the GTX295 is at least $50 more expensive than the 4870X2.

We may argue about ATI drivers not being as good as NVIDIA drivers. We may even talk about the advantages of NVIDIA cards with CUDA support. Or talk about the 4870X2 heating up more or even wasting more energy.
But fact is if you want a cost effective very powerfull graphics card that will run pretty much everything great you pick the 4870X2.

There will obviously be those who even noticing this thread will pick the NVIDIA card.
But this doesn't mean there isn't a problem for NVIDIA.
And the problem is. They have released a couple of graphics card to take the crown from ATI and become the must buy.
So far they have failed.
ATI isn't doing anything special. It is just repeating what it did last year. And that was, having slightly less powerful graphics card but at a much better price than the competition.

IF NVIDIA doesn't slash prices quickly, ATI keeps the performance/price crown. Keep in mind DX 11 cards aren't so far off. Were already entering February and if NVIDIA wants to get some kind of an edge in this generation they're time is running up.

To give you guys one idea. My mind was completely set into buyin a NVIDIA graphics card. Now, that isn't so.
I'm waiting for some kind of news of a price cut from NVIDIA. But if in a couple of weeks that news doesn't come out, I won't be able to wait anymore and will have to go with what is the best for the time.

Just taking a look at some of the reviews online from all sorts of games its easy to see the 4870X2 coming out as a great choice
 
Let's put it this way, why would NVIDIA release a GTX 260 216 if it didn't have to? For shits and giggles? I think not. Instead of being able to disable two shader blocks (or whatever they're called) on each core, they now could only disable one for the 216 core, which probably cut into their yields. They wouldn't have done it if they didn't need to.

Yah, so the core 216 was necessary to reduce the gtx260 192 high failure rate? Makes sense I guess.
 
Please nissan, don't try to hide your fanboyism, is as clear as it can get. The G92 is an optical shrink of the original G80, in which for being smaller, allowed to increase yields with less defective chips, and hence, they were able to increase the shader count. While is true that during time in manufacturing, yields improve, the main reason for nVidia to release the GTX 260 core 216 is to remain competitive against the HD 4870 512MB which did very well, is actually faster by a small margin, that's why ATI created the 1GB version to remain competitive, they both trade blows, but overall the ATi card is slighly faster by a hair. If it was a planned SKU, they would at least change it's name like they did with the 6 videocards based on the same old and tired G92 die. 8800GT, 8800GTS 512, 9800GT, 9800GTX, 9600GSO, 8800GS etc. So there's no GTX 265 or GTX 270. You should thanks ATi that your favorite company lowered their SKU, otherwise, nvidia's customers would be paying $650.00 for the GTX 280 and a little less for the GTX 260.
 
Please nissan, don't try to hide your fanboyism, is as clear as it can get. The G92 is an optical shrink of the original G80, in which for being smaller, allowed to increase yields with less defective chips, and hence, they were able to increase the shader count. While is true that during time in manufacturing, yields improve, the main reason for nVidia to release the GTX 260 core 216 is to remain competitive against the HD 4870 512MB which did very well, is actually faster by a small margin, that's why ATI created the 1GB version to remain competitive, they both trade blows, but overall the ATi card is slighly faster by a hair. If it was a planned SKU, they would at least change it's name like they did with the 6 videocards based on the same old and tired G92 die. 8800GT, 8800GTS 512, 9800GT, 9800GTX, 9600GSO, 8800GS etc. So there's no GTX 265 or GTX 270. You should thanks ATi that your favorite company lowered their SKU, otherwise, nvidia's customers would be paying $650.00 for the GTX 280 and a little less for the GTX 260.

You are terribly confused and are having a hard time comprehending this. I suggest you sit on your hands. Here ill quote myself...

Realize...they did the exact same thing with the G80 8800 GTS. First it was 96 shaders...then right before G92 there was a 112 shader G80 8800GTS. You honestly think they did that to compete with the 2900? It sure as hell wasnt to compete with the 3870...thats what G92 was.

Also G92 was a process shrink of G80.

And again, as myself and others have reference, the 192 version of the GTX 260 has always been faster than the 4870 512mb in the majority of games, and trades blows with the 1gb.
 
Lets put it this way...what you said makes absolutely no sense. Based on what you said...if they are already killing off two shader clusters...what the hell do they care if they now only have to kill off one? Do you think they would save money by killing off two? Or one even? You really think they are artificially inhibiting the chip with the shitty yields they got? How could that have possibly cut into their yields anyway?

What you said is basically ass backwards... Not all G200's produced come with all 10 clusters working properly, and those chips end up getting sold as GTX260's as long as they have 8 of the 10 working properly. Over time yields improve as the process matures, you get more GPUs with 9 or 10 working clusters. Sure if the yields are great...they might disable some to sell more GPUs coupled with a smaller ring bus and less memory to cater to another portion of the market...but would you call G200 yields good for the first few months?
You just disagreed and agreed with me in the same reply, pick one or the other. If you have the possibility to disable two clusters per card (i.e., the 192 shader GTX 260), it's easier to salvage more working chips from each wafer than if you can only disable one cluster per card (as in the case of a GTX 260 216). The process will mature and yields should get better, but why cut into your yields at all if you don't have to?
Yah, so the core 216 was necessary to reduce the gtx260 192 high failure rate? Makes sense I guess.
You got it backwards. The GTX 260 192 shader allowed NVIDIA to produce more working chips from each wafer. Think of it as a "strike" system. With the original GTX 260 192 shader, each chip on a wafer could be a GTX 280 (all 10 clusters work) or GTX 260 (8 of 10 clusters work). You could still produce a working chip even if two clusters were bad (two strikes). If you had a chip where only 9 out of 10 clusters work, you could still disable one cluster and have a GTX 260; if you had a chip where only 8 of 10 clusters worked, you still could use it on a GTX 260. Once they shifted over to the GTX 260 216, if you had that same chip with only 8 of 10 clusters working, it's now wasted as you only have one bad cluster (one "strike") before the chip must be discarded. NVIDIA probably made the move to 216 cores only because it had a mature enough process so that the losses were acceptable. However, if they were comfortable with their performance margins and sales in the first place, why even risk it? If they could create chips where they only needed 8 of 10 clusters working, why not keep yields as high as possible? They definitely felt some pressure from ATI, no doubt.
 
And again, as myself and others have reference, the 216 version of the GTX 260 has always been faster than the 4870 512mb in the majority of games, and trades blows with the 1gb.

Fixed!!

Look all the benchmark tables here

http://www.anandtech.com/video/showdoc.aspx?i=3501&p=2

http://www.firingsquad.com/hardware/evga_geforce_gtx_285_ssc_performance_review/page4.asp

http://hothardware.com/Articles/NVIDIA-GeForce-GTX-285-Unveiled/?page=4

http://www.bit-tech.net/hardware/graphics/2009/01/16/nvidia-zotac-geforce-gtx-285-1gb/4

The additional 512MB didn't do miracles with the HD 4870 1GB like you want to paint. The HD 4870 1GB is simply faster in scenarios were there are lots of textures, at high resolutions with Anti Aliasing. There are a lot of games which the additional VRAM doesn't help at all. So unlike the HD 4870, the GTX 260 received a small bump in performance, so that mean simply that the GTX 260 192 wasn't competitive against the HD 4870 512MB which is as fast as it's 1GB version (except in higher resolutions, with graphic engines which loves much VRAM and Anti Aliasing), so hence the GTX 260 Core 216.
 
You just disagreed and agreed with me in the same reply, pick one or the other. If you have the possibility to disable two clusters per card (i.e., the 192 shader GTX 260), it's easier to salvage more working chips from each wafer than if you can only disable one cluster per card (as in the case of a GTX 260 216). The process will mature and yields should get better, but why cut into your yields at all if you don't have to?

They only disable them if they are defective. If they are getting more GPUs with 9 working clusters as opposed to 8...Why would they disable one cluster? They dont...they release it as a new version, which makes it more enticing to buy. As i said...yields improve, and instead of getting GPUs with only 8 working clusters...they get more with 9 clusters working properly.

Why would they do it if they didnt have to...? To make the card more appealing or be able to charge a bit more money. Thats not rocket science, thats simple business.

I think we are saying the same thing, but misunderstanding each other.

The additional 512MB didn't do miracles with the HD 4870 1GB like you want to paint. The HD 4870 1GB is simply faster in scenarios were there are lots of textures, at high resolutions with Anti Aliasing. There are a lot of games which the additional VRAM doesn't help at all. So unlike the HD 4870, the GTX 260 received a small bump in performance, so that mean simply that the GTX 260 192 wasn't competitive against the HD 4870 512MB which is as fast as it's 1GB version (except in higher resolutions, with graphic engines which loves much VRAM and Anti Aliasing), so hence the GTX 260 Core 216.

Lol. Benchmarks. Ill give you an A for effort though. Kinda like how the 2900 benchmarked better than G80 right? Or how the 9800GX2 benchmarks better than the GTX280? Lol.

BTW, what you "fixed" wasnt broken. Just a little tid bit for you there.

Did you not see this post here? Scroll up, its just above!
 
someone on newegg posted something about this card crashing wOw.. just wait for some new drivers.. patience man
 
Just my 2 cents.
I like competition. I like it ALOT.

Fact is both new NVIDIA cards are great. And if there was no competition we would all trow money towards NVIDIA.
But with the agressive price of the 4870x2 it is very very hard for most of us to go out and pick a GTX285 or GTX295.

When you start checking all the reviews out there you pretty much end up with one conclusion.
The 4870x2 beats the GTX285 and gets shares punches with the GTX295. The 4870X2 is priced quite close to the GTX285, being that the GTX295 is at least $50 more expensive than the 4870X2.

We may argue about ATI drivers not being as good as NVIDIA drivers. We may even talk about the advantages of NVIDIA cards with CUDA support. Or talk about the 4870X2 heating up more or even wasting more energy.
But fact is if you want a cost effective very powerfull graphics card that will run pretty much everything great you pick the 4870X2.

There will obviously be those who even noticing this thread will pick the NVIDIA card.
But this doesn't mean there isn't a problem for NVIDIA.
And the problem is. They have released a couple of graphics card to take the crown from ATI and become the must buy.
So far they have failed.
ATI isn't doing anything special. It is just repeating what it did last year. And that was, having slightly less powerful graphics card but at a much better price than the competition.

IF NVIDIA doesn't slash prices quickly, ATI keeps the performance/price crown. Keep in mind DX 11 cards aren't so far off. Were already entering February and if NVIDIA wants to get some kind of an edge in this generation they're time is running up.

To give you guys one idea. My mind was completely set into buyin a NVIDIA graphics card. Now, that isn't so.
I'm waiting for some kind of news of a price cut from NVIDIA. But if in a couple of weeks that news doesn't come out, I won't be able to wait anymore and will have to go with what is the best for the time.

Just taking a look at some of the reviews online from all sorts of games its easy to see the 4870X2 coming out as a great choice
 
HD 4870 1GB ~ GTX 260 Core 216.
HD 4870 X2 ~ 2x HD 4870 1GB
GTX 295 ~ 2x GTX 260 + more shaders

Why blame Blizzard when HD 4870 X2 can perform in the game while GTX 295 can't?

Because cheerleaders can't root for the other team.:p
 
Why would they do it if they didnt have to...? To make the card more appealing or be able to charge a bit more money. Thats not rocket science, thats simple business.

I think we are saying the same thing, but misunderstanding each other.

Lol. Benchmarks. Ill give you an A for effort though. Kinda like how the 2900 benchmarked better than G80 right? Or how the 9800GX2 benchmarks better than the GTX280? Lol.

You seems to forgot that during time, driver development make optimization for their cards, which gets faster over time, specially for the ATi HD architecture which is a superscalar one and loves and needs optimizations for it, a type of design which nVidia never knew how to make it work, like with their FX architecture. So isn't fine to use an old review with old drivers, that's why I posted those GTX 285 reviews. And those are real games, not 3DMark Vantage only junk.

And about simple bussiness, there's more than that. I heard that the HD 4800 architecture comes with 840 stream processors, which are used for redundancy to make sure that the GPU will have 800 usable stream processors, they just could enable them, but they won't. AFAIK the GT200 architecture comes with 240 stream processors and that's it, that's why if a block is deffective, they disable it and sell it as a lower end part. But since yields improves over time, may be they managed how to reduce the broken stream processors, who knows that most GT200 made now have fully working 240 stream processors, but why do they still choose to disable 1 block? Because isn't in the high end were the money is, is in the midrange were you can buy the GTX 260 for less than $250.00, not an overpriced GTX 280 which is barely faster than the much cheaper GTX 260.
 
Back
Top