Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I'll agree with you 100%. This is the best damn card I have owned, absolutely amazed by it every time I open a game.My 295 is the fastest single card I've ever used in everything I've tossed at it so far. I've seen no such issues, but I also don't play WoW. I wanted a way to reduce heat and power usage (was running SLI 280s,) and this performs nearly as good, and my case is totally cool now. (I have good air flow, but I don't like noise, so I run the fans slowly.) Anyway, I haven't found a single bad thing about this card yet. I say use whatever works for what you're doing, but I thought I'd throw a different perspective out there.
I'll agree with you 100%. This is the best damn card I have owned, absolutely amazed by it every time I open a game.
That said, OP, it seems like you have a driver issue. The card is new, and I wouldn't be surprised if this was the case.
Try playing Warhead on this baby (it has issues)That can't be it, I got the impression that there is no driver issue from nVidia cards
Try playing Warhead on this baby (it has issues)
Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect .
It couldn't be louder than your XBox360, could it?
To the OP: lesson learned. The grass looks greener on the green side but in the end, you prefer your reddish grass afterall.
You better watch your back, nVidia fanboys will jump on you for this
Try playing Warhead on this baby (it has issues)
Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect .
Try playing Warhead on this baby (it has issues)
Every company has its issues. I had a few with my 4870, and I have a few with my GTX 295. No one's perfect .
QFT, I hate when people start that fud. its a matter of picking your issues.
I'd say it depends on the games as to which card you pick.
For example if you're an AoC addict the HD 4870 X2 is a better card. I've tested HD 4870 1Gb Crossfire vs GTX 260 Core 216 SLi and Tri-SLi in that game. Crossfire will clown it in heavy grass situations. nVidia setups I mention can't hit 40FPS while Crossfire does 60 in the same situation. Most of the time the nVidia cards are around 35FPS in heavy grass. That's 1920x, all high, SM3, bloom, 4xAA|16xAF and full ground quality radius.
If you're a Crysis fan nVidia is a better choice. They also are better in FC2 and you can do things like set 4xAA in game then "enhance" it to 8x-16x CSAA from the nVidia CP. Looks good and is perfectly playable.
I'd say it depends on the games as to which card you pick.
For example if you're an AoC addict the HD 4870 X2 is a better card. I've tested HD 4870 1Gb Crossfire vs GTX 260 Core 216 SLi and Tri-SLi in that game. Crossfire will clown it in heavy grass situations. nVidia setups I mention can't hit 40FPS while Crossfire does 60 in the same situation. Most of the time the nVidia cards are around 35FPS in heavy grass. That's 1920x, all high, SM3, bloom, 4xAA|16xAF and full ground quality radius.
If you're a Crysis fan nVidia is a better choice. They also are better in FC2 and you can do things like set 4xAA in game then "enhance" it to 8x-16x CSAA from the nVidia CP. Looks good and is perfectly playable.
He was just trying to get my attention
Nvidia fan.
+1 , see his post history around, isn't even funny anymore
Whats funny is the blatant misinformation people like you are allowed to spread around.
Lol...sure.
Let's put it this way, why would NVIDIA release a GTX 260 216 if it didn't have to? For shits and giggles? I think not. Instead of being able to disable two shader blocks (or whatever they're called) on each core, they now could only disable one for the 216 core, which probably cut into their yields. They wouldn't have done it if they didn't need to.
Let's put it this way, why would NVIDIA release a GTX 260 216 if it didn't have to? For shits and giggles? I think not. Instead of being able to disable two shader blocks (or whatever they're called) on each core, they now could only disable one for the 216 core, which probably cut into their yields. They wouldn't have done it if they didn't need to.
Please nissan, don't try to hide your fanboyism, is as clear as it can get. The G92 is an optical shrink of the original G80, in which for being smaller, allowed to increase yields with less defective chips, and hence, they were able to increase the shader count. While is true that during time in manufacturing, yields improve, the main reason for nVidia to release the GTX 260 core 216 is to remain competitive against the HD 4870 512MB which did very well, is actually faster by a small margin, that's why ATI created the 1GB version to remain competitive, they both trade blows, but overall the ATi card is slighly faster by a hair. If it was a planned SKU, they would at least change it's name like they did with the 6 videocards based on the same old and tired G92 die. 8800GT, 8800GTS 512, 9800GT, 9800GTX, 9600GSO, 8800GS etc. So there's no GTX 265 or GTX 270. You should thanks ATi that your favorite company lowered their SKU, otherwise, nvidia's customers would be paying $650.00 for the GTX 280 and a little less for the GTX 260.
Realize...they did the exact same thing with the G80 8800 GTS. First it was 96 shaders...then right before G92 there was a 112 shader G80 8800GTS. You honestly think they did that to compete with the 2900? It sure as hell wasnt to compete with the 3870...thats what G92 was.
You just disagreed and agreed with me in the same reply, pick one or the other. If you have the possibility to disable two clusters per card (i.e., the 192 shader GTX 260), it's easier to salvage more working chips from each wafer than if you can only disable one cluster per card (as in the case of a GTX 260 216). The process will mature and yields should get better, but why cut into your yields at all if you don't have to?Lets put it this way...what you said makes absolutely no sense. Based on what you said...if they are already killing off two shader clusters...what the hell do they care if they now only have to kill off one? Do you think they would save money by killing off two? Or one even? You really think they are artificially inhibiting the chip with the shitty yields they got? How could that have possibly cut into their yields anyway?
What you said is basically ass backwards... Not all G200's produced come with all 10 clusters working properly, and those chips end up getting sold as GTX260's as long as they have 8 of the 10 working properly. Over time yields improve as the process matures, you get more GPUs with 9 or 10 working clusters. Sure if the yields are great...they might disable some to sell more GPUs coupled with a smaller ring bus and less memory to cater to another portion of the market...but would you call G200 yields good for the first few months?
You got it backwards. The GTX 260 192 shader allowed NVIDIA to produce more working chips from each wafer. Think of it as a "strike" system. With the original GTX 260 192 shader, each chip on a wafer could be a GTX 280 (all 10 clusters work) or GTX 260 (8 of 10 clusters work). You could still produce a working chip even if two clusters were bad (two strikes). If you had a chip where only 9 out of 10 clusters work, you could still disable one cluster and have a GTX 260; if you had a chip where only 8 of 10 clusters worked, you still could use it on a GTX 260. Once they shifted over to the GTX 260 216, if you had that same chip with only 8 of 10 clusters working, it's now wasted as you only have one bad cluster (one "strike") before the chip must be discarded. NVIDIA probably made the move to 216 cores only because it had a mature enough process so that the losses were acceptable. However, if they were comfortable with their performance margins and sales in the first place, why even risk it? If they could create chips where they only needed 8 of 10 clusters working, why not keep yields as high as possible? They definitely felt some pressure from ATI, no doubt.Yah, so the core 216 was necessary to reduce the gtx260 192 high failure rate? Makes sense I guess.
And again, as myself and others have reference, the 216 version of the GTX 260 has always been faster than the 4870 512mb in the majority of games, and trades blows with the 1gb.
You just disagreed and agreed with me in the same reply, pick one or the other. If you have the possibility to disable two clusters per card (i.e., the 192 shader GTX 260), it's easier to salvage more working chips from each wafer than if you can only disable one cluster per card (as in the case of a GTX 260 216). The process will mature and yields should get better, but why cut into your yields at all if you don't have to?
The additional 512MB didn't do miracles with the HD 4870 1GB like you want to paint. The HD 4870 1GB is simply faster in scenarios were there are lots of textures, at high resolutions with Anti Aliasing. There are a lot of games which the additional VRAM doesn't help at all. So unlike the HD 4870, the GTX 260 received a small bump in performance, so that mean simply that the GTX 260 192 wasn't competitive against the HD 4870 512MB which is as fast as it's 1GB version (except in higher resolutions, with graphic engines which loves much VRAM and Anti Aliasing), so hence the GTX 260 Core 216.
HD 4870 1GB ~ GTX 260 Core 216.
HD 4870 X2 ~ 2x HD 4870 1GB
GTX 295 ~ 2x GTX 260 + more shaders
Why blame Blizzard when HD 4870 X2 can perform in the game while GTX 295 can't?
Pick-up groups.
It's when you start a raid with a bunch of random people. In other words, it's a waste of time usually.
Why would they do it if they didnt have to...? To make the card more appealing or be able to charge a bit more money. Thats not rocket science, thats simple business.
I think we are saying the same thing, but misunderstanding each other.
Lol. Benchmarks. Ill give you an A for effort though. Kinda like how the 2900 benchmarked better than G80 right? Or how the 9800GX2 benchmarks better than the GTX280? Lol.