What was your least favorite graphics card?

I had a Radeon 9 series (9600 maybe) back around 2004-2005 that I never got along with. It "worked," but had all sorts of odd quirks, like it wouldn't let the system post without a VGA monitor plugged in, and the linux drivers were a nightmare, and on and on.

If you did all the things it wanted you to do, it worked fine for the purpose of playing games in Windows, but it always seemed like kind of a struggle.
 
Probably a GTX 470. What a power hungry POS.

Followed by the HD 5870 Eyefinity 6 edition. This card was actually a decent performer. My problem is I sold it before I found out about BTC mining and thought I had made out like a bandit when I sold it to a miner for more than what I paid for it. If I would have put it into use mining back then, I probably would have had a few extra thousand bucks in my pocket. Oh well.
 
I remember I traded a Wii I got for Christmas for an 8800gts, I think I was upgrading from an x1950 aiw on a custom water loop? It wasnt worth it and I wish I kept the Wii at the time.
 
Last edited:
ATI 9800 whatever. I remember just starting high school and saving all my money. First big computer part that I ever bought and then like a month later the X800 comes out and crushes it. Sigh.
 
ATI 9800 whatever. I remember just starting high school and saving all my money. First big computer part that I ever bought and then like a month later the X800 comes out and crushes it. Sigh.

That's just bad timing. The 9800 Pro was EASILY the best graphics card around when it was released. It was my longest lasting (between upgrades) and probably my favorite card, though my 1080Ti is giving it a run for it's money currently. I remember when BF2 came out, the 9800 Pro was pretty old by then, but I was still able to run it at 1280x1024 (the most popular gaming resolution at the time) at nearly maxed settings.
 
I had two EVGA 8800 gtx cards (for SLI). They were expensive and one of them was always broke. RMA'd each of them once or twice.
 
That's just bad timing. The 9800 Pro was EASILY the best graphics card around when it was released. It was my longest lasting (between upgrades) and probably my favorite card, though my 1080Ti is giving it a run for it's money currently. I remember when BF2 came out, the 9800 Pro was pretty old by then, but I was still able to run it at 1280x1024 (the most popular gaming resolution at the time) at nearly maxed settings.

Yeah I had a 9800 Pro and it was pretty sweet
 
It was my NV FX 5600 standard and my ATI 9600 standard.

I hated having to buy the FX 5600 for about the same price because it was inferior to the 9600... but the 9600 wouldn't run the game I was upgrading the GPU for -- Warcraft 3
 
I had a EVGA 7800 GT that I bought and then used the step-up program at the time to upgrade to the 7900 GT. That 7900 GT was constantly giving me issues, artifacts, overheating. Regretted upgrading to that card.
 
Yeah I had a 9800 Pro and it was pretty sweet

Yeah, it was an amazing card all around. Not downplaying that at all. Just for me, the timing was terribad like RamonGTP alluded too which affected how I looked at it. I bought it for CS 1.6 and I played the crap out of that game. It lasted forever. Just felt bad when your buying the best product out for that amount of money and another card comes out shortly afterwards. I was young & naive.
 
It puzzles me that some people list the S3 Virge. I got one when it came out (somewhere in 95) to replace my S3 Trio (32 or 64, don't remember) and it had blazing fast windows performance, great at DOS games, plus some very early rudimentary 3D. There were no other significant "players" in the 3D world until almost a year later. This was before the first Voodoo or the first Rage 3D. So for the time it was great. Now, if you got it later (VX etc versions) it's on you, because you should have known it was not really a 3D performer...
I was never disappointed by a card I chose. And I even chose a Savage4 for my gf at the time: she only played 2D adventure games and wanted DVD playback. The Savage4 was the best value for DVD playback at the time and it did manage 2D games quite nicely :D
 
It was my NV FX 5600 standard and my ATI 9600 standard.

I hated having to buy the FX 5600 for about the same price because it was inferior to the 9600... but the 9600 wouldn't run the game I was upgrading the GPU for -- Warcraft 3
Huh weird. I ran warcraft 3 on my 9700 pro. Played with my cousin when he had the geforce ti 4200. I would think the 9600 should run it fine.
 
Not sure Diamond Stealth 3D 2000 should be included there. S3TC texture compression on this card was a gamechanger with the Unreal game at the time. That game alone made this card worthwhile!

When your textures looked like this circa 2000 or 2001 (With 2048x2048 textures) it was amazing compared to mainstream 3dfx’s lesser flat 256x256 texture capability.

S3TC
View attachment 238897
The negative was that S3TC wasn't more broadly adopted.


Sorry, you have no clue what you are talking about. You are thinking of Savage 3D, while I'm talking about the Diamond Stealth 3D 2000, a S3 ViRGE based graphics card very popular with OEMs in 1997 (because they were selling at a massive discount due to them being outperformed by ATI Rage Pro, Nvidia Riva 128). This waas the reason everyone thought S3 gaming as dead, as the Savage 3D wasn't released until mid 1998

Just because one comany purchased the other doesn't mean you should mix these things up - Before the S3 purchase, Diamond used every graphics chip on the planet!

And S3TC didn't look very good in it's first-iteration: because the Savage 3D was 64-bit memory bus, they used real-time S3TC to make performance acceptable. Long-term, it became a necessary improvement in memory-thrroughput, but the first iteration was a joke.
 
Last edited:
FX5200. I was poor and scraped up enough money to get a new GPU and it was a total disappointment. Couldn't afford a better GPU for a couple more years.
 
Huh weird. I ran warcraft 3 on my 9700 pro. Played with my cousin when he had the geforce ti 4200. I would think the 9600 should run it fine.

Not sure too.

My GeForce 3 Ti200 played it well, then it died.

I do recall the 9600 would not display colors on that game properly. New/old drivers would not help and a replacement did not fix that.

I'm also not in the US where it's relatively easier to get replacements. Could have been a faulty card but it played all my other games fine -- it was just WC3 with that problem.

Got stuck with the FX5600 which was inferior but was the best card that budget allowed at that time.
 
Sorry, you have no clue what you are talking about. You are thinking of Savage 3D, while I'm talking about the Diamond Stealth 3D 2000, a S3 ViRGE based graphics card very popular with OEMs in 1997 (because they were selling at a massive discount due to them being outperformed by ATI Rage Pro, Nvidia Riva 128). This waas the reason everyone thought S3 gaming as dead, as the Savage 3D wasn't released until mid 1998

Just because one comany purchased the other doesn't mean you should mix these things up - Before the S3 purchase, Diamond used every graphics chip on the planet!

And S3TC didn't look very good in it's first-iteration: because the Savage 3D was 64-bit memory bus, they used real-time S3TC to make performance acceptable. Long-term, it became a necessary improvement in memory-thrroughput, but the first iteration was a joke.
I do know what I’m talking about. I owned both the S3 Virge for Descent game acceleration, (1997ish) and the Diamond Viper S2000 for Unreal engine game acceleration and S3TC was fantastic visually at the time. (1999ish). The only thing we are fighting about is apparently I owned the II version of the Diamond Viper s2000 released in 1999. IIRC I pre-ordered this card, and had it basically at launch...but it wasn't ready to go with S3TC at launch on Unreal. But when it did get the patch it was amazing.
Both graphics cards were basically targeting a single popular game with significant enhancements. Both did incredible technology jumps for the time in those respective games. I owned them both, and I saw their improvements first hand!!! I also owned a Orchid Righteous 3d 3dfx card during this same generic time period. S3TC looked amazing compared to my 3dfx card's visuals on Unreal engine.
https://www.tomshardware.com/reviews/s3-diamond-viper-ii-review,154.html
 
Last edited:
GTS 8800 - Terrible price/performance ratio. I should have bitten the bullet and bought the GTX 8800.
 
I was also unfortunate enough to have a Rendition V1000 (Sierra Screamin' 3D) as my first card.

I think the worst thing about it was the abysmal 2D performance. I remember literally having to pull it out and using the onboard video on my Acer Aspire to play a sega genesis emulator properly.
 
Can't remember the ATI card at the time, but that was my worst card I have ever owned out of many. Ended up returning it to Newegg because it had a lot of issues. I ended up with the Nvidia GTX 480 which was a great card. It's driving me nuts that I can't remember the ATI/AMD card though.
 
I was also unfortunate enough to have a Rendition V1000 (Sierra Screamin' 3D) as my first card.

I think the worst thing about it was the abysmal 2D performance. I remember literally having to pull it out and using the onboard video on my Acer Aspire to play a sega genesis emulator properly.
I had one of those.
 
My reference GTX 480, so hot and loud. AMD put out an absolutely hilarious video to make fun of Nvidia Fermi at the time: Great watch if you've never seen.
 
  • Like
Reactions: noko
like this
Well, I had a few video cards in my PC career, but the worse was the Daytona 64. a 4Mb vid card that was so bad, I complained day one. Unfortunately, I got it with my allowance then and I had to wait about a year to upgrade it to a VooDoo 3 2000 PCI. I had a Radeon 9600 SE to try ant replace my 9000 Pro I was using, but the performance was worse despite it having a faster GPU. I stuck with my 9000 Pro for a few years until I was able to get my next upgrade, which was a Radeon X800XL. Before the X800XL, there was a side-grade, which proved to be a downgrade. The GeForce MX 200 was the worst! The tiny heatsink was really thick aluminium and it only covered the GPU, but would get hot enough to artifact and literally burnt my hands several times. It became a great eWasted thing when the X800XL got in. The X800XL card was amazing and I had not replaced it for several generations (at the time played Everquest and World of Warcraft) as it worked well. When I did, I got an XFX HD 4850 1Gb version. That one was hell with drivers, but I got it going and lasted me a couple generations. That is until I got my Radeon 6870. It was a hot, awful, mess. I went back to my 4850 until the R9 series came out (6870 was a gen old when I got it). Have not had problems with vid cards since. My old R9 270x 4Gb card is still running well today and can do 2k in medium gaming just fine and my RX 480 is still able to game above 100 FPS most titles in max settings. I am now on a Green card, the RTX 2070Super. I had not considered nVidia for many years, since MX 200 left such a bad taste in my mouth. I was just tired of driver issues with AMD cards and wanted something to just work.
 
I haven't seen anyone mention the NV1 (Diamond Edge 3d) which I got later when it was basically obsolete. I was curious to see what the deal was with those advertised Sega Saturn ports. For those that remember, it used quads instead of triangle primitives which Nvidia gave up on after the NV1 and the aborted NV2. NV3 was the Riva 128, which I also had. Start of the geforce empire..
 
Geforce MX2 400 I think it was. Also my Kyro 2 it wasn't really an upgrade from my TNT2.
 
My least favorite was my Radeon HD6990 + HD6970 triple video setup. I still have a picture of this one my phone I can dig up. It produced the fastest 3dmark scores ever but that's pretty much it. It was buggy as hell and "hot" was not a good description, it was way more. I needed a box fan on the side to really make it work. I also had dual Radeon R9 390X's that were extremely hot, my 2nd least favorite config. And right behind those are my Vega 64. I still have this in my closet, the drivers are not good. It's going to be hard to get me away from Nvidia these days.
 
I had a Radeon 9 series (9600 maybe) back around 2004-2005 that I never got along with. It "worked," but had all sorts of odd quirks, like it wouldn't let the system post without a VGA monitor plugged in, and the linux drivers were a nightmare, and on and on.

I remember these same quirks. I thought it was my system all along.
 
ATI 9800 and 9800XT. For whatever reason one fried itself, and the other fried my whole computer
 
Radeon HD3850 512GB. I had it for a short while and had some driver issues. Then I got 8800GS, cheaper, almost as fast and absolutely zero issues with any game on it.
 
AMD 6950. Even flashed to 6970, this card felt like it ran out of gas far too quickly in its life cycle
 
I had this AGP EVGA CO. from like 2006 and I plugged it in using a CRT monitor the eyestrain was so bad something to do with the convergence was using VGA connection .
 
S3 Virge....lord.... one my old childhood computers had one of these on the mainboard with NO PCI SLOT.... what a nightmare.
 
back in 2000 or so I scrounged up my allowance to buy a S3 Savage 4 and it was awful. I remember a lot of games having bizarre issues with transparency textures properly leaving black squares around particle effects and such. I replaced it with a GeForce 2 MX which was much better, even if it was a middling card at the time. I also had a GeForce FX 5600XT a few years later and that card was awful, just a crap performer all around. I almost got a Radeon 9700 instead but I bought into the FX hype and ended up disappointed.
 
The only GPU purchase that I've made that disappointed me was my Radeon VII. Totally self induced, though; I read the reviews, saw the complaints (loud, hot, buggy, spotty performance) but plopped the money down for one to replace my 1080 (hybrid unit too) - total AMD fanboy move, I know. Sold that GPU only a few months later (picked up a 2080 Super).

Still, every other GPU purchase of mine has been well thought out and planned - so I'm rarely disappointed.

This describes my experience and feelings towards the Radeon VII, as well.

The only cards I have been truly disappointed in were the FX-5500 I had and my current Radeon VII. I wasn't really disappointed in the FX-5500 (as others did, I got this card because I was poor) until I upgraded to a 6800 Ultra. Damn, what a difference!

As for my Radeon VII, I built a new Ryzen 2700x system a couple years ago and got an RX 580 to go with it. I was not happy with that pairing so I upgraded to an RX 590; still not happy. Then the Radeon VII came out and I was all excited. I have never been happy with this Ryzen 2700x system regardless of what video card was in it. The Radeon VII gets the "most disappointing" title based on its price and that the performance just never seemed to be there despite its specs. Overall, I would say the Radeon VII is the biggest let-down in my GPU history.
 
Radeon HD3850 512GB. I had it for a short while and had some driver issues. Then I got 8800GS, cheaper, almost as fast and absolutely zero issues with any game on it.

HD3850 the memory was to small to play GTA4,i think, with the distance slider turned up.Did not take long to ditch it.
 
For me it was the 6800GT. Drove me away from any Nvidia based cards for more the 10 years.

The first card with a onboard VPU and highly marketed as the best video encoding cards that could also excel in a gaming world.

After spending over 300 dollars (in 2004) on their top of the line card you find out weeks later that the "onboard VPU" is broken and that $75.00 video cards can actually encode movies faster.

Nvidia's solution to a card they released with a hardware problem?

We know its broken, we dont care, we will not replace them with any other card BUT we will SELL you some software for $20.00 to help speed our broken $300.00 card so it can get up to levels of other $75.00 cards on the market.

Last nvidia based card I bought up until the 1080ti
 
Last edited:
Speaking of solder / joint issues, Apple’s least favorite GPU must have been the G86m or 8600m GT which was in a lot of macbooks that would fail because of “bumpgate” where shifting / bad connection points between the GPU chip and board caused bad video and failures, dinging Apple’s reputation. As many of you may remember that also affected the integrated intel chipsets that Nvidia sold.
Apple must have really hated it because they never sold another Mac anything with Nvidia GPUs even when AMD graphics were far behind the curve at various points in time, years down the road.
 
Two PITA GPUs
1) AMD 4870 x 2.
Just awful driver support.
2) GTX 480.....
Nuclear heater until watercooled.
 
Back
Top