What was your least favorite graphics card?

Way back when I first had my own PC it was early 90's.. had a 486sx 20 with 2.. yes TWO megabytes of ram. It had a little 512k video solution built in. I bought at the time a 1 megabyte video card that was way faster.

That was the worse video card I ever bought and it eventually let the smoke out too.
 
GeForce 2 MX400 hands down, anything over 640x480 resolution became blurrier and blurrier - worst DACs I've ever seen that made the card useless at the time. Second worst would be the Rage 128 and Windows 2000 driver hell. The one card that I loved and hated at the same time was the 8800GTX, performance was phenomenal at the time, Windows 8 driver hell virtually ruined this card for me.
 
I bought an Albatron ti4400 which was physically a beautiful looking card and reviewers reported it was extremely overclockable... the card I got barely ran stable at stock clocks... I ended up returning it and basically somehow returning and trading it for the 9700 pro I ran for years afterward.
 
For me it was a Powercolor Geforce 2 MX. That thing didn't even run the stock GeForce 2 MX clocks and it would overheat at the drop of a hat. I was in HS at the time and that was all that I could afford. I later sold it and bought another GeForce 2 MX (I forget the brand, I think it's defunct but they were top tier at the time) and I got like a 25% speed increase after overclocking with perfect stability. It cemented in my mind that Powercolor is to be avoided. I hear that they've fixed their reputation, what with their Red Devil line, but even after all these years I feel like I can't trust them again.
 
Awesome thread idea!

Most of my GPU purchases were great except the zotac 1080 mini. It didn't fit in itx cases cause it was too tall, like why was it branded that way? Found out after purchase sadly.
 
I don't know.

I've never had a video card I thought was terrible.

Let me explain. I do my research first, and I buy the appropriate video card for the appropriate purpose. The only way I would get a video card I thought sucked would be if it didn't live up to expectations.

I've owned many low end budget video cards that could barely run a game if you wanted them too, but I bought them for non-gaming purposes.

If anything I'd have to mention my Gigabyte Radeon 7970 here, only because it had visual artefacts at stock settings (apparently this was a common problem)



 
Geforce 8500 GT, which was my first PCI-E video card. I could have bought a 7900 for just a little bit more, but didn't understand the numbering scheme at the time. I thought it would be enough for GTA I/V but it was a slide show.... I cranked that sucker up to whatever the slider on the overclock would do, and I could play for 15 mins or so at a less than slideshow framerate until it crashed... Replaced with a GTS 250 which of course was night and day. I never checked the framerate, but it was playable..

The GTS 250 is long gone, as I donated it to my brother but the 8500 GT I still have and it still works, although it's got obvious heat damage
 
An ASUS 780 Ti with bad delivery problems. I've stayed clear of ASUS since then, because of the terrible warranty / RMA issues I had with them and that card.
 
I have a whole bunch of cards I don't like much.

Chips & technologies 655xx - Really cheap junk line of video chips commonly found on laptops in the mid 90s. The drivers sucked and 2D image quality was really terrible.

NeoMagic MagicGraph - Another cheap junk chip used on low budget laptops in the mid 90s.

S3 Virge - S3's first venture into 3D support. While the cards had good 2D and driver support, 3D was awful and in many cases slower than software rendering on the host CPU.

The entire ATI RAGE series - ATI basically took a good 2D chip (the Mach 64) and bolted on a hideously broken and poor 3D core to make the RAGE. Drivers were always awful and the cards didn't ever get proper OpenGL support, which was basically most games in the mid to late 90s that didn't use Glide. Subsequent revisions like the Rage II, Rage Pro, Rage 128, etc were no better other than they had good enough Direct 3D support to run Direct X titles.

The entire x300-x1000 line - This was ATI's first forte into PCIe, and man was it a dumpster fire. I had an x800GTO and it was slower than my previous Radeon 9800 Pro despite having double the pixel and vertex pipelines. I later came into possession of other cards in the line like the x300, x1300 and x1600, they were all junk. The drivers were super unstable and games ran terribly.

If anything I'd have to mention my Gigabyte Radeon 7970 here, only because it had visual artefacts at stock settings (apparently this was a common problem)





You had it mild compared to my card. In any game, vertexes would separate from objects and fly around the screen, making everything look like polygon spike balls constantly flailing about. On the desktop, I'd have diagonal trapezoids of grey steaks on the screen that would move around. I got rid of that piece of junk and got something else.
 
lets see.. the entire radeon x1000 line up.. i tried the x1500, x1600, x1650 pro, x1800 and x1950 pro. they all sucked. i hated them so much so that i didn't buy another radeon card again until the vega 56 that i bought in early 2019.

the only one I had from that group was the X1950 Pro as that was one of the fastest and last AGP cards you could get. I upgraded from an EVGA 7800GS.
 
the only one I had from that group was the X1950 Pro as that was one of the fastest and last AGP cards you could get. I upgraded from an EVGA 7800GS.
I had an x1950 pro for over a year there was nothing wrong with it. Switched from the garbage 7600GS too.
 
the only one I had from that group was the X1950 Pro as that was one of the fastest and last AGP cards you could get.

The HD3850 was the fastest AGP card you could get, with the HD4650/70 being a close second.
 
S3 virge 1mb... Half Life at 320x240 was pushing it.. 640 was a slide show.

6970s were a POS too, especially the first Gigabyte Windforce models which loved heating one of the ICs to insta-burn-your-finger temperatures, by virtue of blowing hot VRM air on them and hot air from heatpipe fins.

Ahhh, the S3 Virge!

Infamously shit. :D

I remember the very first LAN with my buddies way back when; we were all playing Battlezone on our 3dfx Voodoos, and there's 'that guy' with the S3 Virge. "Yeah, it's really smooth, amazing". We look on his screen and it's a fucking slideshow. :LOL::ROFLMAO:
 
Ahhh, the S3 Virge!

Infamously shit. :D

I remember the very first LAN with my buddies way back when; we were all playing Battlezone on our 3dfx Voodoos, and there's 'that guy' with the S3 Virge. "Yeah, it's really smooth, amazing". We look on his screen and it's a fucking slideshow. :LOL::ROFLMAO:

Several of the earliest 2D/3D solutions essentially took an existing core and crammed a 3D component into the die space freed up after a process improvement. The Virge was built on the Trio32/64; the ATi Rage series was built on mach64. The earliest accelerators were intended to provide a superior software-quality experience at higher resolutions than 486es and Pentiums could manage, but then RAM prices dropped and it was actually possible to buy a 3Dfx part originally intended for arcade machines, and everything went nuts.
 
I had a 9600 standard that wouldn't run warcraft 3 properly. Had to downgrade to an FX 5600 :/
 
S3 virge 1mb... Half Life at 320x240 was pushing it.. 640 was a slide show.

I second the S3 Virge cards, only thing they had going for them was S3TC version of Unreal Tournament, it looked really good but the framerate was a total joke for a First Person Shooter. Shortly after release of S3TC Microsoft added the texture compression to DirectX which meant all cards could use it, leading me to snack myself vigorously with the box the card was packaged in. S3 Virge, should have been used in the PC build by The Verge.

3870x2, where to start with this one, hmp, maybe the damn stuttering they were never able to fix with drivers, or perhaps the hotter than a summer piss on pavement temperatures and resulting fan noise. If anyone has a GPU museum, I have a new refurb-replaced-while-in-transit 3870x2 in a box. I'd do electronics recycling on it but where I live is very anti-recycle, so it along with many other old cards linger in a storage container.
 
Last edited:
Bought a GeForce 7300GT because I wanted to play a game and my older Radeon didn't support that level of DirectX. Thought it would be good enough. That's when I learned about cheaping out on GPUs.
 
The entire x300-x1000 line - This was ATI's first forte into PCIe, and man was it a dumpster fire. I had an x800GTO and it was slower than my previous Radeon 9800 Pro despite having double the pixel and vertex pipelines. I later came into possession of other cards in the line like the x300, x1300 and x1600, they were all junk. The drivers were super unstable and games ran terribly.

I had an X800XT on launch (AGP) and it was great. Omega drivers were simply an amazing difference too. Kept a 6800u honest and surpassed with Omegas... had no issues with it other than the 256Mb VRAM being a limitation in Doom3 for Uttra.

Ahhh, the S3 Virge!

Infamously shit. :D

I remember the very first LAN with my buddies way back when; we were all playing Battlezone on our 3dfx Voodoos, and there's 'that guy' with the S3 Virge. "Yeah, it's really smooth, amazing". We look on his screen and it's a fucking slideshow. :LOL::ROFLMAO:

LOL! I'm glad I wasn't 'that guy' haha. They were crap then to the point of almost being redundant. Mine came in a school laptop so I had no choice, software rendering on a K6-2 386MHz was about as fast as the damn card..
One of the OGL modes made for interesting cheating in TFC though, as it made characters visible through walls (which all went white) if they touched them, 2fort became quite the interesting snipefest with this..

One last mention, I was given a GT730. It can't even run Aero at 1440... enough said. That thing is a horrible 'last resort' card but ok for troubleshooting.
 
My purchase of a 8800 GT in sli was my worse purchase ever. The stuttering and problems I had with sli was a nightmare and sucked the fun out of things more often then it worked liked it was supposed to. Reviews sucked me into that purchase choice and I was never really ever happy with it.
 
My least favorite card was probably a Kyro II I had. It wasn't terrible, but it wasn't well supported which led to some odd issues with some games.

I had one too, I think I ended up returning it because it was so bad, and later on got a Radeon 9000 that worked waaaay better.

Hercules_3D_Prophet_4000XT_64MB_PCI.png
 
I might catch some flack for this but------- Geforce 4 Ti 4200

It had the WORST image quality. Omega Drivers helped but man, what a muddy image. I also had an ATI Radeon 9000pro at the same time. Even though it was slower, the image quality was so much better. And DX 8.1 meant extra visual effects in Max Payne 2 so, hah!
 
I might catch some flack for this but------- Geforce 4 Ti 4200

It had the WORST image quality. Omega Drivers helped but man, what a muddy image. I also had an ATI Radeon 9000pro at the same time. Even though it was slower, the image quality was so much better. And DX 8.1 meant extra visual effects in Max Payne 2 so, hah!
That's interesting. I remember I had a 9000pro and I was so jealous of my friends' ti4200s.

Then again we mostly cared about fps.
 
I second the S3 Virge cards, only thing they had going for them was S3TC version of Unreal Tournament, it looked really good but the framerate was a total joke for a First Person Shooter. Shortly after release of S3TC Microsoft added the texture compression to DirectX which meant all cards could use it, leading me to snack myself vigorously with the box the card was packaged in. S3 Virge, should have been used in the PC build by The Verge.

3870x2, where to start with this one, hmp, maybe the damn stuttering they were never able to fix with drivers, or perhaps the hotter than a summer piss on pavement temperatures and resulting fan noise. If anyone has a GPU museum, I have a new refurb-replaced-while-in-transit 3870x2 in a box. I'd do electronics recycling on it but where I live is very anti-recycle, so it along with many other old cards linger in a storage container.

Unfortunately S3TC didn’t debut until the S3 Savage3D. The Virge went through multiple revisions and late model Virges were around twice as fast as the earliest, but it was like turbocharging a Mercury Topaz. Even the Savage3D was pretty broken - the closest thing S3 had to a successful 3D accelerator was the subsequent Savage4, which had cheap memory and dodgy drivers and was consigned to the bargain segment by 64-bit memory. But at least it had S3TC...
 
Unfortunately S3TC didn’t debut until the S3 Savage3D. The Virge went through multiple revisions and late model Virges were around twice as fast as the earliest, but it was like turbocharging a Mercury Topaz. Even the Savage3D was pretty broken - the closest thing S3 had to a successful 3D accelerator was the subsequent Savage4, which had cheap memory and dodgy drivers and was consigned to the bargain segment by 64-bit memory. But at least it had S3TC...


Yeah, they never really figured out how to make a performance graphics chip, or make bug-free drivers. The Savage 2000 was too late and too broken to save them from bankruptcy.

The Savage 3D saved texture memory and bandwidth by automatically compressing all textures into S3TC, but that meant reduced image quality on all textures (real-time encoding is always less-optimized than offline tools)
 
I am really upset that nVidia and Apple can't get things where we can use nVidia in the Apple ecosystem again!
 
I am really upset that nVidia and Apple can't get things where we can use nVidia in the Apple ecosystem again!

You’d think that since Apple’s external gpu functionality was fenced off to only newer Macs they’d open the doors to whatever gpu you need to get the job done would be supported.

Oh wait, it’s Apple, Pro is just a label they throw around.

luckily I have an equipment allowance and Linux works fine on a thinkpad.
I don’t do any native mobile dev, so I’m not locked into anything.
 
I might catch some flack for this but------- Geforce 4 Ti 4200

It had the WORST image quality. Omega Drivers helped but man, what a muddy image. I also had an ATI Radeon 9000pro at the same time. Even though it was slower, the image quality was so much better. And DX 8.1 meant extra visual effects in Max Payne 2 so, hah!

I saw tons of them burn up due to bad VRMs. They were shit hardware wise and the image quality did seem a bit worse than my 9600Pro.
 
Good thread! One card out of the insane amount of video cards I've owned really stands out. I bought an EVGA FX5900 128MB card when they first appeared on the market. Was all excited, until I got card and discovered it had smeared image quality reminding me of a crappy Geforce 2 (many of those had poor output quality). I ended up RMAing the card twice with EVGA before I got one that wasn't a headache inducing blurry mess on my 17" Dell Trinitron monitor (1024x768 resolution). I damn near just returned it and went back to my Geforce 3 that I had moved to my secondary rig.

Then the 'good' card ran about a year until it had memory fail and start artifacting. Good times! The FX series really was garbage. The used 9800 Pro I replaced it with was a far superior product and ran F.E.A.R. very well!
 
Last edited:
I saw tons of them burn up due to bad VRMs. They were shit hardware wise and the image quality did seem a bit worse than my 9600Pro.
Doom 3 killed mine, actually!

I never got to find out why, exactly. Scheduled an RMA with BFG. But, I left the card on a book shelf and a younger step brother snuck into my bedroom and dismantled the card. >_>
 
I second the S3 Virge cards, only thing they had going for them was S3TC version of Unreal Tournament, it looked really good but the framerate was a total joke for a First Person Shooter. Shortly after release of S3TC Microsoft added the texture compression to DirectX which meant all cards could use it, leading me to snack myself vigorously with the box the card was packaged in. S3 Virge, should have been used in the PC build by The Verge.

3870x2, where to start with this one, hmp, maybe the damn stuttering they were never able to fix with drivers, or perhaps the hotter than a summer piss on pavement temperatures and resulting fan noise. If anyone has a GPU museum, I have a new refurb-replaced-while-in-transit 3870x2 in a box. I'd do electronics recycling on it but where I live is very anti-recycle, so it along with many other old cards linger in a storage container.
S3TC was introduced with the Savage3D years later. The Virge couldn't do bilinear filtering properly let alone texture compression.
 
Sierra Screaming '3D with the rendition V1000. Never really lived up to expectations as an alternative
 
S3 Virge. 'nuff said

Powercolor Radeon 7500. The chip reported itself as a Radeon 7500, but would only work with customized manufacturer (Powercolor) drivers. Turns out that it was running SDR memory instead of the DDR that the 7500 was supposed to run, and the chip and memory was underclocked compared to the AMD specs for a 7500. In other words, Powercolor shipped a Radeon 7200 that was BIOS modded to report itself as a 7500.
 
Sierra Screaming '3D with the rendition V1000. Never really lived up to expectations as an alternative


Yeah, that was quite the cluster. It was such a pain-in-the-ass to work with, Carmack swore off native rendered versions of Quake afterwards. Quake was instead ported to OpenGL, and everyone had to create a MniGL just to get in on it.

It always amazed me that v1000 was so hobbled, the vQuake port didn't run much better on the second generation.

https://www.vogonswiki.com/index.php/VQuake

r_surfacelookup = 0 made things lot faster :D

I only played VQuake for single-player (edge AA basically for-free), but the rest of the wold abandoned it for GLQuake/QuakeWorld.
 
Last edited:
Back
Top