How long since AMD was on top or within 5% of the best card?

GhostCow

Limp Gawd
Joined
Feb 13, 2005
Messages
368
The last non-Nvidia card I owned was the 9700 pro back when it was still ATI. I don't upgrade every generation but it seems strange to me that they've never been at or near the top during a time that I was in the market. Am I just black pilled on AMD or have they really been failing for this long?
 
Disclaimer: No personal or professional ties to AMD nor am I holding any stock :p , only interested in what's best for my money when it comes to GPU...

There were a number of times since the 9700 where ATi/AMD had major successes. Primarily in the 00s before Tesla came around and mopped the floor with ..everything.... In the last decade really only the 7970 stomped the 580. Issue is that card was already a year old and the 680 dropped a few short months later and stole the show. Same story with 290x.

I wouldn't really say they were failing prior to 2015, as they always did manage to be right against Nvidia's heels for quite a bit less money, it's where the whole "AMD best bang for buck" thing came from. That whole thing hurt them.

I WILL SAY they completely dropped the ball after the 290x, which was a monster of a card at launch. GCN went totally stale, HBM didn't provide a boost in gaming, etc. There has been almost no reason to buy an AMD card since (Many will quote the 580, but I'd rather have a used/bstock 980Ti which will eat the 580 for breakfast...), navi is the first card I would consider buying from AMD since 2013.

On the bright side, Ryzen has seriously rejuvenated AMD. Navi is a good start but not there yet. Maybe next gen.
 
Akshully Ur aLL ronG it wuz d FuRy itZ da oVrclokAs dr3am!!!11

Having owned a few of AMDs best cards, the 290X is probably it. It best the Titan and held the crown for 6 months.
7970 was also about two years ahead of the competition if you did the right bios, had decent silicon and hit 1.3GHz core. It was barely slower than my 290x DCUII and most was due to vram at 1440.

680 certainly wasn't faster than the 7970 oc to oc. Good clocking cards were absolute beasts.
 
Last edited:
Having owned a few of AMDs best cards, the 290X is probably it. It best the Titan and held the crown for 6 months.
7970 was also about two years ahead of the competition if you did the right bios, had decent silicon and hit 1.3GHz core. It was barely slower than my 290x DCUII and most was due to vram at 1440.

680 certainly wasn't faster than the 7970 oc to oc. Good clocking cards were absolute beasts.


Both of these fell under "if the drivers worked, you had the top end cooler available, and your card was a top overclocker, you could run as fast as as Nvidia's best previous generation for maybe six months while producing twice the heat and noise".

I recall that's why I skipped them after coming from an AMD GPU at the time.
 
Both of these fell under "if the drivers worked, you had the top end cooler available, and your card was a top overclocker, you could run as fast as as Nvidia's best previous generation for maybe six months while producing twice the heat and noise".

I recall that's why I skipped them after coming from an AMD GPU at the time.
I did it on a reference cooler with the 7970. Brutal [H]ardness. Come at me bro!

Drivers were mostly trouble free for both too I think platform and games makes the difference.
And no the 7970 with a 40% OC was still ahead of nvidia next gen.
 
  • Like
Reactions: mikeo
like this
The 290x I still have does 15,000 gpu score today at 1.1Ghz in Fire Strike with the factory locked bios .. I would really like to see Hawaii running at higher speeds as 1500Mhz on 14 or 12nm would wipe most of the low end out with 512Mb memory bus on DDR6

No better then gaming is moving along after surround3d and 1080p still being main baseline .. AMD should reboot GCN as a Vintage Series for all those GF dies if RX 580 since compares to RX 5500 very well and cheap to build .
 
Last edited:
Well, I like to think my fury nano is pretty competitive considering it's space, lol. Seriously though, in the top end, 290x/fury was it, although fury has lost ground as time has passed, but it wasn't to far off at the time... Performance wise not to bad, but $$$ or power wise not so much, so I don't know if I'd consider it competitive really.
 
The 290x I still have does 15,000 gpu score today at 1.1Ghz in Fire Strike with the factory locked bios .. I would really like to see Hawaii running at higher speeds as 1500Mhz on 14 or 12nm would wipe most of the low end out with 512Mb memory bus on DDR6

I had almost 20 Hawaii cards and none broke 1000mhz
 
I had almost 20 Hawaii cards and none broke 1000mhz
You sure you had Hawaii cards? It was the 290, 290x and 295x2... The 290x and 295x2 had a 1ghz clocks speed, the 290 had a 947mhz clock... If you couldn't hit 1ghz on a single one, I think you must have owned all 290's and never overclocked them or you are confusing another product for hawaii. (Or I am incorrect and there is another product that I missed).

Or your talking the firepro professional series, which most don't get to overclock but still a possibility.
 
You sure you had Hawaii cards? It was the 290, 290x and 295x2... The 290x and 295x2 had a 1ghz clocks speed, the 290 had a 947mhz clock... If you couldn't hit 1ghz on a single one, I think you must have owned all 290's and never overclocked them or you are confusing another product for hawaii. (Or I am incorrect and there is another product that I missed).

Or your talking the firepro professional series, which most don't get to overclock but still a possibility.
I had 290 and 290x. All on reference cooling. None would game or mine stable at 1000mhz
 
R9 290x, but it sucked because no non reference cooler models launched for several months after and maxwell ate its lunch as a cool and quiet solution.
yeah but that was and still is pretty standard for how AMD releases radeon cards. there's almost always a 3 month delay after reference cards release before partner models release. (which i wish wasn't a thing anymore)
 
Wow, nice to see the usual anti amd suspects are still alive and well.

5700 series is very good price performance, reminds me of the hd 38xx cards to be honest. Hopefully big navi is reminiscent of the hd4870. Drivers have been a major weakness though. At stock it's ~1080ti performance for ~$350 and it can overclock fairly well. Drivers have been a bit rough though.

Vega was pretty meh but the Vega 56 had it's moment in the spot light as it can overclock REALLY well (if you have the power supply for it) imo, AMD drivers were just as solid as Nvidia up until Vega, I think Vega bought about a decline in driver quality.

Fury was probably the start of mediocrity, but it was competitive with the 980 and 980ti once drivers matured, just more expensive and the 980ti had a ton of headroom whereas fury had none. Prices quickly dropped. The ram buffer size also did it no favors.

290/290x was competitive and still plays games well (still on par with rx580/gtx 1060/gtx 1650)

7970 was a monster that aged very well.

5870 and 6970 were very good cards

4870/4890 were fantastic cards for the $$$.

38** series was okay.

29xx was trash

X1900/1950 was very strong

X1800 was okay, quickly fixed by x1900

X850 was very strong

That's all since your 9700 pro.

On the Nvidia side of things they haven't really messed up since the disappointment that was Fermi (480 with the 580 being good)

My video card history: 5700xt, 1080ti, Vega 64, 1070, rx580, hd7870, gtx460, hd4870, 8800gt, 1950 pro, 9800, ti4600.
 
Last edited:
I had 290 and 290x. All on reference cooling. None would game or mine stable at 1000mhz
The stock clock on reference model 290x was 1ghz.... So your saying you had to underclock to get stability? Seems odd as I don't recall any mass hysteria at the time for failing to hit those speeds. My 280x had no issues at 1ghz+ undervolted both mining and gaming, but I didn't have a 290 or 290x, just the 280x then fury x and fury nano. Just haven't ever heard of 290x's not hitting their advertised speeds, especially with 20 of them odds are at least a few would work as advertised.
 
Last edited:
The stock clock on refer nice model 290x was 1ghz.... So your saying you had to underclock to get stability? Seems odd as I don't recall any mass hysteria at the time for failing to hit those speeds. My 280x had no issues at 1ghz+ undervolted both mining and gaming, but I didn't have a 290 or 290x, just the 280x then fury x and fury nano. Just haven't ever heard of 290x's not hitting their advertised speeds, especially with 20 of them odds are at least a few would work as advertised.
I don’t think you’re correct on the stock clock of the 290x being 1000. Do you have a link?
 
The last time AMD had a clear performance advantage over nvidia, was on the 9700/9800 era (more so the former). No OC needed no voltage tweaking, no memory tweaking. Ever since, they have traded blows a few times but for the most part nvidia has dominated.
 
I don’t think you’re correct on the stock clock of the 290x being 1000. Do you have a link?
As linked by sabrewolf732, base clock was 1ghz, aftermarket was up to 1.5ghz! I can't imagine you could not be getting 1ghz, probably confusing with another card or misremembering the specific #'s (happens to me , that's why I try not to get specific with my past items because I don't want to mislead with wrong info).
 
Failing is a strong word. AMD GPUs have been just fine. A couple recent lackluster cards and a mining boom that raised prices of cards because they could compute better than game. The high end was healthy competition back and forth basically the whole time. The Fury Vega era was not so good.
I've always bought what I considered the best value, which had been a steady train of ATI/amd cards. Not high end though.
 
Failing is a strong word. AMD GPUs have been just fine. A couple recent lackluster cards and a mining boom that raised prices of cards because they could compute better than game. The high end was healthy competition back and forth basically the whole time. The Fury Vega era was not so good.
I've always bought what I considered the best value, which had been a steady train of ATI/amd cards. Not high end though.

Major exception being the 8800gt

God, that thing was so good.
 
Last edited:
Both of these fell under "if the drivers worked, you had the top end cooler available, and your card was a top overclocker, you could run as fast as as Nvidia's best previous generation for maybe six months while producing twice the heat and noise".

I recall that's why I skipped them after coming from an AMD GPU at the time.

This is utterly false for the 7970.
Basically all 7970s could do 1125Mhz on the core (Stock was 925 im pretty sure).
The 680 came out like 6 months later and just barely beat a stock 7970 and had a 1/3rd less VRAM.
Whoever owned a 7970 didn't buy a 680 because it was faster at all.
The 7970 OC'd like a boss, and man, them legs........

Who here today would take a 680 over a 7970 for gaming???
................
That's what I thought.

Back on topic, I agree with the 290x, although the FuryX was pretty close to the 980ti (stock) as long as your never got close to the 4GB VRAM limit lol. OC is a WHOLE nutha story though.

I still have a 290x PCS+ that runs stock at 1040 or 1050MHz can't remember. To the guy that said he had 20 290x's and none were stable @1000MHz, I say: You're doing it wrong lol! You're problem is not the Video cards
 
Last edited:
Who here today would take a 680 over a 7970 for gaming???

No one would take either on purpose -- but if I had to make a choice, I would take the GTX680, for the same reason I bought a pair of GTX670s: they ran significantly cooler and quieter, single-GPU drivers were solid, and multi-GPU support for Nvidia was ubiquitous- while AMD was just shown to have utterly terrible multi-GPU frametimes with regularly negative scaling, something I experienced with the pair of HD6950s I ran previously.

If I can get a GPU with the same performance with significantly less heat or noise, well, that's a win. AMD has approached efficiency parity with Nvidia on the same node exactly... once. They've approached performance parity exactly... once.

They haven't ever done both at the same time.

That doesn't mean that I wouldn't buy their products or recommend them where they make sense, because I do both, but I'm also not going to praise them because they're the underdog or because I own their stock or whatever else. They can earn their praise like everyone else, or not. So far their GPU products following their acquisition of ATI have been a shadow of what ATI was able to release relative to the competition.
 
No one would take either on purpose -- but if I had to make a choice, I would take the GTX680, for the same reason I bought a pair of GTX670s: they ran significantly cooler and quieter, single-GPU drivers were solid, and multi-GPU support for Nvidia was ubiquitous- while AMD was just shown to have utterly terrible multi-GPU frametimes with regularly negative scaling, something I experienced with the pair of HD6950s I ran previously.

If I can get a GPU with the same performance with significantly less heat or noise, well, that's a win. AMD has approached efficiency parity with Nvidia on the same node exactly... once. They've approached performance parity exactly... once.

They haven't ever done both at the same time.

That doesn't mean that I wouldn't buy their products or recommend them where they make sense, because I do both, but I'm also not going to praise them because they're the underdog or because I own their stock or whatever else. They can earn their praise like everyone else, or not. So far their GPU products following their acquisition of ATI have been a shadow of what ATI was able to release relative to the competition.

No freaking duh, I was just creating a hyoptheical scienerio.
In my case, I have had a 280x (OC'd mature 7970) as a backup card for a long freaking time. Every time I would break it out and use it I was always impressed how well it handled games considering its age. Many non AAA games need more than 2GB of RAM @1080p, I would know since I also have a 7850 2GB that chokes if textures are too high. So i'm glad I didn't go with a 680.

So you would take a 680 because you think that the drivers are more solid? Dude its been 8 years, I think they ironed out any issues that are present. I still have a 280x in use and never once had a single issue with it, and that's because it's its just an OC'd 7970 with mature drivers.

Your comment just reeks of fanboyism. But if you say that you prioritise cool and quiet over performance it makes more sense, because here most people want performance first everything else second. You would rather take a GPU with less VRAM because its cooler and quieter. There's a reason it was cooler and quieter and that reason is because it's slower.

Just think, they also made variants of the 7970 with 6GB.
 
IIC says he skipped the 7970. So clearly he is an authority on it.

I had 7970's and 290x's in Crossfire. Even with all the supposed driver "issues", they were great. And ahead of the 680 my friend has/had.
 
7970 > 680 and 290 > 780. Nvidia took the crown back with the OG Titan and has held it since.
BF3 and BF4 were extremely well optimized for Crossfire on the 290s as well, in fact I think that (BF4) might be the last great dual card experience. R9 290 brought the no-bridge-needed Crossfire experience along with frame pacing.
 
There's not a lot of reason to have the fastest card on top, when you have the fastest card in the segment that a majority of the customers buy.
 
The first game in the newest tomb raider series reboot ran really well at 4k60 on 7970 ghz cards in crossfire too.
 
No freaking duh, I was just creating a hyoptheical scienerio.

Yes.

In my case, I have had a 280x (OC'd mature 7970) as a backup card for a long freaking time. Every time I would break it out and use it I was always impressed how well it handled games considering its age. Many non AAA games need more than 2GB of RAM @1080p, I would know since I also have a 7850 2GB that chokes if textures are too high. So i'm glad I didn't go with a 680.

2GB vs. 3GB may mean turning a setting or two up or down, but you're still quite limited today -- then, it didn't matter. Not a decision point either way.

So you would take a 680 because you think that the drivers are more solid? Dude its been 8 years, I think they ironed out any issues that are present. I still have a 280x in use and never once had a single issue with it, and that's because it's its just an OC'd 7970 with mature drivers.

Today, I don't really care about the drivers. Multi-GPU support has been pushed to the back-burner, and single-GPU AMD is more than good enough; I don't play enough newly-released AAA titles to care whether they lag on support.

The point is, their drivers had major issues at the time, and at the time, there was zero indication that AMD would get their act together -- because at the time, that was a tremendous weakness and as forum discourse shows, it's a perceived relative weakness despite AMD making great strides across the board.

Your comment just reeks of fanboyism.

You can read what you want in it -- my main reason was the heat and noise, and well, the average quality of an AMD GPU cooler (then and unfortunately now). Given that I knew I'd eventually need two of whatever I bought, two 7970's were simply not faster enough in single-card (where they may have been faster) to warrant the additional heat and noise.

That's still the case today. Yes, they aged well, and yes, if they were equivalent in terms of heat and noise as well as performing well and coming with more VRAM I'd certainly choose the 7970 -- but not as they actually were.

Just think, they also made variants of the 7970 with 6GB.

And they made GTX600-series with 4GB of VRAM, which wasn't useful at the time either.


I'm looking for the best solution. That's holistic, not just FPS charts (and hopefully frametime plots because we know AMDs history with frame pacing...). A little more power draw, a little more heat, imperceptably more noise? Sure.

GTX680 vs. 7970 wasn't a little.
 
This is utterly false for the 7970.
Basically all 7970s could do 1125Mhz on the core (Stock was 925 im pretty sure).
The 680 came out like 6 months later and just barely beat a stock 7970 and had a 1/3rd less VRAM.
Whoever owned a 7970 didn't buy a 680 because it was faster at all.
The 7970 OC'd like a boss, and man, them legs........

Who here today would take a 680 over a 7970 for gaming???
................
That's what I thought.

Back on topic, I agree with the 290x, although the FuryX was pretty close to the 980ti (stock) as long as your never got close to the 4GB VRAM limit lol. OC is a WHOLE nutha story though.

I still have a 290x PCS+ that runs stock at 1040 or 1050MHz can't remember. To the guy that said he had 20 290x's and none were stable @1000MHz, I say: You're doing it wrong lol! You're problem is not the Video cards

I sold my 7970's and purchased 2x GTX 680's because xfire was a POS at that time. The 680's in SLI were much smoother.

The good news was xfire improved with the 290's and I sold the 680's to get a pair of those.
 
Back
Top