Should the 5700xt be considered one of the greatest of all time?

Fair enough. I was considering throwing 2080 ti in my list for introducing RT in a real way, but 1) GTX 1080 ti has had longer staying power thus far, and 2) RT and DLSS has not brought real impact to gaming yet (where as unified shaders gave an immediate, massive bump in performance and affected all subsequent APIs). There's also the notable price increase on the 2080 ti which, disregarding my prior criteria, left a bad taste in many people's mouths and set a precedent for ever-increasing prices.
1080Ti and whole Pascal range did not bring unified shaders or any other features that we had before (at least mayor - there might be some minor features related to VR gaming which was hot thing at the time - who cares XD) . Just another good high-end GPU. GTX 980Ti before it was very similar in term of generational improvements as are other high-end GPUs before it.

Turing generation on the other hand is something which not only introduced most features since like forever ago but literally forced evolution in computer graphics hardware. Given AMD's current features and their implementation and overall opinion on RT I bet that if it wasn't for Nvidia then Playstation 5 and Xboxe Series series of consoles would still have the same DX12 feature set that previous generation of consoles had.

Cyberpunk 2077 is the first game that truly shows why RTX is indeed revolutionary hardware so adoption is rather slow but for all it is worth it is first generation of new generation whareas anything after GTX 680 and Radeon 7970 was just iteration of all the same. Besides Turing cards will stay relevant far longer than any pascal card was when it comes to ability to run games at expected quality or run them at all. Mark my words 🙂
 
lol

I agree the 5700xt is not in the running for the greatest of all time but I'm sorry this is not accurate at all. A 1080 is 75% of the 5700xt, whereas a 5700xt is 82% of the 3060ti. It's certainly closer to a 2080 than the 2060 super. For $399 it provided a very good value, arguably one of the better cards from the turing era (2018-2020) in terms of price/perf especially considering they were often available at the ~$350 range.
What it is closer to what depends heavily on games used for testing. I used Techpoweup reviews for my numbers:
relative-performance_2560-1440.png

Besides 75 or 85% is a minor difference and pretty much the same class of performance. If you cannot play game at given resolution at given setting on 1080 then you won't be able to do it on 5700xt either. Certainly there is little to no reasons for 1080 owners to change to 5700xt

Imho 2060 Super is superior card to get. Even if it was 50 bucks more expensive and slower. It will stay relevant for far longer and will be fully usable as a budget second-hand GPU.

Personally I hate to use GPU which doesn't run all games at max settings, even if the games in question are not my pick and effects are marginal. RTX card I even made pre-order. Wanted 2080 but downgraded to 2070 as my pre-order was delayed and it didn't seem such an improvement to spend 200 bucks more so I ordered 2070. It is the card I am most happy with since getting 8800gs for almost half a price of 8800gt or 9550 which had 9600xt board. I was pretty happy with HD 7950 also, this card OCed like hell and ran everything maxed out for years until it... died, probably because it overheated or something 🤣
 
My vote for the GOAT in video cards is... 3DFx Voodoo 1. At the time, it was a revelation showing actual playable Arcade quality 3D graphics on your PC and it completely changed the way games were designed on PC forever after. Despite this, it was actually affordable being ~$200 in 1990's money as well.

Nothing before nor since has had quite the same impact.
 
290x in Uber Mode took Titan class cards on which was way above it's pay grade and DX 12 ready .
 
Voodoo 2 12 mb
Radeon 9700/9800 pro
Radeon 4870
GeForce 8800 gt/gts/gtx/ultra

Those are goat type cards. 5700xt? Maybe you enjoyed yours but it’s not like it shook up the video card scene in any major way. Heck, I’d even suggest that a ti4200 is a potential goat card.
 
Voodoo 2 12 mb
Radeon 9700/9800 pro
Radeon 4870
GeForce 8800 gt/gts/gtx/ultra

Those are goat type cards. 5700xt? Maybe you enjoyed yours but it’s not like it shook up the video card scene in any major way. Heck, I’d even suggest that a ti4200 is a potential goat card.

I was 7 years old playing Pokemon Blue/Silver on a Gameboy Color when 3dfx Voodoo 2 was the best GPU, what did it do to make it the best of all time? And the Radeon 4870 was a good GPU, but why did you choose that one as the best?

For the Radeon 9700 and nVidia 8800 I know the story behind those already
 
I was 7 years old playing Pokemon Blue/Silver on a Gameboy Color when 3dfx Voodoo 2 was the best GPU, what did it do to make it the best of all time? And the Radeon 4870 was a good GPU, but why did you choose that one as the best?

For the Radeon 9700 and nVidia 8800 I know the story behind those already

All of this is in the context of its time.

It was the first “fully featured” GPU of its time that also ran games fast. You could combine two of them to get 1024x768. That’s like getting 8K today. They weren’t over priced, and had very good driver support. GLIDE was the go to API for years. Graphically, games on Voodoo 2s could look MUCH better than those of other cards while running faster.

I had a Dell P2-300 back then. 64mb of RAM. AWE 64 sound card. RIVA 128 GPU. As soon as I put that Voodoo 2 in there, the RIVA collected dust.

Years later I had a Falcon Northwest machine that was my dream machine. Even though it has a GeForce 256DDR, I still had two Voodoo 2s in SLI just for GLIDE games.

Hell, I STILL HAVE A PAIR OF VOODOO 2s new and sealed in the box with original mail in rebate stickers on them on my display shelf!

But let me put this into another context: most people have played games on Voodoo 2s and never known it. Remember Hydro Thunder? It’s a boat racer that is inexplicably still in every Dave and Busters. Well that’s a Celeron 333 with a 10mb Voodoo 2.
 
Last edited:
when 3dfx Voodoo 2 was the best GPU
It had a more than 100% jump of performance in many scenario's over the Voodoo 1 that made that release hyper hyped back in the days, with SLI extending is relevance a bit:

SgRXWQziZpwAuYcqAyohd9-970-80.jpg

That said outside for title that had a solid Glide version I am not sure how much it was legendary and marketing 3DFX had over the mind of gamers versus being that superior to the nvidia TNT, there was an aura about how those glide game felt lighting wise that was maybe subjectively created and hyped by that 3DFX logo.

It is clearly in the list of the most legendary cards too.
 
It was the first “fully featured” GPU of its time that also ran games fast.
I am not sure what you mean by that, from my memory we still had to have a regular 2D card installed and the Voodoo 2 was only running 3D game or it was that we had it has an option to keep the regular card ?
 
Hd 4870 was so unexpectedly good that nVidia was forced to immediately lower the msrp price of the gtx 280 and 260. Aib’s even went as far as issuing reimbursements to customers who bought them at msrp. Hd4870 represented ATi finally getting back on their competitive feet with NVidia’s domination since G80. Sounds pretty legendary huh?

https://www.extremetech.com/computing/82582-nvidia-drops-prices-on-geforce-gtx-280-and-260.

There are many other articles on it.

Voodoo 2 was a great chip all by itself, what maybe it even better? You could run 2 in sli. Not sli as in how nVidia borrowed the term after they purchased 3dfx. Scan line interleave. The video cards took turns drawing lines of the same frame. True 2x performance with little to no latency hit. This is very different from nvidia’s Scalable link interface which relied on afr, alternate frame rendering, driver profiles and increased frame latency. Running 2 cards together also allowed you to game at 1024x768, completely unheard of at the time

just my opinion btw.
 
Last edited:
I am not sure what you mean by that, from my memory we still had to have a regular 2D card installed and the Voodoo 2 was only running 3D game or it was that we had it has an option to keep the regular card ?


By fully featured I mean supporting all the features games used at the time. GPU is common vernacular now, but back then we just said 2D card and 3D card. A lot of people preferred this because most early combined chips were terrible at one or both. Duke, Doom, etc still used your 2D card for the most part.

Actually, I recall having a 2D card, 3D card, 2 separate sound cards for music and effects, and an MPEG decoder card to watch movies. Lots of dongles on the back of that PC.

Now all of that happens in the GPU.
 
Last edited:
By fully featured I mean supporting all the features games used at the time. GPU is common vernacular now, but back then we just said 2D card and 3D card. A lot of people preferred this because most early combined chips were terrible at one or both. Duke, Doom, etc still used your 2D card for the most part.
I could be remembering wrong (I was young) but from memory the voodoo 2 was a 3D card and you still needed a 2D card with it.
 
Ehhh...it was not well received when it came out. Slower than a Voodoo2, and all sorts of driver issues with 2D.
What driver issues?
I had Banshee and actually had it after having Voodoo2 and even after Voodoo3 and my experience tells me 3dfx cards were one of the best when it comes to 2D: all GDI functions hardware accelerated making desktop feel snappier than on other cards and even DOS compatibility/performance was one of the best of all time. I definitely did not note any driver issues in either V3 or Banshee

Performance-wise Banshee was faster than Voodoo2 in most games. Some which heavily used multi-texturing might run faster on Voodoo2 but most games did not. At one time I did have Banshee and V2 and I remember doing some tests.

Image quality of V3 and Banshee was the best pretty much:
- analog output (basically CRT image clarity)
- 3d image quality in 16bit - these cards even had some fancy filter to remove dithering artifacts and put out 20bit color quality output

Riva TNT/2 had 32-bit color which looked better in games which took advantage of it but performance of this mode was not very good to say the least. Especially if you compared budget option. Almost everyone who did not use some kind of 3dfx had this card: Riva TNT2 M64 and compared to it there really was nothing to complain about on Banshee. M64 was tiny bit faster but its 16-bit output was terrible and it did not have enough memory bandwidth for 32-bit - so it ran like three legged dog. Also most of the games of this era of PC gaming had Glide which often looked better and ran better so imho if you had Riva then you were missing out on how these games should really work and look compared to having proper 3D accelerator.

V2 vs Banshee - definitely Banshee had nicer output in games (better dithering algorithms and special output filter to remove dithering artifacts) and in desktop also as it had excellent video DAC's/filters and most 2D cards you used with V2 did not and pass-trhough through V2 did not help.
There were registry (or rather autoexec.bat...) switches to disable these fancy filters but still Banshee/V3 were superior in image quality - that said: for the most part it did not matter much and all 3dfx cards had better image quality than most of the competition.

Overall I would not put even Voodoo3 as "greates of all time" and definitely not Banshee. 3dfx screwed up big time with V3 making it 16-bit-only card. Their next card Voodoo4 4500 had 32-bit color and performed exactly like Voodoo3 in 16-bit and if it was the GPU which competed with Nvidia's Riva TNT2 Ultra then who knows maybe we would all use 3dfx cards today. Especially that VSA-100 chip could do SLI and Voodoo5 5500 would just roll over all competition. Unfortunately they were too late and by the time they did release VSA-100 cards it was too late.
 
Everyone voting for the 8800s have not run into the BGA solder issue...
 
What driver issues?
I had Banshee and actually had it after having Voodoo2 and even after Voodoo3 and my experience tells me 3dfx cards were one of the best when it comes to 2D: all GDI functions hardware accelerated making desktop feel snappier than on other cards and even DOS compatibility/performance was one of the best of all time. I definitely did not note any driver issues in either V3 or Banshee

Performance-wise Banshee was faster than Voodoo2 in most games. Some which heavily used multi-texturing might run faster on Voodoo2 but most games did not. At one time I did have Banshee and V2 and I remember doing some tests.

Image quality of V3 and Banshee was the best pretty much:
- analog output (basically CRT image clarity)
- 3d image quality in 16bit - these cards even had some fancy filter to remove dithering artifacts and put out 20bit color quality output

Riva TNT/2 had 32-bit color which looked better in games which took advantage of it but performance of this mode was not very good to say the least. Especially if you compared budget option. Almost everyone who did not use some kind of 3dfx had this card: Riva TNT2 M64 and compared to it there really was nothing to complain about on Banshee. M64 was tiny bit faster but its 16-bit output was terrible and it did not have enough memory bandwidth for 32-bit - so it ran like three legged dog. Also most of the games of this era of PC gaming had Glide which often looked better and ran better so imho if you had Riva then you were missing out on how these games should really work and look compared to having proper 3D accelerator.

V2 vs Banshee - definitely Banshee had nicer output in games (better dithering algorithms and special output filter to remove dithering artifacts) and in desktop also as it had excellent video DAC's/filters and most 2D cards you used with V2 did not and pass-trhough through V2 did not help.
There were registry (or rather autoexec.bat...) switches to disable these fancy filters but still Banshee/V3 were superior in image quality - that said: for the most part it did not matter much and all 3dfx cards had better image quality than most of the competition.

Overall I would not put even Voodoo3 as "greates of all time" and definitely not Banshee. 3dfx screwed up big time with V3 making it 16-bit-only card. Their next card Voodoo4 4500 had 32-bit color and performed exactly like Voodoo3 in 16-bit and if it was the GPU which competed with Nvidia's Riva TNT2 Ultra then who knows maybe we would all use 3dfx cards today. Especially that VSA-100 chip could do SLI and Voodoo5 5500 would just roll over all competition. Unfortunately they were too late and by the time they did release VSA-100 cards it was too late.

I confused the driver issues with Voodoo Rush.

Performance of the Banshee though was lower than the V2 due to the lack of a second texture unit. Image quality on both was considered equal.


https://www.anandtech.com/show/197/3
 
Everyone voting for the 8800s have not run into the BGA solder issue...

I didn't have the issue with the 8800 gt (3 years is how long they were in use - 2 of them separate machines). I thought the bigger issue one was the 8600 mobile series though.

*Note* I'm not saying the issue didn't exist, it's just I don't remember it if it did.
 

The original GeForce Titan was released in February 2013. AMD's Radeon 290X released in October 2013. The R9 290X was more than six months late to the party and was generally beaten by the Titan anyway. Even worse, it had to compete with NVIDIA's GeForce GTX 780 Ti which was generally faster than the Titan in gaming due to higher clock speeds. Being late to the party and not being decisively faster than its competition makes the card a pretty underwhelming offering and disqualifies it from being considered one of the greatest GPU's of all time.

Listen, if you guys want to debate the greatest GPU's of all time it really needs to be a GPU that stood the test of time better than the others or was a technological marvel when it came out. A card that set the bar high and brought us new technologies that actually went somewhere. The 8800GTX, the 9700Pro, The GeForce 256, Riva TNT2 Ultra, 3DFX Voodoo, etc. These are the cards that actually had an impact on the market. Cards like the R9 290X may have been great for you, but ultimately weren't long lasting enthusiast favorites. They didn't make a substantial mark on the industry.
 
Last edited:
What it is closer to what depends heavily on games used for testing. I used Techpoweup reviews for my numbers:
I trust Techpowerup as well but when was that chart made? I suggest checking out HWUnboxed's video revisiting the 5700xt. The numbers have changed.
 
Everyone voting for the 8800s have not run into the BGA solder issue...
My 8800 GTX died after a couple of years of use, and I think this was the issue. However, when I sent it back for an RMA, eVGA sent me back a 9800 GTX+, which was about 10% faster than the 8800 GTX.

That being said, the 8800 series of cards was the first GPU line to introduce Unified Shaders, This combined withe the 8800 GTX's massive 2x performance bump over the previous gen is what made is such an amazing GPU.
 
Looking at all the graphs posted here.. this thread should be renamed to:

Should the 1080 Ti be considered one of the greatest of all time?​

 
Looking at all the graphs posted here.. this thread should be renamed to:

Should the 1080 Ti be considered one of the greatest of all time?​


The only remarkable thing about the 1080 Ti was its value at the high-end; it was both the fastest non-Titan GPU and priced appropriately compared to the rest of their lineup. Beyond that, it wasn't a super remarkable GPU as it didn't introduce any new features. It was just a good all around GPU. Unfortunately, the thing that stops it from being the "GOAT" is its successor, Turing, introduced hardware-accelerated Ray Tracing and AI-based rendering. The 1080 Ti could do Ray Tracing, but it was dreadfully slow at it, and it had no AI acceleration.
 
Well, my 5700XT is good for what it is (decent bang-for-buck placeholder that's roughly equivalent to a RTX 2070), but hardly a revolutionary card, even upon release. It still is doing a decent job "placeholding" as I wait for more inventory from both NVIDIA/AMD (don't really have the time to jump to every Discord announcement, or wait outside my local Micro Center).
 
My 1080Ti is still going way stronger than I ever expected it to.

It'll likely be lasting me till the RTX 4000 or AMD 7000 series at this point given the current GPU buying situation.

(Edit: I will personally find it really fucking funny if I end up with another AMD 7800 GPU...)
 
Everyone voting for the 8800s have not run into the BGA solder issue...

No, I didn't. I had three of them in 3-Way SLi and ran them hard. I never had that issue. Nor did I experience that with they other 8000 series NVIDIA GPUs I had in other machines at the time.
 
The only remarkable thing about the 1080 Ti was its value at the high-end; it was both the fastest non-Titan GPU and priced appropriately compared to the rest of their lineup. Beyond that, it wasn't a super remarkable GPU as it didn't introduce any new features. It was just a good all around GPU. Unfortunately, the thing that stops it from being the "GOAT" is its successor, Turing, introduced hardware-accelerated Ray Tracing and AI-based rendering. The 1080 Ti could do Ray Tracing, but it was dreadfully slow at it, and it had no AI acceleration.
While I like your post. The useless shit storm of Turding is what made the 1080ti look sooo good in comparison. Useless features with the worst bump in performance ever!
But hey let's give a slow clap to first gen RT :ROFLMAO: :ROFLMAO: :ROFLMAO:
 
  • Like
Reactions: noko
like this
If we were just talking about a mid-range king that dropped and punched way above its weight, I actually would throw my weight behind the RX580 over the 5700XT. Lived for a lot longer (mostly because honestly AMD couldn't keep improving that aging GCN design) and did wonders at a number of different things. It definitely was a champion at compute, especially for its price point and could play everything reasonably at 1080p as well as accelerate photo video tasks well.

The 5700XT is just a stop gap card at best. Sad to say. It was a product that was made in order to keep up in the midrange while ("big") Navi was still being developed. And while it's the basis for Navi2 and a very important stepping stone keeping AMD alive in the GPU space, it wasn't a knockout card.
 
If we were just talking about a mid-range king that dropped and punched way above its weight, I actually would throw my weight behind the RX580 over the 5700XT. Lived for a lot longer (mostly because honestly AMD couldn't keep improving that aging GCN design) and did wonders at a number of different things. It definitely was a champion at compute, especially for its price point and could play everything reasonably at 1080p as well as accelerate photo video tasks well.

The 5700XT is just a stop gap card at best. Sad to say. It was a product that was made in order to keep up in the midrange while ("big") Navi was still being developed. And while it's the basis for Navi2 and a very important stepping stone keeping AMD alive in the GPU space, it wasn't a knockout card.
I agree. If it wasn't for Nvidia over-pricing the market there was no way an over-priced 5700XT it's self would even be mentionable. Nvidia was making so much $$ that they literally let AMD have their own cash cow.
 
While I like your post. The useless shit storm of Turding is what made the 1080ti look sooo good in comparison. Useless features with the worst bump in performance ever!
But hey let's give a slow clap to first gen RT :ROFLMAO: :ROFLMAO: :ROFLMAO:
To be honest, the performance bump from Pascal to Turing was pretty good. About 30-40% if you compare GP102 -> TU102. (much more if you factor in RT and DLSS)

We got a slightly bigger performance bump going from TU102 -> GA102, but power consumption went up considerably as well.

The thing that was terrible for Turing was its price. If Nvidia had kept the price the same as previous generations, Turing would have been a hit.
 
To be honest, the performance bump from Pascal to Turing was pretty good. About 30-40% if you compare GP102 -> TU102. (much more if you factor in RT and DLSS)

We got a slightly bigger performance bump going from TU102 -> GA102, but power consumption went up considerably as well.

The thing that was terrible for Turing was its price. If Nvidia had kept the price the same as previous generations, Turing would have been a hit.

By that same thinking, if AMD had dropped the 5700XT in the $200 range as rumors at the time before release had suggested it would be, this would be a VERY different conversation.
 
It didn't fail to compete against high end Nvidia cards, that's like saying the 1060 failed to compete at the high end. That's a misleading argument, it was never intended to compete against the high end, it's price point supports this.

That said, I do agree this isn't a 1080ti moment from AMD. It was a decent card with a price to match it's performance, not a bad deal, but as mentioned, more noted for it's mining ability (hashes/watt) than it's gaming ability.

Don't kid yourself. AMD wasn't trying to make a midrange card with the 5700XT. It was shooting for the high end but the architecture's inefficiency reared its ugly head and it just wasn't able to compete. The idea is always to build an ultra-high end card and then scale it back for different price points as that's generally the best way to go from both a manufacturing and a technical standpoint. AMD also talked about scaling etc. which given its horrendous power consumption was never in the cards. Pun intended. There was never going to be a scaled up version of this. This was the high end, it just failed to measure up to NVIDIA's midrange at the time. This was also the end of the line for that architecture as the architecture behind the 6800/6800XT and 6900XT is almost entirely new. There is a good reason for that.
 
Last edited:
Don't kid yourself. AMD wasn't trying to make a midrange card with the 5700XT. It was shooting for the high end but the architecture's inefficiency reared its ugly head and it just wasn't able to compete. The idea is always to build an ultra-high end card and then scale it back for different price points as that's generally the best way to go from both a manufacturing and a technical standpoint. AMD also talked about scaling etc. which given its horrendous power consumption was never in the cards. Pun intended. There was never going to be a scaled up version of this. This was the high end, it just failed to measure up to NVIDIA's midrange at the time. This was also the end of the line for that architecture as the architecture behind the 6800/6800XT and 6900XT is almost entirely new. There is a good reason for that.
So they made a 40 CU part to compete at the high end with a small die? Yeah... No. Whether or not they tried to scale it up and it didn't work is not an argument of whether the 5700xt was a good card or not. The 5700xt was not meant to compete with a 2080.. they may have hoped they could scale up/down or w/e but those imaginary parts are not the 5700xt. The question is the 5700xt not some mythical part that was never made that couldn't compete. It's more like they were focused on the CPU side and didn't want huge dies taking up space at TSMC with questionable (at the time) defect rates. The card at hand that was released was the 5700xt and it competed well with what it was intended, which is the discussion. Not other cards that never were produced from this architecture, but the actual 5700xt that was produced.
My point still stands, the 5700xt wasn't meant to compete at the high end (the 5700xt itself, not the architecture). The argument isn't about how or why AMD didn't compete, it's about the 5700xt and it's position. It was a decent $/perf and that's about it. It aged pretty well for what it was, but, in my opinion, is not as long lived a card as others.
 
Back
Top