Old AMD GPU destroys Nvidia in Doom Eternal

A rather interesting read.
It's probably a better idea to choose AMD if one wants to keep a card for longer than 3 years.

https://www.techradar.com/amp/news/ancient-amd-gpus-destroy-nvidias-when-it-comes-to-doom-eternal
lol its nice to know my stored away back up gpu (7970/280x) from 7-8 years ago can still manage 60+fps in a brand new game. I wonder how many of them were sold off for 30 bucks?
Capture.PNG
 
The thing is AMD for multiple generations has overplayed the future rather than the present and that has bitten them in the ass. GCN was built for DX12, the problem was/is the DX11 is predominantly what people are using in the market.
The same thing is basically what you're illustrating here. Hopefully with Navi they can keep that future looking attitude while still having current gen raster performance. It's essentially how nVidia goes about building their graphics cards.
 
Doesn't surprise me... I was testing an RX 570 4GB card on Doom Eternal using Ultra Nightmare Settings, Textures on Medium/High, both at 1080p and 1440p. During intense action scenes, @1080p it was well over 55-60, and 1440P it was keeping above 35-40. Pretty Impressive for Ultra Nightmare settings on a card that can be found for $100.
 
Last edited:
So in reality, :D, AMD had the superior hardware for a number of generations . . . . Actually for those buying used, AMD is probably the best route to take in most cases. Also of note, GCN long lifespan got the benefit of driver updates for newer games over the many more different architectures Nvidia had in the last decade. Used buyers -> look at AMD, new buyers -> ?
 
  • Like
Reactions: Auer
like this
I knew something was up - the GTX Titan Xm I've been using in my main box was regularly nudging the resolution scale down when scenes got busy or big at 1440p60, while the Vega 56 in the family PC was rock-solid at 60 fps with no scaling. I had no compunction against swapping cards between the systems for Eternal, and the experience is much nicer in general. That said, the Titan's still awesome at 1200p.

My initial suspicion is not down to a failure to optimize drivers - Vulkan's low level nature jibes better with GCN's functionality that isn't well-exposed through prior rendering APIs. Nvidia's cards have been designed to offer more performance with less work on the part of developers, and I'm less inclined to think Nvidia's ignoring Maxwell-era cards than I am to accept that this preposterous GTX 980-and-a-half with 12 GB of RAM is hitting a wall trying to deal with Doom Eternal at 1440p.
 
Last edited:
I still have my R9-280 just for memoirs of a day gone by .. maybe another game that is murder in Badass settings would be Borderlands 3 as the RX 5700 feels like an RX 570 @ 1080p but smooth ..

 
No surprise here. Anticipated something like this since the game is exclusively Vulkan and well optimized for it. We already knew that AMD cards excelled in that API. What did Kyle's review of the Gigabyte RX 480 show, roughly 15% improvement in Doom 2016 bench scores just by switching to Vulkan? And now I own the exact card used in that review. From the looks of things it may serve adequately for this title at high settings in 1080. May end up buying DE afterall.
 
I recently tossed mine, seeing as I have only one AGP computer left (that hasn't been powered on in years), it felt like it was taking up space.
 
With Doom 2016 I remember the Kepler cards performing poorly with Vulkan but did alright with OpenGL. Makes me wonder if they would do better in Doom Eternal if it had an option for OpenGL.
 
With Doom 2016 I remember the Kepler cards performing poorly with Vulkan but did alright with OpenGL. Makes me wonder if they would do better in Doom Eternal if it had an option for OpenGL.

I seem to remember the situation with Kepler improving under Vulkan over time, at least for games that aren't Doom Eternal specifically. An OpenGL renderer would not help with Eternal - Vulkan allows a huge number of draw calls to be avoided or more efficiently batched than could be managed with OpenGL, which is a big part of why Eternal's more visually complex than Doom 2016. GL's inability to distribute rendering load over plural CPUs in conjunction with struggling with the greater number of draw calls would make the problem both crippling and single thread performance-limited. You can get an idea of how this would play out with Wolfenstein II - The New Colossus, where running the command r_renderapi 0 runs the game with OpenGL support. It is MUCH slower due to the issues above, and not officially supported besides.
 
So AMD wins the battle of 6-8 year old GPUs on one game. Hurrah!
No, more than one game, those who pass down their AMD cards to their kids or second/third computer are still enjoying smooth high quality game play. Nvidia, well not so much. :D
 
So AMD wins the battle of 6-8 year old GPUs on one game. Hurrah!

It’s actually a consistent trend across a lot of titles. Nvidia just seems to stop optimizing anything for previous generation cards. Mind you, AMD is still using derivatives of the same GCN architecture since Tahiti, so obviously older generations will benefit from newer drivers, but there is no denying that running modern titles with decent visuals at playable FPS on cards that are several years old and several years back represents tremendous value for a buyer. Not everyone is a hardware enthusiast who has the money, or the desire to commit money, to a fresh upgrade every year or two. Given the legs that cards like the 7970 have had makes me wonder just how long an RX 5700 will deliver playable framerates going forward. If I’m going to put $500 down on a card, and one will get me 6 years of optimum play, and the other will get me 3 years of optimum play, it’s not hard to see which will give you the better value. I have no idea how anyone can dismiss this as not a big deal or impressive that this has been the case for AMD owners.
 
No, more than one game, those who pass down their AMD cards to their kids or second/third computer are still enjoying smooth high quality game play. Nvidia, well not so much. :D

It’s amazing that I still see the RX 580 continually recommended almost universally on hardware forums to people asking for upgrade advice that won’t break the bank but still deliver enjoyable graphics and frame rates. That card launched in April of 2017 and it’s still performing well in new titles and you’ll pay very little for it.
 
  • Like
Reactions: noko
like this
It’s actually a consistent trend across a lot of titles. Nvidia just seems to stop optimizing anything for previous generation cards. Mind you, AMD is still using derivatives of the same GCN architecture since Tahiti, so obviously older generations will benefit from newer drivers, but there is no denying that running modern titles with decent visuals at playable FPS on cards that are several years old and several years back represents tremendous value for a buyer. Not everyone is a hardware enthusiast who has the money, or the desire to commit money, to a fresh upgrade every year or two. Given the legs that cards like the 7970 have had makes me wonder just how long an RX 5700 will deliver playable framerates going forward. If I’m going to put $500 down on a card, and one will get me 6 years of optimum play, and the other will get me 3 years of optimum play, it’s not hard to see which will give you the better value. I have no idea how anyone can dismiss this as not a big deal or impressive that this has been the case for AMD owners.
Exactly, I've seen this over and over again from my own machines, hell my Radeon 290 is still usable at 1080p. Sold off my two 1070's because they had no long term value, kept my Vega's because, well they do. Now even the 1080 Ti is scraping 2060 Super levels of performance -> Very pitiful.
 
I've always wondered if Nvidia just isn't optimizing for their older generation video cards or if they weren't designed well for newer APIs. Could be a little bit of both. The Turing cards seem to do well with Vulkan and have hardware RT so maybe they'll age better when compared to previous generation Nvidia cards and RDNA1 cards. Hard to tell at this point.
 
I have to believe it has very little to do with lack of ongoing optimization by Nvidia, and far more to do with the fact that they are very near peak utilization when first purchased. Sure, their driver team undoubtedly moves on, but if the hw was being utilized well to start with - why keep trying to wring blood from a stone?

AMD has always had strong hardware, and their driver teams have traditionally struggled to get all that power to the rear wheels out of the gate. It either takes a long time ("fine wine") or happens on the game developer side (DX12 / Vulkan).

I would personally balk at describing the 1080ti's current performance as "very pitiful" simply because a recent 2060super can fight in its space in some titles 3 years later.
 
Exactly, I've seen this over and over again from my own machines, hell my Radeon 290 is still usable at 1080p. Sold off my two 1070's because they had no long term value, kept my Vega's because, well they do. Now even the 1080 Ti is scraping 2060 Super levels of performance -> Very pitiful.

Part of me regrets upgrading to Nvidia this time around after seeing how fast performance drops in new titles every generation. I got a reasonably Good price on a G-Sync monitor and, at least at the time I bought, the GTX1070Ti was offering the best price performance to upgrade my R9 280X.

I recall when the RTX 2070 came out, dollar for dollar, performance was roughly equivalent to the GTX1070Ti, so I didn’t worry too much about it. Now I’m checking benchmarks and there is a rather large delta between the two cards now. It’s crazy. It makes me feel, as a consumer, that I’m not getting a lot of long term value for the money I’m spending compared to the 6 years I got out of my R9 280X, but I also have a closed-source G-Sync monitor so I need to own Nvidia to get the benefit. It’s pretty frustrating. If I could do it again, I’d probably get a Freesync monitor and an RX 5700, mod the BIOS, and call it a day for a few years. As much as I like new hardware for a new project, I also like saving money and not feeling like I have to upgrade on a regular basis. Granted, this might change when AMD does a total architectural redesign, but there is no question that AMD owners have gotten plenty of long term value out of their cards.
 
I would personally balk at describing the 1080ti's current performance as "very pitiful" simply because a recent 2060super can fight in its space in some titles 3 years later.

“Very pitiful” might be an extreme description, but it is kind of insane to see where the GTX1080Ti vs RTX2060 performance delta was at launch vs now.
 
No, more than one game, those who pass down their AMD cards to their kids or second/third computer are still enjoying smooth high quality game play. Nvidia, well not so much. :D

my GTX 970 it's aging better than my old r9 280X the difference its huge in this very same tittle when in the past the difference between GTX 970 and r9 280X wasn't that huge...
 
my GTX 970 it's aging better than my old r9 280X the difference its huge in this very same tittle when in the past the difference between GTX 970 and r9 280X wasn't that huge...

It also has 1 GB more VRAM....errr, sorry, 500MB more VRAM ;).

Joking aside, the 970 wasn’t the primary competitor against the R9 280X. That would have been the 770. The 970 is a newer card, I would expect it to be better, which is kind of the point of this discussion.
 
When Vulkan came out for Doom 2016 the Pascal cards where performing worst or the same as OpenGL while AMD took a big leap. My Radeon Nano was smoking my 1070 where as before in OpenGL the 1070 was much better. Nvidia optimized Doom 2016 with Vulkan turned that right around. Many DX 12 and some Vulkan titles that the 1080/1070 performed better than the equivalent Vega's. Looking at newer titles and now the Vega's are really improving while Pascal cards look anemic and that is with the newest titles. It is just not in Nvidia's interest to optimize newer games on older hardware when significantly different. AMD Navi and Navi 2 GPU's and drivers will probably be similar once AMD is just producing Navi GPU's. This is just an observation on my part, anecdotal evidence.
 
When Vulkan came out for Doom 2016 the Pascal cards where performing worst or the same as OpenGL while AMD took a big leap. My Radeon Nano was smoking my 1070 where as before in OpenGL the 1070 was much better. Nvidia optimized Doom 2016 with Vulkan turned that right around. Many DX 12 and some Vulkan titles that the 1080/1070 performed better than the equivalent Vega's. Looking at newer titles and now the Vega's are really improving while Pascal cards look anemic and that is with the newest titles. It is just not in Nvidia's interest to optimize newer games on older hardware when significantly different. AMD Navi and Navi 2 GPU's and drivers will probably be similar once AMD is just producing Navi GPU's. This is just an observation on my part, anecdotal evidence.
yea unfortunately i suspect the same thing driver wise...Just a matter of time GCN cards will start not getting the same driver attention
 
A rather interesting read.
It's probably a better idea to choose AMD if one wants to keep a card for longer than 3 years.

https://www.techradar.com/amp/news/ancient-amd-gpus-destroy-nvidias-when-it-comes-to-doom-eternal
This was a unique case involving an inferior architecture, Kepler, which has not kept up with the times. To use it as a general all encompassing example for your argument is misleading really. Try comparing other Nvidia vs AMD cards, ie, 980ti vs FuryX or GTX 1080 vs Vega 64 (release date performance vs current performances).
 
Back
Top