The 970 is still a capable card? Heresy

DarkSideA8

Gawd
Joined
Apr 13, 2005
Messages
988
Flipping through GPU articles - I stumbled across this: https://www.google.com/amp/s/www.techspot.com/amp/review/1822-geforce-gtx-970-vs-radeon-r9-290/

A 2019 article comparing 970 performance 5 years later.

Considered within the limitations of playing on a 24 inch 1080p screen at 60hz (the most common screen out there (as far as I know) ) - it's still getting solidly playable frames. As in - a player with a good CPU and RAM and a 970 can probably get the visuals and be competitive on all but the most demanding 2020 games at 1080p.

AAnyway... While most of us here are enthusiasts who like the bleeding edge of tech in our rigs... I'm kinda surprised by the 'long legs' of this very legacy card. Anyone still rocking one certainly got their money's worth.
 
Yeah I've still got one in the wife's PC. Can't really do 60 FPS in most stuff at 1440p, but still a decent little card.
 
Yeah I've still got one in the wife's PC. Can't really do 60 FPS in most stuff at 1440p, but still a decent little card.

I think this has been the biggest takeaway for me during this 3080 debacle. Unless you are determined to game at 4k, and have the screen, Mobo CPU and RAM to support it - why buy one? **

I think people are so enthused by the potential of the card - and get blown away by the numbers that they're losing perspective on the context.

From what I can tell, the greatest benefit of the 3k cards is making 4k gaming a realistic goal - where you are not sacrificing playable /competitive frames for high pixels. The real crux of this is that to really need 4k, you should also be running screens larger than 27" (where 1440 is probably perfect). The irony is that 32 inch monitors are not all that great atm. and anything larger is probably a TV.

So the 3080 (etc) and the next Gen consoles are kicking open the door for RT and large screen, high pixel and high refresh rate monitors - but industry hasn't shown up yet with the products that can really show off the tech.

Hopefully that changes soon - because if these forums are any indication... The demand is there (presuming people don't get suckered into buying a 24 inch 4k - which to me sounds like playing at 1080p on a 15 inch laptop!)

** 1440 players will benefit for sure - but 1080 monitor owners should take a pass.
 
roommate is still rocking a 970, still seems fine for what hes playing, mostly warzone and lol lately. i was gonna try and sell him my 580 but they are seem pretty close so didnt bother.
 
I doubt there is a game out there that a 970 can't run at some setting. It's had a good run. 1060 6gb is in the same class of card that will last forever it seems and still work for basic no frills gaming.
 
The 970 is still a decent card. It's biggest weakness is VRAM. Not only does it only have 4GB, but only the first 3.5GB of that VRAM actually runs at full speed. The last 512MB runs considerably slower. There was even a class-action lawsuit about it.

At 1080p, VRAM is going to be much less of an issue than at a higher resolution, but just keep in mind that once you exceed your VRAM, performance will tank. You might keep a monitoring app open on a 2nd monitor or something to see just how much of your VRAM you are actually using in games. Try to not go over 3.5GB, perhaps adjusting some settings to compensate. That is what I do with my backup computer. My backup computer is still running 3x GTX680 in SLI. In any game with decent SLI support, that gives me more combined GPU power than a 980ti, but still limited to 2GB VRAM, so I have to be very careful with game settings to not exceed that.
 
The 970 is still a decent card. It's biggest weakness is VRAM. Not only does it only have 4GB, but only the first 3.5GB of that VRAM actually runs at full speed. The last 512MB runs considerably slower. There was even a class-action lawsuit about it.

At 1080p, VRAM is going to be much less of an issue than at a higher resolution, but just keep in mind that once you exceed your VRAM, performance will tank. You might keep a monitoring app open on a 2nd monitor or something to see just how much of your VRAM you are actually using in games. Try to not go over 3.5GB, perhaps adjusting some settings to compensate. That is what I do with my backup computer. My backup computer is still running 3x GTX680 in SLI. In any game with decent SLI support, that gives me more combined GPU power than a 980ti, but still limited to 2GB VRAM, so I have to be very careful with game settings to not exceed that.

I agree. As I wrote, for the 24 inch 1080p 60hz gamer... 970 works. For the 1440 guy who wants better performance, the card is a liability, and don't even try 4k.

The thing that absolutely astounds me - as a guy who is looking hard for a 32 inch, 4k IPS with high refresh rate - is that the monitors that will show off Ampere's true ability are not even available. The January CES showed some promising panels... but they've not yet been seen in the wild. There's some indication that they might (might) be available Q1 2021, and that's about all that's kept me from losing my mind with the 3080 BS. I've been so spoiled in the past with every upgrade, where I researched what I wanted, ordered, and built my computers within a week's time - that I can't believe that all the promise and hype cannot be effectively realized within a reasonable timeframe.

Everything I'm reading is solidifying my opinions tho: for a 24 inch monitor, Ampere (and likely Big Navi) is wasted. It's a good choice for a 27 inch 1440 or 4k, and the required choice for a 32 inch 4k.
 
I didn't catch what version of Hawaii they had for testing as it got an update in 2015 with faster samsung memory and better clocks then the early R 290's .. but with a 512mb memory bus it was made for Eyefintiy and not 1080p as I used my 290x for iRacing Sims with 3 displays .. the RX 5500 XT 8Gb is the card you want for a budget build for it's eco like power usage at 1080p .
 
I just retired my Titan X (Maxwell) card in the girlfriend's gaming rig. It's been able to max out everything at 2560x1440 up until Ghost Recon Breakpoint. It was doing pretty badly in that game so I finally swapped in a GTX 1080 Ti.
 
Ha, I'm still on a 1060 3gb, which is probably similar to a 970 if perhaps a bit better, and run everything just fine at 1080p as long as I don't overwhelm the VRAM with high textures. Looking forward to upgrade in the spring with either AMD or Nvidia, but so far I'm not too lacking in power!
 
The GTX 970 will have two issues: it’s small VRAM and it’s driver optimization.

That said, it’s fine for 60FPS Average at 1080P medium settings, and can even do 30FPS Ultra settings in some newer games.
 
As soon as I got a Vive, the 970 was no longer sufficient for me, though it was still a decent card for regular 1080p gaming.

With the Vive Pro, the 1070 I have now is pushing it.
And the 3080 that I'm wanting to get will probably have a rough time with the HP Reverb G2 that I have on preorder.
 
The 970 should be fine for 1920 x 1080 gaming, even with a good amount of eye candy turned up.

Just as a reference, one of my PC's is still using that same old AMD FX-4100, in combination with a GTX 960 (not 970), and it handles that kind of gaming just fine. I suspect the 970 would be a good bit better.
 
It’s not a fair comparison to the 970, but my 5-year old 980TI is still performing so well that it’s proving impossible to replace at the moment.
 
The 970 should be fine for 1920 x 1080 gaming, even with a good amount of eye candy turned up.

Just as a reference, one of my PC's is still using that same old AMD FX-4100, in combination with a GTX 960 (not 970), and it handles that kind of gaming just fine. I suspect the 970 would be a good bit better.
960 isn't great but still perfectly usable. Some had 4gb as well which helps today. I think some titles might push it a little harder than you'd like, but if you don't play the latest games, it doesn't matter at all.
 
I agree. As I wrote, for the 24 inch 1080p 60hz gamer... 970 works. For the 1440 guy who wants better performance, the card is a liability, and don't even try 4k.

The thing that absolutely astounds me - as a guy who is looking hard for a 32 inch, 4k IPS with high refresh rate - is that the monitors that will show off Ampere's true ability are not even available. The January CES showed some promising panels... but they've not yet been seen in the wild. There's some indication that they might (might) be available Q1 2021, and that's about all that's kept me from losing my mind with the 3080 BS. I've been so spoiled in the past with every upgrade, where I researched what I wanted, ordered, and built my computers within a week's time - that I can't believe that all the promise and hype cannot be effectively realized within a reasonable timeframe.

Everything I'm reading is solidifying my opinions tho: for a 24 inch monitor, Ampere (and likely Big Navi) is wasted. It's a good choice for a 27 inch 1440 or 4k, and the required choice for a 32 inch 4k.

This is a very good point. If you don't have the right screen, the card doesn't matter as much. I was freaking out because I couldn't get one, but I'm 120hz 1440P - and the 2080TI I have does just ~fine~ for that outside of RT... which I don't use much.
 
Have a 970 as a backup and it cuts the mustard but “just “ for backup at 1440 it really shows it’s age.
 
It's not too shocking. My wife's on on a 1080p monitor with a similar card in her machine. It pulls FPS in the 50's most of the time (with new'ish games) and she doesn't care about cranking the details all the way up or never having her FPS dip.
The upgrade bug is mainly for the current "holy grail" of 4K/60+ with details all the way up. If that isn't you, you can be happy with hardware from a while ago. Especially on the CPU front.
 
Flipping through GPU articles - I stumbled across this: https://www.google.com/amp/s/www.techspot.com/amp/review/1822-geforce-gtx-970-vs-radeon-r9-290/

A 2019 article comparing 970 performance 5 years later.

Considered within the limitations of playing on a 24 inch 1080p screen at 60hz (the most common screen out there (as far as I know) ) - it's still getting solidly playable frames. As in - a player with a good CPU and RAM and a 970 can probably get the visuals and be competitive on all but the most demanding 2020 games at 1080p.

AAnyway... While most of us here are enthusiasts who like the bleeding edge of tech in our rigs... I'm kinda surprised by the 'long legs' of this very legacy card. Anyone still rocking one certainly got their money's worth.

Why are you surprised. The reality of the current games publishing world is that 90%+ of games have their core performance dictated by console capabilities. The ps4/xbone came out the end of 2013. The 980 came out 2014. I would expect that the current gen with 8 cores and the graphics ability to at least upscale to 4k with reasonable image quality to shorten the legs of some older systems, at least if you want to play a NEW game. With everyone's steam/EGS/etc backlog, this pain could probably be delayed for some time.
 
I have a 970 and got bit by the upgrade bug once I tried some interpolation, deepfaking, and AI upscaling softwares and suddenly felt hamstringed by my GPU performance. Upgraded to the 2060 Super a week before the 3000 line was announced; played around with it for a few more days before getting a refund. Now I'm twiddling my thumbs wondering how much I really need the upgrade. For work I could definitely use a better CPU (Adobe video editing), but I don't do much in the way of GPU utilization otherwise, especially with the fuss of getting one. I also am waiting for a ~30 HFR 4k monitor, as I've been rocking a 40" 4k TV for a few years and it's just a hair too big imo.
 
I would still be perfectly happy with my GTX 980 if I hadn't upgraded my 24" 1080p monitor to a 32" 1440p monitor last year. As a 1080 card, the 970/980 are perfectly fine even today. At 1440p though? Nope.
 
I have a 970 and got bit by the upgrade bug once I tried some interpolation, deepfaking, and AI upscaling softwares and suddenly felt hamstringed by my GPU performance. Upgraded to the 2060 Super a week before the 3000 line was announced; played around with it for a few more days before getting a refund. Now I'm twiddling my thumbs wondering how much I really need the upgrade. For work I could definitely use a better CPU (Adobe video editing), but I don't do much in the way of GPU utilization otherwise, especially with the fuss of getting one. I also am waiting for a ~30 HFR 4k monitor, as I've been rocking a 40" 4k TV for a few years and it's just a hair too big imo.
what does the rest of your system look like?
 
Say what? Like 80 percent of my gaming in the last four years has been VR.
I won't quibble - some like VR, some like encouraging emerging tech, and some just like the new shiney.

VR does not work for me, atm, because I'm one of those people who can see the spaces between the pixels and not ignore it. Neither, with a screen that close to my eyes, can I stomach the tearing /lag that occasionally happens. Maybe an 8k fast refresh rig would work for me - but nothing I've seen tells me that VR is ready for mainstream use.
 
The GTX970 was one of the best purchases I have ever made in my life (and one of the greatest deals, after factoring in money savings from the partial refund and the class-action lawsuit due to the segmented vRAM). I finally replaced it in 2019, but the main reason I replaced it was the vRAM amount, not the GPU horsepower. I felt the GPU was still pretty dang capable. Worked great for 60Hz 1920x1200p max-settings gaming for years. Now I'm doing 144Hz 1440p on 1080 Ti, and I gave the 970 to my bro, who is still at 1200p 60Hz.
 
My EVGA 970 FTW was howling like a banshee trying to play Division 1 (with 85% Res Scale) @ 1440p when I had to upgrade my monitor cause my Dell 2209wa died. Other games too, it was managing, but screaming in pain almost literally with how loud the fans got. I had to replace it with a 2070 Ultra Gaming in 2018 when they were released. Got 4 years out of the card though. Was a champ.
 
Last edited:
Say what? Like 80 percent of my gaming in the last four years has been VR.

Same here. I mainly play VR games.

I won't quibble - some like VR, some like encouraging emerging tech, and some just like the new shiney.

VR does not work for me, atm, because I'm one of those people who can see the spaces between the pixels and not ignore it. Neither, with a screen that close to my eyes, can I stomach the tearing /lag that occasionally happens. Maybe an 8k fast refresh rig would work for me - but nothing I've seen tells me that VR is ready for mainstream use.

VR is definitely going mainstream. The Valve Index, Quest 2 and Reverb G2 are leading the way. The Reverb G2 has no SDE.

And if you only tried VR on your GTX 970, that would explain the tearing/lag that was happening.
 
Same here. I mainly play VR games.



VR is definitely going mainstream. The Valve Index, Quest 2 and Reverb G2 are leading the way. The Reverb G2 has no SDE.

And if you only tried VR on your GTX 970, that would explain the tearing/lag that was happening.
Idk I feel VR list it's steam again. Alyx helped push it big time but there hasn't been anything worthwhile since. Not too mention all the bs with oculus and FB.
 
Back
Top