Upgrading 970, help appreciated.

Joined
Aug 31, 2012
Messages
28
Looking for best value for the money. Right now I am considering:

New 1660TI
Used 1070Ti
Used 1080

Budget ~$300

What do guys think would be the best?
 

kirbyrj

[H]ard as it Gets
Joined
Feb 1, 2005
Messages
25,216
Best value is a used 1080 if you can buy it right (harder to do recently). Another alternative is a 1660Ti from EVGA B-stock which is around $230 shipped with 12 month warranty. 1070Ti is around $230-260 and is 95% of a 1080.

The best bang for the buck though at $300 is a vanilla Radeon 5700 flashed to the XT.
 

Jandor

Gawd
Joined
Dec 30, 2018
Messages
520
I wouldn't buy a used graphics card because gaming graphics cards have their GPU and most of the tim VRam overclocked by the manufacturer. they are already on most of their limit of use. And after that many people still overclock them over the safe limit. And they may sell them when they may start to show some stability problems. If you personnaly know the guy and you're confident, you can buy his used graphics card. 1660Ti is quite close, and brand new is better. Whatever you buy now, you'll need to buy again in about 6 months to keep up with what's coming (which will be huge improvement) and will become the new standard in gaming.
 

defaultluser

[H]ardForum Junkie
Joined
Jan 14, 2006
Messages
12,952
If you insist on staying Nvidia, the value of the 1660 Super is hard to beat at $230 (it's basically a 1660 ti)

relative-performance_3840-2160.png


https://www.amazon.com/ASUS-Overclocked-Dual-Fan-Graphics-TUF-GTX1660-O6G-Gaming/dp/B081SPGMBD?th=1

Your 970 is slightly slower than the rx 570, so it would be about 75% faster.

The RX 5700 is hard to beat, but means dropping $300 new.

https://www.microcenter.com/product/609054/xfx-radeon-rx-5700-single-fan-8gb-gddr6-pcie-video-card

I would avoid going with Pascal, as new games are starting to be aimed at Turing/Navi. The performance differences will only get more common as new punishing games are released.
 
Last edited:

Mode13

Gawd
Joined
Jun 11, 2018
Messages
746
My vote is to massively cheap out, buy a used 980Ti for roughly $130-140 via forums, overclock it to 1400mhz or even 1500 depending on model, enjoy performance comparable to 1070 Ti, then upgrade again to maybe a 1080Ti or better later this year or next year when big navi and rtx 3000 are around.


I'm biased of course since that's how I played it... I don't think 1080s are going to be worth the $300 or so people are asking in short order once sales start hitting the rx 5700 ...and then rtx 3000 shows its face... Probably won't lose too much on the resale of the already cheap 980Ti when swapping it back in later.

[ymmv with power consumption, and the 6gb frame buffer is still totally adequate for 1080P/1440p in my experience...]
 

Westwood Arrakis

[H]ard|Gawd
Joined
Nov 17, 2012
Messages
1,891
I went from a 970 to a 1060 6gb and its been flawless. Paid like $150 for it from someone here.
 

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
228
I went from a 970 to a 1060 6gb and its been flawless. Paid like $150 for it from someone here.
That is not even a decent upgrade as the performance difference is under 20%. And the 970 was a better overclocker so comparing overclocked to overclocked it's like a little over 10% difference. The only real advantage of course is the extra vram.
 
Last edited:

Westwood Arrakis

[H]ard|Gawd
Joined
Nov 17, 2012
Messages
1,891
That is not even a decent upgrade as the performance difference is under 20%. And the 970 was a better overclocker so comparing overclocked to overclocked it's like a little over 10% difference. The only real advantage of course is the extra vram.
Hrm. Then perhaps I'm mistaken. I thought it was a 970...
 

cyclone3d

[H]ardForum Junkie
Joined
Aug 16, 2004
Messages
13,256
Finance a 2080ti for 4 months ($300/month). This is the way.
LOL

What resolution is the OP going to be playing at? If only 1080p.. or even 1440p, then a 2080Ti is kinda overkill.

I upgraded to a 2080 from a 1080 and only game at 1080p at the moment. (Bought a 1080Ti that was still under warranty and it ended up being faulty and EVGA sent me a 2080 as a replacement).

Everything I have run on it thus far maintains 75fps (max my freesync monitor goes) with no more than about 25% GPU usage with settings maxed out. I watch the GPU usage on my second screen. I am overclocking the 2080 with +100 for the GPU and +700 for the RAM which is what a lot of the different reviews said was the normal overclock they were seeing.
I haven't tried to push it any further.

Games I have been playing:
Destiny 2
The Division 2
Warframe

Granted, there are more demanding games out there but I still find it hilarious at how low the GPU usage is.
 

kdh

Gawd
Joined
Mar 16, 2005
Messages
799
I went from a 970, to a 1660ti. Was a monster upgrade for me. My 970 was a great card and lived in a bunch of machines, and I definitely got my moneys worth out of it. Feel the 1660TI will be in the same boat and will easily last as long as my 970 lasted me. I use it more then my 1080, and 1080ti.
 

DejaWiz

Oracle of Unfortunate Truths
Joined
Apr 15, 2005
Messages
19,521
Budget of $300 for nVidia...go with a 1070Ti, 1080, or 2060 for used; 1660Ti/Super for new.
 
  • Like
Reactions: xx0xx
like this

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
228
LOL

What resolution is the OP going to be playing at? If only 1080p.. or even 1440p, then a 2080Ti is kinda overkill.

I upgraded to a 2080 from a 1080 and only game at 1080p at the moment. (Bought a 1080Ti that was still under warranty and it ended up being faulty and EVGA sent me a 2080 as a replacement).

Everything I have run on it thus far maintains 75fps (max my freesync monitor goes) with no more than about 25% GPU usage with settings maxed out. I watch the GPU usage on my second screen. I am overclocking the 2080 with +100 for the GPU and +700 for the RAM which is what a lot of the different reviews said was the normal overclock they were seeing.
I haven't tried to push it any further.

Games I have been playing:
Destiny 2
The Division 2
Warframe

Granted, there are more demanding games out there but I still find it hilarious at how low the GPU usage is.
There is zero chance you are only at 25% gpu usage on max settings 75 fps at 1080p in a game like Division 2.
 

cyclone3d

[H]ardForum Junkie
Joined
Aug 16, 2004
Messages
13,256
There is zero chance you are only at 25% gpu usage on max settings 75 fps at 1080p in a game like Division 2.
I'll test again tonight but I'm pretty sure that that is what I was seeing according to the load monitoring (not power monitoring) in EVGA X1.
 

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
228
I'll test again tonight but I'm pretty sure that that is what I was seeing according to the load monitoring (not power monitoring) in EVGA X1.
If you are only seeing 25% GPU usage then it's absolutely being reported wrong. That would mean a 2080 super is capable of 300 FPS in division 2 which is not even remotely possible. Plus when GPU usage is that low then clock speed is usually even below base clocks which will mean that would be even more headroom meaning the GPU was really only at about 1/6 its actual capabilities which again is beyond possible.
 

cyclone3d

[H]ardForum Junkie
Joined
Aug 16, 2004
Messages
13,256
If you are only seeing 25% GPU usage then it's absolutely being reported wrong. That would mean a 2080 super is capable of 300 FPS in division 2 which is not even remotely possible. Plus when GPU usage is that low then clock speed is usually even below base clocks which will mean that would be even more headroom meaning the GPU was really only at about 1/6 its actual capabilities which again is beyond possible.
So testing again, if I am running around and even in fights, the GPU usage that Windows is showing is less than 20%. If I zoom in with my scope, it jumps up to 28-50% usage for a couple seconds and then settles back down to less than 10% when not in a battle. Around 10% when in a battle.

Going to test with vsync disabled to see what happens.

Maybe Windows is reporting it wrong.

Edit... Well, turning vsync off jumped the fps to around 115-200 depending on where I am looking (outside and in fights).

I am completely CPU limited when doing that though as my CPU usage jumps up from around 45%+ to 85-100%.

GPU usage is being reported at around 18% when doing this. Turbo settles at 2010 for the GPU clocks with it warming up to a whopping 67C. Both fans bump up to 85% speed per my custom fan curve... maybe I should raise them to 100%. Maybe it would boost a bit more.. but I don't play with vsync disabled so what is the point?

Edit 2 : With MSI Afterburner, the GPU usage being reported is way higher.. up to around 97%. If that is really the case, then what in the world is Windows 10 GPU monitor reporting?

Or is MSI Afterburner reporting the percentage of the Power Limit % setting?
 
Last edited:

Ricky T

Limp Gawd
Joined
Nov 7, 2019
Messages
228
So testing again, if I am running around and even in fights, the GPU usage that Windows is showing is less than 20%. If I zoom in with my scope, it jumps up to 28-50% usage for a couple seconds and then settles back down to less than 10% when not in a battle. Around 10% when in a battle.

Going to test with vsync disabled to see what happens.

Maybe Windows is reporting it wrong.

Edit... Well, turning vsync off jumped the fps to around 115-200 depending on where I am looking (outside and in fights).

I am completely CPU limited when doing that though as my CPU usage jumps up from around 45%+ to 85-100%.

GPU usage is being reported at around 18% when doing this. Turbo settles at 2010 for the GPU clocks with it warming up to a whopping 67C. Both fans bump up to 85% speed per my custom fan curve... maybe I should raise them to 100%. Maybe it would boost a bit more.. but I don't play with vsync disabled so what is the point?

Edit 2 : With MSI Afterburner, the GPU usage being reported is way higher.. up to around 97%. If that is really the case, then what in the world is Windows 10 GPU monitor reporting?

Or is MSI Afterburner reporting the percentage of the Power Limit % setting?
Windows GPU usage is all jacked up. In afterburner I am seeing around 60 to 70% average GPU usage at 1080p maxed 75 FPS with my 2080 super.
 

AIM9x

[H]Lite
Joined
Feb 8, 2011
Messages
72
Radeon 5700 non-XTs occasionally dip down to $300. I have a 970 as well and just put in my order last week at $300. It was backordered, but I should be seeing it fairly soon.
 

DooKey

[H]ardForum Junkie
Joined
Apr 25, 2001
Messages
8,286
Financing a GPU. notsureifserious.jpeg
I finance my GPU purchases all the time. Can't beat PayPal six month no interest. Why should I pay out all the money at once? I love using other people's free money! I always use free money if it's offered.
 

Furious_Styles

[H]ard|Gawd
Joined
Jan 16, 2013
Messages
1,565
I finance my GPU purchases all the time. Can't beat PayPal six month no interest. Why should I pay out all the money at once? I love using other people's free money! I always use free money if it's offered.
Yeah I got the newegg store card and they will offer 12month no interest on stuff. If you are buying an expensive card it's sorta like why not?
 

Borgschulze

2[H]4U
Joined
Feb 28, 2005
Messages
3,575
I went from a 4690K 4.5ghz and a 970, to a 1660 Super, very good upgrade, very happy.

That jump was massive, going from the 4690K 4.5ghz to a Ryzen 5 3600X made very little difference in gaming.
 

kdh

Gawd
Joined
Mar 16, 2005
Messages
799
Thanks for all the recommendations everyone. I was able to pick up a very lightly used 1080 for $250, hopefully I notice a big upgrade :D
That's a hell of an upgrade. I picked up my 1080 when I got my vive and it did VR things really well with it for about 2 years. From a 970 to a 1080? Thats a sweet jump. you'll be very happy with that for atleast another 2 or 3 years.
 
Top