4770k and I just bought a new GPU

david_

2[H]4U
Joined
Jan 30, 2012
Messages
2,506
I've had 4770k (2013) and GTX970 (2015) for a bit. I don't game much anymore but I do some 3d modeling/printing and have had the upgrade itch lately, but couldn't justify new CPU/mobo/OS/PSU etc etc etc.

So instead I'll be replacing my old B-stock GTX 970 with a B-stock RTX 3060. Is this a terrible idea for my workstation / occasional gaming on my 4k monitor and VR once in awhile?

I'm just hoping my old power supply is up to the task.
 
I've had 4770k (2013) and GTX970 (2015) for a bit. I don't game much anymore but I do some 3d modeling/printing and have had the upgrade itch lately, but couldn't justify new CPU/mobo/OS/PSU etc etc etc.

So instead I'll be replacing my old B-stock GTX 970 with a B-stock RTX 3060. Is this a terrible idea for my workstation / occasional gaming on my 4k monitor and VR once in awhile?

I'm just hoping my old power supply is up to the task.
If your PSU is in good shape then it should be fine. Doesn't sound like you are pushing it too hard.

4K is the cpu equalizer, generally. You should see a decent increase in performance upgrading to the 3060. Lots more vram than you've had so you can turn up the extras. if you decide to upgrade the cpu later the 3060 will carry over nicely.
 
Sounds a lot like my old system, only that one had a GTX 980.

An RTX 3060 12 GB is about as far as you can go without being seriously CPU-bottlenecked, so that would make the most sense if you don't plan on changing platforms just yet.

For serious 4K and VR gaming, your wallet's gonna hate you - I moved to a 12700K/RTX 4080 build recently, and it really strikes me as more of a minimum requirement for those things, particularly if your idea of VR gaming is stuff like DCS and No Man's Sky that just isn't very well optimized to begin with.

But that RTX 3060 might do well enough to tide you over a bit, enough to save up for a modern CPU, motherboard, and matching DDR4 or DDR5. Running at 4K native may very well call for DLSS Performance in recent titles, but if push comes to shove and your monitor can handle integer scaling properly, just run 1080p if the game's too demanding.
 
A 3060 is a nothingburger, and would be around a 1080 performance-wise, which is perfectly appropriate to pair with a 4770K
My trusty old 4770K @ 4.6GHz w/ 1080Ti is still working perfectly fine as a secondary rig (upgraded to a 7900X / 3060 Ti last Nov)
 
I see nothing wrong with that gpu for your system, as others have said it's not the greatest 4k card but it will be nice upgrade over the 970 and it doesn't really make sense to go much higher than a 3060 unless you're going to upgrade the rest of the system too.

That power supply should have enough wattage based on the rating but the fact that this PSU tier list seems to have it listed as a model to avoid is a little concerning, at least it's not on their replace immediately tier. I'm also not sure how accurate the list is on the lower tiers but the higher end tiers seemed to match reviews when I was shopping for a new one last week.
 
That cpu will be a bottleneck in gaming. A 4770 is about equal ipc as a ryzen 2700. Look at benchmarks with a 2700 and you will se massive performance loss at 1080p.
 
A 3060 is a nothingburger, and would be around a 1080 performance-wise, which is perfectly appropriate to pair with a 4770K
My trusty old 4770K @ 4.6GHz w/ 1080Ti is still working perfectly fine as a secondary rig (upgraded to a 7900X / 3060 Ti last Nov)
A 4770k is not really appropriate at all for modern game unless you think super low minimums are fun. Even 4 years ago my 4770k was shitting itself in some games dropping below 60 fps. A 4770k will not meet recommended cpu requirements in modern games and in some cases not even the minimum.
 
Last edited:
That cpu will be a bottleneck in gaming. A 4770 is about equal ipc as a ryzen 2700. Look at benchmarks with a 2700 and you will se massive performance loss at 1080p.
Even worse as according to passmark 2700x has12% higher single thread than the 4770k and murders it in multithread being 2.5x times faster.
 
A 4770k is not really appropriate at all for modern game unless you think super low minimums are fun. Even 4 years ago my 4770k was shitting itself in some games dropping below 60 fps. A 4770k will not meet recommended cpu requirements in modern games and in some cases not even the minimum.
Whether it's appropriate depends really depends on their expectations. It would be a bad setup to play the newest most demanding games but it would allow them to play quite a few more games than their 970 can handle, especially at 4k.
 
I've had 4770k (2013) and GTX970 (2015) for a bit. I don't game much anymore but I do some 3d modeling/printing and have had the upgrade itch lately, but couldn't justify new CPU/mobo/OS/PSU etc etc etc.

So instead I'll be replacing my old B-stock GTX 970 with a B-stock RTX 3060. Is this a terrible idea for my workstation / occasional gaming on my 4k monitor and VR once in awhile?

I'm just hoping my old power supply is up to the task.
You may want to turn down the resolution of the monitor while gaming, for performance's sake
 
Whether it's appropriate depends really depends on their expectations. It would be a bad setup to play the newest most demanding games but it would allow them to play quite a few more games than their 970 can handle, especially at 4k.
No one is making the claim that a 4770K is the equivalent of RocketLake or Zen4
However, it is also not exactly like trying to run Crysis on a Z80 (maybe a 68060?)

Definitely agree that something like a 5600X would be more appropriate, but beyond the money, it is also the time/trouble of tearing apart the rig and rebuilding it. Young(er) me would jump at the chance to fiddle with hardware, but old(er) me just says "meh, too lazy"
 
A 3060 is a nothingburger, and would be around a 1080 performance-wise, which is perfectly appropriate to pair with a 4770K
My trusty old 4770K @ 4.6GHz w/ 1080Ti is still working perfectly fine as a secondary rig (upgraded to a 7900X / 3060 Ti last Nov)
A 3060 is at a 1080Ti performance level. The 2060 is comparable to the 1080.
I would not call a 3060 "perfectly appropriate" to pair with a 4770k when a 2060 is a bottleneck for it.



As for the the OP, since he isn't gaming much at all, he will probably never realize how much potential he's lacking from that 4770K, especially at 4K.
 
A 3060 is at a 1080Ti performance level. The 2060 is comparable to the 1080.
I would not call a 3060 "perfectly appropriate" to pair with a 4770k when a 2060 is a bottleneck for it.



As for the the OP, since he isn't gaming much at all, he will probably never realize how much potential he's lacking from that 4770K, especially at 4K.

I probably will upgrade CPU again once DDR5 platform matures. Just seems like an awkward time to build right now, straddling older vs. newer tech with something like alder lake.
 
I probably will upgrade CPU again once DDR5 platform matures. Just seems like an awkward time to build right now, straddling older vs. newer tech with something like alder lake.
I'm waiting to see how the new 3d cache 7000 series CPUs look when they launch later this month and april. I already picked up a new gpu and I could put a 5800x3d in my current mb but I want to see if the new 3d parts are worth jumping to a new platform, if I had to build from scratch right now though there's not really anything that compelling. I agree that DDR5 tech has a ways to mature still which is more of a factor if you keep systems for a long time.
 
I'm waiting to see how the new 3d cache 7000 series CPUs look when they launch later this month and april. I already picked up a new gpu and I could put a 5800x3d in my current mb but I want to see if the new 3d parts are worth jumping to a new platform, if I had to build from scratch right now though there's not really anything that compelling. I agree that DDR5 tech has a ways to mature still which is more of a factor if you keep systems for a long time.
I would wait until the AGESA improves for the new 7xxx platform .. boot times are still in the order of crazy to stupid
 
My old setup was a 4790K with a 1080Ti and some games at 3440x1440 would max out the CPU while the GPU was just sitting at 50% or less usage.
When I got my 5800X paired with the 1080Ti, the CPU was now at 30-50% usage and the GPU was maxxed out.


IMG_8953.PNG IMG_0287.PNG
 
Last edited:
I went from a i7 4770k to a 3800X huge difference. My buddy couldn't even get his 4770 and 3080 to work. You will enjoy your next upgrade even if you do a budget 6 core.
 
I went from a i7 4770k to a 3800X huge difference. My buddy couldn't even get his 4770 and 3080 to work. You will enjoy your next upgrade even if you do a budget 6 core.
Good grief a 3080 is a laughable waste on a 4770k anyway. Sadly there are idiots out there even running a 4090 on a 4770k.
 
To a point, the higher resolution will offload the CPU. I used to use DSR to upscale my 1080p resolution for some games when I was running a W3690 and a 1070. Frames improved quite a bit
The cpu will bottleneck at 1080p. Just bump the resolution and the 3060 will handle it
 
To a point, the higher resolution will offload the CPU. I used to use DSR to upscale my 1080p resolution for some games when I was running a W3690 and a 1070. Frames improved quite a bit
The cpu will bottleneck at 1080p. Just bump the resolution and the 3060 will handle it
OP has a 4k monitor.
 
I think some people just don't get it. Running a game at a higher resolution does not magically make up for your CPU being old and slow. There are different kind of CPU bottlenecks and the issue here is not that he wouldn't get full use of a video card upgrade it's just that a 4770k is literally a playable limitation in some games. Now if he has low standards and doesn't care about dropping into the 40s and 50s all over the place then that's fine.

But this nonsense oh I'm running a high resolution so the CPU doesn't matter really needs to end. Running at graphical settings and resolutions that bring your GPU down to its knees and runs like shit is no better than being CPU limited and running like shit. You're just trading one shitty experience for another. Again a 4770k is not going to meet recommended requirements of probably any modern game and is not even going to meet minimum requirements of several games.
 
Last edited:
With all the above said, there are some bad assumptions. Not all games out or nearly out require the highest end CPU or GPU imaginable. In fact, I'd say the vast majority do not. Why? You sell more copies if it can play anywhere. Not saying that there aren't cases where you need the "very latest", just saying it's not the majority case. With that said, there are some cases on the GPU front where a feature of, let's say DX12.x, is required (not optional), where you will have to look at that as well. So while, a game might not require the top tier GPU, it may require a "newer" GPU because otherwise it just won't run at all. Anyway, I think some people here are needlessly being "over the top" (likely because they are now poor that they've upgraded 5 times over the past 5 years).
 
I think some people just don't get it. Running a game at a higher resolution does not magically make up for your CPU being old and slow. There are different kind of CPU bottlenecks and the issue here is not that he wouldn't get full use of a video card upgrade it's just that a 4770k is literally a playable limitation in some games. Now if he has low standards and doesn't care about dropping into the 40s and 50s all over the place then that's fine.

But this nonsense oh I'm running a high resolution so the CPU doesn't matter really needs to end. Running at graphical settings and resolutions that bring your GPU down to its knees and runs like shit is no better than being CPU limited and running like shit. You're just trading one shitty experience for another. Again a 4770k is not going to meet recommended requirements of probably any modern game and is not even going to meet minimum requirements of several games.
The OP isn't really gaming anymore. So the 3060 will be a fine upgrade over the 970 for him. It will be a massive uplift with modern rendering ability. Should be able to easily power it if he was driving the 970 and even though his CPU is bullshit by modern standards it will good enough for their use case.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-Nvidia-GTX-970/4105vs2577

I agree, though, there is no substitute for a modern CPU in SOME games that leverage more cores. Depends on what he's doing. If the OP has been using 4K and a 970 this long, the 3060 will be like the second coming of Christ to his PC.
 
No one is being over the top. I used the 4770k for years and upgraded 4 years ago because it wasn't even cutting it for me at the time anymore in some games. I was already dropping into the 50s and 40s in some games for minimums which for me was not enjoyable and a 9900k literally doubled my minimum frame rates in some CPU intensive games. And games have become even more CPU intensive since then. A 3060 is not a 4K card so you're just going to trade one crappy experience for another because if you played at the appropriate resolution you're going to be CPU limited in the vast majority of times. And again it's not about getting full use of the card but about playability so depending on the game you could be in for a horrible experience unless you have bottom of the barrel standards. Go play Crysis remastered on that 4770k and you'll be dropping into the upper 30s at times whereas on a modern CPU like the 12700k would have you around 120 FPS in those same spots. Of course use the 3060 at 4K and you're going to be in the 30s and 40s anyway so it's just one shitty experience over another so pick one.
 
No one is being over the top. I used the 4770k for years and upgraded over 4 years ago because it wasn't even cutting it for me at the time anymore in some games. I was already dropping into the 50s and 40s in some games for minimums which for me was not enjoyable and a 9900k literally doubled my minimum frame rates in some CPU intensive games. And games have become even more CPU intensive since then. A 3060 is not a 4K card so you're just going to trade one crappy experience for another because if you played at the appropriate resolution you're going to be CPU limited in the vast majority of times. And again it's not about getting full use of the card but about playability so depending on the game you could be in for a horrible experience unless you have bottom of the barrel standards. Go play Crysis remastered on that 4770k and you'll be dropping into the upper 30s at times whereas on a modern CPU like the 12700k would have you around 120 FPS in those same spots. Of course use the 3060 at 4K and you're going to be in the 30s and 40s anyway so it's just one shitty experience over another so pick one.
I ran 4K with a 970 and a Intel 8600K and it was fine. You couldn't run maximum settings, you had to dial down your graphical goodies. But it was doable.

I have been on 4K displays since they became "affordable" (around 700 bucks) with bullshit hardware and trying to make it work. It wasn't until I had a 1080Ti (still about 23% faster than the 3060) that I was getting 40 FPS with everything maxed out. Paired 970's worked if the games supported SLI.

You're always losing performance until you're not.

But for general use, which is what it seems the OP is using their PC for, they will be fine.
 
OP has a 4k monitor.
Exactly. 3060 is generally a 1080p/1440p card but by bumping the resolution, it can offload the cpu.. I used 1080p as an example because the 3060 will bottleneck a 4770k at that resolution.Most people wouldn’t run a 3060 above 1440p, but it probably has enough power with some settings lowered
 
I probably will upgrade CPU again once DDR5 platform matures. Just seems like an awkward time to build right now, straddling older vs. newer tech with something like alder lake.
It might seem awkward given the transitions at hand, but Alder/Raptor Lake being very performant with mature DDR4 actually gives it a leg up in the bang-for-your-buck department, since DDR4 is cheap right now. Now factor in those Micro Center bundles, and you have a pretty solid foundation for around $325-375.

That said, DDR4's bottomed out at the bell curve already, and is arguably going back up if you absolutely insist on one of those Samsung B-Die kits. DDR5 will only get better from here on out, but you're still paying early adopter tax for the 6000+ MT/s stuff that would actually provide some benefit over older DDR4 right now, which may also be too much for the IMC on current CPUs. (Alder/Raptor Lake can't even run Gear 1 with DDR5!)

That 4770k should be running @ 4.7Ghz, how silly
If it was a 4790K, that might be believable, but my 4770K struggled to hit 4.6 GHz as it was, and I had to back it off even further to 4.5 GHz later to ensure stability.

This is with custom liquid cooling, but no delid + liquid metal TIM.

Maybe I just don't have any luck with my CPUs. Had to leave a 7700K at 4.9 GHz even though they're supposed to hit 5 GHz easy, but I just couldn't quite get the damn thing Cinebench stable at the 5 GHz mark.
 
If it was a 4790K, that might be believable, but my 4770K struggled to hit 4.6 GHz as it was, and I had to back it off even further to 4.5 GHz later to ensure stability.
It's called variance among samples. It isn't a matter of belief.
 
It's called variance among samples. It isn't a matter of belief.
I'm just saying, 4.7 GHz on Haswell seems awfully optimistic given my own experiences with overclocking a 4770K. 4.4-4.5 GHz seems a bit more believable without feeling like you lost the silicon lottery hard.

The 4670K it replaced was actually far worse, needed way too much voltage by the time it hit 4.2 GHz just to remain stable - so bad that I justified upgrading to the 4770K because of it. (That, and the $200 Micro Center sale at the time which they price-matched.)
 
I'm just saying, 4.7 GHz on Haswell seems awfully optimistic given my own experiences with overclocking a 4770K. 4.4-4.5 GHz seems a bit more believable without feeling like you lost the silicon lottery hard.

The 4670K it replaced was actually far worse, needed way too much voltage by the time it hit 4.2 GHz just to remain stable - so bad that I justified upgrading to the 4770K because of it. (That, and the $200 Micro Center sale at the time which they price-matched.)
Yeah my 4770k was only fully stable at 4.4 and looking on forums at the time was about the average for most.
 
Yeah my 4770k was only fully stable at 4.4 and looking on forums at the time was about the average for most.

That's as far as my 4790K is 100% stable at sustained full load. I could do a few hundred higher when it's cool/cold, but experienced once every few weeks crashes when my indoor temperature got into the high 70s/low 80s.
 
Just for $hits & giggles.. I was hitting up YouTube on some 4770k videos.. and its crazy how good this CPU is, still...
 
Just for $hits & giggles.. I was hitting up YouTube on some 4770k videos.. and its crazy how good this CPU is, still...
It does fine in a lot of games if all you look at is the average FPS. I can tell you from first-hand experience it feels like choppy sluggish shit in some games though. As I said earlier I upgraded to a 9900k 4 years ago because the 4770k was absolutely not cutting it in some games then with absolutely terrible 1% lows. And if you're the kind of person that likes to leave stuff running in the background while you're playing games then it's an even worse experience in the real world.
 
And if you're the kind of person that likes to leave stuff running in the background while you're playing games then it's an even worse experience in the real world.
That's where the extra cores come in. While many might assume that "base/entry" is now 8 cores, it's still not for many, and perhaps doesn't need to be if all their PC is, is pure gaming. YMMV.
 
Yes even if overclocked to 4.5-4.7 ghz certain games that are CPU hungry can def make it choppy with min fps

Also it's a bit annoying having your CPU fan go full throttle most of the time : D
 
Depending on the game/ app, but in most cases a Haswell or Broadwell is a bottleneck for a 3060. My 9700K is a bottleneck in many cases. Some games just don't efficiently feed the GPU, and need a more modern CPU to not choke.

I am waiting for some sales to go with 13600K or 13700K.
 
TDP wise, it's pretty close - 145w vs 170w, 970 vs 3060. Your PSU with 48a on the 12v rail definitely should have no issue covering the extra 25w the 3060 brings. Your unit has a 6x2 pin so you're also good there.

In regards to Antec, one of my family members still apparently uses an Athlon x6 1055t build I donated to them years ago. the antec unit was a 550w unit, is at least closing on 20 years old, and apparently hasn't burnt their house down. :ROFLMAO:
 
Back
Top