How Many CPU Cores Do You Need For Great PC Gaming?

I have a 12C24T. From the chart it seems that 6C12T or 8C16T is the best. But I do more than play games.Yoy should see how well 12C will load a a evga 970 ssc . In dire need of more gpu.
 
The more modern the engine, the more cores it will be able to use. It's not really rocket science, it's more along the lines of 'What are you really going to do with your computer.' Web use, retro games, go cheap on the computer. State of the art / VR / 4K 60+ fps? Build or buy a good system.

The engine using more cores does not always translate into higher performance. If the main thread is bottlenecked, the rest of the cores will be underutilized.

Vulkan%2BDoom%2Bcore%2Bload.png
 
aparently cyberpunk players need 24 thread cpus :)

No, it needs more CPU frequency, because there is a difference between all 5000 series Ryzen.
5600x have around 300MHz lower than 5900x by default and this is not a little.

Lock them all to boost on 5GHz and show me the difference in the games ;)
 
I'm playing at 4k on a 3080 and see absolutely no reason to upgrade my Zen 2 3600 @ 4.2Ghz presently or for the next year at least. I'd want to upgrade to a 6/12 5600 before going 8/16 cores for the IPC boost, but even that benefit isn't worth the cost.
 
Next year when the MCM “GPU chiplet” coming we "might" start seeing a CPU bottleneck in 4k gaming.
 
Next year when the MCM “GPU chiplet” coming we "might" start seeing a CPU bottleneck in 4k gaming.
CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.
 
CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.
Yep. Once you get into high refresh rates, you start to feel the limitations of your CPU. You need a fast GPU too, but most of the time it's much easier to tune a game to use less GPU than it is to tune a game to use less CPU.
 
"Need" = 6/12 Cores/Threads
Want = 8/16 Cores/Threads

Anything more than that won't see much (if any) benefit in the vast majority of games and extra budget would be better off put towards a better/faster GPU, RAM, and probably cooling too.
 
CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.

Far Cry 5 (or maybe it was far cry 4, can't remember now) is the only title in recent memory I've actually seen CPU bottlenecking in at 4k, but that was on my old 6C12T Core i7-3930k. Even overclocked to 4.8Ghz, it was not able to keep pace with modern CPU's.

Frames would occasionally drop down into the 40's, but it was not due to the GPU.
 
I'm playing at 4k on a 3080 and see absolutely no reason to upgrade my Zen 2 3600 @ 4.2Ghz presently or for the next year at least. I'd want to upgrade to a 6/12 5600 before going 8/16 cores for the IPC boost, but even that benefit isn't worth the cost.

Anyone playing at 4K can basically ignore anything about gaming CPU's unless they're on something 4+ years old. If your CPU isn't bottlenecking you, it isn't a concern at 4K.
 
Anyone playing at 4K can basically ignore anything about gaming CPU's unless they're on something 4+ years old. If your CPU isn't bottlenecking you, it isn't a concern at 4K.

That depends on the refresh rate too. If it's 60-75Hz, then anything remotely modern will do. If it's 120Hz or more, you'll need a monster CPU if you don't want dips.
 
That depends on the refresh rate too. If it's 60-75Hz, then anything remotely modern will do. If it's 120Hz or more, you'll need a monster CPU if you don't want dips.
Running an OC'ed 4k monitor to 70hz so not worried about the bottleneck shifting back to CPU.
 
Running an OC'ed 4k monitor to 70hz so not worried about the bottleneck shifting back to CPU.
Digital foundry show Metro Exodus enhanced could not maintain 60 FPS on a 3600x in the CPU heavy areas..
 
If you play on full HD you won't need more than basic 6c/12 for a looong time.
 
If you play on full HD you won't need more than basic 6c/12 for a looong time.
You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.
 
64 threadripping cores. To run Crysis. Without a video card. Which is how many of us are now or will be gaming in the future. ;-)
 
Threadripper has no iGPU you'd need one built into the motherboard somewhere like people did before on-die GPUs to get any display out of the thing.
 
Threadripper has no iGPU you'd need one built into the motherboard somewhere like people did before on-die GPUs to get any display out of the thing.
Many x86-64 server boards do have iGPUs, such as the Aspeed AST series.
Most individuals won't be gaming on 32MB VRAM, but they are good enough for 2D video output or non-accelerated 3D graphics.
 
You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.
When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.

It also wasn't that long ago that people said a quad core was plenty for gaming and would be for a while but from what I've seen all the Ubisoft and Rockstar games from the last few years along with a few others love to load up all 16 threads on my system and do so fairly evenly.
 
When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.

It also wasn't that long ago that people said a quad core was plenty for gaming and would be for a while but from what I've seen all the Ubisoft and Rockstar games from the last few years along with a few others love to load up all 16 threads on my system and do so fairly evenly.
I believe there's also the example of the 2500k vs 2600k; four core/four thread vs four core/eight thread. The 4/4 CPU was hitting bottlenecks that the 4/8 CPU wasn't hitting until a good bit later.

Unless you're on a tight budget there's no point in going for anything less than 8/16 for a gaming system now. Even if the games don't use up all of the cores/threads it still leaves you some processing power for anything you have running in the background. I don't know about anyone else but I always have several things running in the background. Some of them use practically nothing from the CPU most of the time but that isn't the same as all of the time. I do not want sudden slowdowns or stutters happening because I tried to go with the minimum possible. This is only for a pure gaming machine and doesn't take into account people who do more.
 
When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.

It also wasn't that long ago that people said a quad core was plenty for gaming and would be for a while but from what I've seen all the Ubisoft and Rockstar games from the last few years along with a few others love to load up all 16 threads on my system and do so fairly evenly.

Keep in mind that "loading up" and "actually benefitting from" are not always the same thing.

A quad core is probably not going to be the best solution today, but if you go 6C/12T it's probably enough for now as long as you don't run too much shit in the background or want to stream (and thus need to compress video in real time)

8C/16T is good for a little future proofing.

As many cores as you can get for streaming, especially if you want to increase quality settings on the stream.
 
Last edited:
Keep in mind that "loading up" and "actually benefitting from" are not always the same thing.

A quad core is probably not going to be the best solution today, but if you go 6C/12T it's probably enough for now as long as you don't run too much shit in the background or what to stream (and thus need to compress video in real time)

8C/16T is good for a little future proofing.

As many cores as you can get for streaming, especially if you want to increase quality settings on the stream.
Agreed. Not Only That But...

I'd personally just set aside "streaming" as a separate use case honestly. It comes up all the time, but it's actually a very small percentage of actual people. I think?
Or are you all making [H]ot tub streams?
 
Keep in mind that "loading up" and "actually benefitting from" are not always the same thing.
Thre's a point of diminishing returns but I don't think there's any developers using extra clock cycles just because they're available. Those games would likely run just as well on a faster 6 core but newer games are properly utilizing more threads which does make more cores an advantage as long as you're not sacrificing clocks or IPC.

I do think that a decent 6 core is good enough right now but I doubt that will stay the case for long, especially for higher refresh rates. Now any 8c/16t better than the ones in the consoles will likely be good enough until the next generation of consoles but even during the time that games were using the tri core 360s as a baseline good pc ports often benefited from a fourth core and they've gotten much better at multi threading than they were back then.

I don't know why but there always seems to be resistance to more cores and more memory than is currently required. The funniest part of this is the inevitable comments about supposed hard technical limits for multi threading games, I've seen essay length posts about why games would never benefit from an increase in core counts from the current standard and every time they've been dead wrong. They all had good points on why it was going to be difficult but the hurdles were never insurmountable.
 
You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.

The same was true for dual-core versus single-core, but there absolutely was a hiccup going up from quad-core. AMD's flirtations with triple-core and Intel's general lack of progression are culprits there.

I expect hexa-core to take a relatively quick bow. The standard of 4c/8th is on its very good 8-year way out. It's not going to be replaced with a hexa-core standard, though. That will be an adequate stopgap while 8c takes over.

But I don't think more and more cores will be the next big thing, either. That's been touted this whole time, for over 20 years. There still need to be IPC and clock gains on a mainstream standard. And I think that will be 8c/16th.

If I was building now I'd go for the highest-clocked 8c/16th processor (all clocks being equal, in this example) over a lower-clocked 10-plus-thread processor, but if there are greater-than-8c/16th options with the same clocks going forward, they're not gonna hurt.

A 6c/12th processor makes a lot of good sense if you're interested in a mainstream build and are on a quick upgrade cycle.

Which currently doesn't seem to exist, so I'd skip it entirely if possible.
 
This is [H]. 16/32. MINIMUM. 64GB RAM, 12GB VRAM minimum.

We don't fuck around. There is no overkill. 6 or 8 core? Newb.

I'm still on 4/8, but need to move up. Not in any big rush as I need a GPU upgrade (so, realistically looking to a 3-4 year wait on that upgrade...). By the time I do get to upgrade, I'll need more cores, 8K monitor, etc...
How many cores? All of them. That's how many. This is the [H] after all. 14/28 here. Need more cores.
 
I wish they would have added a 3090 to the testing. Multiple tests that 3070 is maxed out and the bottleneck. Not the CPU - so those tests where that occurs tell us nothing - except for owners of 3070 hardware.
 
This is not true already. Again, high refresh exists, you can buy 1080p 360Hz monitors today and the CPU can be a bottleneck.
Yeah agreed, it seems like the high-refresh thing gets forgotten a lot and it's impact on CPU requirements is pretty stark. With GPU discussion too... people laughing at the notion of using a high-end card for anything less than 4K when even a 6900XT/3090 struggles to do 120/144fps at 1080P on some new games at max settings and 240fps is still out of the question without lowering quality. I know this is big talk from someone (me) who games at 30fps but it's something I notice a lot in these conversations.
 
The engine using more cores does not always translate into higher performance. If the main thread is bottlenecked, the rest of the cores will be underutilized.

View attachment 357134
Yep, and this is what makes Far Cry 5 such a nightmare to run at high framerate IME. It's just multithreaded enough to drive boost clocks down, but still single-thread-bound enough that it really needs high clock and IPC to run well and those extra threads are generally wasted. Metro: Exodus seems to exhibit somewhat similar behavior; I have a feeling a lot of games do. Then on the extremes there's the likes of Cyberpunk on one end which benefits from multiple cores so well that adding more threads can (to a point, in my testing) completely offset clock speed deficit, and on the other there's the likes of Control which is so single-threaded that I see boost clocks wayyy above the all-core max and the game even benefits from doing silly things like disabling lots of cores in BIOS to goose higher real-world turbo speeds out of a locked CPU but both of those aren't actually typical IME.
 
Yeah agreed, it seems like the high-refresh thing gets forgotten a lot and it's impact on CPU requirements is pretty stark. With GPU discussion too... people laughing at the notion of using a high-end card for anything less than 4K when even a 6900XT/3090 struggles to do 120/144fps at 1080P on some new games at max settings and 240fps is still out of the question without lowering quality. I know this is big talk from someone (me) who games at 30fps but it's something I notice a lot in these conversations.
I think hitting those higher refresh rates is more dependent on IPC and clock speed though rather than more cores, so a 6/12 core/thread CPU could still perform as well hitting those higher Hz speeds given they're running at about the same clock speed as the higher core CPU in the same generation.

Also, this is why I'm still comfy with my 3440x1440p 120Hz monitor for a while still, since it's a bit harder to hit that performance level than it would be gaming at 4k/60. But as soon as LG or whoever makes a newer OLED TV/panel at 42" or smaller (the 48" CX is still a bit too big for my desk and preference as a monitor) since they're 4k/120 with VRR, I'll be jumping on that.
 
I think hitting those higher refresh rates is more dependent on IPC and clock speed though rather than more cores, so a 6/12 core/thread CPU could still perform as well hitting those higher Hz speeds given they're running at about the same clock speed as the higher core CPU in the same generation.
It's both, really. Most modern games use multiple threads, requiring a decent core count, but they tend to hammer one or two threads much harder than the rest, requiring fast single-core. For example, if I fire up Crysis 3, which is older but fairly well threaded, the game actually uses 85 threads according to procexp64, but most of them are below 1% utilization. What that translates into on my 8C/16T chip, is two logical cores that are near 100% utilization, seven that are hovering around 50%, two that are hovering around 25%, and four that are around 10%. What that tells me is that this game would probably run better if I had 6 faster cores instead of the 8 I have, but would probably run shittier on 4 cores, even if they were faster.
 
My Acer laptop has a i7 9750h, 6C/12T with a 2060 6GB, 1080p 144Hz screen and runs almost every game buttery smooth.

I think my next system will have a 8C/16T and a 3080 with a high resolution 120Hz or higher display
 
Far Cry 5 (or maybe it was far cry 4, can't remember now) is the only title in recent memory I've actually seen CPU bottlenecking in at 4k, but that was on my old 6C12T Core i7-3930k. Even overclocked to 4.8Ghz, it was not able to keep pace with modern CPU's.

Frames would occasionally drop down into the 40's, but it was not due to the GPU.

Based on some recent 1440p testing in a very heavy game load scenario showing performance gains from 12(24) vs 8(16) cores, I suspect Path of Exile would still bottleneck at 4k. That said, POE's homebrew engine is is very CPU intensive; and extreme min/maxing can stress the game much harder than normal play would. The case tested was unusual in being readily achievable by normal players at the time. (It's since been replaced with something less stressful in the current league.)

https://www.reddit.com/r/pathofexile/comments/moz1hx/performance_benchmarking_of_valdos_harbinger/
 
[H]are people often have some sort of chronic disconnect from the reality. There are many normal mates out there playing just fine with less powerful systems. Ylnot everyone is playing at 4k 144fps of 1440p 300fps. Most people are quite happy with 60fps. And 1080p. Or 720p. Sometimes.

Pussy.

:D Yea, I'm still gaming at 1080p and 60fps is a nice goal. Still rocking the old GTX1070 (which broke, then came back to life...). I'm happy with how things are, but they could be better. It's not a disconnect from reality. A lot of us got that attitude from the old days of overclocking/editing config files/whatever from back in the day trying to get 1-2 fps more out of the game. Where some little tweak could make a visual and noticeable difference. Or where we could increase it enough to take the graphics settings from medium to high.

Now, we just go from 80fps to 85fps at 4k. It's more of a flex than anything. But, we just have that same attitude. "It's good, but we can do better!". We see it in benchmarks, not an actual noticeable difference (for the most part, sometimes it's very noticeable). That's just how it goes with tweaking and playing around. It's not a necessity, it's just part of the hobby. Like with cars. We put that NOS sticker on there for that extra 2 HP. We do all sorts of things to our cars that really don't make much difference, but it's just fun and you get that extra little whatever out of it (better sound, better responsiveness, whatever... we may or may not notice a difference, but it's fun).

For a lot of those things (no one NEEDS a xxx core), I replace it with different other hobbies. No one needs more than 150 horsepower. No one needs more than 100 rounds of ammo. No one needs more than a kilo of coke. You don't need another 6TB drive for more porn. Say that to the people that enjoy those hobbies, and it's fightin' time. :) You're right. For someone not into the hobby, you may not need any of that (except, you always need more ammo. Always.). For someone that is in it, though, yes. They need it all.
 
Far Cry 5 (or maybe it was far cry 4, can't remember now) is the only title in recent memory I've actually seen CPU bottlenecking in at 4k, but that was on my old 6C12T Core i7-3930k. Even overclocked to 4.8Ghz, it was not able to keep pace with modern CPU's.

Frames would occasionally drop down into the 40's, but it was not due to the GPU.
Those 3930Ks held up forever. Slight overclock. GPU upgrade every couple years. Best platform I ever owned though, as far as longevity goes anyways.

Wanted NVME, PCIe4.0, and better usb controllers though. It’s just a little too long in the tooth.
 
Back
Top