Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
The more modern the engine, the more cores it will be able to use. It's not really rocket science, it's more along the lines of 'What are you really going to do with your computer.' Web use, retro games, go cheap on the computer. State of the art / VR / 4K 60+ fps? Build or buy a good system.
aparently cyberpunk players need 24 thread cpus![]()
CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.Next year when the MCM “GPU chiplet” coming we "might" start seeing a CPU bottleneck in 4k gaming.
Yep. Once you get into high refresh rates, you start to feel the limitations of your CPU. You need a fast GPU too, but most of the time it's much easier to tune a game to use less GPU than it is to tune a game to use less CPU.CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.
CPU is already a bottleneck for 1080p high refresh gaming. For example in Far Cry 5. Cyberpunk too, I can't really get above 120 fps even with lowest settings.
I'm playing at 4k on a 3080 and see absolutely no reason to upgrade my Zen 2 3600 @ 4.2Ghz presently or for the next year at least. I'd want to upgrade to a 6/12 5600 before going 8/16 cores for the IPC boost, but even that benefit isn't worth the cost.
Anyone playing at 4K can basically ignore anything about gaming CPU's unless they're on something 4+ years old. If your CPU isn't bottlenecking you, it isn't a concern at 4K.
Running an OC'ed 4k monitor to 70hz so not worried about the bottleneck shifting back to CPU.That depends on the refresh rate too. If it's 60-75Hz, then anything remotely modern will do. If it's 120Hz or more, you'll need a monster CPU if you don't want dips.
What version of doom is that? id Tech 7 engine should scale much better than that!!!The engine using more cores does not always translate into higher performance. If the main thread is bottlenecked, the rest of the cores will be underutilized.
View attachment 357134
Digital foundry show Metro Exodus enhanced could not maintain 60 FPS on a 3600x in the CPU heavy areas..Running an OC'ed 4k monitor to 70hz so not worried about the bottleneck shifting back to CPU.
You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.If you play on full HD you won't need more than basic 6c/12 for a looong time.
Many x86-64 server boards do have iGPUs, such as the Aspeed AST series.Threadripper has no iGPU you'd need one built into the motherboard somewhere like people did before on-die GPUs to get any display out of the thing.
When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.
I believe there's also the example of the 2500k vs 2600k; four core/four thread vs four core/eight thread. The 4/4 CPU was hitting bottlenecks that the 4/8 CPU wasn't hitting until a good bit later.When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.
It also wasn't that long ago that people said a quad core was plenty for gaming and would be for a while but from what I've seen all the Ubisoft and Rockstar games from the last few years along with a few others love to load up all 16 threads on my system and do so fairly evenly.
When I was building my x58 system a lot of people were still building c2d systems with 8400s and there were already games that ran poorly on a dual core(like gta iv)and that quickly became the norm because the xbox 360 had three cores.
It also wasn't that long ago that people said a quad core was plenty for gaming and would be for a while but from what I've seen all the Ubisoft and Rockstar games from the last few years along with a few others love to load up all 16 threads on my system and do so fairly evenly.
Agreed. Not Only That But...Keep in mind that "loading up" and "actually benefitting from" are not always the same thing.
A quad core is probably not going to be the best solution today, but if you go 6C/12T it's probably enough for now as long as you don't run too much shit in the background or what to stream (and thus need to compress video in real time)
8C/16T is good for a little future proofing.
As many cores as you can get for streaming, especially if you want to increase quality settings on the stream.
Thre's a point of diminishing returns but I don't think there's any developers using extra clock cycles just because they're available. Those games would likely run just as well on a faster 6 core but newer games are properly utilizing more threads which does make more cores an advantage as long as you're not sacrificing clocks or IPC.Keep in mind that "loading up" and "actually benefitting from" are not always the same thing.
You know, I remember people saying this about dual cores back when quad cores came around. It wasn't long before that was shown to be a lie. Not only that, people with quad cores were able to go longer before needing an upgrade.
How many cores? All of them. That's how many. This is the [H] after all. 14/28 here. Need more cores.This is [H]. 16/32. MINIMUM. 64GB RAM, 12GB VRAM minimum.
We don't fuck around. There is no overkill. 6 or 8 core? Newb.
I'm still on 4/8, but need to move up. Not in any big rush as I need a GPU upgrade (so, realistically looking to a 3-4 year wait on that upgrade...). By the time I do get to upgrade, I'll need more cores, 8K monitor, etc...
This is not true already. Again, high refresh exists, you can buy 1080p 360Hz monitors today and the CPU can be a bottleneck.If you play on full HD you won't need more than basic 6c/12 for a looong time.
Yeah agreed, it seems like the high-refresh thing gets forgotten a lot and it's impact on CPU requirements is pretty stark. With GPU discussion too... people laughing at the notion of using a high-end card for anything less than 4K when even a 6900XT/3090 struggles to do 120/144fps at 1080P on some new games at max settings and 240fps is still out of the question without lowering quality. I know this is big talk from someone (me) who games at 30fps but it's something I notice a lot in these conversations.This is not true already. Again, high refresh exists, you can buy 1080p 360Hz monitors today and the CPU can be a bottleneck.
Yep, and this is what makes Far Cry 5 such a nightmare to run at high framerate IME. It's just multithreaded enough to drive boost clocks down, but still single-thread-bound enough that it really needs high clock and IPC to run well and those extra threads are generally wasted. Metro: Exodus seems to exhibit somewhat similar behavior; I have a feeling a lot of games do. Then on the extremes there's the likes of Cyberpunk on one end which benefits from multiple cores so well that adding more threads can (to a point, in my testing) completely offset clock speed deficit, and on the other there's the likes of Control which is so single-threaded that I see boost clocks wayyy above the all-core max and the game even benefits from doing silly things like disabling lots of cores in BIOS to goose higher real-world turbo speeds out of a locked CPU but both of those aren't actually typical IME.The engine using more cores does not always translate into higher performance. If the main thread is bottlenecked, the rest of the cores will be underutilized.
View attachment 357134
I think hitting those higher refresh rates is more dependent on IPC and clock speed though rather than more cores, so a 6/12 core/thread CPU could still perform as well hitting those higher Hz speeds given they're running at about the same clock speed as the higher core CPU in the same generation.Yeah agreed, it seems like the high-refresh thing gets forgotten a lot and it's impact on CPU requirements is pretty stark. With GPU discussion too... people laughing at the notion of using a high-end card for anything less than 4K when even a 6900XT/3090 struggles to do 120/144fps at 1080P on some new games at max settings and 240fps is still out of the question without lowering quality. I know this is big talk from someone (me) who games at 30fps but it's something I notice a lot in these conversations.
It's both, really. Most modern games use multiple threads, requiring a decent core count, but they tend to hammer one or two threads much harder than the rest, requiring fast single-core. For example, if I fire up Crysis 3, which is older but fairly well threaded, the game actually uses 85 threads according to procexp64, but most of them are below 1% utilization. What that translates into on my 8C/16T chip, is two logical cores that are near 100% utilization, seven that are hovering around 50%, two that are hovering around 25%, and four that are around 10%. What that tells me is that this game would probably run better if I had 6 faster cores instead of the 8 I have, but would probably run shittier on 4 cores, even if they were faster.I think hitting those higher refresh rates is more dependent on IPC and clock speed though rather than more cores, so a 6/12 core/thread CPU could still perform as well hitting those higher Hz speeds given they're running at about the same clock speed as the higher core CPU in the same generation.
Also seeing who thinks monitors only go up to 60Hz.It's always interesting reading these threads and knowing EXACTLY what types of games people play.
Far Cry 5 (or maybe it was far cry 4, can't remember now) is the only title in recent memory I've actually seen CPU bottlenecking in at 4k, but that was on my old 6C12T Core i7-3930k. Even overclocked to 4.8Ghz, it was not able to keep pace with modern CPU's.
Frames would occasionally drop down into the 40's, but it was not due to the GPU.
[H]are people often have some sort of chronic disconnect from the reality. There are many normal mates out there playing just fine with less powerful systems. Ylnot everyone is playing at 4k 144fps of 1440p 300fps. Most people are quite happy with 60fps. And 1080p. Or 720p. Sometimes.
Those 3930Ks held up forever. Slight overclock. GPU upgrade every couple years. Best platform I ever owned though, as far as longevity goes anyways.Far Cry 5 (or maybe it was far cry 4, can't remember now) is the only title in recent memory I've actually seen CPU bottlenecking in at 4k, but that was on my old 6C12T Core i7-3930k. Even overclocked to 4.8Ghz, it was not able to keep pace with modern CPU's.
Frames would occasionally drop down into the 40's, but it was not due to the GPU.