5930K @ 4.2GHz a bottlebeck for 2080 Ti?

Will it bottleneck?

  • Yes

    Votes: 9 16.7%
  • No

    Votes: 45 83.3%

  • Total voters
    54
Not really, no. While a faster CPU is always better to some extent, we are primarily GPU limited in games. The difference between various CPU's in games is going to be incredibly small. That's why I've stayed with my 5960X for so damn long. I want a new CPU, but everything out there on the HEDT platforms is going to be a lateral move in terms of gaming performance. AMD's Threadripper has similar IPC to what we've got. Even if it had slightly more, 5930's and 5960X's should be able to clock a little higher. You would honestly have to grab a 8700K or 9900K to get an "upgrade" and even then it would be so small I wouldn't feel like it justified the cost in the slightest. On the HEDT front, there isn't much that will break 4.5GHz and therefore, very little that would be an upgrade for playing games.

CPU's have advanced more than people realize in the last few years. In the realm of performance per watt, virtualization, and parallelism today's CPU's are considerably faster than what was on the market 7 or 8 years ago. This is also true of the mobile market. However, the problem is that the advancements have little to no impact on gaming. IPC has barely gone anywhere by itself in the last 8 years. Clock speeds haven't gone up much either and when they have its come at the cost of IPC in most cases. We aren't at a point where gaming benefits from additional threads.
 
At 4K, definitely not. At 1440p, possibly depending on the refresh rate.

At 144Hz, you can get to CPU limited scenarios in some games (for example Far Cry 5 or Watch Dogs 2).

But even in those games, a new CPU will not drastically change the situation.
 
Not really, no. While a faster CPU is always better to some extent, we are primarily GPU limited in games. The difference between various CPU's in games is going to be incredibly small. That's why I've stayed with my 5960X for so damn long. I want a new CPU, but everything out there on the HEDT platforms is going to be a lateral move in terms of gaming performance. AMD's Threadripper has similar IPC to what we've got. Even if it had slightly more, 5930's and 5960X's should be able to clock a little higher. You would honestly have to grab a 8700K or 9900K to get an "upgrade" and even then it would be so small I wouldn't feel like it justified the cost in the slightest. On the HEDT front, there isn't much that will break 4.5GHz and therefore, very little that would be an upgrade for playing games.

CPU's have advanced more than people realize in the last few years. In the realm of performance per watt, virtualization, and parallelism today's CPU's are considerably faster than what was on the market 7 or 8 years ago. This is also true of the mobile market. However, the problem is that the advancements have little to no impact on gaming. IPC has barely gone anywhere by itself in the last 8 years. Clock speeds haven't gone up much either and when they have its come at the cost of IPC in most cases. We aren't at a point where gaming benefits from additional threads.

Thank you so much for this answer Dan. Very informative!

OK, in that case I won't upgrade my CPU/mobo because I am really only interested in gaming at 1440p-4K @ 60Hz
 
At 1440p the cost of a new cpu and motherboard in no way would justify the very small gain in fps... from everything I see you will only see the fps gains from a high dollar new cpu at 1080p
 
Does anyone have any 4K/1440p gaming benchmarks from reviews which show 5930K amongst the latest CPUs?
 
At 1440p the cost of a new cpu and motherboard in no way would justify the very small gain in fps... from everything I see you will only see the fps gains from a high dollar new cpu at 1080p

This is because 1920x1080 is a CPU limited resolution for today's systems. We tend to think of it as being "HD" and we don't think of it as being a low resolution given that's probably the most common resolution for today's displays. However, its an old resolution. We've had monitors that surpass that resolution before 1920x1080 was even close to mainstream. 1920x1200 and 2560x1600 professional monitors preceded the popularity of such displays. 1080p caught on because of the TV market and the "HD" marketing campaigns. Lower quality panels and economies of scale, etc. all contributed to the adoption of that resolution. We have had GPU's that were designed to push multiple displays at combined resolutions far in excess of 1080p for many years. 1920x1080 just isn't that hard for high end graphics cards and is basically the new 1024x768, which was super popular and the dominant resolution up through the early 2000's. You could even go so far as to call it today's 640x480 for those of us who are old enough to remember displays limited to resolutions that low. At one point, it was the standard.
 
you question is impossibel to answers because what is the bottlenekc depends on you system load aka what ressoruce you are using in your software

yu cpu will bt the bottlenck in runnings 7-zip compression
but not in a modenr game running in 8k ressouluionts with 16xMSAA and 8xAF

Tou can't simply not boi lthe question down to hardware alone
 
you question is impossibel to answers because what is the bottlenekc depends on you system load aka what ressoruce you are using in your software

We know what the system will be used for. He's asking about gaming.

yu cpu will bt the bottlenck in runnings 7-zip compression

Yes, the CPU is a bigger concern when compressing files as the GPU is irrelevant in such a task.

but not in a modenr game running in 8k ressouluionts with 16xMSAA and 8xAF

Where is this 8k resolution with 16xMSAA & 8xAF (I'm not sure why you didn't kick things up to 16xAF) and 8xAF stuff coming from? He specifically mentioned 2560x1440 and 3840x2160 as the resolutions he would be playing games at.

Tou can't simply not boi lthe question down to hardware alone

Nonsense. You can certainly boil the question down to hardware alone when we know what kind of software is being used. He didn't ask if it was worth getting a 2080Ti over a 1080Ti while playing 30 year old DOS games in a virtual machine or mention anything about rendering video or zipping files. His question was about CPUs with a given GPU and gaming performance.
 
Not really, no. While a faster CPU is always better to some extent, we are primarily GPU limited in games. The difference between various CPU's in games is going to be incredibly small. That's why I've stayed with my 5960X for so damn long. I want a new CPU, but everything out there on the HEDT platforms is going to be a lateral move in terms of gaming performance. AMD's Threadripper has similar IPC to what we've got. Even if it had slightly more, 5930's and 5960X's should be able to clock a little higher. You would honestly have to grab a 8700K or 9900K to get an "upgrade" and even then it would be so small I wouldn't feel like it justified the cost in the slightest. On the HEDT front, there isn't much that will break 4.5GHz and therefore, very little that would be an upgrade for playing games.

CPU's have advanced more than people realize in the last few years. In the realm of performance per watt, virtualization, and parallelism today's CPU's are considerably faster than what was on the market 7 or 8 years ago. This is also true of the mobile market. However, the problem is that the advancements have little to no impact on gaming. IPC has barely gone anywhere by itself in the last 8 years. Clock speeds haven't gone up much either and when they have its come at the cost of IPC in most cases. We aren't at a point where gaming benefits from additional threads.
The only thing making me want to upgrade is my lust for hardware. Basically agreeing with you.
 
I don't think there has been one with the 2080Ti, but a stock and overclocked 5820K has been tested @ 1440p with a 1080Ti. It's not that far off from the 8700K, which is still one of the best gaming CPUs. Would you see better FPS with an 8700K or 9900K? Yes; but unless you're shooting for the absolute best FPS, then I wouldn't worry too much about it. I'm willing to bet you'd still have an excellent gaming experience with you current CPU ;)

https://www.techspot.com/review/1558-core-i7-5820k-test-in-2018/page4.html

Average-1.png
 
Last edited:
Not really, no. While a faster CPU is always better to some extent, we are primarily GPU limited in games. The difference between various CPU's in games is going to be incredibly small. That's why I've stayed with my 5960X for so damn long. I want a new CPU, but everything out there on the HEDT platforms is going to be a lateral move in terms of gaming performance. AMD's Threadripper has similar IPC to what we've got. Even if it had slightly more, 5930's and 5960X's should be able to clock a little higher. You would honestly have to grab a 8700K or 9900K to get an "upgrade" and even then it would be so small I wouldn't feel like it justified the cost in the slightest. On the HEDT front, there isn't much that will break 4.5GHz and therefore, very little that would be an upgrade for playing games.

CPU's have advanced more than people realize in the last few years. In the realm of performance per watt, virtualization, and parallelism today's CPU's are considerably faster than what was on the market 7 or 8 years ago. This is also true of the mobile market. However, the problem is that the advancements have little to no impact on gaming. IPC has barely gone anywhere by itself in the last 8 years. Clock speeds haven't gone up much either and when they have its come at the cost of IPC in most cases. We aren't at a point where gaming benefits from additional threads.



So specifically talking about 1440p and referencing the chart above, wouldn’t a 5930 technically be a bottleneck if better performance can be had by installing a faster 8700K/9900K? I mean the gains probably won’t be giant, but if there are still gains to be had when moving to a faster CPU, then I would technically consider that a bottleneck.
 
So specifically talking about 1440p and referencing the chart above, wouldn’t a 5930 technically be a bottleneck if better performance can be had by installing a faster 8700K/9900K? I mean the gains probably won’t be giant, but if there are still gains to be had when moving to a faster CPU, then I would technically consider that a bottleneck.

We are talking about buying a new motherboard, processor and RAM for what probably amounts to less than 10FPS when you are already above 100FPS using the examples above. At 4K, the difference will probably be even smaller. If you think that's worth it, then by all means go right ahead. I've looked at this many times myself. I'm a hardware guy and will justify all kinds of upgrades with little real benefits just so I can pull the trigger on new hardware. The gains are so pathetic at this point I've seen very little reason to upgrade my motherboard and CPU right now. Especially considering the cost / benefit ratio for doing so.
 
We are talking about buying a new motherboard, processor and RAM for what probably amounts to less than 10FPS when you are already above 100FPS using the examples above. At 4K, the difference will probably be even smaller. If you think that's worth it, then by all means go right ahead. I've looked at this many times myself. I'm a hardware guy and will justify all kinds of upgrades with little real benefits just so I can pull the trigger on new hardware. The gains are so pathetic at this point I've seen very little reason to upgrade my motherboard and CPU right now. Especially considering the cost / benefit ratio for doing so.

Correct, I'm in no way trying to justify the cost of the hardware for such a small gain. I'm just saying, there would technically be a bottleneck if there is more performance to be had just by changing out the CPU. Meaning the 5930 would hold the 2080 Ti back from performing at it best. We could even say the 9900K might also be a bottleneck to an extent, but since there is nothing faster on the market, it's hard to say at this point.
 
I have a 5820K and it doesn't even break a sweat running anything including propriety XRD engineering software.
My only limitation is 28 PCI lanes. Never once loaded down all cores unless running prime benchmarks.
Pricing is insane on new gen products from Intel even the 9th gen 1151 shit is overblown in price, I paid 279 USD for mine when Microcenter ran a tax free sale.
 
Correct, I'm in no way trying to justify the cost of the hardware for such a small gain. I'm just saying, there would technically be a bottleneck if there is more performance to be had just by changing out the CPU. Meaning the 5930 would hold the 2080 Ti back from performing at it best. We could even say the 9900K might also be a bottleneck to an extent, but since there is nothing faster on the market, it's hard to say at this point.

Right. If you want to get technical, every CPU is a bottleneck to some extent.
 
Amazing that the 2600K can still hang. Might be the greatest CPU of all time, past, present, and future.

Couldn't agree more. I've been gaming on my 2600k (Stock Speeds) for 8+ years...recently paired with a 1080SC card @ 4k. BF5 was definitely hitting the system hard, I struggled whether to upgrade the CPU or pick up a 2080 based on some of these charts. I went the 8700k route with supporting hardware, gain 30 FPS in BF5. Run's butter smooth!
 
Amazing that the 2600K can still hang. Might be the greatest CPU of all time, past, present, and future.

You have to place things like that in the proper context. There is no doubt that the 2600K was a leap forward at the time of release. The only reason it still hangs today is because Intel isn't in the business of building processors with an emphasis on desktop performance. New architectures have come and gone but brought very little to the table as far as desktops were concerned. However, Intel's focus shifted away from the desktop more and more after the 2600K's release as it concentrated on mobile and server sales. In a sense, that's to be expected as getting higher IPC and maintaining clock speeds is a tall order while making CPU's more power efficient. In a different market, the 2600K might have been surpassed long ago.
 
Couldn't agree more. I've been gaming on my 2600k (Stock Speeds) for 8+ years...recently paired with a 1080SC card @ 4k. BF5 was definitely hitting the system hard, I struggled whether to upgrade the CPU or pick up a 2080 based on some of these charts. I went the 8700k route with supporting hardware, gain 30 FPS in BF5. Run's butter smooth!

This is an important example: while the 2600k may look alright in benchmarks, actual playability is lower. This is just a factor of the difficulty of distilling benchmark data into useful information, and that proper benchmarks do not cover real-world use cases simply because they cannot. Every end user, and thus every end user system, is different. Benchmarks show 'best case'.


With respect to the OP, no doubt that the 5930k is holding the 2080Ti 'back', but the question as to 'how much' and whether an upgrade is 'worth it' is much harder to answer.

Here I'd agree with the consensus, that being not held back 'that much' and generally not worth the upgrade; beyond that, to add, the difference versus a top-flight 9900k is going to revolve around workload- and not just gaming, but also specific games, specific settings, and specific sections or sequences of games.

Generally, a 'slower' CPU in terms of clockspeed and/or IPC is going to cut your top-end off, limiting the highest framerates/fastest frametimes, but as we've seen with Ryzen, the more important low-end, that being the lowest framerates/slowest frametimes, will be largely unaffected so long as there are enough free thread resources. This is where the i7's ranging from the 2600k up to the 7700k start to show their limitations, despite otherwise being highly competent.
 
When I purchased my 2080TI, I temporarily installed it in my 4930k system and @ 4k all games ran as fine as they do on my 7700k. A small difference in fps but it was basically same experience at both 1440p and 2160p, respectively.
 
What games? I can name many old games that even a 5.5 GHz 9900k will be a bottleneck for its integrated graphics. For newer games, refresh rate is really important to know as 60hz will be hard to show a bottleneck.

I am always CPU limited at a high refresh rate.
 
1440P still matters slightly.

Once you game at 4K though, you can underclock your CPU and it's still going to be GPU limited.
 
1440P still matters slightly.

Once you game at 4K though, you can underclock your CPU and it's still going to be GPU limited.

For me it is the opposite. Every game I play at 4k is CPU limited. ARMA 3, Fallout 4, Supreme Commander, 7 Days to Die, Kerbal Spae Program, and GTA V are all CPU bottlenecked. For games that are not CPU intensive, I use a higher refresh rate display.
But yes, in general you are absolutely correct.
 
For me it is the opposite. Every game I play at 4k is CPU limited. ARMA 3, Fallout 4, Supreme Commander, 7 Days to Die, Kerbal Spae Program, and GTA V are all CPU bottlenecked. For games that are not CPU intensive, I use a higher refresh rate display.
But yes, in general you are absolutely correct.

Sure, if you are playing extremely old games that aren't well multithreaded, or multithreaded at all, then everything is CPU bound no matter the resolution. Shoot the older you go, the more you are basically doing a form of emulation, that's why emulators are all so CPU bound. The one that doesn't work in your list is GTAV because the 2080 TI is about 23% faster than 1080 Ti, which is relatively close to the average 27% increase in DX11 between the cards, so it's still more GPU bound than a CPU bound game.
 
Sure, if you are playing extremely old games that aren't well multithreaded, or multithreaded at all, then everything is CPU bound no matter the resolution. Shoot the older you go, the more you are basically doing a form of emulation, that's why emulators are all so CPU bound. The one that doesn't work in your list is GTAV because the 2080 TI is about 23% faster than 1080 Ti, which is relatively close to the average 27% increase in DX11 between the cards, so it's still more GPU bound than a CPU bound game.

I only play GTA V with traffic mods, so anything past a 1070 is kind of a waste. I don't consider ARMA 3, KSP and F4 too old!
 
Last edited:
Arma 3, 7 days to die, GTAV are 5 years old, Supreme Comm over 10 years old. Yeah I think just about everyone would call those old.
 
Back
Top