RTX 4090 CPU Bottleneck

Wow so several games with a 20% performance gap using a 12900k over a 5800x even at 4K. And to think we have people on here buying a 4090 to pair with a CPU like a 7700k or even a really old 4960x. Not only does that level of GPU power require a very high-end CPU, Nvidia has very poor hardware scheduling that makes the issue even worse. If you don't have a top of the line CPU then you can literally end up in a situation where a mid-range AMD card will give you better performance than a 4090 will.
 
And that with a 5800x that was almost the fastest gaming CPU not so long ago, when Raptor Lake release it will be interesting to see between 5800x3d-13900K-7950x and the 9900K-3600x of the world, even at 4K could be quite massive.
 
And that with a 5800x that was almost the fastest gaming CPU not so long ago, when Raptor Lake release it will be interesting to see between 5800x3d-13900K-7950x and the 9900K-3600x of the world, even at 4K could be quite massive.
I did not expect gap at 4K to be that large, its making me re-think viability of my 5800x CPU, I installed 4090 RTX yesterday, now I am questioning if I should move to 13900K in the near future for 4k gaming. First I bought external sound blaster G6 to replace my internal ZxR so the 4090 has all the free slots it needs, then I got 1200W PSU unit to replace 850W to be on a safe side, now I am thinking about Mobo / CPU replacement. This card is quite something...
 
Nvidia needs to go back to the drawing board on their hardware scheduler before they release their next GPU architecture in 2 years. Probably best case scenario is we'll have a CPU that's 20 to 25% faster than right now in gaming and the vast majority of people that would be buying a 5090 well not even have that level of CPU. Even on this generation between Nvidia and AMD people with slower CPUs are going to end up in scenarios where AMD beats the shit out of a 4090 for high refresh gaming in some cases. There have already been some reviews with a 5800x CPU that show the 6900xt nearly matching the 4090 already in some games at 1440p.
 
I did not expect gap at 4K to be that large, its making me re-think viability of my 5800x CPU, I installed 4090 RTX yesterday, now I am questioning if I should move to 13900K in the near future for 4k gaming. First I bought external sound blaster G6 to replace my internal ZxR so the 4090 has all the free slots it needs, then I got 1200W PSU unit to replace 850W to be on a safe side, now I am thinking about Mobo / CPU replacement. This card is quite something...
I think one variable would be at what FPS those gap become large and your monitor FPS (and your wanted FPS), I could see a situation large gap exist at 4K yes but at large FPS.

The gap between the 5800x-12900k calm down quite a bit on the cyberpunk type of title.

Hitman 3 has a large gap yes, but run over 200 fps at 4K on a 4090 which can make it irrelevant.

That kind of analysis should show the raw FPS and not just a percentage gap imo, to help people decide.
 
I think one variable would be at what FPS those gap become large and your monitor FPS (and your wanted FPS), I could see a situation large gap exist at 4K yes but at large FPS.

The gap between the 5800x-12900k calm down quite a bit on the cyberpunk type of title.

Hitman 3 has a large gap yes, but run over 200 fps at 4K on a 4090 which can make it irrelevant.

That kind of analysis should show the raw FPS and not just a percentage gap imo, to help people decide.

Exactly, FPS figures would be nice. Also, I suspect true next gen titles will be heavier on GPU. Some of the more graphically advanced titles such as Cyberpunk, Dying Light 2 and Metro Exodus are under 3% difference. But then on the other hand, you have graphically intense titles such as RDR2 @ 17% (which to be fair is raster only with no ray tracing based on my understanding).
 
Nvidia needs to go back to the drawing board on their hardware scheduler before they release their next GPU architecture in 2 years. Probably best case scenario is we'll have a CPU that's 20 to 25% faster than right now in gaming and the vast majority of people that would be buying a 5090 well not even have that level of CPU. Even on this generation between Nvidia and AMD people with slower CPUs are going to end up in scenarios where AMD beats the shit out of a 4090 for high refresh gaming in some cases. There have already been some reviews with a 5800x CPU that show the 6900xt nearly matching the 4090 already in some games at 1440p.

Yea I totally agree with this. Have watched youtube video where they discuss it too. That nvidia needs to really work on that. I think AMD did that after RDNA1 and it really showed them maximizing performance. I wouldnt be surprised if 4090 looks like shit at lower resolutions lmao with older CPUs.
 
Exactly, FPS figures would be nice. Also, I suspect true next gen titles will be heavier on GPU. Some of the more graphically advanced titles such as Cyberpunk, Dying Light 2 and Metro Exodus are under 3% difference. But then on the other hand, you have graphically intense titles such as RDR2 @ 17% (which to be fair is raster only with no ray tracing based on my understanding).
True next gen titles could also be harder on the CPU. The two Spider-Man games make my 9900k look like a joke. Even The Ascent has CPU limited areas where my 9900k can't even get 60 FPS with ray tracing on. I can be standing in a spot with the frame rate hovering just below 60 and GPU usage just plummets so lowering it to 720p still results in the same performance. And with more and more people trying to push high refresh rate monitors the CPU will be even more important.
 
The RAM choice for the tests is kind of odd. Ryzen benefits from tighter timings moreso than faster clocks so using CL20 DDR4 RAM is probably not ideal. It's also kind of strange that a 5800X3D was not used or included in the results, since that would be closer to in price to the Intel CPU.

And why the hell are they testing with Windows 10 instead of Windows 11?
 
The RAM choice for the tests is kind of odd. Ryzen benefits from tighter timings moreso than faster clocks so using CL20 DDR4 RAM is probably not ideal. It's also kind of strange that a 5800X3D was not used or included in the results, since that would be closer to in price to the Intel CPU.
He was comparing the 12900k to the 5800x because the 5800x is what they normally use in their testing machine. Plenty of people had said he was CPU limited so he tested the 12900k to see if that was the case.
 
He was comparing the 12900k to the 5800x because the 5800x is what they normally use in their testing machine. Plenty of people had said he was CPU limited so he tested the 12900k to see if that was the case.
Yea but think about it for a second. It seemed like a intel advertisement campaign. Guy that does reviews for everything and has tested 7700x and 12900k didn't already possibly know that it 5800x will be slower? I mean kinda doesn't make him a good reviewer lmao. Just a thought.
 
He was comparing the 12900k to the 5800x because the 5800x is what they normally use in their testing machine. Plenty of people had said he was CPU limited so he tested the 12900k to see if that was the case.

It seems like it wouldn't have taken much effort to swap the 5800X for a 5800X3D compared to building an Intel system from scratch.
 
Yea but think about it for a second. It seemed like a intel advertisement campaign. Guy that does reviews for everything and has tested 7700x and 12900k didn't already possibly know that it 5800x will be slower? I mean kinda doesn't make him a good reviewer lmao. Just a thought.
Well again that was the testing machine that he's been using so maybe he didn't want to change it just for one GPU review at the time. The absolute vast majority of people would have laughed if you said the 5800x would hold back a 4090 anyway.
 
I did not expect gap at 4K to be that large, its making me re-think viability of my 5800x CPU, I installed 4090 RTX yesterday, now I am questioning if I should move to 13900K in the near future for 4k gaming. First I bought external sound blaster G6 to replace my internal ZxR so the 4090 has all the free slots it needs, then I got 1200W PSU unit to replace 850W to be on a safe side, now I am thinking about Mobo / CPU replacement. This card is quite something...
Getting a 4090 means system rebuild. I wouldn't think to run it on current gen hardware. 1200hp on used all season tires.
I plan to get a 4090Ti whenever it's released. By then I will have the "best CPU available" for it.
 
Getting a 4090 means system rebuild. I wouldn't think to run it on current gen hardware. 1200hp on used all season tires.
I plan to get a 4090Ti whenever it's released. By then I will have the "best CPU available" for it.
You have a point and have order in already for 4090 at best buy, picking up this weekend. Got a 7700x, downgraded from 7950x to prioritize gaming. 4090ti would be nice but as always wanna use it now and not sure care too much about another 10% performance and waiting another 8+ months.

But I think its still beneficial to have schedular that can take best advantage of lower res higher refresh for those that need it.
 
You have a point and have order in already for 4090 at best buy, picking up this weekend. Got a 7700x, downgraded from 7950x to prioritize gaming. 4090ti would be nice but as always wanna use it now and not sure care too much about another 10% performance and waiting another 8+ months.

But I think its still beneficial to have schedular that can take best advantage of lower res higher refresh for those that need it.
4090 was on my now list but I'm not ready for the rebuild, so I decided to get a $700 3090 after rewards from Amazon Warehouse.
By the time everything shakes out between AMD and Intel, Nvidia will release the 4090Ti. Hopefully price drops will follow, replacing the 4090 at the same price.

I agree with the lower res users, but I hope they have 240Hz+ monitors so it makes sense. But they should just buy a 3090.
 
Getting a 4090 means system rebuild. I wouldn't think to run it on current gen hardware. 1200hp on used all season tires.
Yeah this is what I am thinking too.

My current 9900K is a fine CPU but certainly not king of the hill anymore. And the motherboard is a lowly Asus Tuf model, the PSU will probably not cut it etc etc.
 
The RAM choice for the tests is kind of odd. Ryzen benefits from tighter timings moreso than faster clocks so using CL20 DDR4 RAM is probably not ideal. It's also kind of strange that a 5800X3D was not used or included in the results, since that would be closer to in price to the Intel CPU.

And why the hell are they testing with Windows 10 instead of Windows 11?
Agreed. They needed to find the lowest latency not the highest number. Would question that groups skills.
 
Yeah this is what I am thinking too.

My current 9900K is a fine CPU but certainly not king of the hill anymore. And the motherboard is a lowly Asus Tuf model, the PSU will probably not cut it etc etc.
I'm also considering a monitor upgrade, going from 3440x1440 to 3840x1600 144Hz+ but may still have a bottleneck.
Hoping for a 5120x2160 144-165Hz monitor to be released at some point.

The 4090 has too many extra costs associated with it, so I rather put it on hold for now.
 
I'm also considering a monitor upgrade, going from 3440x1440 to 3840x1600 144Hz+ but may still have a bottleneck.
Hoping for a 5120x2160 144-165Hz monitor to be released at some point.

The 4090 has too many extra costs associated with it, so I rather put it on hold for now.
Agree 100% with monitor goal, that resolution and refresh rate I see as ideal. Except the 4090 lack of DP2.0+ makes it not suitable if something like that comes about.
 
Agree 100% with monitor goal, that resolution and refresh rate I see as ideal. Except the 4090 lack of DP2.0+ makes it not suitable if something like that comes about.
Yep. DP1.4a is another bottleneck for the GPU itself. We need 5120x2160 120-165Hz+ to get the most from the card.
That's why I'm waiting for the DP2.0 4090Ti or see what the red team does. Then we need the monitor to be announced at CES.
 
Last edited:
Yep. DP1.4a is another bottleneck for the GPU itself. We need 5120x2160 120-165Hz+ to get the most from the card.
That's why I'm waiting for the 4090Ti or see what the red team does. Then we need the monitor to be announced at CES.
Exactly, now I may get the 42" OLED as a stop-gap until something like that is available. I see the monitor, display device as more important than the gaming GPU in the end. If the monitor cannot display what the GPU can dish out, one is wasting money on the gaming GPU.
 
Exactly, now I may get the 42" OLED as a stop-gap until something like that is available. I see the monitor, display device as more important than the gaming GPU in the end. If the monitor cannot display what the GPU can dish out, one is wasting money on the gaming GPU.
Bingo! The monitor is the most important part of the build.
Next is the keyboard, mouse, audio, the proper desk for the new monitor and chair.

We haven't started building the PC yet.
I think we can all wait until better tech to come along to alleviate the bottlenecks i.e. 7XXX-X3D.
 
  • Like
Reactions: noko
like this
"Sadly" the new series of reviews were quite often on no 4090 or not a 4k ultra, so not sure how much we did learn, but some it look like GPU bound scenarios (when a 7600x-perform the same has the 12900k) and would have seem to really be without the DDR-5 version of the 13900K platform here seem to unlock:
Spider_1440p-p.webp

Which can give some hope that in a near future, will be interesting to see if the 7200 or 8000 kit change anything in the near future.
 
The RAM choice for the tests is kind of odd. Ryzen benefits from tighter timings moreso than faster clocks so using CL20 DDR4 RAM is probably not ideal. It's also kind of strange that a 5800X3D was not used or included in the results, since that would be closer to in price to the Intel CPU.

And why the hell are they testing with Windows 10 instead of Windows 11?
They're not trying to run a head to head.

"The idea here is to get a feel for how much graphics performance is lost by a weaker processor."
 
They're not trying to run a head to head.

"The idea here is to get a feel for how much graphics performance is lost by a weaker processor."

It doesn't really matter what their intention was. 90%+ of the people who are reading that article are taking it as a "head to head" anyway.

And my point about the CL20 RAM stands, as there was no reason for them to use that in their Ryzen test system regardless of what CPU it was or wasn't being compared to.

If all they wanted to do was compare CPU performance, then it would have actually made a LOT more sense to simply swap the 5800X for a 5800X3D so that all other variables could have remained the same.
 
It doesn't really matter what their intention was. 90%+ of the people who are reading that article are taking it as a "head to head" anyway.

And my point about the CL20 RAM stands, as there was no reason for them to use that in their Ryzen test system regardless of what CPU it was or wasn't being compared to.

If all they wanted to do was compare CPU performance, then it would have actually made a LOT more sense to simply swap the 5800X for a 5800X3D so that all other variables could have remained the same.

I think a lot of us have 5800x it was / is super popular processor and for a good reason. I for one am super glad they included 5800x.
 
I think a lot of us have 5800x it was / is super popular processor and for a good reason. I for one am super glad they included 5800x.

I wasn't saying that they should have omitted the 5800X. I'm saying that they should have compared 5800X vs 5800X3D instead of 5800X vs 12900k, if the only variable they wanted to test was CPU performance. Or better yet include all 3.
 
It doesn't really matter what their intention was. 90%+ of the people who are reading that article are taking it as a "head to head" anyway.

And my point about the CL20 RAM stands, as there was no reason for them to use that in their Ryzen test system regardless of what CPU it was or wasn't being compared to.

If all they wanted to do was compare CPU performance, then it would have actually made a LOT more sense to simply swap the 5800X for a 5800X3D so that all other variables could have remained the same.
If you read the comments section, I think you'll find your concerns are covered within the first page ;)
 
Exactly, now I may get the 42" OLED as a stop-gap until something like that is available. I see the monitor, display device as more important than the gaming GPU in the end. If the monitor cannot display what the GPU can dish out, one is wasting money on the gaming GPU.
You'll like it. I use a 48" LG C1 as my gaming monitor and it's amazing! With the 4090.
 
Have anybody seen any benchmarks showing how much uplift I can expect going from a 9900K to Raptor lake or Zen 4? I can’t seem to find anymore using 9900K or equivalent on the latest benchmarks for those new CPUs. :oops:
 
Have anybody seen any benchmarks showing how much uplift I can expect going from a 9900K to Raptor lake or Zen 4? I can’t seem to find anymore using 9900K or equivalent on the latest benchmarks for those new CPUs. :oops:
FWIW, I noticed a consistent 15 fps increase going from 8700K to 12900k with my 3090 in World of Warcraft during one of the raid encounters.
 
Worth upgrading from 5900X to 5800X3D to remove CPU bottleneck at 1440P 240 Hz?

Depends on the games you play and how many cores you need. For example, if you are a World of Warcraft player, nothing beats the 5800X3D. Even Intel shows it decimating both their 12-series and 13-series CPUs. But World of Warcraft relies heavily upon single-threaded performance and almost never uses more than 4 cores. That would still leave you 4+ cores for other tasks. But some games do manage to make reasonable use of 8 cores, and depending on what else you are running on your computer, that might result in reduced performance. 8-cores has become the standard due to the current generation of consoles, but people aren't running other programs in the background when they game on consoles...

Intel13seriesWoW.png
 
Last edited:
Wow so several games with a 20% performance gap using a 12900k over a 5800x even at 4K. And to think we have people on here buying a 4090 to pair with a CPU like a 7700k or even a really old 4960x. Not only does that level of GPU power require a very high-end CPU, Nvidia has very poor hardware scheduling that makes the issue even worse. If you don't have a top of the line CPU then you can literally end up in a situation where a mid-range AMD card will give you better performance than a 4090 will.
This is why I am not going for the 4090 & keeping my 3080Ti because I know my 12700kf has already met Its match with what I have now any more is mismatched.
 
I bet 90% of 4090 owners are system/CPU limited at 1440p and 4k in a lot of games. CPU improvements just have not been keeping up with GPU improvements now that DLSS is able to run things at 1080p and Up-res and insert new frames.
 
I bet 90% of 4090 owners are system/CPU limited at 1440p and 4k in a lot of games. CPU improvements just have not been keeping up with GPU improvements now that DLSS is able to run things at 1080p and Up-res and insert new frames.
The fact is that a 4090 + any CPU from the last few years is producing frame rates that are insanely high in most situations. We are very much in a period where cutting edge hardware has outrun advancements in game tech. It’s awesome.
 
It doesn't really matter what their intention was. 90%+ of the people who are reading that article are taking it as a "head to head" anyway.

And my point about the CL20 RAM stands, as there was no reason for them to use that in their Ryzen test system regardless of what CPU it was or wasn't being compared to.

If all they wanted to do was compare CPU performance, then it would have actually made a LOT more sense to simply swap the 5800X for a 5800X3D so that all other variables could have remained the same.
That only says something about how people are biased themselves. Things are being it approached too sentimental these days. Just take it as it is and stop being brand loyal, either way is just plain naive since all companies are out to maximize profits.
 
Back
Top