Thoughts on using RTX 4090 FE with i7-8086k

Neon01

[H]ard|Gawd
Joined
Jan 22, 2008
Messages
1,048
Hey all, just wanted to get some opinions from the experts here on a potential CPU/mobo upgrade. I saw the recent post about CPU bottlenecks for the 4090, and it looks like it can be somewhat significant with an AMD 5800X compared to newer hardware, and the gap is only going to widen a bit more when I consider the 5800X is stronger than my i7-8086k. However, even with a Microcenter around the corner, a new 13600k and mobo is likely to run me at least $550-600, and that assumes my Alphacool Eisblock XPX water block designed for my LGA 1151 socket is compatible with LGA 1700 (everything I've read indicates this is so). If not, I'm looking at a CPU cooler too. Either way, that's a nice chunk of change. The other part of my hesitation is that the absolute frame rates aren't shown in that article. A 18% frame rate advantage doesn't really matter to me (I game either at 4k or 1600p UW resolutions only) if we're talking the difference between 150fps and 177fps. I'm not super sensitive - anything over about 90fps feels buttery smooth to me. On the more challenging games like the witcher 3 enhanced (with ray tracing), CP2077, or God of War, the bottleneck appeared to be lower. These are the games I'm upgrading from my 3090 for anyway - the 3090 runs less challenging titles very well already, so I'm not as interested if I'm being held back there.

So, thoughts? In all other respects my current processor is working just fine for me, so I'm really on the fence.
 
It will bottleneck you pretty hard. Witcher 3 RTX patch is incredibly CPU heavy (apparently RTX increasing CPU usage happens in other games too but Witcher 3 is pretty much single threaded so it's really bad), it doesn't even run good with a 13900k, a 8700k will fare terribly there. CP77 actually makes pretty good use of the CPU too.

With a 8700k you might not even notice any difference upgrading to a 4090 in many games because that CPU can already bottleneck you with a 3090.

Of course you can push your resolution and video settings to crazy levels and somehow hammer the GPU, but it won't be that great of an experience and your 1% and 0.1% low will be terrible.
 
Thank you for the insights. I did a bit more digging online yesterday and sorta came to the same conclusion. Now trying to decide what to get. I was leaning on the old "go big or go home" in considering a 13900k, but Microcenter is running an amazing deal right now for a 12900k with Asus mobo for $465 (or 12700k for $360). 13900k with the mobo is more like $750-800. Still, some of the benchmark videos I've seen have absolutely stunned me in how many frames just that one generation difference can give you even at 4k. Like this one: (although that's with a 3090ti)
 
There's always more frames you can squeeze somewhere but personally I was even considering the 13600k to go with a 4090 and my old DDR4 4000mhz kit. All the 3 choices you listed are fine too. As would be a 5800 X3D. Normally I'd go 13900k, but I like the idea of making my next build a bit more focused on price and power efficiency.

I haven't really come to a decision yet, probably going to wait for more info on Zen4 with 3D cache.
 
Thank you for the insights. I did a bit more digging online yesterday and sorta came to the same conclusion. Now trying to decide what to get. I was leaning on the old "go big or go home" in considering a 13900k, but Microcenter is running an amazing deal right now for a 12900k with Asus mobo for $465 (or 12700k for $360). 13900k with the mobo is more like $750-800. Still, some of the benchmark videos I've seen have absolutely stunned me in how many frames just that one generation difference can give you even at 4k. Like this one: (although that's with a 3090ti)


I don't know where that review came from, but virtually every review I've seen has nearly identical performance for gaming between the 13900k and 13700k and not the 10%+ difference in that video. And the 13700k is just a repackaged 12900k with slight improvements.
 
Doesn't the 13900K have 400MHz higher boost clocks? Of course I imagine that only matters if you have sufficient cooling to maximize the 13900K.
 
Doesn't the 13900K have 400MHz higher boost clocks? Of course I imagine that only matters if you have sufficient cooling to maximize the 13900K.
Raptor Lake across the board also has more cache, and if I had to guess, most of the improvements over Alder Lake boil down to having more cache rather than extra clock speed and/or more E-cores, though those help too.

The Ryzen 5800X3D proves that certain games love extra cache like you wouldn't believe, mostly simulations with heavy single-thread reliance that struggle noticeably on older CPUs. (In fact, certain titles like MSFS2020 have it handing the 13900K its ass on a silver platter by a significant margin despite losing out in others.)

With all that said, my recommendation would be the Micro Center 12700K bundle; the OP can carry forward the DDR4 from the 8086K build, and no other CPU/platform can really beat that at $320-350 - not the 5800X3D or the 13600K when you still have to buy a separate mobo for those, adding significantly to the cost.

If you were going to go for the 13900K, money was never an object to begin with, especially considering you want to pair it with an RTX 4090.
 
Raptor Lake across the board also has more cache, and if I had to guess, most of the improvements over Alder Lake boil down to having more cache rather than extra clock speed and/or more E-cores, though those help too.

The Ryzen 5800X3D proves that certain games love extra cache like you wouldn't believe, mostly simulations with heavy single-thread reliance that struggle noticeably on older CPUs. (In fact, certain titles like MSFS2020 have it handing the 13900K its ass on a silver platter by a significant margin despite losing out in others.)

With all that said, my recommendation would be the Micro Center 12700K bundle; the OP can carry forward the DDR4 from the 8086K build, and no other CPU/platform can really beat that at $320-350 - not the 5800X3D or the 13600K when you still have to buy a separate mobo for those, adding significantly to the cost.

If you were going to go for the 13900K, money was never an object to begin with, especially considering you want to pair it with an RTX 4090.

I guess for memory limited scenarios the extra 6mb of l3 can help but its not a significant enough amount to make a huge difference like with the x3d chips. From what i have seen when clockspeed controlled the difference between the two seem to be within margin of error.
 
With a 8700k you might not even notice any difference upgrading to a 4090 in many games because that CPU can already bottleneck you with a 3090.
This is probably the case and what I was thinking. PC upgrades could get expensive if you want high end. Have your ducks in order. Get a 13th gen core system, then get a 4090 if you play at 4k.
 
I guess for memory limited scenarios the extra 6mb of l3 can help but its not a significant enough amount to make a huge difference like with the x3d chips. From what i have seen when clockspeed controlled the difference between the two seem to be within margin of error.
I haven't seen any clockspeed-normalized benchmarks of the various CPUs, but I sure wouldn't mind coming across them.

If that's true, though, I'll just disregard any upgrades to Raptor Lake down the line, especially after looking into more benchmarks like how Star Citizen still runs better on a 5800X3D than a 13700K.

Intel better be prepared to scale up their cache in future generations, because the moment AMD plops that V-Cache on Zen 4, they're gonna retake the gaming crown handily in anything that isn't RPCS3 (which heavily favors Intel, and more specifically, Alder Lake with AVX-512 enabled).
 
The system will perform well the 8086K will not pull the 4090's full performance but it will give you time to think about your next upgrade. I wouldn't worry about it other than knowing what future CPU you'll get and have time to save up for it.
 
OP I'm in the same boat coming from a 8086k and it was simply time for me to build a new system anyway, so a current-gen CPU is what I'll be getting regardless.

Personally, I couldn't fathom spending that much on a 4090 and then turn around and pair it with a 5 year old CPU, but that's me and everyone's situation/financials are different.
 
I ended up going overboard (probably) with a 13900k, and I have zero regrets. Thanks for the great advice from all who posted.
That's not overboard, you did good. I would do the same. Now u can enjoy it for years. Did u get the 4090 also?
 
Thank you for the insights. I did a bit more digging online yesterday and sorta came to the same conclusion. Now trying to decide what to get. I was leaning on the old "go big or go home" in considering a 13900k, but Microcenter is running an amazing deal right now for a 12900k with Asus mobo for $465 (or 12700k for $360). 13900k with the mobo is more like $750-800. Still, some of the benchmark videos I've seen have absolutely stunned me in how many frames just that one generation difference can give you even at 4k. Like this one: (although that's with a 3090ti)

I am calling BS on that video. There is no way a 13900k is 10-25% faster than a 13700k at 4k with a 3090 ti. For instance look at Cyberpunk 2077 where the 13900k is at 73 fps while the 13700k is at 57 which is 28% difference in that scene. There is NO way there would be that much cpu difference at 4k PLUS a 13700k is sure as hell not cpu limited at sub 60 FPS in that game. And that channel has posted other sketchy videos that do not line up with any trusted reviews.

Go look at the techspot / hardwareunboxed 13700k review and even with a much faster 4090 and at only 1080p there was never any games with a gap that big between a 13900k and 13700k. In fact at 1080p there was only 5% difference overall. Now that clearly shows that youtube comparison is pure BS showing large differences with a way slower card and at 4 times the resolution.
 
Last edited:
If you are on a budget go with a 5800x3D with a 4090. I have that build and it runs all games just fine. Also you can reuse most of your existing hardware like ram etc.

I would wait for Intel 14th gen or a Ryzen 7000 series X3D processor to upgrade. Right now the 5800X3D performs so close that it is a viable choice especially for 4K gaming where you are more GPU bound than CPU bound.
 
  • Like
Reactions: noko
like this
I am calling BS on that video. There is no way a 13900k is 10-25% faster than a 13700k at 4k with a 3090 ti.
I don't place much faith in Youtube in general anymore when it comes to reviews or comparison videos by the so called "experts". I have the 13900K and upgraded from the 12900K I previously had, and can verify while some games run smoother in 4K it is not 25% faster and that is with a 4090.
 
That's not overboard, you did good. I would do the same. Now u can enjoy it for years. Did u get the 4090 also?
Sure did. Wow, talk about gaming Nirvana. For the first time in my life I've felt I can play pretty much anything at 100-120 fps with all the bells and whistles for pretty much any game I care to. The Witcher 3 next-gen remake with all settings maxed out in 4k HDR running on my OLED42C2 at 100-120 fps? <chef's kiss> The 4090 is simply mind blowing.
 
I'm in a similar boat coming from a 5930k at 4.4 ghz with a 3090 ftw3

Which mobo did you go with? Was meaning AM5
 
Back
Top