3900X Good Upgrade over 5820K?

Epos7

Gawd
Joined
Aug 31, 2015
Messages
892
Current system:

i7 5820K
2 x 8GB Vengeance LPX DDR4 2666
EVGA RTX 2080
1TB HP EX920 m2

The HDD and GFX card I've upgraded over the last year, and now I'm starting to consider a CPU upgrade to 3900X along with 32GB RAM.

I game on a 4K monitor at resolutions from 1440p to 4K depending on the game. Playing Destiny 2, No Man's Sky, Witcher 3 currently.

It doesn't look like I would see a huge boost in single-threaded performance, so especially at higher resolutions, gaming performance might not change too much.

I do a decent amount of CAD design and rendering, a little bit of work in Lightroom, and quite a bit of multitasking. I think I may see a pretty big improvement in these scenarios?

Thoughts? I like the idea of an upgrade, just not too sure what the value proposition is. I don't plan to overclock much, and would likely go for a B450 chipset motherboard, saving me ~$100 over the B570 models.
 
Comparing a few benchmarks from AnandTech...

Power (Package), Full Load
3900X: 142.09 watts
5820K: 132.07 watts

Rendering: CineBench 15 SingleThreaded (Higher is Better)
3900X: 209
5820K: 139

Rendering: CineBench 15 MultiThreaded (Higher is Better)
3900X: 3102
5820K: 980

GTX 1080: Civilization 6, Average FPS
4K Ultra, Frames Per Second
3900X: 207
5820K: 88
 
Current system:

i7 5820K
2 x 8GB Vengeance LPX DDR4 2666
EVGA RTX 2080
1TB HP EX920 m2

The HDD and GFX card I've upgraded over the last year, and now I'm starting to consider a CPU upgrade to 3900X along with 32GB RAM.

I game on a 4K monitor at resolutions from 1440p to 4K depending on the game. Playing Destiny 2, No Man's Sky, Witcher 3 currently.

It doesn't look like I would see a huge boost in single-threaded performance, so especially at higher resolutions, gaming performance might not change too much.

I do a decent amount of CAD design and rendering, a little bit of work in Lightroom, and quite a bit of multitasking. I think I may see a pretty big improvement in these scenarios?

Thoughts? I like the idea of an upgrade, just not too sure what the value proposition is. I don't plan to overclock much, and would likely go for a B450 chipset motherboard, saving me ~$100 over the B570 models.

Make sure to flash the latest bios with AGESA 1003aba since you play Destiny 2 if you decided to go with Ryzen.
 
Comparing a few benchmarks from AnandTech...

Power (Package), Full Load
3900X: 142.09 watts
5820K: 132.07 watts

Rendering: CineBench 15 SingleThreaded (Higher is Better)
3900X: 209
5820K: 139

Rendering: CineBench 15 MultiThreaded (Higher is Better)
3900X: 3102
5820K: 980

GTX 1080: Civilization 6, Average FPS
4K Ultra, Frames Per Second
3900X: 207
5820K: 88

Damn that power efficiency increase is insane.
 
I recently went from a 7700K to a 3900X. Gaming performance was pretty much identical, but I expected that. I game at 1440p or through my Oculus Rift on a 1080ti, so I stay GPU-bound for the most part.

My driving force was the thread count. My machine would lag at times when running multiple applications, which was common for me: I'd be doing some low-intensity content like crafting or gathering in Final Fantasy 14 on one monitor while 3D modeling or slicing for my 3D printer on the other monitor. Since I basically shoehorned two more 7700Ks into my system (single threaded performance is much the same between Kaby Lake and Matisse), I don't have those slowdowns anymore. :D
 
Look at your workflow for the answer.

What's the least troublesome, best performing platform for your CAD suite, whatever you consider "multitasking" should benefit from +32gb of ram, and Lightroom likes +frequency CPUs + nvme stack.
 
Make sure to flash the latest bios with AGESA 1003aba since you play Destiny 2 if you decided to go with Ryzen.

Yeah, sounds like there are some wrinkles still being ironed out. BIOS updates look to be an issue with B450. I don't have an old AMD processor lying around I could use to update with, so going with B450 or X470 could be a hassle. I saw AMD have a program that will ship you a CPU to use, but I think you have to get the CPU first and document contact with the motherboard manufacturer to confirm you need the old CPU to flash.

I'm looking at ITX options, and I'm not seeing any that would allow me to flash the BIOS from USB.
 
Yeah, sounds like there are some wrinkles still being ironed out. BIOS updates look to be an issue with B450. I don't have an old AMD processor lying around I could use to update with, so going with B450 or X470 could be a hassle. I saw AMD have a program that will ship you a CPU to use, but I think you have to get the CPU first and document contact with the motherboard manufacturer to confirm you need the old CPU to flash.

I'm looking at ITX options, and I'm not seeing any that would allow me to flash the BIOS from USB.
The AMD "chip loan" always seemed like such a customer-focused nicety to me. If I had settled on an X470 board, I think I might have participated in spite of the hassle, just out of appreciation for a company willing to do that.
 
I recently went from a 7700K to a 3900X. Gaming performance was pretty much identical, but I expected that. I game at 1440p or through my Oculus Rift on a 1080ti, so I stay GPU-bound for the most part.

My driving force was the thread count. My machine would lag at times when running multiple applications, which was common for me: I'd be doing some low-intensity content like crafting or gathering in Final Fantasy 14 on one monitor while 3D modeling or slicing for my 3D printer on the other monitor. Since I basically shoehorned two more 7700Ks into my system (single threaded performance is much the same between Kaby Lake and Matisse), I don't have those slowdowns anymore. :D

Thanks, that's a good comparison as the 7700K and 5820K are remarkably close in a lot of benchmarks.

https://www.anandtech.com/bench/product/2409?vs=2255
 
The AMD "chip loan" always seemed like such a customer-focused nicety to me. If I had settled on an X470 board, I think I might have participated in spite of the hassle, just out of appreciation for a company willing to do that.

I agree it's great that they have it. Given the requirements, it's probably designed more for existing motherboard owners who want to upgrade than for prospective new buyers.
 
AnandTech looks to have updated their bench section, and there is now a more direct comparison between the 5820K and 3900X:

https://www.anandtech.com/bench/product/2409?vs=2519

The 5820K actually pulls ahead slightly in the majority of 1440p and 4K benchmarks. The difference is small enough to be negligible, but that's surprising for a 5 year old processor with a significantly lower base clock. I suppose just more evidence that CPU really doesn't matter at higher resolutions.
 
Last edited:
AnandTech looks to have updated their bench section, and there is now a more direct comparison between the 5820K and 3900X:

https://www.anandtech.com/bench/product/2409?vs=2519

The 5820K actually pulls ahead slightly in the majority of 1440p and 4K benchmarks. The difference is small enough to be negligible, but that's surprising for a 5 year old processor with a significantly lower base clock. I suppose just more evidence that CPU really doesn't matter at higher resolutions.
The funny thing about that is that the 9900K also pulls ahead of the 3900X by negligible amounts in high resolution benches.
 
The funny thing about that is that the 9900K also pulls ahead of the 3900X by negligible amounts in high resolution benches.

That's true. Perhaps higher resolutions just favor Intel ever so slightly.

I wonder if the 3900x would improve 1% lows vs the 5820K. That's one thing that doesn't show up in AnandTech's benches.
 
Only 10 watts increase for going from 2,600 million to 2.09 billion transistors

Must be a typo here. I thought the 3900x was closer to 10 billion transistors once you include the IOD and 2 CCDs? Not to mention 2,600 million = 2.6 billion > 2.09bn

would be interesting to hear the actual transistor count in that little chip
 
Must be a typo here. I thought the 3900x was closer to 10 billion transistors once you include the IOD and 2 CCDs? Not to mention 2,600 million = 2.6 billion > 2.09bn

would be interesting to hear the actual transistor count in that little chip

you're essentially right with 10billion

3.9bn per chiplet + 2.09bn for the IO die
so, 9.89bn for the 3800X/3900X/3950X and 5.99bn for the 3600/3600X/3700X
 
My problem is more to build new workstations with old Autodesk suites I bought in mind. Problem is Windows 10.
So the answer may be new CAD suites non Autodesk, but on Linux.
As far as I can tell AMD confirmed Ryzen 3000 supports 128GB RAM on X570 motherboards. Not sure about other chipsets, but why shouldn't they as far as firmware support. All the hardware for controlling the RAM is built in the CPU, and motherboards' hardware doesn't care about the size of the RAM, but the firmware could limit the amount.
It would be interesting that someone tests that solution on a B450 or X470, but here in France, I can't find any 32GB DDR4 ECC unbuffered if they ever exist. 32GB unbuffered non ECC has been tested by a guy on a dual slot mini-ITX X470 Gigabyte motherboard and it worked but that makes only 64GB which is the previous known limit.
 
This is actually the same jump I just did, going from a 5820k to a 3900x (after a month of waiting to get it on backorder)

I'm running a 2080 TI being the main difference hardware wise. Also I'm running 1440p ultrawide.

In gaming, I didn't notice a change really with the exception of Shadow of The Tomb Raider in which I had a CPU bottleneck in certain parts of the game, all of which are gone now. I also do a fair bit of After Effects/Premiere work and here I noticed a massive difference in performance both using the programs and in render times afterwords. That jump in threads can really make a difference here.
 
i want to upgrade to 3900x but, im a gamer only. but games are not supported for that many cores at all, maybe a few select titles but. so i guess spending that 1000usd on 2080ti would be better for me but. actually makes more sense for me to get 9900k if anything. upgrade itch for sure, but i think holding of to next ti card and intel 10 would make more sense sadly... but i do want to upgrade.

this 5820k is such a damn good piece of hardware.. never in past i did pay for a cpu that last this long and perform that well. and it was cheap like hell when i got it. never thougt it would last this long and well. but mine isnt such a good overclocker, stress test stable is 4.3 but anything else 4.4 never gave me issues, but im just running it on 4.3 now.
 
Same setup, 5820k @ 4.3GHz and 2080Ti, 4k gaming.

Since I use it to mostly game and every day use I guess upgrading isn't really worth it yet.
 
using 5930K here and considering the upgrade to a 3950X or the next entry level threadripper.
imho 3900X is a big jump from the good old Haswell-E.
 
think wise to wait for intel, however slow they are. i suspect a crushing blow once they do ship 10 nm. but even a old cpu like 5820k is still solid today. gfx is way behind really. it's hard to justify a upgrade if you are just a gamer i think. it still suprise me how well my 5820k hold up, and it was cheap when i got it also. and the ram too, as it was in such low demand at the time. likewise how ddr4 have saturated the market now and prices are low.
 
think wise to wait for intel, however slow they are. i suspect a crushing blow once they do ship 10 nm. but even a old cpu like 5820k is still solid today. gfx is way behind really. it's hard to justify a upgrade if you are just a gamer i think. it still suprise me how well my 5820k hold up, and it was cheap when i got it also. and the ram too, as it was in such low demand at the time. likewise how ddr4 have saturated the market now and prices are low.

10nm processor is far away from now, next HEDT cpu is Cascade Lake-X based on the so old 14nm++++++++++++++++++++++++++++
current mobile processor based on 10nm shows that 10nm is not good for intel to maximize frequency, with 10nm intel wasn't able to increase clock speed and neither IPC so much.

so I will not bet so much on 10nm.
 
I don't see any reason to upgrade at all for gaming at 4K. Just find a nice sale on the 2080Ti.

For CAD, etc. it definitely makes sense to upgrade.
 
10nm processor is far away from now, next HEDT cpu is Cascade Lake-X based on the so old 14nm++++++++++++++++++++++++++++
current mobile processor based on 10nm shows that 10nm is not good for intel to maximize frequency, with 10nm intel wasn't able to increase clock speed and neither IPC so much.

so I will not bet so much on 10nm.
intelresting. thougt 10nm was going to be huge for intel but, i didnt really know. logically when someone who manufactur the best cpu's u would think when they node shrinked they would see massive gains. extreme platform have been a joke past x99 really, amd have done far more justice in that regard. intel extreme platform is dead in my eyes, and have been past x99. that x299 shenanigans they pulled a year + so ago hardly is competative to amd and required a new expensive platform ontop of cpu's twice the price of anything else, and just after most of their extreme platform was prety much rendered useless. infact they even did so themselves with 9900k in some way. har to justify intel extreme nowdays. only idiots who want the best for no reasons or ppl who actually need that amouint of cpu performance would get it. and amd would be the clear cut choice it seems anyway. now we just need better gpu's cpu is far ahead of it still. wqe just need stronger gpu's to match these new multicore powers. thank god amd changed that up.
 
I don't see any reason to upgrade at all for gaming at 4K. Just find a nice sale on the 2080Ti.

For CAD, etc. it definitely makes sense to upgrade.

there is a nice video from hardware unboxed that shown that with a 2080Ti a 5820K can reduce performance by up to 20% even in 4K.


now with RTX the situation can be even worse.
 
il probs cave in for a 3900x lol. but, i never bought a cpu that was as good as a 5820k :p when i was younger a cpu was dated after like 2 years, like prety bad. anyway not a single game i strugled to hit 60 fps @ 1440+ with my ti. only ghost recon wildlands prety much.
 
there is a nice video from hardware unboxed that shown that with a 2080Ti a 5820K can reduce performance by up to 20% even in 4K.


now with RTX the situation can be even worse.


Did you even watch that video? He says clearly at 13:10 that if you have a 5820k it isn't worth upgrading for gaming.
 
Well I ordered a 3900X from BHPhoto a few weeks ago, but it's still backordered. I noticed today Amazon isn't expecting more until October.

Could be a long wait.
 
Well I ordered a 3900X from BHPhoto a few weeks ago, but it's still backordered. I noticed today Amazon isn't expecting more until October.

Could be a long wait.
I just made the jump from a 6850k to a 3700x. i havent done a ton of benchmarking but its a noticeable upgrade, you should be happy with it. Once the 3950x comes out and people sell off their 3900x for it I'll probably pick up a used one.
 
In as much as it is an upgrade to an 8700k or 3600. In short, not really.
 
In as much as it is an upgrade to an 8700k or 3600. In short, not really.


Did you read the OPs post? it destroys is it in multitasking and even in single thread work loads. For his purpose its absolutely worth it.
 
Did you even watch that video? He says clearly at 13:10 that if you have a 5820k it isn't worth upgrading for gaming.

if 20% performance loss is acceptable to you.
if it is worth or not it depends on you, 20% worth the upgrade?

who knows, for me yes.
 
if 20% performance loss is acceptable to you.
if it is worth or not it depends on you, 20% worth the upgrade?

who knows, for me yes.
Didn't he say for 4k gaming it really doesn't make that much of a difference?
 
if 20% performance loss is acceptable to you.
if it is worth or not it depends on you, 20% worth the upgrade?

who knows, for me yes.

Where are you getting that 20% number? It wasn't from that video. Because he said it is not worth it to upgrade. His number at 4K was in the neighborhood of 3%. Unless you're talking about a different video and you linked the wrong one.
 
Only time this makes sense is in paradox strategy games....you could have a 10ghz cpu in most of their games and still get late game slow down lol.
 
Latest word from AMD in Canada is Sept 15’th.... It’s been a long ass wait for the 3900x.
 
Back
Top