3900X Good Upgrade over 5820K?

Did you even watch that video? He says clearly at 13:10 that if you have a 5820k it isn't worth upgrading for gaming.


I like their stuff for the most part, but he's missing the .1% lows and in-the-moment drops that Digital Foundry catches. At times, for example, the 7700K can drop to only 2/3 the performance of an 8700K, but those momentary drops are lost in the averages. This is why even though the 2600K averages are still good, when you actually play on a 9900K or 3900X it feels much smoother.

That said, I have a 5820K at 4GHZ next to my 8086K at 5GHZ. When I used my 980ti in both of them, I didn't notice much of a difference. When I switched to the RTX 2080, I did notice the 5820K was slower with more in the moment drops. It really depends a lot on your GPU and resolution.
 
I use a 3970X at 4.4Ghz and undervolted and overclocked Vega 64 and play at 4K/1440P. No complaints here with a budget setup. Performance increase justification is always relative.
 
you're essentially right with 10billion

3.9bn per chiplet + 2.09bn for the IO die
so, 9.89bn for the 3800X/3900X/3950X and 5.99bn for the 3600/3600X/3700X

All debate aside does that not boggle the mind literally? I mean we have something that is the size of an infant's little finger nail packed with billions of transistors capable of crunching data at unbelievable rates. Where has technology come man wowsers!!!

I mean teams of human engineers sat for thousands of hours using our carbon framework brains and contemplated and designed the exact layout for these microchips and they actually work and they work damn well indeed!
 
I like their stuff for the most part, but he's missing the .1% lows and in-the-moment drops that Digital Foundry catches. At times, for example, the 7700K can drop to only 2/3 the performance of an 8700K, but those momentary drops are lost in the averages. This is why even though the 2600K averages are still good, when you actually play on a 9900K or 3900X it feels much smoother.

That said, I have a 5820K at 4GHZ next to my 8086K at 5GHZ. When I used my 980ti in both of them, I didn't notice much of a difference. When I switched to the RTX 2080, I did notice the 5820K was slower with more in the moment drops. It really depends a lot on your GPU and resolution.

What ? When you switched to your 2080, did you change resolution and/or image quality settings by any chance ? If so, the comparison isn't worth much but I can accept your opinion and experience. Now if you used the same settings and got noticeable drops with a better and newer GPU .... what the hell ? If the later, I would re-image Windows (yeah I may get flamed for saying that, don't care... a fresh install has always been a boon over a patched 5 years old Win install for me). I mean that with same settings the newer GPU should be able to handle the same workload better than the older card hands down, no way an older CPU would limit it enough to become less performing than a less performing card.
 
What ? When you switched to your 2080, did you change resolution and/or image quality settings by any chance ? If so, the comparison isn't worth much but I can accept your opinion and experience. Now if you used the same settings and got noticeable drops with a better and newer GPU .... what the hell ? If the later, I would re-image Windows (yeah I may get flamed for saying that, don't care... a fresh install has always been a boon over a patched 5 years old Win install for me). I mean that with same settings the newer GPU should be able to handle the same workload better than the older card hands down, no way an older CPU would limit it enough to become less performing than a less performing card.

I took it that before he had a GPU bottleneck. Now since the GPU is faster he sees more of a difference between processors.

For the OP, looks like he already posted he preordered a 3900x.

If you do productivity that requires multiple cores and you game at ~< 90 Hz then the 3900x is the easy choice hands down.

If you game and try to hit high Hz then the 9900k can pull significantly ahead in lows (up to 20% when above 90Hz). Tom’s Hardware has a good 3900x review with OC vs OC (essientially) and minimums. 9900k or 9900kf (no iGPU) will lag behind in productivity.

The jist I got was the 3900x makes sense for you.
 
What ? When you switched to your 2080, did you change resolution and/or image quality settings by any chance ? If so, the comparison isn't worth much but I can accept your opinion and experience. Now if you used the same settings and got noticeable drops with a better and newer GPU .... what the hell ? If the later, I would re-image Windows (yeah I may get flamed for saying that, don't care... a fresh install has always been a boon over a patched 5 years old Win install for me). I mean that with same settings the newer GPU should be able to handle the same workload better than the older card hands down, no way an older CPU would limit it enough to become less performing than a less performing card.


The 980TI was a GPU bottleneck compared to the RTX 2080. The speed difference between the two CPUs became apparent when I got the faster GPU.

Both systems had the same speed ram, same model SSD (Samsung), and fresh copies of Windows. Both were hooked up to the same two monitors at the time running 1080P and then a 1440P with Gsync (I aim for high refresh rate gaming).

This video really shows how you can see the in-the-moment .1% drops.

 
The 980TI was a GPU bottleneck compared to the RTX 2080. The speed difference between the two CPUs became apparent when I got the faster GPU.

Both systems had the same speed ram, same model SSD (Samsung), and fresh copies of Windows. Both were hooked up to the same two monitors at the time running 1080P and then a 1440P with Gsync (I aim for high refresh rate gaming).

This video really shows how you can see the in-the-moment .1% drops.



Sorry I read your original post the wrong way :eek: I blame it on the Whisky but that's an excuse ;)
 
Back
Top