6700k Itching to upgrade (I'm using a 2080Ti@1440p and do some 4k video editing) - 3900x or keep on rocking my 6700k?

Emig5m

Weaksauce
Joined
Dec 6, 2005
Messages
106
My 6700k only does 4.4Ghz so that's my base on what I'm looking to improve from (I feel like I've been using this CPU forever, definitely my longest lasting CPU)
I game mainly at 1440p and like to crank on the eye candy as high as possible on a 144Hz G-Sync monitor but sometimes play on my 4K HDR 60Hz TV using a ROG RTX 2080Ti - I will never play or worry about 1080p again so if the only difference would be at 1080p with graphics turned down I'll definitely skip upgrading

I'm considering AMD because the Intel K series is sold out everywhere I shop.

I also do 4k video editing on Movie Studio Platinum (Thinking about upgrading to Vegas Pro) - But gaming is 1rst priority and what's more important to see a boost in - the rest would be icing on the cake

Motherboard I'm looking at is the ASUS X570 TUF Gaming Plus (WI-FI) AMD AM4 ATX Motherboard - is this a good choice?

I haven't used AMD since the Athlon64 so I really don't know what's more important with ram on these new AMDs - MHz or Latency? Would it be best to get the fastest Mhz ram (like say DDR 4400) and run it at a lower speed (DDR 3600) but at tighter memory timings? Recommend me some ram, preferably something that Microcenter carries in stock so that I could pick it up this weekend.
 
Since your CPU only does 4.4Ghz, I think you'd get a slight boost from a 3900x even in gaming, although you're starting to get to a resolution where the CPU doesn't matter as much. I would think you'd also get a pretty big boost if you have to render or convert video while doing your editing also.

That motherboard is ok. Pretty much all of the X570 have a decent VRM configuration because they were designed with the higher core count CPUs in mind. That particular board has a Realtek NIC and the lower quality onboard sound (if you use it). I stepped up to the Prime X570-Pro as it has an Intel NIC and the better onboard sound (on sale it was maybe $20 more than that, but it was a good sale. Retail is $70 more).

I don't think it would be too terribly hard to get a "K" Intel chip in the next month or so if you are diligent about watching for them. Whether or not you'd get price gouged is another story. Another option might be to pick up a used 8700k setup that gets to 5Ghz. Might have better luck finding that over a 10600k or better for gaming. In that $400-420 price bracket, the 3900x is still a compelling option competing against the 10700k/9900k.
 
Since your CPU only does 4.4Ghz, I think you'd get a slight boost from a 3900x even in gaming, although you're starting to get to a resolution where the CPU doesn't matter as much.

6700K is slow compared to a stock 3900x and that's just in single core with a 6700K at 4.6ghz. At 4.6ghz a 6700k nets around 475 in R20 single and a 3900x nets around 505 in R20 single. In multi... not even worth mentioning. I gave my old 6700K at 4.8ghz to my daughter.
 
Ok.
  • Intel and AMD are both viable options for upgrades
  • You will get more bang for buck in productivity with AMD
  • Suitable AMD upgrades are 3700x,3800x, 3900x, 3950x
  • Suitable intel upgrades are 9900k, 10700k, 10900k
  • AMD is more picky about memory/memory speeds
  • Intel is more picky about motherboard VRMs if overclocking, and don’t come with heatsinks.
  • Either way, you will need to spend more in one area (memory or motherboard/heatsink)
  • More cores is not necessary for gaming
  • More speed helps gaming
  • Intel’s AVX2 implementation is better than AMDs at this time
  • AMD has more cores at this time, helpful for video
You need to decide which is better based on what you do. If it is 60/40 gaming/video, I would lean AMD, if it is 90/10 or 80/20 I would lean intel. At 70/30 it is a bit of a wash.
 
Last edited:
My 6700k only does 4.4Ghz so that's my base on what I'm looking to improve from (I feel like I've been using this CPU forever, definitely my longest lasting CPU)
I game mainly at 1440p and like to crank on the eye candy as high as possible on a 144Hz G-Sync monitor but sometimes play on my 4K HDR 60Hz TV using a ROG RTX 2080Ti - I will never play or worry about 1080p again so if the only difference would be at 1080p with graphics turned down I'll definitely skip upgrading

I'm considering AMD because the Intel K series is sold out everywhere I shop.

I also do 4k video editing on Movie Studio Platinum (Thinking about upgrading to Vegas Pro) - But gaming is 1rst priority and what's more important to see a boost in - the rest would be icing on the cake

Motherboard I'm looking at is the ASUS X570 TUF Gaming Plus (WI-FI) AMD AM4 ATX Motherboard - is this a good choice?

I haven't used AMD since the Athlon64 so I really don't know what's more important with ram on these new AMDs - MHz or Latency? Would it be best to get the fastest Mhz ram (like say DDR 4400) and run it at a lower speed (DDR 3600) but at tighter memory timings? Recommend me some ram, preferably something that Microcenter carries in stock so that I could pick it up this weekend.

4k editing... If you are doing cpu rendering then you'll want more cores to throw at it. Gaming at 4k any of the mid upper end cpus will generally speaking be in the same ballpark with some small exceptions like Destiny 2 which loves the 9900/10900 K chips overclocked. That said the Intels are way behind in multi threaded workloads so it comes down to what your true priorities are. Outside of the anamolies like Destiny, is the few percent difference in gaming worth more than multi threaded performance?

That board is decent. And about the vrms, it really doesn't matter a whole lot since these cpus don't draw much wattage anyways. That said I run a Strix-E which is for all intents 98% of a Crosshair for a bit less and obviously it is overkill since the chip doesn't draw more than 140w at max. This whole vrm thing has kind of gotten out of hand. Instead look at the feature set, nvme placement, audio chip, network chip, bios design/layout etc etc. I'm partial to the Asus ROG layout so I generally will lean towards an ROG bios' board.

Now for ram, on AMD side maximum speed is not that important. The sweet spot is 3600mhz (which is easy to attain) with tight timings, most Hynix CJR and B-die will do great. You could even try 3800mhz ram and 1900 fabric which is the max attainable. You could also get faster ram like you wrote and run it slower. That would leave your future options open.
 
  • Like
Reactions: noko
like this
I got rid of a 5GHz 6700k system for 3900x system and could not be happier with the results for gaming. I play Battlefield 4, 1 & 5 all were CPU limited with 1080Ti & 2080Ti, so it depends on the games you play. My avg FPS went up 20-30 FPS with 2080Ti which I consider substantial and well worth the upgrade. For productivity it's a no brainer.
 
I got rid of a 5GHz 6700k system for 3900x system and could not be happier with the results for gaming. I play Battlefield 4, 1 & 5 all were CPU limited with 1080Ti & 2080Ti, so it depends on the games you play. My avg FPS went up 20-30 FPS with 2080Ti which I consider substantial and well worth the upgrade. For productivity it's a no brainer.
I find these posts from actual users very informative, thanks for sharing actual experience.

To OP: some slightly higher clocked ryzens should be coming out soon hopefully, like the 3800xt/3900xt which might be a great fit for your usage. (Depending on price and availability though). Slightly higher clocks will help close up the s difference in games (Intel will still be faster in some though). Kind of a crap shoot what will be in stock first though. If gaming is your first priority, might want to just wait on 10600k/10700k availability, who knows when prices will stabilize for anything at this point.
 
Well yes, going from a 6700k to any of the mentioned processors means a doubling of cores at least, in the case of the 3900x It is 3x the cores.

A reminder though, most games don’t make full use of 6 cores/12 threads, so it is highly dependent on what you play.

I maintain my earlier statements, both intel and AMD are both valid choices, it just depends on what you use the box for.

Be realistic, how much 4k editing/rendering are you actually doing vs gaming? Any 6 core+ chip can do 4k reasonably well.
 
Last edited:
I got rid of a 5GHz 6700k system for 3900x system and could not be happier with the results for gaming. I play Battlefield 4, 1 & 5 all were CPU limited with 1080Ti & 2080Ti, so it depends on the games you play. My avg FPS went up 20-30 FPS with 2080Ti which I consider substantial and well worth the upgrade. For productivity it's a no brainer.

Battlefield 4 is likely not the recipient of that improvement (adding more than 4 threads results in lower performance), but I'll buy it for the rest of the games ion he list.
 
The thing you will notice most is speedshift or the AMD equivalent, which will mean your computer seems more responsive
 
If you can get your head around davinci, it will do your 4k stuff and rely on your graphics card more than your cpu.
 
Yea gaming is the top priority and why I bought a 2080Ti. Video editing I don't really care too much as the latest version of Movie Studio Platinum 17 claims full GPU acceleration and it does seem to run very smooth to the point I don't even have to create smaller cache files of clips anymore and realtime transitions and effects run as smooth as if you where playing back a final fully rendered video so it seems like my video editing workload has been shifted to my mighty 2080Ti so gaming performance is my main concern with the CPU upgrade and anything else would just be considered icing on the cake I guess. I suppose I could wait a bit longer since the system does run pretty good, but I hate the idea that I'm potentially holding back a $1300 graphics card with a old CPU and how much longer until the new Ryzens are released? This fall/winter? I'm pretty sure I've never kept a CPU this long before...the 6700k has been a work horse. It's a shame it only does 4.4GHz and even upping the voltage to 1.45v doesn't make a difference - I must of gotten the worst 6700k on the planet...lol. My last chip a 2500k overclocked right out of the box by 1000Mhz without even touching the voltage. It was the only time in my life that I won the silicon lottery lol.
 
Based on what you’re suggesting, I have no hesitation recommending the 10900k.
 
Yea gaming is the top priority and why I bought a 2080Ti. Video editing I don't really care too much as the latest version of Movie Studio Platinum 17 claims full GPU acceleration and it does seem to run very smooth to the point I don't even have to create smaller cache files of clips anymore and realtime transitions and effects run as smooth as if you where playing back a final fully rendered video so it seems like my video editing workload has been shifted to my mighty 2080Ti so gaming performance is my main concern with the CPU upgrade and anything else would just be considered icing on the cake I guess. I suppose I could wait a bit longer since the system does run pretty good, but I hate the idea that I'm potentially holding back a $1300 graphics card with a old CPU and how much longer until the new Ryzens are released? This fall/winter? I'm pretty sure I've never kept a CPU this long before...the 6700k has been a work horse. It's a shame it only does 4.4GHz and even upping the voltage to 1.45v doesn't make a difference - I must of gotten the worst 6700k on the planet...lol. My last chip a 2500k overclocked right out of the box by 1000Mhz without even touching the voltage. It was the only time in my life that I won the silicon lottery lol.

Thing is the new Ryzen should match or maybe even exceed Intel's latest CPUs in single thread performance while offering more cores and much lower thermals. These intel CPUs are pigs, they run super hot and the IHS is such trash that you can't even dissipate the heat even with a custom WC setup unless you delid it. So that's why I would never consider Intel at this point no matter what as a consumer. I have a 9900k@5 Ghz w/a custom WC setup so that's why I'm telling you from experience. Just hold out a bit longer and go AMD--I plan to get the new Ryzen and toss this Intel trash on eBay ASAP. Already got the mobo + CPU money in hand so just playing the waiting game at this point.
 
Thing is the new Ryzen should match or maybe even exceed Intel's latest CPUs in single thread performance while offering more cores and much lower thermals. These intel CPUs are pigs, they run super hot and the IHS is such trash that you can't even dissipate the heat even with a custom WC setup unless you delid it.

Yeah, if you’re running all threads full out all the time and are overclocked without speedshift and adaptive voltage. But, you typically are not.

As mentioned many times my 9900k uses 50w peak while playing games, at 5ghz all cores, with cache at 4.7. Temps are barely above 50C and fans are barely ticking over.

Unless I got super lucky and won the silicon lottery (unlikely), you must be doing something wrong to always hit high temps.
 
Last edited:
Yeah, if you’re running all threads full out all the time and are overclocked without speedshift and adaptive voltage. But, you typically are not.

As mentioned many times my 9900k uses 50w peak while playing games, at 5ghz all cores, with cache at 4.7. Temps are barely above 50C and fans are barely ticking over.

Unless I got super lucky and won the silicon lottery (unlikely), you must be doing something wrong to always hit high temps.
50W?
 
Yeah, if you’re running all threads full out all the time and are overclocked without speedshift and adaptive voltage. But, you typically are not.

As mentioned many times my 9900k uses 50w peak while playing games, at 5ghz all cores, with cache at 4.7. Temps are barely above 50C and fans are barely ticking over.

Unless I got super lucky and won the silicon lottery (unlikely), you must be doing something wrong to always hit high temps.

What AVX are you running? 5 Ghz all core means nothing if AVX is offset. Also what voltage? You'd be the first guy I ever heard say the 9900K@5 ghz uses little peak power and doesn't get overly hot. In fact here's what Tom's had to say: https://www.tomshardware.com/reviews/intel-core-i9-9900k-9th-gen-cpu,5847-11.html

But power becomes more of an issue in some productivity applications because a constant load on all cores at high clock rates is almost too much. And to be clear, the Core i9-9900K gets super hot faced with Prime95 and AVX instructions (205W stock, 250W overclocked), exceeding the specified TDP.

We measured 137W (232W) during the Cinebench test, and we topped 145W (241W overclocked) under the larger Blender workload.

And before you say "oh that's with constant workload and AVX" well guess what? Most new games utilize AVX and even Apex Legends now pushes the CPU hard with AVX on. I find it very deceptive when people sometimes try to claim, "oh my 9900k@5/5.1 ghz runs so nice and cool" when they're running non-AVX games and/or benchmarks and then also fail to disclose they have AVX offsets w/their so-called golden silicon chips.
 
Last edited:
What AVX are you running? 5 Ghz all core means nothing if AVX is offset. Also what voltage? You'd be the first guy I ever heard say the 9900K@5 ghz uses little peak power and doesn't get overly hot.

Avx 0 offset. As in AVX 5ghz. I'm running 1.24v (-75mV offset). It does get a bit hot when under load on all cores, but not overly so, remember I am running a D15, and have speedshift enabled, as well as core parking etc. (see my thread on it)
 
Last edited:
Avx 0 offset. As in AVX 5ghz. I'm running 1.199/1.2v (-75mV offset). It does get a bit hot when under load on all cores, but not overly so, remember I am running a D15, and have speedshift enabled, as well as core parking etc. (see my thread on it)

1.19v is basically a golden sample CPU if it can hold that at constant workload with AVX on. So your sample doesn't represent the average 9900K. Nor does it counter my claim that the IHS is pure trash. If they had Ryzen 3000 series back when I built this I'd have jumped on it but right now I can't wait to ditch this garbage and go Ryzen w/the upcoming release. As far as I'm concerned, there's zero use case for Intel CPUs anymore since AMD has them easily beat.
 
1.19v is basically a golden sample CPU if it can hold that at constant workload with AVX on. So your sample doesn't represent the average 9900K. Nor does it counter my claim that the IHS is pure trash.

I didn't claim anything re IHS etc.. I only can speak for what I have in front of me.
 
I didn't claim anything re IHS etc.. I only can speak for what I have in front of me.

That's fine but you tried to make it sound like your 9900k was somehow the norm and not the exception in your initial reply when it clearly isn't and every hardware review contradicts that.
 
Yeah, if you’re running all threads full out all the time and are overclocked without speedshift and adaptive voltage. But, you typically are not.

As mentioned many times my 9900k uses 50w peak while playing games, at 5ghz all cores, with cache at 4.7. Temps are barely above 50C and fans are barely ticking over.

Unless I got super lucky and won the silicon lottery (unlikely), you must be doing something wrong to always hit high temps.
50W @ 5GHz?
 
And before you say "oh that's with constant workload and AVX" well guess what? Most new games utilize AVX and even Apex Legends now pushes the CPU hard with AVX on. I find it very deceptive when people sometimes try to claim, "oh my 9900k@5/5.1 ghz runs so nice and cool" when they're running non-AVX games and/or benchmarks and then also fail to disclose they have AVX offsets w/their so-called golden silicon chips.

There's a huge difference between running prime/cinebench/etc, and running a game. I'm specifically referencing Doom Eternal and AC origins and yes they both use AVX.

Urm.. No need to get aggressive about it.. as said. just presenting the facts as I see them with my hardware, as always YMMV!
 
Last edited:
50W @ 5GHz?

yes. When gaming. I'm 100% positive it's not 50W consistently, and 100% positive it's not running at 5ghz consistently either as, as mentioned, I have speedshift enabled and adaptive voltage enabled. And I'm sure too that even when I'm video encoding it's not running 5ghz constantly either. Read my thread here: https://hardforum.com/threads/quick...6xxx-7xxx-8xxx-9xxx-10xxx-processors.1996926/

I daresay many if not all reviews don't have HDC or Core parking enabled, and some won't have speedshift or all C states enabled either.
 
Last edited:
Clearly I was mistaken during busy scenes - benchmark screenshot attached.. 50-60W, and on the voltage front too (1.243v AVX peak - so not a special golden snowflake). Still far from 150-220w
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    219.1 KB · Views: 0
Last edited:
And before you say "oh that's with constant workload and AVX" well guess what? Most new games utilize AVX and even Apex Legends now pushes the CPU hard with AVX on.

I don’t have Apex Legends, I’ll grab it and test (If it is free). Have shadow of the tomb raider if you want me to give that a whirl.
 
yes. When gaming. I'm 100% positive it's not 50W consistently, and 100% positive it's not running at 5ghz consistently either as, as mentioned, I have speedshift enabled and adaptive voltage enabled. And I'm sure too that even when I'm video encoding it's not running 5ghz constantly either. Read my thread here: https://hardforum.com/threads/quick...6xxx-7xxx-8xxx-9xxx-10xxx-processors.1996926/

I daresay many if not all reviews don't have HDC or Core parking enabled, and some won't have speedshift or all C states enabled either.

Then your cpu never sustains 5 ghz all core with avx because you have c states, speedshift enabled and even turbo boost limit not maxed out. Download hwinfo64 and run an encoder that uses the cpu (eg davinci resolve) for an mp4 and show your cpu core speeds and temps. I also want to see Apex on all cores with temps shown. On mine with 1.38v it averages 68C but there are spikes when the game loads where the cpu hits 82C. If I encode a large mp4 in davinci resolve using the cpu it can hit 90C with a big file and all cores pegged at 5 GHz.
 
Last edited:
I would look into how a new CPU would compare to using your 2080ti for video editing. If the 2080ti is going to be faster than any of the CPUs for editing you will have no use for so many cores.
 
Then your cpu never sustains 5 ghz all core with avx because you have c states, speedshift enabled and even turbo boost limit not maxed out. Download hwinfo64 and run an encoder that uses the cpu (eg davinci resolve) for an mp4 and show your cpu core speeds and temps. I also want to see Apex on all cores with temps shown. On mine with 1.38v it averages 68C but there are spikes when the game loads where the cpu hits 82C. If I encode a large mp4 in davinci resolve using the cpu it can hit 90C with a big file and all cores pegged at 5 GHz.


Going to move discussion to the other thread - downloading benchmarks and stuff now
 
Pick your board after checking out the Ryzen 3/4/5 series chipset AM4 Vcore Ratings spreadsheet Emig5m

Some boards are very much up to the task, some very much not.

I went with an X570 Aorus Ultra, and she's been fine w/ a 3950X thusfar.
 
Ok first off... frequency isnt everything.

Ryzen 3900x has GOBS of more cache than a 6700k. Its git a far newer architecture, improved in every way than an old 6700k.

Were talking night and day performance difference in all meterics not just clock speed.

Ditch the 6700k and get a 3900 or 50x and your productivity will increase DRASTICALLY.
 
I would look into how a new CPU would compare to using your 2080ti for video editing. If the 2080ti is going to be faster than any of the CPUs for editing you will have no use for so many cores.

Nope ... 2080ti is good at one thing in editing and that is outputting a final video with a shittier quality than a cpu.

My 3960x is just as fast as my 2080ti quality wise in Davinci Resolve studio. If I crank the quality up in Davinci my CPU is slower than the 2080ti but the quality shames nVENC.

Nvenc uses virtually no CPU at all unless your using AVISynth for cpu decode. Why? Just use Cuda for both.

You can get the same nvenc performance on a Sandy Bridge from 2012 or a 3950x from 2019. Lanes done mean shit, cpu just needs to run storage fast enough to feed the gpu video data.

Nvenc and cuda are like coin mining. 1x pcie lane is all you need and cpu is a lightweight brain.

Cpu is going to make a huge difference in doing editing tasksand ram is even more important than that in the big picture.

This is a general statement but
Video editing is not handbrake lol
Adobe Premier and DResolve is editing.

If all one is going to do is Handbrake then GPU is all you need. But real editing you need gobs of cpu and ram and a fast gpu is highly rexommended for offloading tasks.
 
Last edited:
Yeap, gpu encoding still sucks compared to cpu. But for the op, making videos for youtube is hardly difficult. You don't wanna be like tech jesus and make a threadripper for youtube lolz.
 
Are you guys sure GPU encoding still sucks? I find that as long as I keep the bitrate close to the original file that it looks nearly identical? It's when I use a pre-configured profile is when the video quality degrades. Youtube degrades video quality more than anything and Facebook drops 4k down to 720p! But my local rendered files look pretty much as good as the originals using GPU encoding and sticking to a similar bitrate. Maybe that was a problem on older GPUs and software? Anyway, real time smooth high quality 4k/x265 previews on a high quality video editor under $100 (I pay around $50 to upgrade from one version to the next) without having to make proxies is a game changer for me!
 
Are you guys sure GPU encoding still sucks? I find that as long as I keep the bitrate close to the original file that it looks nearly identical? It's when I use a pre-configured profile is when the video quality degrades. Youtube degrades video quality more than anything and Facebook drops 4k down to 720p! But my local rendered files look pretty much as good as the originals using GPU encoding and sticking to a similar bitrate. Maybe that was a problem on older GPUs and software? Anyway, real time smooth high quality 4k/x265 previews on a high quality video editor under $100 (I pay around $50 to upgrade from one version to the next) without having to make proxies is a game changer for me!

That there... you know shows the lower efficiency of gpu encoding. GPU encoding is fast for sure but it doesn't look as good so if you are going for quality you lose on size reduction.
 
Back
Top