6700k Itching to upgrade (I'm using a 2080Ti@1440p and do some 4k video editing) - 3900x or keep on rocking my 6700k?

Are you guys sure GPU encoding still sucks? I find that as long as I keep the bitrate close to the original file that it looks nearly identical? It's when I use a pre-configured profile is when the video quality degrades. Youtube degrades video quality more than anything and Facebook drops 4k down to 720p! But my local rendered files look pretty much as good as the originals using GPU encoding and sticking to a similar bitrate. Maybe that was a problem on older GPUs and software? Anyway, real time smooth high quality 4k/x265 previews on a high quality video editor under $100 (I pay around $50 to upgrade from one version to the next) without having to make proxies is a game changer for me!
Sucks is a relative term. The quality is better than it was way back when, but you still need higher bitrates on GPU to maintain as good of quality as CPU. The difference has closed a lot though, so the speed benefit is nice in situations where you don't mind slightly larger files or slightly lower IQ. For video editing in real time, GPU is best. For a final encode of something probably best to use CPU still depending on if you value time, HDD space or video quality the most. It's not as black and white as it used to be, I'm looking to s itch my Plex serve over to real time GPU transcoding for supplying video, but my original video source will still be from CPU. Right now a real time transcoding takes a bit of oomph from my CPU that would be nice to offload (not that I do so often or even really notice any difference).
 
That there... you know shows the lower efficiency of gpu encoding. GPU encoding is fast for sure but it doesn't look as good so if you are going for quality you lose on size reduction.

I agree, NVENC is faster, but software encoding provides better quality and smaller files. That's to be expected, in general hardware encoders shoot for speed at the cost of some quality and bit rate by design.
 
I gotta thank the OP for this topic. I am literally in the same boat but with a 6800k. Based on what I've read here, I think I have more than enough juice int he tank to wait until the AMD 4 series vs Intel Ice Lake (or whatever) juicy battlethat will be unfolding by end of year or 2021.
 
I gotta thank the OP for this topic. I am literally in the same boat but with a 6800k. Based on what I've read here, I think I have more than enough juice int he tank to wait until the AMD 4 series vs Intel Ice Lake (or whatever) juicy battlethat will be unfolding by end of year or 2021.

Frankly you'd wait to see how 8c console affects game Dev. Will they use cores for non gaming like enabling better streaming/recording/etc or will we see Ray tracing and +6 physical cores being pushed?

Work side it's a wash. If you are making $ you buy the tools that do the job best and increase invoice velocity. Luckily what I do they're opening up more towards a gear allowance rather than handing me a MacBook Pro and a Yubikey.
 
If it's any consolation, I just ordered a b550 + 3700x for my son's desktop... He's currently on a 6600k. Expecting a pretty big difference for CPU limited work loads (blender, Photoshop) and a bit higher frame rates (and lower CPU usage) @ 1080p (just got him an msi Samsung144hz qled 27" curved freesync gaming monitor, 1080p 1ms) but with his RX 570 he'll probably GPU limit pretty quickly. Was going to put my 1600 in and take the 3700x but realized b550 only support zen2 and up and 3300x are still MIA. Guess he'll have a better PC than me for a bit.
 
Last edited:
My son has 3600 and 5600xt. Smokes 1080p like with ease.

When you say video editing are talking Filmora low level stuff or are we talking bigboy BlackMagic stuff? If Davinci then the fastest everything you can afford is necessary.
 
That there... you know shows the lower efficiency of gpu encoding. GPU encoding is fast for sure but it doesn't look as good so if you are going for quality you lose on size reduction.

I know this is an old thread but wanted to chime in with most of what I've been reading or watching on youtube about NVENC on Turing cards is that it's vastly improved over it's predecessors. To the point that it's perfectly good enough to use for the hobbiest or youtuber.
 
What I would do is try running your editing workload on a friends newer CPU/ PC to see if the upgrade is worth it time and cost wise.
 
I would imagine that your GPU alone costs more than the entire rest of your current system combined, I say go all out on a 3950X/10900K.
 
If the main function is gaming one might as well just stick to the 6700k. I upgraded from a 7700k to a 3700x with a 2080ti and it was a sidegrade at best.
 
If the main function is gaming one might as well just stick to the 6700k. I upgraded from a 7700k to a 3700x with a 2080ti and it was a sidegrade at best.

I challenge everyone who keeps saying this, to load up Satisfactory and download ImKibitz's savemap.
 
I challenge everyone who keeps saying this, to load up Satisfactory and download ImKibitz's savemap.

I don't think that anyone would deny that under "certain" bin picked conditions that there won't be any difference at all, it's just that the majority of the time you probably won't notice anything worthwhile. On the Steam forums I seen someone complaining that their 9900k with a 2800Ti was dipping down to 30fps in Horizon Zero Dawn and here I am with a old and lowly 6700k setting steady at around 100fps blowing their doors off. I could go pick up a brand new 10850k today (which I might actually do for nothing other than to ease my curiosity, and you only live once, hehe) but I know a new mobo with possible flaky un-optimsed drivers, firmware, etc. would be all that it would take to make faster and newer hardware perform slower than my old stuff. It's like I was watching a benchmark video on Youtube and I was getting the same exact framerates as a system with a 9900k in the new Jedi game. So it's like if you have to bin pick a select few situations where it shows improvement, is it really worth the money and time to dink around with clean installs, setting everything back up, etc.?
 
Ok so I got a 10850k, MSI Z490 mobo, and 32GB of DDR 3600 and what a joke - not even one frame per second more at 1440p! Definitely still GPU bound even with a 2080Ti! 15 days to return at Microcenter so I'll probably return everything and get a RTX 3090. Anyone out there with the 6700k era CPU's just hold onto it a little longer and let the GPUs play catch up (unless you play at 1080p with all graphics turned down).
 
Ok so I got a 10850k, MSI Z490 mobo, and 32GB of DDR 3600 and what a joke - not even one frame per second more at 1440p! Definitely still GPU bound even with a 2080Ti! 15 days to return at Microcenter so I'll probably return everything and get a RTX 3090. Anyone out there with the 6700k era CPU's just hold onto it a little longer and let the GPUs play catch up (unless you play at 1080p with all graphics turned down).

Not very surprised. The difference in many games is absolutely nothing. What games did you test it with?
 
Not very surprised. The difference in many games is absolutely nothing. What games did you test it with?

Horizon Zero Dawn and Monster Energy Supercross 3. Even more sadly, I don't even really see any perceivable boost in video or photo editing and my video editor will max all 10 cores but it is also GPU accelerated. I am keeping the system/parts though for other reasons (more stable, old system would crash in Horizon Zero Dawn and Vegas Video). I recon the CPU will be more utilized with the RTX 3090 but I have no expectations. The 6700k is still one hell of a processor IMO.
 
Man thanks for posting this and doing the leg work for me. I'm in an identical situation. 6700k (down at 4.4ghz due to recent stability issues) and 2080Ti that I'm using for 1440p @144hz. I've been waiting to see what the next Ryzen chips do but now you've got me second guessing myself...
 
My 6700k only does 4.4Ghz so that's my base on what I'm looking to improve from (I feel like I've been using this CPU forever, definitely my longest lasting CPU)
I game mainly at 1440p and like to crank on the eye candy as high as possible on a 144Hz G-Sync monitor but sometimes play on my 4K HDR 60Hz TV using a ROG RTX 2080Ti - I will never play or worry about 1080p again so if the only difference would be at 1080p with graphics turned down I'll definitely skip upgrading

I'm considering AMD because the Intel K series is sold out everywhere I shop.

I also do 4k video editing on Movie Studio Platinum (Thinking about upgrading to Vegas Pro) - But gaming is 1rst priority and what's more important to see a boost in - the rest would be icing on the cake

Motherboard I'm looking at is the ASUS X570 TUF Gaming Plus (WI-FI) AMD AM4 ATX Motherboard - is this a good choice?

I haven't used AMD since the Athlon64 so I really don't know what's more important with ram on these new AMDs - MHz or Latency? Would it be best to get the fastest Mhz ram (like say DDR 4400) and run it at a lower speed (DDR 3600) but at tighter memory timings? Recommend me some ram, preferably something that Microcenter carries in stock so that I could pick it up this weekend.

Bro!

Wait a bit for the AMD 4000 series. They're coming soon! I'm using a i7 6700k as well with my GTX 1080. I'm still loving this older tech and I believe 4K is still overrated for gaming right now so I'm waiting a year or so to get an RTX 3060 or 3070. Bottle necks, smottle necks...I'm not at all worried about that right now. Ray tracing isn't ever that improved with the Nvidia 3000 series over the last generation so I can wait and squeeze out any remaining fun from my 1000 series gpu. Games still play and look great in 1080p right now on this new panel I just bought!


Dude
 
Bro!

Wait a bit for the AMD 4000 series. They're coming soon! I'm using a i7 6700k as well with my GTX 1080. I'm still loving this older tech and I believe 4K is still overrated for gaming right now so I'm waiting a year or so to get an RTX 3060 or 3070. Bottle necks, smottle necks...I'm not at all worried about that right now. Ray tracing isn't ever that improved with the Nvidia 3000 series over the last generation so I can wait and squeeze out any remaining fun from my 1000 series gpu. Games still play and look great in 1080p right now on this new panel I just bought!


Dude

4k isn't overrated if you have a large screen. On my 75" HDR 4k screen it will still show some alaising without anti-aliasing at 4k but otherwise it looks breathtaking. On smaller screens such as 27" I find 1440p the sweet spot. 1080p really looks chunky to me now.
 
I actually upgraded from a 6700K to a X3900. Keep the 6700K for now. Unless you are rendering or doing video editing, its not worth the upgrade for gaming. Wait for Zen 3. 20% improvement.
 
Back
Top