6700k paired with 3080 (bottleneck?)

legcramp

[H]F Junkie
Joined
Aug 16, 2004
Messages
11,414
Forget 6700k, even 4790k is still decent for gaming and shouldn't hold back a 3080 much at 4k.

No it's not.. I just sold my i7 4770 for a Ryzen 3600 in my living room. It was holding back my RX 570 4Gb at 1080p on many games, stuttery mess.
 

dpoverlord

[H]ard|Gawd
Joined
Nov 18, 2004
Messages
1,810
I have a 5930k at 4.45ghz

Everything is fine at 4k except my 4k nodded skyrim.

I want a 3090 and debating a 144hz big screen or the new LG for GPS games.

Oddly the TV won't do what I want still from my Samsung 4k.

Lesson: always something better. Do you want to spend 1+k = max return and have the cash, go 3090/3080.

Want a few benchmark fps? Get the chip.

If you came and said you were on x58 we could say yeah but even X99 is still rocking hard today.

I'd definitely wait for something more exciting on the mobo / chip realm... But hey that's why Intel is now behind. (kina)
 

dewi_imut

Weaksauce
Joined
Jan 16, 2007
Messages
72
Hello everyone,

I usually upgrade my gpu every other generation. I currently have a 6700k (overclocked 4.6) with a gtx 1080 and plan to game on my 4K tv (120 hz, g-sync) via hdmi 2.1. Do you think a gtx 3080 would be bottlenecked due to my older sky lake cpu at 4K resolution. My current specs are:

i7 6700k at oc’d 4.6 mhz
1080 gtx
16 GB Ram DDR4 3000
Asus Hero Mobo (pci express 3.0)
750 watt psu

I would prefer to keep the rest of my pc and replace just the gpu unless bottlenecks are inevitable. Thanks in advance.

I think you already on the minimum comfortable spot with that VGA.
If you plan using 4K, even RTX 3080 will be hardly reach >60fps with games in high setting, you'll need RTX 3090 for 4K 120Hz.
I also doubt that RTX 3090 can play average 120fps on high setting in 4K.
 
Joined
Jan 5, 2018
Messages
2
Using Gigabyte RTX 3080 Gaming OC here with overclocked 6700K - 4.6 GHz. The overall performance in benchmarks matches the average. Serious bottlenecking happens only in very crowded places of Night City, where frames can dip to 40 with CPU at 99% and GPU at 70%. I'm playing at 3440x1440. Other places of the game I get 60-75 FPS, otskirts is 80-90 FPS. All maxed out, RT on Ultra, DLSS on Balanced.
Apart from that no game is showing this. The performance boost over previous card (Aorus GTX 1080 Ti) is insane.
Deus Ex MD, TW3, Control, XCOM2, Hellblade, PUBG, GTA5 all is maxed out and most are locked at 100 FPS or higher (if g-sync turned off). In GTA5 I made some insane MSAA settings and it's smooth as butter at all times.
Glad about the decision to hit the card 3 days after release. The good old Skylake can pull it quite well. Will be replaced, but delivers better than expected.
 

SOAREVERSOR

Limp Gawd
Joined
Apr 10, 2017
Messages
367
I moved from a 7700k to a 10850k and it's night and day. But I keep a lot of shit running for a long time. I had a bunch of pcie 3.0 ssds based off u.2 and optane so AMD wasn't an option, also my cooling solutions. I'm happy with it but it's an EOL system at this point (2080 super) and I'll camp on it till pcie 5.0 and new things.
 

CAD4466HK

[H]ard|Gawd
Joined
Jul 24, 2008
Messages
1,232
Yeah CPU bottlenecks are real, I found out the hard way when I upgraded from a 1060 to a 2060. My 2600K @ 4.6GHz pushed the 1060 for all it was worth at 1080p/120Hz.

But soon as I got the 2060, I never got more than 76% usage out of it unless I put it at 1440p. I was lucky to see more than 10-15 frames gained over the 1060. Dropped a few bones and got me a 10700K and was blown the fuck away with the performance at stock vs. the 2600K with the same card.

I still can't get over the fact, the staying power 4c/8t CPU's retained all these years though. Who would of thought we could use a 10y old CPU to game with?
 

Geezus

Limp Gawd
Joined
Apr 9, 2018
Messages
314
I went from a 2500k to 6700k and saw a huge difference on my 1080 ti at the time.

Now I went from 6700k (4.6ghz) to 5600x with a 3080. Other than windows feeling a little snappier I don't see any fps gains in 1440p or in 4k (maybe 5%?)

Maybe when more multi-threaded games come out next year (because of new consoles) it'll make a difference, but for right now it wasn't worth it IMO.
 
  • Like
Reactions: yowen
like this

III_Slyflyer_III

Limp Gawd
Joined
Sep 17, 2019
Messages
316
After gaming at 4K with my 3090 for over a month now in my overclocked 5960x, I am 100% waiting till LGA1700 and DDR5. No reason to sink money into a new built right now with the crazy prices and somewhat dead end platforms (after Intel 11 anyway). Can't say I am familiar enough with AMD these days to know if the next series will be a new socket as well.

I figured jumping on DDR4 when it first came out with X99 has turned out to be an amazingly long lasting move for me, maybe jumping LGA1700 with DDR5 will prove to be the same kind of long lasting jump.
 
Joined
May 20, 2016
Messages
853
It's only a problem if your goal is to game above 60fps I would say in most cases. If you plan to upgrade to an HDMI 2.1 monitor in the near future then you should do a CPU upgrade to a 8 core or higher at that point.
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,905
you might consider getting a 6950X off ebay to breathe more umpf into your x99 board. 10core/20 thread.
6950X are beasty CPUs and only <$300 on ebay. That'll tide you over till AM5 easy (or until Intel releases something that actually makes a wave). Or frankly -- more objectively -- until AMD or Nvidia can release a grapics card that isn't the bottleneck at 4K, because the 5th and 6th Gen Intel CPUs being discussed here in this x99 thread are not the bottleneck at high resolution gaming.

I'm pretty convinced I have zero reason to upgrade right now from my 6950X gaming at 3440x1440 - except boredom. Every time I look at game benchmarks at 4K with various CPUs and a top tier video card - there's just nothing to gain.
 

undertaker2k8

[H]ard|Gawd
Joined
Jul 25, 2012
Messages
1,031
Don't forget that min FPS are massively impacted by the CPU, saw a big jump in CP2077 when moving from 5960x to 5800x.
From 5800x to 5900x nada but I knew that already and wanted it anyways lol. And yes ironic naming schemes between Haswell and Vermeer...
 
Last edited:

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,905
Don't forget that min FPS are massively impacted by the CPU, saw a big jump in CP2077 when moving from 5960x to 5800x.
From 5800x to 5900x nada but I knew that already and wanted it anyways lol. And yes ironic naming schemes between Haswell ans Vermeer...
What games what mins?

I recently put 110 hours in cyberpunk 2077 (arguably the most punishing to hardware game there is). with my Intel 6950x. If didn’t see any behavior on minimums that was alarming, or even visible and I played with the windows 10 Xbox overlay FPS frame counter up the whole time. The game ran surprisingly well with all settings maxed out and ray tracing in psycho at 3440x1440 and about mid 50s FPS average (with Gsync). Not sure frame times should be any significant difference between a Skylake or Broadwell 8 or 10 core and a modern 8 or 12 core CPU at 4k resolution.

3090 and various CPUs at 4K
1612964572974.png


Vega 64 and various CPUs at 1080p
1612964939216.png
 
Last edited:

undertaker2k8

[H]ard|Gawd
Joined
Jul 25, 2012
Messages
1,031
Those canned tests don't capture the whole game, my frame rate on the 5960x at 4 Ghz would drop down to mid 30s in certain sections with lots of NPCs, same 3080, stayed about 50 in those exact same areas with a Zen 3, overall FPS per level probably didn't change too much. Running 3840x1600 UW maxed out...
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,905
Those canned tests don't capture the whole game, my frame rate on the 5960x at 4 Ghz would drop down to mid 30s in certain sections with lots of NPCs, same 3080, stayed about 50 in those exact same areas with a Zen 3, overall FPS per level probably didn't change too much. Running 3840x1600 UW maxed out...
Only place I saw 30FPS range was in the forrested city park area of the Glen. You aren’t getting 30FPS in that area now with all max settings on your Ryzen 5900x CPU?

Post 2280
https://hardforum.com/threads/cyber...ieres-june-25.1998205/page-57#post-1044895126
 
Last edited:

undertaker2k8

[H]ard|Gawd
Joined
Jul 25, 2012
Messages
1,031
Will have to test but the 5800x was def about 10-15 fps better for mins in that section, expect 5900x to be the same. I never replay a game...currently on medium, ghostrunner was just too frustrating lol.
 

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,905
Will have to test but the 5800x was def about 10-15 fps better for mins in that section, expect 5900x to be the same. I never replay a game...currently on medium, ghostrunner was just too frustrating lol.
Do test for me please because thats the only section of the game for me that reliably dipped below the 50FPS range and into the 30s. If it really makes a difference there that’ll be interesting. For me you can see in my screenshot that area used LESS CPU. I think it was getting hammered with the Psycho ray tracing setting (all max settings) and all the leaf/foliage the GPU had to render, which seems to free up the CPU load. My GPU in Cyberpunk was normally 90-97% and my CPU 60-80% But in that scene (the lowest FPS in the game by far) my GPU was 100% and my CPU was 45%.
So I’m confused on why a faster CPU would benefit?
1612966142526.png
 

III_Slyflyer_III

Limp Gawd
Joined
Sep 17, 2019
Messages
316
One thing most "benchmarks" on sites do not compare is; overclocks on the CPUs they test when comparing across a line. If you can run your 5960x at 4.5Ghz+ or your 6950x at 4.3Ghz+, you can still play ball with most modern CPUs, very much for gaming performance anyway at 2k and 4k.... even 1080p honestly if you are pushing more frames than you can see anyway. When I compared my heavily OC'ed 5960x to other CPUs in Cinebench, I was up there with a 3700X and 8700k, which are very much more "modern" CPUs. The x99 platform had some weaker IMC's, but later batches of Haswell-e and Broadwell-e had some good IMCs that could do 3200Mhz on the DDR4 with good timings pretty easily (which is a great price/performance range for DDR4 for Intel).

I have also put in a ton of hours on CP2077 at 4K, everything maxed, Psycho RT and with balanced DLSS and Slow HDD Mode = On (removes all stuttering on SSDs and NVMEs too); the game is smooth as butter (generally 50+ FPS). Like the last post, it chokes only in that park within the city; and that's likely the psycho RT at work there. Other games I play like BFV at 4K, with RT on, I am almost always pegged at 120+ FPS, with dips into the low 100's / high 90's in Heavy RT maps like Rotterdam and Solomon Islands.

Want proof of how "non-limiting" some older high end platforms can be? Check out our very own thread in these forums...

https://hardforum.com/threads/rtx-3090-h-owners-official-3dmark-time-spy-leaderboard.2003111/

I am still within the Top 10 for Graphics score in Timespy with my 3090; which is actually a 2K test, not 4k! Up there with 10 series Intel's and 5xxx series AMDs.

I do not think anyone would deny a newer platform is always better... you will eventually gain Smart Access Memory for your GPU and maybe someday PCIe 4.0 will actually make a difference for video cards (does not matter at all now). But for anyone with a HEDT x99 or x299 setup with a good OC, I would wait to 2H of 2021 for DDR5 and maybe PCIe 5.0 at this point. As of right now; most games will run perfectly fine at 2k & 4k on these older CPUs. I would say you need 8C/16t minimum though with a good OC.
 

evhvis

n00b
Joined
Feb 12, 2021
Messages
34
A 7700k was a massive bottleneck in modern games with a 3080. Got my GPU before getting my CPU so had to run on the 7700k (5 weeks delivery time even though order 2 minutes after launch on CPU). Much smoother framerates with the new CPU and in some games 70% increase in framerate 1440p. Even in cyberpunk 2077 with raytracing on ultra and dlss off, the difference was about 15% with lots of people in the scene. With DLSS the difference was closer to 50%. While the graphics card does a lot more of the work it will still be a problem in 4k as the stuttering is the CPU not being able to processs data fast enough. Higher res means more drawcalls etc. and an overloaded CPU will struggle. When GPU bound the the min framerates will still suffer. The problem is lack of threads on the 6700k and 7700k. An 8 core, 16 thread you should be fine and probably on 6 core 12 thread, even with lower clocks, but 4 core 8 thread is too low.
 

WilyKit

n00b
Joined
Dec 18, 2020
Messages
53
I'm also with 6700K and will continue to do so, until Intel actually makes some kind of advancement in CPU-front. 10900K's benefit would have been minimal and this is over FOUR years after my 6700K-build.
Considered AMD too for a while, but don't want to go with AM4, which is also already 4 years old.
I'll wait until AM5, unless Intel gets their sh*t straight before that.

3090 will be my GPU and I don't expect much difficulties with current 4-year old CPU-Mobo-combo. This older PCIe generation might bottleneck at with very high framerates, which won't be a problem for 4K gamers.
Big Navi might be good, but they're 1.5 months late with their release, so they will lose a lot of enthusiast's money to Nvidia.
The older PCIe generation is the LEAST of your bottlenecks
 
Top