Xeon bottleneck 1070?

Luca1

Limp Gawd
Joined
Nov 14, 2008
Messages
349
Hi,

I'm running a Xeon 5670 @ 3.8gHz, sometimes 4.2 depending on the task.

Is this CPU likely bottlenecking my Asus GTX 1070 STRIX OC?

I ran the Valley benchmark with the task manager open on the performance tab to keep an eye on the CPU, and GPU-Z to watch the GPU and my GPU was hitting 99% usage while the CPU was around 30-40%. Does this confirm in general there is no bottleneck happening?

Is that how you would generally test this type of thing?

Cheers!
 
that depend entirely on the game and resolution you are playing at, some games are pretty CPU demanding (while Valley is kinda far from that) and even at 2560x1440 would be causing bottleneck to that GPU, much more aggravated at 1920x1080.. so yes, depending on the game you can be from minor bottleneck to severe, it's just a thing to play some games of your likes and see how it behave. I recommend you to use the MSI Afterburner OSD as you can do full monitoring of both GPU load and each Core CPU Load, if you see even a single Core reaching constantly 80%+ load then you are likely to be CPU bottlenecked in that game.
 
It might.

Running the i7-990x @ 3.47Ghz (no O-clock) and a GTX 1070 on a 1080p monitor. The only game I have seen so far that can cause this setup to stumble a bit is Dying Light with the video options all maxed out. 90% of the time it's fine, but with The Following DLC there are parts of the game in open fields with like 75+ zombies stumbling around. In these situations I see a stutter or two while the CPU is figuring out the positioning of a crapload of moving bad guys, waving grass, shimmering trees and leaves, and me, hacking them all to oblivion with a meat cleaver with blood and body parts flying.

I'm sure more processing power would cure it all, but I'm still waiting for a 1366 replacement that doesn't suck.
 
Check out my little test I did not too long ago. I know it's just a few applications, but still :)

not only few applications but only one game but also the conclusions are kinda wrong according to the table of results and the commentary should be better explained for example
Interesting fact is that both 3.67GHz and 4.2GHz overclocks on the Core i5 760 processor yielded almost identical results, meaning, a first gen quad-core processor above 3.67GHz mark is able to take full advantage of the GTX1070 GPU in GTA V

When the result show average 112FPS but on a simulated 4c/8t the result at the same 4.2ghz is 115FPS and with the 980X at same speed 121FPS, so yes, there's a bottleneck there specially as minimums aren't displayed which is important and valuable information.

I know you talked mostly about 3dMark but being those GTA V results there it could be achieved a totally different conclusion as you don't used another platform to see if even those 121FPS achieved by the X5670 at 4.2ghz mean any lower result than a more modern CPU.
 
there is no such thing as a 6c/12t CPU running at 3.8GHz being so weak that it hinders a mid range card like the GTX 1070 to unleash its potential. The opposite is more likely, i dare say it will be true over the next 2 graphics cards generations as well. any game that reaches 30% CPU utilization of an overclocked Xeon X5670 is pretty brutal CPU-wise. You would need something otherwordly CPU intensive like Arma or Civ V tostress this CPU, and even then, there is not much you can do with single socket builds to solve it that improving cooling to reach 4.2Ghz on the X5670 would not deliver better results.
 
there is no such thing as a 6c/12t CPU running at 3.8GHz being so weak that it hinders a mid range card like the GTX 1070 to unleash its potential. The opposite is more likely, i dare say it will be true over the next 2 graphics cards generations as well. any game that reaches 30% CPU utilization of an overclocked Xeon X5670 is pretty brutal CPU-wise. You would need something otherwordly CPU intensive like Arma or Civ V tostress this CPU, and even then, there is not much you can do with single socket builds to solve it that improving cooling to reach 4.2Ghz on the X5670 would not deliver better results.

you don't need to reach 30% CPU utilization =) you only need to reach 80% - 100% utilization on a single core/thread to be abysmally CPU bottlenecked.. and that's is less than 10% total CPU load in a 6c/12t CPU..

A GTX 1070 is more or less an equal performer card than a 980TI and still at stock clocks I found some games to be single thread bottlenecked by my 6700K.. and that's JUST with a 980TI, I need 4.6ghz to juice all the potential of my card overclocked to 1500mhz at 1080P. I know of few tittles being heavy CPU bottlenecked with a Oc'd i7 6700K Titan XP even at 1440P so yes, I know of lot of games that can be a trouble for that CPU.

The opposite is more likely, i dare say it will be true over the next 2 graphics cards generations as well.

Try with a Titan X Pascal again not with a low-end GTX 970.
 
Xeon W3690 and R9 Fury combo here, have never noticed any bottleneck in the games I've been playing
 
download and run process explore https://technet.microsoft.com/en-us/sysinternals/processexplorer.aspx
select your process/game you want to know about.
Go into thread CPU utilization
If any threads are hitting close 100%/logical cores then yes you are CPU bottle necked
If you overall CPU utilizations is hitting close to 100% then yes you are CPU bottle-necked


bottleneck is not a set in stone in regards to you hardware. it depending on the load pattern you have in your system/software. So the only precise way to do it is to actually look with your given load pattern.

I'm still working on a program that will easily do this for you, but really this take like 5m ins to do and its way more precise than asking some random dude that doesn't play your game/setting/resolution etc
 
there is no such thing as a 6c/12t CPU running at 3.8GHz being so weak that it hinders a mid range card like the GTX 1070 to unleash its potential. The opposite is more likely, i dare say it will be true over the next 2 graphics cards generations as well. any game that reaches 30% CPU utilization of an overclocked Xeon X5670 is pretty brutal CPU-wise. You would need something otherwordly CPU intensive like Arma or Civ V tostress this CPU, and even then, there is not much you can do with single socket builds to solve it that improving cooling to reach 4.2Ghz on the X5670 would not deliver better results.

I'm sorry but you clearly don't know how multithreaded software behaves in multicore systems
30% overall CPU utilization can easily be a CPU bottle neck on any quad core or above system.
And statements likes CPU X cant bottleneck GPU Y also show lack of understanding what is the cause of a bottleneck. without the software load in account you cant determine a bottleneck in a system.
 
Last edited:
Check out my little test I did not too long ago. I know it's just a few applications, but still :)

Some issues with your "test"
1 you are only testing 2 titles so it really not representative for a lot of games.
2: did i miss this? or did you not even state what resolution/settings you are running this ins


Different software load from different software or setting changes when a CPU becomes a bottleneck
older games and lower resolutions puts a harder demand on the CPU to being able to feed the GPU fast enough

Lets say yo have a GPU that can deliver 75FPS worth of data (Results from doing physics. collision detecting, AI, path-finding etc etc)
and your GPU can rock out 120fps in 800x600 and 90fps in 1280x1024 but only 50fps in 1920x1200

Then you are testing in 1920x1200 and saying hey this CPU is not a bottle neck for this graphics card even though it actually is in the 2 other situations.
The lack of information on how you tested makes your results pretty meaningless :(
 
2: did i miss this? or did you not even state what resolution/settings you are running this ins

I really thought I had added GTA V settings, thanks for it pointing out, going to add them to my post. I benched GTA V at 1680x1050 / No AA / and the rest pretty much maxed out, I'll specify the settings a bit later. When I have more free time I might do a similar test with more games as well as min FPS and average GPU/CPU utilization figures.
 
Thank you.
May i advice to also look at software thread utilization it will show a more direct results int regards to CPU bottlenecks.
the extra information will help analyse the results but might scare less tech savy person when reading the article though.
 
I'm sorry but you clearly don't know how multithreaded software behaves in multicore systems
30% overall CPU utilization can easily be a CPU bottle neck on any quad core or above system.
And statements likes CPU X cant bottleneck GPU Y also show lack of understanding what is the cause of a bottleneck. without the software load in account you cant determine a bottleneck in a system.

and you clearly did not understand that the OP is talking about GAMING with CPU X and GPU Y, hence my comment was based on how games behave with a 3.8GHz 6c/12t xeon, which most of the times is : "OMFG, i was not coded to take advantage of this many cores!". I even went the extra mile and pointed out 2 games infamous for high CPU usage.

back to the OP, [H]ere is an idiot's definition of bottlenecks:

1-CPU bottleneck: when your gaming experience improves with a faster CPU

2-GPU bottleneck: when your gaming experience improves with a faster GPU

Scenario 1 is rare: a very old game played at very low settings and resolution with a very new GPU pumping an obscene number of frames per second, but still below your otherwordly monitor capable of 200+ Hz refresh rates. While pumping this obscene fps, look at CPU usage, if it is 100%, then you MAY have a CPU bottleneck. The absurd number of frames can be improved with a faster CPU, but only if you are still below the monitor refresh rate.

Scenario 2 happens all the time, just pick a new game, put it at the highest possible resolution across a 5x1 multimonitor setup and give a try at Ultra settings. If your fps are below the monitor refresh rate and you CPU usage is below 100%, your gaming experience will improve with a faster GPU.

When i said that a 1070 can not be bottlenecked by an overclocked X5670, i assumed that:

A-The 1070 was bought for 1080p120Hz, 1440p60hz or 4k 60hz/non-ultra settings.
B-The 1070 will be played with new games, nothing more than 10y old.

Assuming both conditions, it is not possible to CPU-bottleneck a 1070.

Now pick a 200-240Hz monitor, a very old game at 1280x800 and there is a theoretical possibility that maybe a game hit 100% CPU usage before the monitor refresh rate is reached. But even then i doubt it will be a frequent situation for most games, because we are talking of a CPU with 4-6 times the computational power of the typical CPU capable of 60fps at 1280x800 that the game was originally designed for.

There is a last type of bottleneck, that i call engine bottleneck: a moronicly coded game that allows an obscene number of objects be calculated by complex algorithms at the same time, even when outside the screen. This happens in RTS games or games with complex logic. Arma and Civ V can suffer such bottlenecks if you create thousands of units. When this happens, the game slows down a lot, and usually the solution is outside the hardware options available: there is no single CPU solution capable of 60 fps in a Civ V savegame with so many units that it is capable of putting a 6c/12t system at 30fps.
 
even more angry wrong stuff because honor is more important than listening and learning.

and you clearly did not understand that the OP is talking about GAMING with CPU X and GPU Y, hence my comment was based on how games behave with a 3.8GHz 6c/12t xeon, which most of the times is : "OMFG, i was not coded to take advantage of this many cores!". I even went the extra mile and pointed out 2 games infamous for high CPU usage.

I understood. but you clearly still dont get that showing an example from one game in one resolution/setting does no converter to the same game in another resolution/setting.
so the conclusion you are drawing are just not correct.


1-CPU bottleneck: when your gaming experience improves with a faster CPU

2-GPU bottleneck: when your gaming experience improves with a faster GPU

Again if you had listened to what me and other had mention there is different ways of being CPU bottle-necked. and if you buy a faster CPU in one way may NOT help even though you are CPU bottle necked.
Again this is because you don't know how multi-core and multi-threads works.
If you new "faster" CPU consisting of more core but you are bottle necked in core-speed it does not improve the situation even though you have a CPU bottleneck.
So here your way to measure "1-CPU bottleneck" is just wrong and inconclusive
You simple view on how CPU works is not correct


if it is 100%, then you MAY have a CPU bottleneck
Wrong agian in this case it not a MAY but a sure thing. its a MAY when it above 100%/logical cores, once you hit 100% its a pretty certain thing.


If your fps are below the monitor refresh rate and you CPU usage is below 100%, your gaming experience will improve with a faster GPU.
Absolutist wrong again. This scenario COULD still be a CPU bottleneck
1: Lets say you are running a quad core CPU and its a 50% but this games is only has 2 CPU heavy threads. Then you are CPU bottle necked as both of the CPU heavy threads are eating all the CPU they can get and is wanting more but cant because your corespeed is to slow.
2: Whatever your fps is below you monitor refresh rate has nothing to say in regards to CPU/GPU bottleneck. Its at best some kind of thumbrule.


When i said that a 1070 can not be bottle necked by an overclocked X5670, i assumed that:
Assumptions you know what they do right...
Assuming things like this is why you are incorrect when you make global statements.



Assuming both conditions, it is not possible to CPU-bottleneck a 1070.
so if i find one game that is you will drop the attitude and try to learn instead of being a yelling fucktard?


Now pick a 200-240Hz monitor, a very old game at 1280x800 and there is a theoretical possibility that maybe a game hit 100% CPU

I Don't know why you NEED a high refresh rate for that. are you assuming people only play in sync mode ?


There is a last type of bottleneck, that i call engine bottleneck:

Irrelevant for the debate and also that not the only kind of other bottleneck so defiantly not last
Also not a bottleneck but an code inefficiency. The bottle neck happens because your xPU that it runs is not fast enough on its part of the load so in effect the other parts can't perform to their best.
 
To clarify in short.

It seems like you still think CPUs are one bucked of resources like back in the single cores time.
Back then you would be right.

But today we have multi-core CPUs and multi-threaded software and it get a quite a bit more convoluted than what you are explaining.
 
Back
Top