worth it to upgrade my older CPU?

I would recommend you to download MSI afterburner and activate frametime monitoring as combo (instead of text) as shown in the video below. Check for spikes and if games are fluid. Witcher 3 in Novograd is a good test spot. The youtube user in the video below have an I7-8700k coupled with a GTX 1050TI, not high average framerates, but smooth frametimes and therefore still very fluid in modern games. :)

I'm going to test out frametime monitoring later with some games I currently have installed on my system- Tomb Raider, Rise of the Tomb Raider etc
 
I haven't updated my sig...a few months back I got a 1440p 144hz G-Sync monitor...everything else in my sig is current...I'm running my i7 980x at stock...games I like to play are Tomb Raider series reboot, Arkham Batman series, Dark Souls, Witcher, Crysis, Fallout...and upcoming ones like Metro Exodus, Cyberpunk, Rage 2 etc

someone told me about a site called Bottlenecker and when I input my current specs it tells me: 'Bottleneck detected: Your CPU is too weak for this graphic card...Average bottleneck percentage: 11%'

http://thebottlenecker.com/
I guess I am the only one who agrees with that site, then? I don't agree with those who say to OC it. You can try but I don't think it will make much difference. I know more people on here know more than I do but when I looked up this cpu....it was good with a lower resolution, 1680 x 1050 and you are trying 1440p with it, now?

https://www.anandtech.com/show/2960/12

You can try to OC it but I don't think you will get the results you want....at least, I would be surprised. I would upgrade the cpu to something in that gen., used, or plan/invest in a new system - new cpu, memory etc. As for the gpu, it's fine. I would guess that one is fine with the 1070 and 1440p even with modern i5 / R5 processors.
 
I guess I am the only one who agrees with that site, then? I don't agree with those who say to OC it. You can try but I don't think it will make much difference. I know more people on here know more than I do but when I looked up this cpu....it was good with a lower resolution, 1680 x 1050 and you are trying 1440p with it, now?

https://www.anandtech.com/show/2960/12

You can try to OC it but I don't think you will get the results you want....at least, I would be surprised. I would upgrade the cpu to something in that gen., used, or plan/invest in a new system - new cpu, memory etc. As for the gpu, it's fine. I would guess that one is fine with the 1070 and 1440p even with modern i5 / R5 processors.

but if you look at the 1080p Far Cry 5 (Ultra) benchmark with a 1080Ti you see that the 980X averages 80fps (minimum 68fps)…so by bumping the resolution up to 1440p it would still average close to 60fps

https://www.techspot.com/article/1666-old-1000-cpu-vs-budget-ryzen/
 
Last edited:
I would recommend you to download MSI afterburner and activate frametime monitoring as combo (instead of text) as shown in the video below. Check for spikes and if games are fluid. Witcher 3 in Novograd is a good test spot. The youtube user in the video below have an I7-8700k coupled with a GTX 1050TI, not high average framerates, but smooth frametimes and therefore still very fluid in modern games. :)



how do I get the 1% and 0.1% frametime info to show up in the display like in that video?...I have them all checked in the settings...the frametime chart works fine but I don't see the last line underneath the chart (like in the linked video) showing my min, avg, max, 1% low and 0.1% low
 
In RivaTuner statistics, enable "show own statistics", open setup, and check "enable benchmark mode".

Nice tool right? :)
 
In RivaTuner statistics, enable "show own statistics", open setup, and check "enable benchmark mode".

Nice tool right? :)

thanks...got it...I didn't know you needed to run the benchmark for it to show up...I thought it would display on screen constantly like the rest of the stats like GPU Temp, Fan Speed etc

I've been using Afterburner for a long time but never realized they added the frametime and 1% features...good stuff...I tested it out with Tomb Raider and my frametime definitely has spikes
 
so I messed around with Tomb Raider (2013)...traveled to different areas etc...frame-times average from 18.5 ms to 22 ms (apparently the ideal frame-time at 60fps is 16.7 ms)...definitely some spikes but it's fairly straight most of the time...meanwhile my CPU load never goes above 30% while my GPU load is always 100%...I'm playing at 1440p maxed out except with SSAA 2X...seems to me like the GPU is more of a bottleneck here

I even tested things by lowering the in-game resolution to 1080p and my frame rates jumped by ~40 fps...or is it that since I'm a graphics whore who loves the highest image quality that it will always be the GPU that is the bottleneck when using high settings?
 
thanks...got it...I didn't know you needed to run the benchmark for it to show up...I thought it would display on screen constantly like the rest of the stats like GPU Temp, Fan Speed etc

I've been using Afterburner for a long time but never realized they added the frametime and 1% features...good stuff...I tested it out with Tomb Raider and my frametime definitely has spikes

Spikes can be caused by several things. Some are to be expected.
If the spikes are far apart and short (and otherwise its fine), it is often the game loading new assets.
If they are close, you need to see what is going on on the screen. Memory issues can be one cause. If you remember the 970 3.5GB issue, where the game used the slower 0.5gb part of the GPU memory, people got stutter. Same goes if the game wants more GPU memory then what you have. Then it needs to use slower system memory. Too little system memory and also too slow system memory (some games really suffers if you use single channel memory instead of dual channel as example) can cause spikes.
Then its the game engine itself or software in the background. Even afterburner can cause stutter in some cases.
To narrow it down to CPU, select parts of the game that is more CPU intensive.
Crowds for instance is a good place. Lots of AI, collision detection, sound, position calculations and so on.
Other places are where explosions and lots of things going on at the same time that the CPU needs to account for.

Example (from the games you play):
Witcher 3: Novograd, daytime with lots of population
Crysis 3: Welcome to the jungle with grass physics
Fallout 4: Diamond City

I don´t know about Tomb raider spots, though I have all the games and love them. Tomb Raider :)

Should be pretty smooth though on 1440P:
 
so I messed around with Tomb Raider (2013)...traveled to different areas etc...frame-times average from 18.5 ms to 22 ms (apparently the ideal frame-time at 60fps is 16.7 ms)...definitely some spikes but it's fairly straight most of the time...meanwhile my CPU load never goes above 30% while my GPU load is always 100%...I'm playing at 1440p maxed out except with SSAA 2X...seems to me like the GPU is more of a bottleneck here

I even tested things by lowering the in-game resolution to 1080p and my frame rates jumped by ~40 fps...or is it that since I'm a graphics whore who loves the highest image quality that it will always be the GPU that is the bottleneck when using high settings?

On higher resolution, the game is more often then not GPU limited, meaning that the framerate is bottlenecked by the GPU. But, you can be GPU limited and still have CPU bottlenecks. Think of it this way: GPU is drawing the game world, while the CPU brings it to life. When the game is GPU limited, the CPU have more idle time so it is ready with setting up the next frame to GPU. This is why you need to find areas where the GPU needs to wait for the CPU. Variations in frametime is normal, since its directly connected to the framerate (FPS/1000= frame time). Its the large pikes and the spikes that are very close you need to watch out for. If they are not there, the impact of the bottleneck is more about framerate then frame times, and if you have enough frame rate, there is no need to do anything about the bottleneck (CPU or GPU wise). Game is then smooth enough! :)
 
Last edited:
I'm just happy to hear the OP is still running with no complaints on a 980x. I just upgraded some older systems to i5-680s (just general systems) and you can barely tell the difference between them and an i5-2500, which is very close to an i7-2600 in our normal non-gaming workload. And knowing that the 2600 is still potent as a gaming machine means it has a lot of life left as a general workhorse, and that trickles down to even the 680s. :cool:
 
That 980x should be fine for a bit longer. I'm still on the x58 platform with a Xeon at 4 GHz. With all cores at that speed it should feed any GPU without hurting your frames too much. It definitely isn't going to beat new CPUs. But as those videos show they aren't useless yet.
 
That 980x should be fine for a bit longer. I'm still on the x58 platform with a Xeon at 4 GHz. With all cores at that speed it should feed any GPU without hurting your frames too much. It definitely isn't going to beat new CPUs. But as those videos show they aren't useless yet.

can it last until Intel releases their Cannon Lake series (or is it Ice Lake?)? :D
 
can it last until Intel releases their Cannon Lake series (or is it Ice Lake?)? :D
The question that I've always found that ties into the answer to this is are the developers making software that needs Cannon/Ice Lake yet? The only reason I've ever seen to upgrade hardware is when the software you're using makes you.
 
can it last until Intel releases their Cannon Lake series (or is it Ice Lake?)? :D

Heh, I can't say. Maybe the new RTX GPUs will have issues with PCIe, or newer games with Ray tracing will hit the CPU harder? but then why would it with the GPUs taking on that load.

If you overclock that CPU to around 4ghz you should see some pretty solid improvement. It will never match the modern stuff in single threaded tasks, but in gaming it should be able to hold its own, most reviews show just a few frames difference on x58 platforms, and on very heavy CPU dependent games there are going to be bigger differences but nothing that seems to render games unplayable.

I have my X5670 at 4ghz and am waiting on a GTX 1080 to come in, but I had a 1070 for a few weeks, and was very impressed with the performance. and in Games like PUBG and farcry I had 110+ FPS at 1080p on customized settings. My GPU usage was pretty much always 99-100% and CPU usage was lower.


Getting your CPU to 4 or 4.2ghz on all 6 cores will really help out the GPU and aid in getting more life out of the system. Should be fairly simple, especially since you have an unlocked multiplier. there are tons of x58 overclocking tutorials out there to use as a guide.

I have been looking at coffee lake or a 9th gen CPU platform as an upgrade for a while myself, but I think my system will run another year or so with the 1080, or if I step up to an RTX GPU. I'm a much more casual gamer though with not a lot of time to game, so the system works for me. I'd rather hold out and wait for memory to come down in price. I wish I could push my x5670 higher but its required too much tweaking having a limited multiplier as it is.
 
I wish I could push my x5670 higher but its required too much tweaking having a limited multiplier as it is.
Have you looked into the x5690s? If they can overclock as well as the x5670 you should easily be able to break 4ghz. (y)
 
Intel Xeon W3690 3.46GHz turbo's to 3.73 and fits the socket. Its as far as it gets on that socket for 1 cpu pre-overclocking. I hit 4.4ghz with them.
 
Have you looked into the x5690s? If they can overclock as well as the x5670 you should easily be able to break 4ghz. (y)

Yeah I just bought a 5670 back in March to replace an i7 950. The 950 clocked better and easier. But the extra threads for transcoding my DVD and Blu-rays cut the time significantly.

I was being cheap and hopeful that I would get a decent one. Wrong. I'd be interested to see what that 980x can achieve.
 
Yeah I just bought a 5670 back in March to replace an i7 950. The 950 clocked better and easier. But the extra threads for transcoding my DVD and Blu-rays cut the time significantly.

I was being cheap and hopeful that I would get a decent one. Wrong. I'd be interested to see what that 980x can achieve.
Oh wow, I thought you were running dual 5670s. Yeah, for single I'd definitely look for an 980x and see what you can get out of it.
 
Go for 10 years!

Only having to buy like 8 CPUs in your life would be a pretty stellar track rating. ;)
pretty sure I've had that many in the past year....

edit:
celeron G3240
celeron G3900
Athlon x4 630
FX8320
A9700
2x Ryzen 1600
2x Ryzen 1700X

That doesn't include stuff I currently have.... or my laptops that I've had. And that is 2018 only, not the past 12 months.

Does [H] have a hardware rehab program?
 
Last edited:
I would also suggest waiting for the 9900k.
You seem to like your hardware to last a long time so a little while longer won't hurt.
Since we have seem to be hitting a IPC wall, you might want to look at Threadripper or Intel HEDT if you want to be good for a long while....
 
I would also suggest waiting for the 9900k.
You seem to like your hardware to last a long time so a little while longer won't hurt.
Since we have seem to be hitting a IPC wall, you might want to look at Threadripper or Intel HEDT if you want to be good for a long while....

Threadripper is great for programs that use a lot of cores but for gaming it's not as good as Intel...maybe Zen 2 will get gaming performance up to par...does AMD even care about gaming performance or are they totally focused on multi-core usage and have conceded?...would be nice to see AMD topple Intel with Zen 2 for the first time in forever (since Thunderbird?)...also it's not about wanting to keep hardware a long time...it's just the way things turned out...advancements in CPU tech has pretty much gone away after the clock speed race died out...I change my GPU frequently because that's the only technology that is advancing

upgrading for the sake of upgrading serves no point...I'd rather wait for something worth upgrading to
 
Threadripper is great for programs that use a lot of cores but for gaming it's not as good as Intel...maybe Zen 2 will get gaming performance up to par...does AMD even care about gaming performance or are they totally focused on multi-core usage and have conceded?...would be nice to see AMD topple Intel with Zen 2 for the first time in forever (since Thunderbird?)...also it's not about wanting to keep hardware a long time...it's just the way things turned out...advancements in CPU tech has pretty much gone away after the clock speed race died out...I change my GPU frequently because that's the only technology that is advancing

Hence the "OR Intel HEDT" man this isn't a AMD vs Intel thread.

Me personally I have a 144hz monitor too and a 1700 and can't tell the difference from 135 fps from 144fps, i'm not that ocd.
I am usually pegged at 144 anyways because the games I play.

HEDT is good going forward in general. We are going in the direction of using more programs and background activities at once anyways, I swear this happens every time another higher core count CPU comes out.
When Quads first came out dual cores were better for gaming.
Same is going to happen with 4 vs 6, 4 vs 8, 8 vs 16, etc

But then I also use my PC for more than just gaming........
 
Back
Top