Oldest CPU pairing with a 1070/1080 GTX?

I'm still have a I7-920 running at stock speed, 2.66 GHZ with only 6 GB of ram and just got a GTX 1080. I played Rise of the Tomb Raider, Doom, GTA 5, and Witcher 3 with everything maxed on a 2560x1440 Acer Predator monitor and every game runs smoothly. The only game I had some lag on was Just Cause 3 but I heard that the game doesn't run well on systems with less than 8 gb of ram. I might update when either kabylake or cannonlake comes out. I don't see any reason to update now since I'm not seeing any problems (except Just Cause 3).
 
Last edited:
1070 will be installed with an i5 2300. It's not the oldest in the list but I too was told I'd be bottlenecking the card. For a year or two it will be fine.
 
I'm still have a I7-920 running at stock speed, 2.66 GHZ with only 6 GB of ram and just got a GTX 1080. I played Rise of the Tomb Raider, Doom, GTA 5, and Witcher 3 with everything maxed on a 2560x1440 Acer Predator monitor and every game runs smoothly. The only game I had some lag on was Just Cause 3 but I heard that the game doesn't run well on systems with less than 8 gb of ram. I might update when either kabylake or cannonlake comes out. I don't see any reason to update now since I'm not seeing any problems (except Just Cause 3).
You are only fooling yourself but ignorance is bliss as they say. IMO putting a 1080 in a system with a stock 920 and only 6 GB of ram is pretty damn stupid. We can compare benchmarks if you like and then you would realize that overall you are not even getting 1070 performance from your 1080. Not to mention some games will stutter and hitch with only 6 GB of system ram. By the time you upgrade your cpu we will have faster cards and you would have accomplished basically nothing by getting a 1080 over a 1070 other than wasting money.
 
I put a GTX 1070 into my i5 2500k @4.6 Ghz system, and it's a big step up from my R9 280X OC.

This video shows a GTX 1060 in an i7 5820K system getting 36 - 49 FPS in Witcher 3 @ 1080p, Ultra settings:



Meanwhile, I get 63 - 80 FPS in Witcher 3 with everything maxed, including Hairworks (except for Hairworks AA, which I have at 4x instead of 8x), at 1920x1200.


And the second half of this video shows an i5 4690K @ 4.0 Ghz getting 58 - 74 FPS using the same settings that I am (Ultra, Hairworks on Full, Hairworks AA at 4x):




And this video shows an i7 6700K getting generally 55 - 74 FPS, though there is one part of the video where FPS goes over 100 (it probably would for me there, too):




Anyone thinking that a 1070 is too much for an i5 2500k, or similar CPU, or 1080p gaming, will be incorrect. However, I can see resolutions of 1440p and higher as being more likely to run into CPU bottleneck risks.

One game where the 1070 didn't make any difference from my R9 280X OC is Arma 3... which is a CPU-heavy game. But pretty much every other game has seen massive gains from the 1070. GTA V at Ultra, with virtually everything maxed, including MSAAx4 and TSAAx2... with only reflection AA at 4x instead of 8x, runs at a constant 105 - 140 FPS. If I put reflection AA at 8x, then it's like 85 - 115 FPS.

I have some other results from my 1070 GPU upgrade posted in this thread on LTT: Results of putting a GTX 1070 SC in my i5 2500k @4.6 Ghz


Im pretty sure at higher resolution it takes more stress off the cpu and lays more on the gpu and at lower res is where the cpu will be more likely to bottleneck a gpu
 
Heh, do what you want. At the end of the day, your rig is your rig. Nobody else but you really cares about it anyways.
 
Heh, do what you want. At the end of the day, your rig is your rig. Nobody else but you really cares about it anyways.
88fe037a2e62379152dfc443ec70409d5c00ce9083d89766d7eeaa1ab7d3d74f.jpg
 
Got my 1070 hooked up to my toaster.

Well some people call it an i5 3570K but they're basically the same thing now after Skylake dropped, right?
 
I'm still have a I7-920 running at stock speed, 2.66 GHZ with only 6 GB of ram and just got a GTX 1080. I played Rise of the Tomb Raider, Doom, GTA 5, and Witcher 3 with everything maxed on a 2560x1440 Acer Predator monitor and every game runs smoothly. The only game I had some lag on was Just Cause 3 but I heard that the game doesn't run well on systems with less than 8 gb of ram. I might update when either kabylake or cannonlake comes out. I don't see any reason to update now since I'm not seeing any problems (except Just Cause 3).

CPU's have been behind GPUs awhile now in general, I used to use a I7-920 @ 4.0, it's still laying on my desk these days, stuck in a X5680 last thing I bought used about $125 that runs at 4.5 on the same MOBO.

I wouldn't be surprised if a I7-920 could not still push a 1080 pretty well these days. I'd imagine it would be struggling a bit these days though, depending on a few other possible issues.

I do not even game much these days, but it would not even cost much to double the ram on that board which probably would help a bit.

*shrug*

G.SKILL Ripjaws Series 12GB (3 x 4GB) 240-Pin DDR3 SDRAM DDR3 1333 (PC3 10666) Desktop Memory Model F3-10666CL7T-12GBRH-Newegg.com
 
Last edited:
"Here's my Witcher 3 performance for my i5 2500K @ 4.6 Ghz"

Hey what do you have for your max vcore under load @4.6? Im at 1.31vcore load for 4.2 on my 2500k, havn't tried to push further yet

Also to the OP : "Appreciate the advice, but I didn't come here looking for it"
If you do something that makes no sense to a majority of the way most people recommend than you can expect to get feedback about that. This is a public forum, so you don't have to get all smartass about your comments when they inform you that your upgrade sequence looks out of balance
 
That could be, and I honestly don't know, but I saw a person's Afterburner hardware performance graphs from their running of Witcher 3 with a GTX 1070, and they were using 1440p DSR to 1080p. Their CPU, which was better than mine (don't recall which model) was at a constant 100% utilization, and it was reducing the GPUs utilization notably. However, on my i5 2500K @ 4.6 Ghz, running Witcher 3 at 1920x1200, with all settings maximized, I don't have a CPU bottleneck, and my GPU utilization stays at 99%, even in Novigrad.

So, maybe DSR is a huge workload for a CPU?
Sorry but there is no way that you stay at 100% GPU utilization with that CPU at 1920 regardless of what you claim. I can turn off HT on my 4770k and my CPU is pegged quite often and I even get some stuttering. And I've seen plenty of videos that shows a CPU like yours cannot fully push the GPU especially in the area you are talking about.
 
The big question should be whether anyone can actually play a game in reality, than throwing graphs and shit all over the place :)
 
I'm still have a I7-920 running at stock speed, 2.66 GHZ with only 6 GB of ram and just got a GTX 1080. I played Rise of the Tomb Raider, Doom, GTA 5, and Witcher 3 with everything maxed on a 2560x1440 Acer Predator monitor and every game runs smoothly. The only game I had some lag on was Just Cause 3 but I heard that the game doesn't run well on systems with less than 8 gb of ram. I might update when either kabylake or cannonlake comes out. I don't see any reason to update now since I'm not seeing any problems (except Just Cause 3).

My i7 920 only saw stock clocks when I first booted it in 2009. That cpu should easily overclock to 3.8 or 4.0ghz. DDR3 is really cheap, no reason not to at least have 12gb triple channeled. I only recently swapped my i7 920 for a Xeon w3690 hexacore i got cheap from ebay. You should definitely look into something like this to breath new life into that x58. My Firestrike 1.1 physics score went from 8,062 to 14,290 with the upgrade.
 
I'm still have a I7-920 running at stock speed, 2.66 GHZ with only 6 GB of ram and just got a GTX 1080. I played Rise of the Tomb Raider, Doom, GTA 5, and Witcher 3 with everything maxed on a 2560x1440 Acer Predator monitor and every game runs smoothly. The only game I had some lag on was Just Cause 3 but I heard that the game doesn't run well on systems with less than 8 gb of ram. I might update when either kabylake or cannonlake comes out. I don't see any reason to update now since I'm not seeing any problems (except Just Cause 3).

You are my hero.
I also have an i7 920 with a generous overclock and just bought a MSI gtx1080 gaming X.
My right has 12Gb of nice (at the time) ram, though.
My system is GPU limited with dual GTX580s with newer titles. Sure I wont be able to get every last ounce of performance out of the 1080 for the time being, but it will be a helluva upgrade and consume less power than as it currently sits.
System overhaul is in the future and my g/f will get my leftovers as she is looking to build her first rig.

I tend to go BIG then sit on a build for a couple of years before I do it all over. Hell I will probably upgrade monitors before I upgrade the rest of my rig.
 
Just got my GTX1080 last night and was able to rock and roll in DOOM with all settings maxed out and never saw frame rates dip under 60fps :)
 
just see how much CPU utilization when play game, if its above 70% switch to newer cpu. make sure the background stuff is off.
 
Bottom line GPUs make the biggest difference.

I'm not surprised some folks are able to max out some modern games rocking old CPUs. Keep in mind modern games like Doom, BF series aren't very demanding game so not the best game to use as a standard for benchmarking. Rise of the Tomb Raider, Hitman, and Project Cars comes to mind when I think about today's demanding games.

If you're going for 144 FPS, you'll really see the CPU bottleneck and the difference between sat a GTX 970 and 1080 or Titan is really nil since the CPU is holding it back no matter what. If you just looking to play a steady 40-60 fps, all the power to ya! :)
 
Take me down to the bottleneck city where the minimums are are low and GPU usage is shitty!

In all seriousness, a friend of mine purchased a GTX 1070 to use in his stock 8350 system. I will say, the bottlenecks are real at 1080p. GPU usage was around 50-60% and the minimums were pretty bad... We cranked it up to 4K with DSR and saw 99% GPU usage and surprisingly good frame rates.
 
Take me down to the bottleneck city where the minimums are are low and GPU usage is shitty!

In all seriousness, a friend of mine purchased a GTX 1070 to use in his stock 8350 system. I will say, the bottlenecks are real at 1080p. GPU usage was around 50-60% and the minimums were pretty bad... We cranked it up to 4K with DSR and saw 99% GPU usage and surprisingly good frame rates.

Good post :)

I'm sure I will face that too with my aging [email protected] rig running 1080p@75Hz display.... I'm still going to buy gtx1070 to prolong rig's life for another 3-4 years. After that I will build a new system ground up...
 
Good post :)

I'm sure I will face that too with my aging [email protected] rig running 1080p@75Hz display.... I'm still going to buy gtx1070 to prolong rig's life for another 3-4 years. After that I will build a new system ground up...

the [email protected] will offer way much better performance than the FX8350 at 1080P... you will be fine but you will face problems with certain games where you will have some stuttering but everything else you will be fine. which is not the case of FX8350 users at 1080P.
 
just see how much CPU utilization when play game, if its above 70% switch to newer cpu. make sure the background stuff is off.
umm no that's not how it works. Using your logic then I would never be CPU limited which is completely false because I am in plenty of games at lower res. Not every game uses all of the CPU which is why what you're suggesting is a terrible way of judging CPU limitations.
 
umm no that's not how it works. Using your logic then I would never be CPU limited which is completely false because I am in plenty of games at lower res. Not every game uses all of the CPU which is why what you're suggesting is a terrible way of judging CPU limitations.

its simple way to do it, sure not perfect but if your game doesnt use that much CPU then no point upgrade it since THOSE game dont use much CPU anyway, if certain game use CPU then it needs upgrade it. thats way if you play game and the CPU util is high then its CPU the bottleneck for that game, if you play tetris only, whats point of upgrade CPU. now if you have some new CPU and still has high util, not much you can do there, but most time new CPU wont has that high util
 
its simple way to do it, sure not perfect but if your game doesnt use that much CPU then no point upgrade it since THOSE game dont use much CPU anyway, if certain game use CPU then it needs upgrade it. thats way if you play game and the CPU util is high then its CPU the bottleneck for that game, if you play tetris only, whats point of upgrade CPU. now if you have some new CPU and still has high util, not much you can do there, but most time new CPU wont has that high util
Um no and you seem to have no idea how things work. I can be fully cpu limited in a game even if was only using 25-35% of my cpu because the game may only effectively use 2 or 3 cores. That is why Intel beats AMD in gaming because IPC matters more so than just cores. Using your logic though then you would wrongly think AMD 8 core users are never cpu limited since clearly they dont have high cpu usage in modern games that typically use 4 cores or less. So again going by cpu usage is a poor way to test to see if you are cpu limited. The easiest way is to just lower the resolution and if performance goes up then your cpu is not the main limitation.
 
Last edited:
Anyone notice more of an increase in PCIe limitations with these new cards? 2.0 to 3.0? Minimal still?
 
Um no and you seem to have no idea how things work. I can be fully cpu limited in a game even if was only using 25-35% of my cpu because the game may only effectively use 2 or 3 cores. That is why Intel beats AMD in gaming because IPC matters more so than just cores. Using your logic though then you would wrongly think AMD 8 core users are never cpu limited since clearly they dont have high cpu usage in modern games that typically use 4 cores or less. So again going by cpu usage is a poor way to test to see if you are cpu limited. The easiest way is to just lower the resolution and if performance goes up then your cpu is not the main limitation.
looking at CPU util is not perfect, but generally is good indication, save you the trouble of other method. you dont need to look at ALL core usage, look at usage for each core. Ive been doing this since 05, yes it work. as ASIC/FPGA engineer, i do know little about CPU archtech etc. either that or look other method
 
Has the OP said what the only game he plays is? Because if he really wants help he could tell us and we could then give him a definitive answer.
 
I saw a guy using a GTX970 and upgraded to a 1070. his cpu is the Core 2 Q7700 lol. he's gonna upgrade the cpu/mobo later this year.
 
Probably just referring to a QX9770?.. the Q7000 Core 2 Quad were a very very strange OEM only die shrinked 45nm Q6000 CPUs.. basically an alternative Q8000 CPU more expensive for OEM only.
 
That's a CPU I've never heard of or seen.... Neat.

Although 1.25v looks like a crazy high voltage for stock. (it's what, 90nm?)
 
yes, it's a real cpu, just a bit higher stock clocks than the Q7600. and there is another Core 2 extreme with Xx7700 in the name iirc.
 
Back
Top