Is a 2500k @ 4.2ghz enough for GTX 680 SLI?

heatthegreat

Limp Gawd
Joined
Oct 12, 2007
Messages
424
Also, I have a Corsair AX750. Will both of these suffice if I add another 680 to my rig
 
Yes. It is fine. 2500K may be old, but it's still a fast proc, especially overclocked like I see yours is.
 
Yup, you should be good with a 2500k even though its a little old. I'm running mine at stock with my GTX 760 no problem. Later on I plan on running with the maxwell equivalent of a GTX 770 whenever they come out, maybe then I will bump up my 2500k a bit more.
 
Ok, thanks for the replies. I was browsing google and some people said 4.2ghz wasn't enough... but as you can tell by my Limp Gawd status, I trust this site over any. Thanks again!
 
If you're asking whether your 750W PS will be enough to add a 3rd 680, I am going to say probably not and if it is, it's way too close for my comfort.
 
I've been running gtx680 sli with an [email protected] for a few years now. Ran everything I threw at it on a single 30 inch 2560x1600 monitor. I now have recently added two more monitors for surround. At a custom resolution of 8080x1600 they're having it's hands full. If I keep the AA low it does a good job. Using an 850W psu. For three 680's I would upgrade to at least 900 or 1000w powersupply.
 
ya my 2500k @ 4.8ghz is still crushing everything i throw at it, i also run SLI but i have 570's
i might upgrade the video card if the next series is impressive but i plan to stay with my 2500k for another 18 months, just no point in replacing it
 
This really comes down to what games you play. Some are CPU limited, some are GPU limited.

Many games are so GPU limited, you could probably still be using a Q6600 and not see a CPU bottleneck.

On the flip side, some games are still almost all CPU. I play a lot of World of Warcraft and that game eats my 2500k @ 5Ghz alive, meanwhile my 680's in SLI will often be loafing along at 20-40% utilization each.
 
You wont get full usage out of 680 sli in all games and your 2500k will be tapped out at times. About the only game where lack of HT will really hold back 680 sli is Crysis 3. There are parts of that game that will eat your 2500k for lunch. My 2500k at 4.4 was at 100% plenty of times and even caused a slightest bit of hitching in some spots. My 4770k fixed all that and took me from low 40s to right at 60 for minimums in the same spots. Hitman Absolution will still dip below 60 fps in a few rare spots even with the 4700k at 4.3 but HT really does basically nothing for it. To stay above 60 fps in that game simply lower screen reflections setting to medium.

At 2560 you will be turning down a graphics setting or two before the cpu is an issue in 99% cases. If you play without vsync then you will likely never feel held back from cpu nor gpus.
 
Your cpu will be fine with 2x680s. I used to have 2600k at 4.2 ghz and 2x 680s and I saw a huge increase in frames in most games Your power supply is fine too. I had Corsair TX750, and yours is a bit better. :)
 
Yeah like everybody else has said, it'll be fine especially at 1920x1080. Maybe if you were gaming at 5760x1080 then you'd choke up a bit but at 1080, I think you'll be fine. Even if you start to run out of CPU horsepower, you've got so much GPU muscle that you'll still be able to shred 1080.
 
Yeah like everybody else has said, it'll be fine especially at 1920x1080. Maybe if you were gaming at 5760x1080 then you'd choke up a bit but at 1080, I think you'll be fine. Even if you start to run out of CPU horsepower, you've got so much GPU muscle that you'll still be able to shred 1080.
That makes no sense as you have things completely backwards.
 
The CPU load increases only (potentially) if the field of view increases. The same exact draw commands are issued by the CPU at 1280x720 and at 1920x1080, 2560x1440 and any other 16:9 resolution.
 
Higher resolutions put more load on the CPU. How is that backwards?
Lol no. Higher resolutions put more load on the gpu. How are you a member here for this long and dont even understand something as basic as this? Using your logic then we should be getting faster cpus for 4k instead of adding more gpu power?
 
How are you a member here for this long and dont even understand something as basic as this?
at least he's not banned :rolleyes: right?.. :rolleyes:

Lol no. Higher resolutions put more load on the gpu. Using your logic then we should be getting faster cpus for 4k instead of adding more gpu power?

In fact yes, The higher resolution will put the more load into the GPU and the Higher resolution you will need more GPU power yes, however the CPU will be loaded more too specially in the case of 5760*1080 because the View distance its way greater, than 1920x1080, so yes, at 4k you need a powerfull CPU too.. same case apply for framerate.. one thing its 1920x1080@60hz and other pretty different to +120hz.. he will need way more CPU horsepower at the same resolution just to keep feeding both GPUs.. and in those cases that 2500K can cause a hit in the performance even at 4.2ghz..
 
at least he's not banned :rolleyes: right?.. :rolleyes:



In fact yes, The higher resolution will put the more load into the GPU and the Higher resolution you will need more GPU power yes, however the CPU will be loaded more too specially in the case of 5760*1080 because the View distance its way greater, than 1920x1080, so yes, at 4k you need a powerfull CPU too.. same case apply for framerate.. one thing its 1920x1080@60hz and other pretty different to +120hz.. he will need way more CPU horsepower at the same resolution just to keep feeding both GPUs.. and in those cases that 2500K can cause a hit in the performance even at 4.2ghz..
Pay attention to the actual context. He was saying that the OP would be more limited by his cpu at higher resolutions in general than at 1080 which is nonsense. The OP's cpu would more of limitation for 680 sli at 1920x1080 than it would at a higher res and anyone with a lick of sense knows that.
 
Last edited:
Pay attention to the actual context. He was saying that the OP would be more limited by his cpu at higher resolutions in general than at 1080 which is nonsense.

No. I said his CPU would be more likely to run out of steam at the higher resolutions but at 1920x1080 with a pair of GTX680's, it wont be an issue.

The OP's cpu would more of limitation for 680 sli at 1920x1080 than it would at a higher res and anyone with a lick of sense knows that.

2495917-9721791628-tumbl.gif
 
That misterbobby guy is very condescending, don't listen to him. OP You will be fine. Load up the most demanding game you have and watch how your CPU handles it. I doubt it comes anywhere near max load during gameplay. I know the only issues I've had with my 2500K have been lack of GPU power, so I tone back settings a bit and I'm fine.
 
No. I said his CPU would be more likely to run out of steam at the higher resolutions but at 1920x1080 with a pair of GTX680's, it wont be an issue.



http://static.gamespot.com/uploads/original/95/959445/2495917-9721791628-tumbl.gif
The cpu is not going to run out of steam at higher resolutions. It would be his gpus that were getting more load.

Of course I am serious. Do you really not grasp that a cpu would be more of a limitation at 1080 pushing 680 sli than it would be at a higher resolution?

EDIT: You do know when they test cpus in gaming for bottlenecks they LOWER the resolution. ;)
 
Last edited:
That misterbobby guy is very condescending, don't listen to him. OP You will be fine. Load up the most demanding game you have and watch how your CPU handles it. I doubt it comes anywhere near max load during gameplay. I know the only issues I've had with my 2500K have been lack of GPU power, so I tone back settings a bit and I'm fine.
I already said his cpu was fine too. It will be nearly pegged at times though and that's a fact. The only shooter that will be held back is Crysis 3 in some very cpu intense spots.

MacLeod is the only one here that seems to be confused thinking its at higher resolutions where the OP's cpu would start to hold him back.
 
EDIT: You do know when they test cpus in gaming for bottlenecks they LOWER the resolution. ;)

I had a feeling that's where you were coming up with this. They test at lower resolutions because it takes the GPU out of the equation. At 800x600 a modern GPU won't be a limiting factor so it's easier to compare one CPU to another. Running Crysis 3 at max settings at 1920x1080 with a GTX760 will give you roughly the same results with a FX-8350 and i7-4770K because the GPU will be maxed. Running a game from 2001 at 800x600 won't be any problem for a modern GPU and so you'll get a more accurate comparison of one CPU to anotber.
 
I have had my setup for 2 years now and straight up there will be some games that are going to struggle at 1080. Crysis 3 is an excellent example. The only way I can get full GPU usage is if I max out the AA. With FXAA to no AA my cards are barely breaking a sweat and there are time where I can't keep it above 40 FPS. In some areas I can get over 100fps.

Running a single card vs a dual card setup has different demands on the CPU. So things like HT, OCing, and newer and more efficient architecture in the CPU, make the GPUs to CPU relationship much more efficient.

I know what I just said is a pretty vague description of how it works but from my experience and research, this is how I look at it.
 
Sandybridge is no joke. My 2600k is still running strong. I really don't see myself needing to upgrade for quite some time. The difference between haswell and sb isn't even noticeable in gaming.
 
Back
Top