Seeing as most games these days are severely GPU limited (esp. with higher visual fidelity and resolutions most gamers use these days), is there any point in overclocking modern CPUs (say i5/i7 etc.) when largely gaming?
I'm asking because I never saw an improvement greater than maybe 1-2fps at 2560x1440 with 2x GTX 580s at 4.8GHz vs. the stock 3.4GHz. Will you see a benefit at those kinds of resolutions with maximum in-game settings, high AA and AF with multiple GPUs?
All depends on the game. Some rely on both gpu and CPU. Like skyrim, there are points where I'm at 99% gpu, and others times I'm at 65% gpu and have liw fps (60-70). Yet other times with full gpu I break 120fps. So yes oc it.
That is the simple test - if your gpu utilization is less than 99% then your cpu is bottlenecking. Every game will be different, eg. bf3 is heavily bottlenecked by cpu in multiplayer but not in singleplayer.
Keep in mind this is Bethesda you're talking about. They wouldn't know a good graphics engine if it landed on their face and started to gyrate. I love the Elder Scrolls, but even after 10(?) years Morrowind still has limitations. And from what I've heard - please correct me if I'm wrong - things haven't changed that much.
But OCing has benefits for your other stuff, so why not if you can?
With ugridstoload at 9, and a bunch of other .ini tweaks and mods, I was getting 23-30 FPS in Skyrim at some places in Markarth (coming out of understone keep and looking down on the rest of the town on either the right or left side) with a i7 965 at stock (3.2Ghz, 3.3Ghz turbo). On my i7 3930k @ 4.0Ghz I get 40-45 FPS in those same places with the same GTX 580. So a faster/overclocked CPU can make an appreciable difference in certain games. That said, I cannot wait for the 600 series, as some places in the wilderness south of riftend I get ~30 FPS even on the new rig.