Worth it to "upgrade" from p8p67 rev 3.1 to Asus p8z77-v deluxe for overclocking?

duronboy

Gawd
Joined
Feb 1, 2003
Messages
549
When I bought the P8P67 rev 3.1, power phases weren't on my radar as they weren't listed in the ads or the reviews. I still don't know how many it has. And I forgot it won't do SLI, only crossfire. The P8Z77-V deluxe offers 16 + 4, and the P8P67 Pro was 12 + 2, the non-pro couldn't have been better than that. This in mind, I might actually be able to OC the 2600K I have better than I ever did when I first got it. But the main reason I stopped OC'ing was the system wouldn't run if the CPU down-clocked at idle. The settings had to be such that the CPU was at full-speed continuously.

So, price-wise, I might not have to spend too much, I might even make a very little bit(not counting the time and risk). Some rando is selling a p8z77-v deluxe packaged with another mobo, CPU, and RAM I won't use. Well, actually, I might could use the RAM, it's 32GB, which is more than I have now, but it's PNY. Is PNY from that era worth considering? I've been using gskill ripjaws or whatever and it's been fine, I guess. Haven't really pushed it.

What kind of degradation do mobos see with extreme OCing?

My first reaction is just to say F it and wait until I can afford a new system. Even the bottom end of the current gen is more than double, possibly triple the speed of this if I successfully OC'd it. On the other hand, I kinda wanna OC the shit out of this 2600K. But if it won't down-clock at idle, that'll be a real bummer.
 
When you say extreme overclocking, do you mean a few occasions of competitive overclocking with exotic cooling, or eleven years of running a 2600K at 4.9GHz?

I'd imagine the most relevant degradation is to the semiconductors in the VRM, if they've been used abusively for this long. The high side FETs in the VRM can eventually fail and short the 12V power supply voltage directly to the switch node. This is pretty common on graphics cards, although anecdotally, it sounds like CPUs are less likely to survive it than a GPU is. When it happens on a graphics card, you can remove the dead phase and the card usually at least kind of works again.

If you back off the overclock, does your current board behave normally when returning the idle speed? Or is it a situation where any overclock causes that behavior?
 
I guess I should have stipulated sustained extreme OC. I have no idea if the seller of that mobo kit was OCing long term. The processor they were bundling is faster than the 2600K, but it was a non-K chip. My limited knowledge suggests it probably wasn't too crazy of an OC if they were? However, the other, inferior mobo could have been the one they were using that non-K chip in, except that they do have it mounted in the z77 for the photo. Who knows. Maybe they toasted an unmentioned K chip in the z77?

Possibly a lighter OC would've worked better. But it seems it was so little of a difference in speed. With the 50+% OC, I could literally feel the difference in everything, as you might expect.
 
My first reaction is just to say F it and wait until I can afford a new system.
^I would lean towards this.
What you are thinking of buying from "some rando" sounds like a lot of unknowns based on your description, and seems likely to be just a waste of money.
I would focus on saving some more money and buying a more modern platform.
 
I guess I should have stipulated sustained extreme OC. I have no idea if the seller of that mobo kit was OCing long term. The processor they were bundling is faster than the 2600K, but it was a non-K chip. My limited knowledge suggests it probably wasn't too crazy of an OC if they were? However, the other, inferior mobo could have been the one they were using that non-K chip in, except that they do have it mounted in the z77 for the photo. Who knows. Maybe they toasted an unmentioned K chip in the z77?

Possibly a lighter OC would've worked better. But it seems it was so little of a difference in speed. With the 50+% OC, I could literally feel the difference in everything, as you might expect.
What happens if you back off from say, 4.9GHz to like, 4.7?

I have a 9900K that doesn't like running at 5GHz. It will, if I dial the voltage up to 11, but if I back it off to 4.9, it'll happily do that forever at just over stock voltage. That last 100MHz doesn't make that much difference in performance, but makes a big difference in longevity and stability. Maybe you could do the same with your current hardware and get some of the usability features back.

^I would lean towards this.
What you are thinking of buying from "some rando" sounds like a lot of unknowns based on your description, and seems likely to be just a waste of money.
I would focus on saving some more money and buying a more modern platform.
Agreed. My opinion is probably warped, but I can't really imagine the payoff from this being all that great, even if it goes exactly how he wants. You're not going to get that much more speed from a 2600K, even on the best motherboard in the world. I'd focus on doing what I had to do in order to upgrade the whole system.

OP, what are you actually doing with this system? Games?
 
I'm not entirely sure, because it's been 11 years, but I think the deal was if whatever speed increase was to all cores, instability or outright inoperability ensued if clock scaling was enabled. I don't remember if it was a P8P67 thing, a P67 thing, or a 2600K thing. A Z77 fixes 2 of those.

The other payoff was getting SLI and like mentioned, 32GB of RAM from 16GB. I just bought two GTX 680s for the price of one so SLI would be pretty sweet. However... that's a lotta watts. For like just barely better than 1060(120watts) performance.

And, my dad's old Mac Pro video card died and the 680 is one of very few cards that work. Mac video cards are super rapey in price for some reason.

I mean, if someone looked at my Steam history, they would say games is the primary system usage. But, it's a good thing hours logged are lies.

In theory, 3D CAD and web are the primary uses with some super light video editing. But yeah, there's some games... mostly old.
 
A current gen i3 stock will run rings around that 5ghz 2600K for any sort of CPU intensive process and will lower your power bill. I'd leave things as is until you're ready for a platform upgrade.
 
Yeah I was looking and i3s can have 10 cores now and cost damn near $300. I don't even know what i3 means any more. lol
 
Regardless, the seller of the mobo combo I was considering hasn't gotten back to me after a couple of attempts to reach them. So it's out of my hands, anyway.
 
When you say extreme overclocking, do you mean a few occasions of competitive overclocking with exotic cooling, or eleven years of running a 2600K at 4.9GHz?

I'd imagine the most relevant degradation is to the semiconductors in the VRM, if they've been used abusively for this long. The high side FETs in the VRM can eventually fail and short the 12V power supply voltage directly to the switch node. This is pretty common on graphics cards, although anecdotally, it sounds like CPUs are less likely to survive it than a GPU is. When it happens on a graphics card, you can remove the dead phase and the card usually at least kind of works again.

If you back off the overclock, does your current board behave normally when returning the idle speed? Or is it a situation where any overclock causes that behavior?
I built my wife her first proper gaming rig in the first week of August 2012 before we were married, and it's 2500k ran at 5ghz from then until sometime in 2017 it started to no longer be stable at that frequency and was running at 4.88 something ghz when it was replaced with a 9900k rig in July 2018. It used a p8z77-pro not deluxe and 16GB of that itty bitty samsung ram from that era and still works at stock settings so it sits in a anti-static bag on the shelves with the other old but operable things that occasionally get put to use.
 
I built my wife her first proper gaming rig in the first week of August 2012 before we were married, and it's 2500k ran at 5ghz from then until sometime in 2017 it started to no longer be stable at that frequency and was running at 4.88 something ghz when it was replaced with a 9900k rig in July 2018. It used a p8z77-pro not deluxe and 16GB of that itty bitty samsung ram from that era and still works at stock settings so it sits in a anti-static bag on the shelves with the other old but operable things that occasionally get put to use.
Was it set to all-core OC? And did it stay at 5GHz continuously, as in no reducing clocks(and watts) at idle?

Anyway I bought another board to play around that supported heavy OC and SLI. Not as nice as the Asus. But after I paid I got one message out of the seller and then nothing. And they never shipped. Did get a reversal on the payment, so I have that going for me, which is nice.
 
Yeah I was looking and i3s can have 10 cores now and cost damn near $300. I don't even know what i3 means any more. lol

I'm not sure what you're looking at, but a modern i3 caps out at 4C/8T just like the 2600k, is much more power efficient, has a better IGP, and is probably 50% faster on average.

You can get cheap DDR4 memory here in the forums, and get a board for under $100. Total upgrade would be around $250 I would imagine. Plus in another 10 years, you can look for a 13900k to drop in it if you're still on the new system.

I get that it's entertaining to OC old parts, but chasing OCs out of old motherboards is going to cost you more in the long run.

TLDR: I don't think it's worth it to upgrade from the P8P67 board personally.
 
I don't remember which i3 it was, but I realized later it was 2 performance cores and 8 efficiency cores.

Yeah it's not worth it. Except people keep posting their old CPU/mobo/RAM combos for reasonable prices and I'm like, OK, let's try this. It was another 2600k, mobo, and 32gb RAM for like $65 shipped. But then no shippie. :'-(
 
2600k is a lost battle today. I run a 2600 non-k, on a z77 M5F and the only reason I got it, is that it has updated bios available to boot from nvme. Since then I moved to x99 and will gift that z77 to my GF which runs a C2D.
 
Core 2 duo OMG. I'm running a 1st gen i7 in a laptop right now and it's barely meaningfully faster than the fastest core 2 duo, and it's done. I'm sure back in the day it was a huge difference, but now, it's just solidly in the slow category on a lot of horrifically inefficient websites.
 
Last edited:
Back
Top