Do overclocked chips really die sooner

Joined
Jul 5, 2005
Messages
601
I've initially viewed overclocking as something stupid that just kills your cpu in a year or so, but recently i started overclocking and gave myself the excuse that people are wrong about overclocking and cpus dont really burn out, its only people trying to apply common sense to processors. But i seriously want to know if OC'ing reduces the life of processors, and by how much
 
depends what kind of chip and how much u oc, if u take a x2 4800 and try to oc it higher it probably will make the life shorter than ocing a x2 3800 :p
 
Big Fat Duck said:
I've initially viewed overclocking as something stupid that just kills your cpu in a year or so, but recently i started overclocking and gave myself the excuse that people are wrong about overclocking and cpus dont really burn out, its only people trying to apply common sense to processors. But i seriously want to know if OC'ing reduces the life of processors, and by how much

It might not be *JUST* the processor who's life is reduced. Take it from my first hand experience ^_^.

I was overclocking my old Ti 4200 graphic card, and I guess I went a tad bit too high. One of the transistors fried....... from then on it crashed on 3D stuff..... So, its not just the CPU/processor that's at risk.

I have no experience with new cpus, I have only ever overclocked things like P2's and P3's... and after that I've just stopped. the computer I have is fast enough as is.
 
it doesn't kill it in a year...and raising the FSB isn't what kills it, the extra heat from pumping up voltage is what can kill it..but if you have a good cooling solution and relativly low temps, you shouldn't have a problem. Besides, by the time the CPU would die, you probably will have a new one. So don't worry about it...just don't cook the CPU.
 
Technically, but not even remotely as bad as the anti-oc folk would want you to believe. Theres no set amount of time as to what overclocking takes from the life of the cpu, but its not enough to where youll likely ever see it fail. Ive still got 8yr old processors that I had always overclocked, still ticking. Heat(electromigration comes along with this) is your enemy, not overclocking itself.
 
xdkimx said:
depends what kind of chip and how much u oc, if u take a x2 4800 and try to oc it higher it probably will make the life shorter than ocing a x2 3800 :p

*itches head*
 
if ur a euthusiat that buys the new stuff every once in awhile, u prolly won't get to see ur OCed chips die (less u keep them running, like FOLDING ;) )
 
Let me give you an example that shows how this is complicated.

My Opteron 165 is watercooled, so are my video cards. Since heat is the main factor in lifespan of an IC, none of those parts are likely to die. However, when overclocked, the caps and voltage regulators on the video cards, and especially the motherboard, get FAR hotter than they usually would.

Odds are, my CPU will last twenty years and the motherboard... five. But as has been said above, the anti-oc crowd tends to overstate the damage done.

The golden rule still applies to overclocking and modding. If you can't afford to replace it, don't mess with it.
 
Big Fat Duck said:
I've initially viewed overclocking as something stupid that just kills your cpu in a year or so, but recently i started overclocking and gave myself the excuse that people are wrong about overclocking and cpus dont really burn out, its only people trying to apply common sense to processors. But i seriously want to know if OC'ing reduces the life of processors, and by how much

Voltage (actually electric field through the insulator around the gate, which is voltage / thickness of the insulator) and temperature are by in large what determines the long term effects that destroy silicon transistors.

At the same voltage and temperatures an overclocked chip has (statistically) the same life span as a chip at stock speeds. (assuming same chp, same manufacturing process).

Now, the reality is that around here a 3-4 year old CPU is considered obsolete for the most part, and you can be pretty abusive to a chip and still expect it to last that long. (general rule of thumb, max die temp and stock voltage are picked for lifespan of about 10 years or more in commercial CPUs).

As towert7 and Advil point out, the CPU may not be the worst of your worries. The mohterboard voltage regulator (and really everything on the motherboard that gets power from the PSU to the CPU) are designed to meet a certain max current from the CPU, overclocking, particularly overvolting, can increase that current draw beyond those design limits and dramatically reduce the life of the components on the motherboard.

At 40-60C most capacitors and last formillions of hours of operation.
At their max rated temps (85-105 usually) they're only rated for maybe 1000 hours of operation.
And a MOSFET that can drive 20A@80C, might only be able to deliver 1/2 that at 120C.
 
OC'ing *may* reduce the lifespan, but there are many factors. First of all, buying a low end chip and oc'ing to the speed of the top of the line if you can do it without bumping voltage may not reduce life at all.

Let's just say as a for instance, overclocking were guaranteed to reduce the life of a CPU by 2/3. I have several CPU's that are 10+ years old and work just fine. Most people who would overclock to begin with aren't going to be sticking with one CPU for more than a year or two. So, if your CPU lives for 2 years instead of the twenty that it might live with stock voltage, it's a no-loss proposition.

There's a caveat though. If an OC newb or anybody else gets reckless you may fry a CPU right now, and that's really unlikely with stock speeds and voltages. IE, if I crank the voltage all the way up and leave it that way with stock cooling it may be possible to cause damage pretty quickly.

Short answer: OC'ing without recklessness may or may not cause a shorter lifespan of the CPU. Even if it does, the effect is negligible as the CPU would likely be replaced before it dies.
 
Winchester1897 said:
I thought that it is the Heat that more voltage creates not the voltage its self that kills your cpu.

While voltage does create a lot more heat (dynamic power dissipation scales quadratically with voltage, lineraly with everything else), higher voltage by itself increases the wear and tear on the transistors as well.

Of course heat creates a lot more heat as well. (Power dissipation, particularly static power dissipation, increases signifigantly as the temperature of a chip increases)

The 'natural' death of silicon transistors is a result of heat and voltage.
 
There are 4 ways that chips break:
electromigration: metal atoms in a wire move around because lots of lots of electrons hit them. Move enough metal atoms and the wire will either short with another wire, or have a hole in it and stop working.
TDDB: the insulation material that separates two key parts of a transistor breaks down creating an electrical short - which breaks the transistor. In most cases, if any one of the hundreds of millions of transistors breaks, it will break the chip.
hot-e: a key parameter of the transistor called "Vt" (threshold voltage) shifts over time - which essentially slows the transistor down. If it slows down enough, then the chip will calculate an incorrect value.
BTI: Similar to hot-e but for a different type of transistors and happens for a different reason. Usually fixed in the factory.

In all but hot-e, increasing the temperature a little makes the chip a little more likely to break (this is all statistics... there is no "do this and this will kill your chip..." it's all a matter of probability). In hot-e, lowering the temperature makes it worse.

In all of these, increasing the voltage a small amount makes them much more likely to break the chip.

As to why 10% more voltage is much worse than 10% more temperature, well let's take the example of electromigration and look at it in detail.

Wires are made up of atoms all lined up. Electrons flow through these atoms. An electron is a very small thing, and atoms are a lot bigger. So the idea of an electron moving an atom around is a lot like someone trying to move a car (atom) by shooting a BB (electron) at it. Clearly to ever hope to move a car by shooting BB's at it, you would need a lot of BB's... a storm of BB's. But if you get enough, the car will move. Millions of BB's and that car will likely start getting pushed around. The temperature of the chip could be thought of as how slippery the road is. A little bit more slipperiness isn't going to help a BB move a car. It helps a little but not a lot. On the other hand, the voltage determines how many BB's you have, and it's not like 10% more voltage is 10% more BB's (electrons), you get a lot more than 10%. And worse than that, because increasing the voltage in a chip also increases the temperature (all things being equal, like same heatsink, same air temp, etc.), increasing the voltage is a double-whammy.

The others are similar... but more complex to explain (I'm still not sure that the experts really completely understand the low-level details of both TDDB and BTI...).
Quoted without permission, from a guy that, IMO, knows his shit far better than anyone else in this thread. :p For more go here:
http://forums.anandtech.com/messageview.aspx?catid=28&threadid=1773169&enterthread=y
 
Decent stolen quote. It's somewhat indirectly related to the OP. He didn't ask what the risks are and/or why overclocking can damage chips. He asked do they die sooner which your quote does not talk about.
 
superkdogg said:
Decent stolen quote. It's somewhat indirectly related to the OP. He didn't ask what the risks are and/or why overclocking can damage chips. He asked do they die sooner which your quote does not talk about.

yes it does. If you read the thread at AT you would see he previously said temp is linearly related to expected life time of a transistor and voltage is ^2 related. :p
 
I have a Sempron 2600+ @ 1.83Ghz
I overclock it at 2.0Ghz. I think that is modest overclocking.
And my powersupply is 350w.
I don't have idea if I pushing it a little too far or if I'm not even pushing it.
I only oc when I'm going to play. I use nTune.
 
toymachineman19 said:
it doesn't kill it in a year...and raising the FSB isn't what kills it, the extra heat from pumping up voltage is what can kill it..but if you have a good cooling solution and relativly low temps, you shouldn't have a problem. Besides, by the time the CPU would die, you probably will have a new one. So don't worry about it...just don't cook the CPU.

While I might have a new cpu before this one dies, that old one will still end up being used. It either gets handed down to the wife, turned into something else or given to friends or family that need a computer to do basic stuff on. Essentially, I generally look at around 4 years minimum for anything I'm buying. I take a far more cautious approach to oc'ing.
 
yes they will die sooner, but whether you will see the day they die is highly doubtful.
 
ok time for a movie quote:

"the light that burns twice as bright burns half as long"
--bladerunner

kind of true in this case, but in proportion of time for which we'll use these, I doubt (as stated) we'll see them die. (dep on vcore increase over default)
 
MY GOD, please NEVER EVER say"will anyone want an A64 in 8 years?" YES someone will, I still regularly use our 486 for word documents, and I use my K6-2 for internet usage 8 years later. SO as you can see, I am using 16 year old computers still.
 
Enjoicube said:
MY GOD, please NEVER EVER say"will anyone want an A64 in 8 years?" YES someone will, I still regularly use our 486 for word documents, and I use my K6-2 for internet usage 8 years later. SO as you can see, I am using 16 year old computers still.

I am sorry but a 486 is not 16 year old
 
The temperature and life expectancy of a silicon based chip is not linear. It's exponential and the rule is that every increase of 10C halves the life of the chip. Also, increases in voltage is a directly proportional square relationship with temperature. Doubling voltage will increase heat 4 times the amount. However, temperature will not increase 4 times because of Newton's cooling laws. To tell you the truth, all this stuff mentioned is true, but probably too complicated to arrive at a final conclusion.


Here's my personal experience from OCing. Extreme OCing is where you push your chip to the limits of whatever cooling, and voltage that your setup can provide kills the cpu very quickly. When I say very quickly, I mean the max OC that that chip can reach will decrease on a daily to monthly basis. Fact, I had a 1700 Athlon XP that I overclocked to the max. After about 3 months of use, the machine would just crash when running any 3D game. I didn't play the game every single day so I didn't know when the problems developed. Now the chip doesn't even do any speed stable and is as good as a key chain. Another experience, this time it is a P4 2.26ghz. It could overclock to 3.2Ghz and left it at that. After about 1 year, the machine started having random little crashes. I prime95 and found that the max OC eventually was around 2.9ghz. I backed that off to 2.8 ghz and further lowered voltages and it seems to be alright. Another story, a P4 2.4C that used to do 3.46ghz. I backed it off immediately to 3.4ghz for an even number and a bit of voltage lowering. It has run for 1.5 years no problem until finally small crashes occurred. This one was peltier cooled and max overclock slid to 3.2ghz. I suspect the peltier made no difference in the life span of the processor eventhough it decreased temps over 15C. Moral of the story, no one knows what lifespan you will get with an OCed CPU. It's like playing the lottery, you may be lucky and you might not be.
 
Enjoicube said:
MY GOD, please NEVER EVER say"will anyone want an A64 in 8 years?" YES someone will, I still regularly use our 486 for word documents, and I use my K6-2 for internet usage 8 years later. SO as you can see, I am using 16 year old computers still.

I am using my XP2600 which has been OC'd to 2.2ghz for ~4 years now... it's this machine I am typing on now actually. :) (1.67ghz is default). Pretty kick ass. :)
 
yeah, I know that, and that is great. BUT the point 'You probably won't have your chip in 6 years" is a BLATENT LIE
 
Enjoicube said:
MY GOD, please NEVER EVER say"will anyone want an A64 in 8 years?" YES someone will, I still regularly use our 486 for word documents, and I use my K6-2 for internet usage 8 years later. SO as you can see, I am using 16 year old computers still.

Heh, my wife's still using my old 600Mhz Athlon. I think I ditched my K6-2 400. These days, I'm expecting a good amount of life out of the parts I buy. After I'm done, it'll probably go to my wife or maybe friends or family. It'll be in use for a long time.
 
yeah, I know that, and that is great. BUT the point 'You probably won't have your chip in 6 years" is a BLATENT LIE

For most peeps here, I'd be surprised if they had a chip 1 year from now! :D
 
Enjoicube said:
yeah, I know that, and that is great. BUT the point 'You probably won't have your chip in 6 years" is a BLATENT LIE

It's not a blatant lie. Might have just been an overstatement. 90% of people won't be using the same chip in 6 years. I'd say 95%+ of the chips people here are using now will still be functional in 6 years.
 
There's a company which buys all old computer parts, anything with gold plating (cards, main boards, memory, cpus, etc) and strips them down then extracts the gold from them (with heat and other chemical processes)... it seems like a major PITA but they are turning a profit after all is said and done.. I bet this is where many of those old chips will end up.. :)
 
Back
Top