Anyone else not give a single hoot about overclocking

6ejdg5.png
 
Today's CPU's depending on the model are designed to be overclocked. In fact, Intel's warranty allows for it. Also, modern CPU's have safe guards against overheating and damage. While not 100% proof, you would have to be an idiot to kill you're cpu.

Fun little side fact. You know how you often hear about people who experienced a traumatic event such as a car wreck and the report that everything seemed to be moving in slow motion? This is because when the brain is under high-stress / anxiety, to protect itself, it tosses aside pieces of your thought process in real time, hence, the slow motion some people sometimes experience. Intel CPU's do this as well to guard itself against heat. It tosses aside cycles to reduce heat.

With that said, overclocking is fun. It's also encouraged. Encourage by motherboard makers, CPU suppliers, etc. From Graphics cards to memory to a few other components.

I've had friends give me a lot of push back when attempting to get them to overclock. These days, I may casually mention it but I don't push. So I understand and get it.

I started out on the PC back in the very early 90's. Back when 1x cd-rom readers were $1,000 @ CompUSA. When 1 Megabyte of memory would cost you $150 - $180 dollars. So free performance is something guys from my time-period are def going to go after. I know I do.

I think everyone should spend time and learn how to do it. It's so easy these days. But, it's cool if you don't want it.
 
Overclocking CPU have so little reward in it, compared to the "good old days" where you used lower resolutions and games were more CPU dependent then. The sense of accomplishment is gone overclocking CPU these days. The feeling of reward afterwards is gone. I get it for those that runs "competitive" settings, where you use lower res in games to get the most FPS you can. Its means to an end. But, from what I can see, the majority would not notice any difference between an overclocked I7-8700K and Stock in a blind test.

As you can see from all those CPU reviews these days, they need to artificially lower settings and eyecandy just to get a difference that cannot be explained by a margin of error. Biggest difference these days are between 4 core I5 and 4 core+HT I7 CPUs. More cores is where its at, even only logical cores.

My little sister needed an upgraded to her system (pre Sandy Bridge). So, because I like building and got an excuse, I decided to upgrade my main system, an I7-4790K. Due to time restraints (work and I had time this weekend), I went into a store and bought the parts there instead of online. I wanted a AMD Ryzen 2700X build (havent had AMD CPUs since Athlon days), but lack of choice in motherboards forced me to get a somewhat decent Intel system (I7-8700K, Asus Prime Z370-A, Noctua NH-U14S cooler, 32GB 3000mhz DDR4). Annoyingly enough, they didnt have any good 16gb memory kits, so I had to get 2x 16GB 3000mhz chips so I could get dual channel with at least 16GB.

TLDR:
Upgrading from a I7-4790K system to a I7-8700K system is the most boring system upgrade I have had ever. Even though I put in a Samsung 960 NVME for OS just for the fun of it. Don´t get me wrong, its nice to have 2 extra cores/4 extra logical cores, since more crap is running in the background these days. I also love building systems, so that was fun too.
But, I don´t feel things are much more snappy then I did on a fresh Windows install on the I7-4790K. The things I do on this PC is mostly GPU limited anyway, so the I7-4790K had breathing room already. All I got was more breathing room "just in case".

I activated MCE, which is the probably the only "overclock" I am going to do on this CPU in the nearest future. In my usage case, overclock is more "academic" then something I would care about without an FPS counter.
 
CPU's have change over the years it's not worth the extra 10 -15% vs the heat & computer problems I never OC and Never will.
 
  • Like
Reactions: pavel
like this
I can't decide between the i7-8700K or R7 2700 but if I can buy one of those used or cheaper than market price, it will help make the choice easier. I will even consider the i7-8700 (non-k) because I don't need to OC either. I think these are already high performance chips - good enough for me.

OC is always an option, regardless, but I am more concerned about temperatures and power consumption. I also am looking at AIO coolers to make it look more tidy in the case. So, OC-ing is not the priority. Any advice?
 
I still have a 4770k destkop ONLY because I got it + a board at microcenter for $250 on a BF. I only replace my E3-1240v3 xeon based windows box as the motherboard died. i5-8400+Z370 & 1070sli runs everything I throw at it. My game budget is no more than $5-$7 a game so you can guess the age of the games I play.

I much rather learn about networking, routing & fire-walling these days than overclocking.


Buy the time the games are that price most of the bugs have been ironed out , so that's not a bad thing.
 
Buy the time the games are that price most of the bugs have been ironed out , so that's not a bad thing.
The only problem is that by that time the online communities are mostly dead......so that was part of the reason for my descent into single player ;-)
 
The reason OC isn't exciting anymore is that Intel and AMD pushes the boundaries of the CPUs more these days due to lack of real progress in the performance jump. We get stock speeds that are so much more closer to the chip's limits.
 
I started with overclocking my IBM AT from 6Mhz to 8Mhz by changing the clock crystal. Now that's old school :D

Later I was overclocking 386/25 chips to 33Mhz and later 486/25's to 33Mhz.

Come to think of it, I even overclocked my old RadioShack color computer. I could double the clock rate (from .9Mhz to 1.8Mhz) with software.
Only problem was , when running the faster speed it would scramble the screen. So, in order to speed of some slow programs I was writing, I'd switch it to 2x speed, run the long process with the scrambled screen and then switch it back off. :p


lol.... good to see some other old-timers here that remember changing clock crystals!
I still have a crystal in one of my old parts boxes, just can't bring myself to toss it. :)

The days of cutting mobo traces and soldering on variable pots to change voltages were fun too.

And you get kids these days saying overclocking is too hard..... it locks up and I have to change settings and try again. :rolleyes:

I'm an old crusty tech dude (Kyle's age!) but I still overclock my CPU. Running my 8700K at 4.8GHz.
I don't go extreme, but it drives me crazy to pass up free performance just for clicking some BIOS settings.

.
 
The only problem is that by that time the online communities are mostly dead......so that was part of the reason for my descent into single player ;-)

It's good for single player games is what I meant. If you want to play a game for multi-player you must buy it at release.
 
I've enjoyed overclocking every one of my PC's since the early 2000's. That being said, I'm on a 10 year old machine that still hangs due to overclocking. Being an old guy and seeing modern specs, I would probably just buy what I wanted next time and run it stock. Overclocking is fun and worthwhile if you have time. I enjoyed proving out 100% stability. Reliability has always been my top priority after an overclock.
 
I enjoyed OC more when hardware could not keep up with SW and OC would improve the operating system experience. Today, operating systems are incredibly lean vs the hardware available to them and in fact most UI tend to add latencies to improve the user experience
e.g. transition effects because there is so much compute power to spare.
 
Care, yes, but in reality how many really need to OC?
It's more for fun and getting the last ounce out of it. Why buy a new cpu that is 10%-15% faster when you can OC the cpu you have and get the same?
 
I do give a hoot, I buy the best I can afford and then make it squeal.

i9 7900x @ 4.7Ghz . I will never have to complain about a slow computer.
but you will have random crashes... Honestly its the stability that made me stop overclocking quite a few years ago. Every time I've started getting back into overclocking crashes make me run more towards underclocking.
 
but you will have random crashes... Honestly its the stability that made me stop overclocking quite a few years ago. Every time I've started getting back into overclocking crashes make me run more towards underclocking.

No random crashes. Not sure why you would even say that. I can actually overclock it to 5 Ghz, but it only runs prime stable at 4.7. I can understand why you would want to avoid crashes, but usually dialing it back 2 or 3 hundred Mhz is enough to stabilize a OC.
 
For me, not overclocking is like leaving free performance on the table. Why not put in a little more time and jack up that Q6600 from 2.4 to 3.2-3.6 GHz, or that 4770K from 3.5 to 4.6 GHz?

Besides, the risks are minimal as long as you're not crazy with Vcore, and it's generally a lot easier and cheaper than, say, trying to boost a car engine for loads more horsepower. (Car guys probably look at a 3800, Barra or 2JZ like we do a Mendocino or Sandy Bridge, to give you an idea of how much more such cheap, common engines can take. It's not uncommon to double or even triple the HP out of those things with mostly stock components!)

If anything, the 4770K at 4.6 GHz still proves adequate enough that I don't feel as much of an itch to upgrade until recently, with the new i9/Threadripper setups, and it's not because of CPU performance so much as needing more PCIe lanes. 16 PCIe 3.0 + 4 PCIe 2.0 isn't a whole lot when you start wanting things like video capture cards that need PCIe x4 slots, quad-channel USB 3.0 controllers that need PCIe x4 as well (thanks, Oculus, for wasting so much USB bandwidth with your sensor cameras!), and NVMe SSDs that need four PCIe lanes apiece.

Yet I want as much single-threaded performance as possible (thanks to the likes of ArmA, DCS and IL-2), which usually means "mainstream" Intel platforms that haven't improved at all on the lane count! Why? Because they overclock that much better, of course.
 
brush? thats what fingers are for.
You're starting to sound like Steve "If you see a stylus, they blew it" Jobs and all the crap digital artists had to sift through in the wake of the iPad being a smash success a decade ago, where no Tablet PC had been before prior to Microsoft's own Surface Pro, because who needs proper Wacom EMR pen digitizers or the newfangled Apple Pencil when your fingers are good enough?

Yeah, consider me jaded there - doubly so in the smartphone market where your only real choice is a Samsung Galaxy Note of some sort.
 
You're starting to sound like Steve "If you see a stylus, they blew it" Jobs and all the crap digital artists had to sift through in the wake of the iPad being a smash success a decade ago, where no Tablet PC had been before prior to Microsoft's own Surface Pro, because who needs proper Wacom EMR pen digitizers or the newfangled Apple Pencil when your fingers are good enough?

Yeah, consider me jaded there - doubly so in the smartphone market where your only real choice is a Samsung Galaxy Note of some sort.
lenovo yoga + sim card... 14" phone screw that tiny note.
 
lenovo yoga + sim card... 14" phone screw that tiny note.
Would be funny to see phone calls come in on a laptop like that... that said, I don't really trust Lenovo too much after how much they screwed up the ThinkPad line.

Also, last time I checked, they dropped EMR for AES on their newer models. Another point of distrust right there, until I get to test it in person.

Oh, how the Cintiq Companion Hybrid has spoiled me in pen performance... I'm probably better off just pairing that up with a decent laptop (read: packing a GPU that isn't utter garbage, like the typical integrated Intel crap).
 
Gave up on overclocking many moons ago. If you overclock, every little glitch, problem or HW failure leaves you wondering if overclocking is the cause. I don't need that.
 
Gave up on overclocking many moons ago. If you overclock, every little glitch, problem or HW failure leaves you wondering if overclocking is the cause. I don't need that.

Gave up free performance. Imagine golden 2600K guys saying what you said stuck at bottlenecking 3.4....when they're usually a golden 4.8-5 ghz.
 
I've overclocked pretty much every CPU and GPU I've ever owned going back to an athlon 64 3000+, to gain speed for BF2 to go with my pair of 6600gt's in sli.

But now i have my x5670 on a gifted x58 system overclocked. I just ust want to game with the little spare time I have. Next rig will be higher end than I have ever built for myself and I probably won't oc until I need to.

GPU will likely get bumped of course but that's a heck of a lot easier than the CPU and RAM overclocks
 
I used to overclock back in the day with the old Athlon 2500+, and the Intel celerons, but now not so much. My upgrades are further apart now so I just buy what I can afford and try to make it last around 6 years or so. Upgraded from an i7 950 to an i7 8700K and am very happy with the speed increase without overclocking.
 
Overclocking CPU have so little reward in it, compared to the "good old days" where you used lower resolutions and games were more CPU dependent then.

I agree with you. but not all ppl use powerfull computer only for games.
I just really dislike when ppl boil anything about high performance computing down to games cause its not the be all of performance demands. its not even close to be the hardest stuff for CPU's

But yeah it IS probably the single biggest factor
 
I dont typically OC but I look for chips that CAN and buy those. Its nice to know you arent running something flat out and theres headroom in there if something becomes necessary.
 
but you will have random crashes... Honestly its the stability that made me stop overclocking quite a few years ago. Every time I've started getting back into overclocking crashes make me run more towards underclocking.

You’re doing it wrong. Rock solid OC stability is.. kinda the point.
 
You’re doing it wrong. Rock solid OC stability is.. kinda the point.

Right, its time consuming, but its fun, and cheaper than buying more expensive hardware. if you're smart about it stability and more performance is easy to achieve. The tweaking part is starting to get old for me, I only have limited time to game, and overclocking has taken a lot of that time trying to get my OC stable.
 
I'd be happy if I could run my i7-4790K at stock with turbo enabled much less think about overclocking. I'm running it in an NCASE M1 with a Cryorig M9i cooler, and it reaches 84C when transcoding videos. I had to disable turbo (keeping it at 4.0 GHz) just to keep the CPU below 75C while transcoding videos. It's on an H81 motherboard, so I can't even lower the multiplier to keep it cooler.
 
Last edited:
I don't overclock either really. Not on my main PC anyway.

I don't get any joy out of overclocking. If you do: cool.
 
I'd be happy if I could run my i7-4790K at stock with turbo enabled much less think about overclocking. I'm running it in an NCASE M1 with a Cryorig M9i cooler, and it reaches 84C when transcoding videos. I had to disable turbo (keeping it at 4.0 GHz) just to keep the CPU below 75C while transcoding videos. It's on an H81 motherboard, so I can't even lower the multiplier to keep it cooler.
Have you thought about delidding your 4790K to keep it cooler?
 
Have you thought about delidding your 4790K to keep it cooler?

I thought about it, but I'd rather not if at all possible. I'd want to re-lid, and there's always risks associated with that. I was under the impression that the TIM in the 4790K was better than in the 4770K, so I thought my temps at stock with the 4790K would be no worse than my 4770K. My 4770K ran at 65C under load.
 
Used to be all about the overclocking- started with a Cyrix P150+ or whatever it was called, then a Celeron 300A @ 450Mhz, P3-500 @ 850. Loved tweaking and maxing out performance. These days I couldn't really be bothered, I have enough power that it just isn't necessary and I have other hobbies to occupy my time.
 
Back
Top