Do you overclock your gpus? Edit fixed gramar

Do you overclock?

  • Yes, boost is never enough

    Votes: 35 41.2%
  • No, boost is enough

    Votes: 44 51.8%
  • I disable boost nd nt overclock at all

    Votes: 3 3.5%
  • Waht is overclock or boost? So confused!

    Votes: 3 3.5%

  • Total voters
    85
Used to do that including hardware mods, bios flashing, et-al.
Nowadays just load driver and play card games. :-P

Was really into CPU overclocking and now with AMD PBO it's like stealing candy from a baby.
There are more gains (of course) but at the expense of much greater power consumption and heat.
A few fps or points in a benchmark, both meaningless in real world, seat of the pants experience, isn't worth the hassle. IMHO of course.
Some like to extract every last hp out of their car at the track where others sit in their shiny new Tesla Plaid, switch to cheetah mode and spank and embarrass the former.
Interesting times we live in that's for sure. Just as long as there's something for everyone. ;-)
 
Agreed mate thank you! Wasn't getting a shrink lead to lower voltages? Why do we see such huge volts on such small node?

It usually does, but not always. Intel's 14nm can take a little more voltage than their 22nm, but they've gone a little crazy in recent days too.

Part of me wants to keep checking on this CPU every couple of months to see if it keeps getting worse. You know, for science. Another part of me wants to be a good brother, and just set it for around [email protected] manual.
 
You know, I didn't want to post this because I know a lot of people will want to shoot the messenger, and one data point does not a trend make, but I did some testing yesterday on the 5800X machine I built for my sister. When I first built it, the CPU could loop Cinebench at 4.7GHz at 1.3 volts. After 5 months of her gaming at BIOS default everything, the CPU can only loop Cinebench at 4.65 at 1.3 volts. 4.7GHz fails instantly, and 4.675 fails within a minute. So 50MHz of degradation in 5 months. I wanted to wait another few months and test again to see if there was further degradation, but maybe it's better if I call some attention to it now. It might just be the initial break-in, but that's a lot more break-in than I typically see. My gut tells me that all that bouncing around at 1.40-1.49 volts at 4.6-4.8GHz while gaming is not something that will lead to a long life. I think that after maybe 3 years, the CPU is going to be in blue screen heaven. Intel should not be let off the hook either - they run some crazy voltages and current levels with TVB, ABT, and whatever other insane algorithms they use to keep up with AMD in benchmarks. I guess time will tell, since these chips are still relatively new...
It will be interesting for sure to see how they last. Afaik der8auer is running AMD CPUs 24/7 for 6 months to see how they hold up and I believe he is closing in on 6 months now. He did OC and temp tests before letting them run and is planning to redo the tests when the timeperiod has expired. That should give us some indication on how they hold up.

The 5800x is a hot running CPU though. My sample will boost to 4.6ghz on stock settings in cinebench, but even with high end cooling it is around 70 degrees at stable state (aprox 47 degrees above ambient). While most games run much cooler, there are a few that run quite a bit hotter (typically COD and similar).
 
I've run benchmarks and the gain from the overclock is kinda worthless for real use. In benchmarks, sure that extra bump in score, but in reality, I'm refresh rate limited so I see no point. I actually run lower power targets and fps caps.
 
I've run benchmarks and the gain from the overclock is kinda worthless for real use. In benchmarks, sure that extra bump in score, but in reality, I'm refresh rate limited so I see no point. I actually run lower power targets and fps caps.
I'm really fond of Radeon Chill and set it up whenever I can/remember.
 
Used to do that including hardware mods, bios flashing, et-al.
Nowadays just load driver and play card games. :-P

I'm in the same boat. At 1440p my 3080 and even my 1080ti played everything fine. There just isn't really a need to mess with it outside of e-peen.
 
You know, I didn't want to post this because I know a lot of people will want to shoot the messenger, and one data point does not a trend make, but I did some testing yesterday on the 5800X machine I built for my sister. When I first built it, the CPU could loop Cinebench at 4.7GHz at 1.3 volts. After 5 months of her gaming at BIOS default everything, the CPU can only loop Cinebench at 4.65 at 1.3 volts. 4.7GHz fails instantly, and 4.675 fails within a minute. So 50MHz of degradation in 5 months. I wanted to wait another few months and test again to see if there was further degradation, but maybe it's better if I call some attention to it now. It might just be the initial break-in, but that's a lot more break-in than I typically see. My gut tells me that all that bouncing around at 1.40-1.49 volts at 4.6-4.8GHz while gaming is not something that will lead to a long life. I think that after maybe 3 years, the CPU is going to be in blue screen heaven. Intel should not be let off the hook either - they run some crazy voltages and current levels with TVB, ABT, and whatever other insane algorithms they use to keep up with AMD in benchmarks. I guess time will tell, since these chips are still relatively new...
That could also be drivers, firmware updates, any number of things. Hard to be totally apples to apples, but it's a valid point.
 
Used to do that including hardware mods, bios flashing, et-al.
Nowadays just load driver and play card games. :-P

Was really into CPU overclocking and now with AMD PBO it's like stealing candy from a baby.
There are more gains (of course) but at the expense of much greater power consumption and heat.
A few fps or points in a benchmark, both meaningless in real world, seat of the pants experience, isn't worth the hassle. IMHO of course.
Some like to extract every last hp out of their car at the track where others sit in their shiny new Tesla Plaid, switch to cheetah mode and spank and embarrass the former.
Interesting times we live in that's for sure. Just as long as there's something for everyone. ;-)
Agreed. I keep the OC off on my Threadripper - sure the scores and e-peen is fun at a 4.4ghz all-core clock, which the motherboard ~did for me~, but... idle is almost 65c, load easily pings 80, it's summer and hot, and it pulls damn near 750W from the wall doing that. Dropped it to standard boost, and I pull 600W at max draw running benchmarks, and it idles at 48c. Still have the BIOS profile saved, just in case, but... I rarely use it.
 
There's enough variables in PC gaming already that cause instabilities and inconsistent performance. Some games are much more sensitive to OC's than others too. So I've stopped messing with it (at least on the GPU) and let it boost on its own. I'll definitely take more stability over a negligible performance gain on a VRR display at 60+ FPS as well.
 
There's enough variables in PC gaming already that cause instabilities and inconsistent performance. Some games are much more sensitive to OC's than others too. So I've stopped messing with it (at least on the GPU) and let it boost on its own. I'll definitely take more stability over a negligible performance gain on a VRR display at 60+ FPS as well.
Yeah for this reason I just try to OC w/ fan curve adjustments to get a more consistent boost clock. I have not had any problems with this method. I also don't go crazy with the mem OC.
 
Undervolting is way more effective on most cards, overclocking will trigger the power limit way too much, with undervolting you will see that way less and that is what counts. When the card really needs to churn out the most fps, you normally run into the power limit.
 
I use an AC adapter that has one. Many people use UPS's that have them.
 
Corsair AX series PSU's have a built in meter you can view through their software
Mine is about 9 years old, and was made before they came out with monitoring software.

I am planning to get a UPS, and that measures real-time power consumption of all the plugged in devices.
 
I always want the full stack, so I use the killawatt for what's pulling from the wall.
99DABB8F-DF43-47CF-95C3-4F6CAAA477C4.jpeg

This is a horrible photo but you get the point. It gives you total input and output and effecieny!

Mine is about 9 years old, and was made before they came out with monitoring software.

I am planning to get a UPS, and that measures real-time power consumption of all the plugged in devices.
That’s interesting I thought they all had it. Mine isn’t that old so that is good to know!
 
View attachment 381344
This is a horrible photo but you get the point. It gives you total input and output and effecieny!


That’s interesting I thought they all had it. Mine isn’t that old so that is good to know!
Doesn’t get the monitors, speakers, etc though (I do get a baseline on them though first so I know what they pull and what the system pulls). :). I tend to be worried about the full kit.
 
Doesn’t get the monitors, speakers, etc though (I do get a baseline on them though first so I know what they pull and what the system pulls). :). I tend to be worried about the full kit.
Oh nice I didn't think about that. I would have to do math and need two Kilowatt to deter mine, doesn't fit on one circuit breaker with the water chiller.
 
Oh nice I didn't think about that. I would have to do math and need two Kilowatt to deter mine, doesn't fit on one circuit breaker with the water chiller.
They do make some that feed off a circuit breaker (or two) and send to your phone. But that also captures the whole room, and depending on how the house was wired, all sorts of OTHER weird shit sometimes too (like my office also feeds the doorbell... on the other side of the house).
 
I built an overkill custom cooling loop with all EK, Bitspower, & Alphacool components. That is how I can achieve the highest boosts possible. Along with the extreme/gaming presets on the MSI 2080Ti Seahawk EK-X software I can max out the card with 3 power connectors plugged in. The cooler the card the higher it will boost until it reaches It's max boosts within reason.
 
Well, taco has stock air cooling on hers nd card boosts very high as well

...is overkill.
My rig is dead silent. Yours isn't.

Also your other components in your case are baking as your case is an oven, mine isn't.

You may not care, some of us do.
 
My rig is dead silent. Yours isn't.

Also your other components in your case are baking as your case is an oven, mine isn't.

You may not care, some of us do.
My rig is also dead quiet...yes it's overkill but it's both fun to do a loop and having 2x480 and 2x360 rads on my 3960 and 3090 the gpu runs at about 1950-2100 (power limited)at a whooping 40c load 100% dead quiet and worth it to me. Before I got the 3090 block the card would run at about 70c and only clock to 1850 or so
 
My rig is also dead quiet...yes it's overkill but it's both fun to do a loop and having 2x480 and 2x360 rads on my 3960 and 3090 the gpu runs at about 1950-2100 (power limited)at a whooping 40c load 100% dead quiet and worth it to me. Before I got the 3090 block the card would run at about 70c and only clock to 1850 or so
Exactly, I didn't want to get into the details of how it does boost higher with more custom loop cooling because of the thermal headroom. So thanks for chiming in mate.
 
Yes it's baking, but it will still outlast its usefulness. There!!
At slower boost clocks with a shorter lifespan for the card & other components in the case also because of prolonged exposure to higher temperatures.

The thread is about how to get higher boost clocks OCing. It's not hard to see the benefits of custom loops not just performance but other aspects as well as the previous member mentioned last post.
 
I don't usually OC my card anymore but I do run a custom fan-curve that is much more aggressive than stock. Presumably that would allow it to boost higher but I haven't done any actual testing.
 
Overclock till day I die, or they die and I replace. I clipped this out of a Path of Exile 4k screenshot
 

Attachments

  • PathOfExileSteam_2021_07_22_20_36_41_978_TEMPS.png
    PathOfExileSteam_2021_07_22_20_36_41_978_TEMPS.png
    239.4 KB · Views: 0
Always OC, but I don't always run them daily. I haven't done variable resistor volt mods or pencil mods since HD 3870 or so though, and I've been following the shunt mods but probably wouldn't do it. Now I tweak the BIOS if it supports voltage adjustment if I can't get it with afterburner, or live with the voltage, but I've been tempted to hop back into hard modding for higher OCs.

Some of these cards can push fairly high if you can keep them cool and boost the voltage.
 
No idea what that means, but I gave like anyways. Stage1, you mean p state? Mine is on p5 when gaming, nt sure. Co fused, p5 state is like max performance level. g'night

P0 is max performance. That's where you should be while gaming, unless the game is Stardew Valley
 
Don't we just HardOC the boost now? I mean with bios lock power limits that is all we are doing :geek:
 
Having jumped from a RX480 to a 6700XT I'm not too concerned as I'm still using the same 1440p 60Hz monitor. Plus its like going from 200HP to 400HP so I got power to spare for what I do. The joy of skipping two or three gens.
 
Still OC just for fun mostly. Have to scratch the itch to see what the card will do.

That said, it was a bit disheartening to get MPT up and running, case fans screaming and all at 100* junction due to 330W target, and discover that I'm barely getting 4 extra fps.

The OC-ing effort to performance ratio has been questionable for several generations now.
 
Getting a freesync/gsync-compatible monitor really sucked the wind of of my GPU overclocking sails. If 90fps looks pretty close to 100fps I'm not going to waste a bunch of time to try to get another 2-4.
I started to work on a custom fan curve until I realized that I was basically getting the same clocks as the factory curve except with much higher noise. Instead I just ramped up the power limits and left it alone.
 
Back
Top