Do you overclock your gpus? Edit fixed gramar

Do you overclock?

  • Yes, boost is never enough

    Votes: 35 41.2%
  • No, boost is enough

    Votes: 44 51.8%
  • I disable boost nd nt overclock at all

    Votes: 3 3.5%
  • Waht is overclock or boost? So confused!

    Votes: 3 3.5%

  • Total voters
    85

UltraTaco

Limp Gawd
Joined
Feb 21, 2020
Messages
150
Mates with all these boosts nd stuff, do you still do it? Tacos gtx460 didnt have any of these boost stuffs, but new 960 has boost nd taco didnt even tey overclocking. What about yu?
 
I stopped bothering with the 1080ti, haven't bothered since with my 2080ti or now 3080ti. There's just no point. The gains are so minimal now anyways. Just like the latest CPU's, there's little point if you can't improve the cooling. The thing is going to boost to just about the best it can anyways given the cooling it has.
 
FF14 always shit itself out with graphics overclocks for whatever reason so I stopped bothering, my GTX 570 I had to actually underclock its factory OC. Only that game too, never had issues with anything else lol.
 
Mates with all these boosts nd stuff, do you still do it? Tacos gtx460 didnt have any of these boost stuffs, but new 960 has boost nd taco didnt even tey overclocking. What about yu?

My 960 is overclocked into the stratosphere. It just kept saying GIVE ME MOAR! (y)
 
Not really anymore. I did it it as a hobby back in the day, and I no longer find the it as fun as I once did. Let the software take care of itself. Maybe, I tweak the fans a bit to make them a bit more aggressive in their ramp up, which may slow down when a card starts to throttle, but that is about it.
 
Not really anymore. I did it it as a hobby back in the day, and I no longer find the it as fun as I once did. Let the software take care of itself. Maybe, I tweak the fans a bit to make them a bit more aggressive in their ramp up, which may slow down when a card starts to throttle, but that is about it.

Not surprising. Older cards had much more headroom, and the firmware wasn't as locked down.
 
Not surprising. Older cards had much more headroom, and the firmware wasn't as locked down.

Can you blame them? Binning top end GPUs and then selling them for more money as "OC" versions and then making sure Joe Sixpack doesn't try some utility off the internet that edits his firmware, unlocks the voltage for more power, and then bricks his card.
 
Yes I do. However, I admit that I have had some bad luck overclocking graphics cards over the years. I've killed a number of them in short order with very modest overclocks and no voltage increases. As a result, I'm sometimes gun shy of doing it. I haven't touched the clocks on my RTX 3090. With that, it's largely because I wouldn't want to deal with replacing it. I did do quite a bit of overclocking with my GIGABYTE RTX 2080 Ti Aorus Xtreme 11G though.

However, I've killed the following through overclocking:
GeForce 3 Ti 500
GeForce Ti 4200
GeForce Ti 4600
Radeon 9600 Pro
GeForce 7800 GTX 512
GeForce 8800 GTX (1 of 4)
GeForce 9800 GX2 (1 of 3)
GeForce GTX 280 (1 of 4)
GeForce GTX 680 (1 of 3)
Radeon HD 7970 GHz Edition (1 of 2)
GeForce GTX 980 Ti (1 of 3)
 
I beat the crap out of my Ti 4600. That thing lived a hard and fast life.
 
Can you blame them? Binning top end GPUs and then selling them for more money as "OC" versions and then making sure Joe Sixpack doesn't try some utility off the internet that edits his firmware, unlocks the voltage for more power, and then bricks his card.

Nope, I don't blame them. People will try to RMA anything, and Amazon will take just about anything back.
 
I overclock the memory and let boost handle the core clock, most stable way to do it that i've found.

Usually its a balance of performance to voltages.
 
Nope, I don't blame them. People will try to RMA anything, and Amazon will take just about anything back.

I mean it was one thing when cards were $200. Now mid-range has graduated to $600 with less margins for the manufacturers (supposedly).
 
I mean it was one thing when cards were $200. Now mid-range has graduated to $600 with less margins for the manufacturers (supposedly).
Honestly, that's a big reason why I stopped. Same reason why I don't bother to modify my cars anymore. No point in modifying a nearly $100k car w/ a factory 800HP. Same goes with a $1200 GPU for only like 5% performance gains.
 
I stopped bothering with the 1080ti, haven't bothered since with my 2080ti or now 3080ti. There's just no point. The gains are so minimal now anyways. Just like the latest CPU's, there's little point if you can't improve the cooling. The thing is going to boost to just about the best it can anyways given the cooling it has.
This, basically. There's not much room and instability is game by game. It's hard to know when it's truly stable or not.
 
This, basically. There's not much room and instability is game by game. It's hard to know when it's truly stable or not.
This. I used to, when you could gain a lot. I remember explaining that my COU overclocked gained me almost 20 seconds on load times for UT2k4, and the gpu overclocked was 20FPS…. Now? PBO and boost does everything I need. And it’s stable.
 
The last great CPU overclocker for me was the Intel Core i9 10980XE. While it did boost to 4.8GHz on a couple of cores, it's all core overclocks were rather modest. (3.5-3.8GHz or something like that.) I clocked our test CPU to 4.8GHz all core. The fucker pulled over 500w of power but man did it ever perform. I knew then that it was probably the last time I'd ever see gains like that. Of course, percentage wise we've seen better over the years but the days of massive performance gains from overclocking or ridiculous clock increases are over. All these modern chips are binned so close to the edge of what they can do, there just isn't a lot of performance left on the table.
 
Yes I do. However, I admit that I have had some bad luck overclocking graphics cards over the years. I've killed a number of them in short order with very modest overclocks and no voltage increases. As a result, I'm sometimes gun shy of doing it. I haven't touched the clocks on my RTX 3090. With that, it's largely because I wouldn't want to deal with replacing it. I did do quite a bit of overclocking with my GIGABYTE RTX 2080 Ti Aorus Xtreme 11G though.

However, I've killed the following through overclocking:
GeForce 3 Ti 500
GeForce Ti 4200
GeForce Ti 4600
Radeon 9600 Pro
GeForce 7800 GTX 512
GeForce 8800 GTX (1 of 4)
GeForce 9800 GX2 (1 of 3)
GeForce GTX 280 (1 of 4)
GeForce GTX 680 (1 of 3)
Radeon HD 7970 GHz Edition (1 of 2)
GeForce GTX 980 Ti (1 of 3)
Would it not have been cheaper in the long run just to buy a faster card? Just asking.
 
Because back in the olden days, you could overclock and actually get 30-40% extra performance, sometimes more, and that was a lot when you're talking BF 1942 being a slide show or not.
This again. Used to be that you couldn’t buy it, or if you could, it was price prohibitive. My 20 seconds CPU overclock was on a Barton core - made a difference. Faster than the top end, and cost far less- and the top end normally clocked about the same at best too. Now? Just doesnt make a huge difference.
 
philb2, looks like a list of best available cards at the time, and multiples running different SLI configurations of those. If the current best card still isn't quite enough, and you can only get so much more out of running SLI, then you have to start overclocking to see what else can be gotten out of them.
 
Let it boost. By the time I need the extra performance, I tend to find a need for a new card overall.

I follow this train of thought. If I need to overclock, then it's already to late. The 5-10 percent you will get overclocking manually from boost is in my case, is not worth the heat/power/headache.

I'm much more likely to power limit and undervolt than anything else.
 
Last edited:
Would it not have been cheaper in the long run just to buy a faster card? Just asking.
I'm still on a 1080ti, when it came out - there were no faster cards.
The 2000 series were a joke for price to performance, so I skipped that generation.
And we know how the 3000 series availability is going. That said my OC'd 1080ti is holding it's own really well considering how old it is and the stress I've put on it.
 
Sice the 1080 Ti I’ve been undervolting my 2080 Ti and 3090 because stock power consumption is getting out of control. Also with gsync there’s really no need anymore for that last bit of FPS because it doesn’t look any better, but the noise levels keep going up if you push it.
 
Yes, I buy components with this in mind. Everything in my build is geared towards overclocking. Its a hobby that i enjoy doing. I run my 7980xe at whatever is stable and just under thermal throttle. I also disable speedstep. I enjoy the snappiness of the system being ran this way. My 3090 is run at default clocks and the power slider at -40% when not under load. Quick and dirty OC for gaming and flashing XOC 1KW vbios and really turning the buckets to 11 for benchmark runs.
 
I used tto, back when cards still had a bit of headroom available But no, the factory boost is usually within 5% of maximum (and overvolting is insanely power-consumptive

My best overclocking cards:
Riva TNT (15%),
Radeon 8500: 20%
GTX 460: 25%

But because I never overvolted, I have never killed an overclcoked card (all these cards have been used for a few years, then sold)

But modern cards are ore a mixed-bag.
 
On the 3080 I just run factory settings (factory OC), but I have OCed to find the limits of the cards. Not really any point in running an OC on top of factory clocks as there is only around 5-6% more to gain when running borderline stable.

The last great overclocking card I've had was a Asus GTX 580 ROG Matrix platinum as it could do more than 20% over a reference card on air without issues. Other than that the cards have typically had 5-7% gain with most of it on memory compared to factory clocks (all cards have been OCed 5% or more from factory on the core clocks).
 
When there's more competition, overclocking headroom vanishes. Consumers just see the raw frame rate numbers and most don't understand that certain compromises had to be made to get there. As I look at how today's CPUs and GPUs are being configured, it seems like a lot of them are being pushed harder at their factory settings than most overclockers would be pushing them. I would be surprised if many of these parts last beyond the warranty period.
 
When there's more competition, overclocking headroom vanishes. Consumers just see the raw frame rate numbers and most don't understand that certain compromises had to be made to get there. As I look at how today's CPUs and GPUs are being configured, it seems like a lot of them are being pushed harder at their factory settings than most overclockers would be pushing them. I would be surprised if many of these parts last beyond the warranty period.
They're far more resilient than they used to be - and due to the competition, the vendors (after initial release) know their parts well. PBO and optimized boost are well designed on the CPU side, and the GPU boost... same. We've gotten good at the feedback loop.
 
They're far more resilient than they used to be - and due to the competition, the vendors (after initial release) know their parts well. PBO and optimized boost are well designed on the CPU side, and the GPU boost... same. We've gotten good at the feedback loop.

You know, I didn't want to post this because I know a lot of people will want to shoot the messenger, and one data point does not a trend make, but I did some testing yesterday on the 5800X machine I built for my sister. When I first built it, the CPU could loop Cinebench at 4.7GHz at 1.3 volts. After 5 months of her gaming at BIOS default everything, the CPU can only loop Cinebench at 4.65 at 1.3 volts. 4.7GHz fails instantly, and 4.675 fails within a minute. So 50MHz of degradation in 5 months. I wanted to wait another few months and test again to see if there was further degradation, but maybe it's better if I call some attention to it now. It might just be the initial break-in, but that's a lot more break-in than I typically see. My gut tells me that all that bouncing around at 1.40-1.49 volts at 4.6-4.8GHz while gaming is not something that will lead to a long life. I think that after maybe 3 years, the CPU is going to be in blue screen heaven. Intel should not be let off the hook either - they run some crazy voltages and current levels with TVB, ABT, and whatever other insane algorithms they use to keep up with AMD in benchmarks. I guess time will tell, since these chips are still relatively new...
 
Yes I do. However, I admit that I have had some bad luck overclocking graphics cards over the years. I've killed a number of them in short order with very modest overclocks and no voltage increases. As a result, I'm sometimes gun shy of doing it. I haven't touched the clocks on my RTX 3090. With that, it's largely because I wouldn't want to deal with replacing it. I did do quite a bit of overclocking with my GIGABYTE RTX 2080 Ti Aorus Xtreme 11G though.

I'm not overclocking my 3080 as I too don't want to be bothered in this climate to replace it. I also maybe game 6/7 hours as week at most. If I was on more, I'd probably be more inclined. Helps I have a decent 3080 model to begin with.
 
not only do i overclock them i flash XOC bios' onto them, remove all the protections and run them until they scream and beg for mercy. and if i can ill solder an EVC2 on and take voltage control too
 
I've tried a few times (across a couple different generations) and the performance increases have always been minimal. Every now and then a little increase will take a game from 58fps to 60fps (with vsync), but that's really rare. I keep PX1 or Afterburner on my system just in case, but 99% of the time I never use it. Especially now that I have Gsync. Usually it's just a shortcut to a couple hundred more points in 3DMark and very little else.
 
Back
Top