Post your GTX 980 TI Overclock Results

I wouldn't expect any 250W computer component to last more than a few years.

It's always the lower power components that never seem to die.
 
Most video cards that see continuous/daily use typically don't make it much past the 3-4 year mark. Most due to fan assemblies wearing out... and then the board getting cooked. Either that or the heat typically does them slowly in, especially if they are overclocked. Exception to this is typically any video card that's waterblocked... they pretty much seem to last forever.
 
My GTX Titan X is still going strong in my friend's PC, still using a custom 425W BIOS running at around 1450 MHz with just the ACX 3.0 air cooler.

My pair of reference EVGA GTX 780, bought new in the middle of 2013, are still used daily in my dad's PC for gaming when he has downtime with work.

My pair of BFG Tech 8800GTX, a 190W card, lasted about 7 years before they gave up the ghost with regular gaming sessions lasting at least 6 hours a day when I was in university and later looking for work.
 
I wouldn't expect any 250W computer component to last more than a few years.

Why not? My GTX 780 Ti (which the 980 Ti replaced) is still alive and kicking in my girlfriend's PC, and it's 6 years old.

This 980 Ti is the shortest-lived graphics card I've ever owned. Even tried baking it to see if that would bring it back to life, but so far, no luck.

Oh well, already replaced it with an RTX 2070...
 
Finally got around to water cooling the 980 Ti Kpe's I had out of boredom. Had the waterblocks in the closet but considering the cost of these cards I felt I didn't want to take the risk till now. Anyhow now that they're water cooled the temps max out at load from 35-38 or 38-40 degrees depending on the card in SLI. Setup a custom rom at 1507 mhz locked in at 1212mv. Think I can go higher as it sometimes had boosted to 1544 to 1551 mhz but when I tried to set those frequencies in AB it would crash Furmark or 3DMark go to hell in a handbasket. Any Kingpin owners care to share their highest clock rates?

Thanks
 
I have a 980ti Classy on water that I've benched at 1600 core/8612 mem, but it just barely made it through. I have several runs ranging from 1580-1590 core. I would think the KPE on water could match or better a Classy.

Here's a link to a TimeSpy run from awhile back. I do not recall the exact OC settings in Afterburner though.

https://www.3dmark.com/spy/157587
 
I have a 980ti Classy on water that I've benched at 1600 core/8612 mem, but it just barely made it through. I have several runs ranging from 1580-1590 core. I would think the KPE on water could match or better a Classy.

Here's a link to a TimeSpy run from awhile back. I do not recall the exact OC settings in Afterburner though.

https://www.3dmark.com/spy/157587


Maybe its because I'm running SLI (almost typed SOL lol) anyhow got a probeit cable to measure what exact volts the cards are getting and at 1506 they're getting 1.18 and 1.17 v depending on the card but if I'm hitting a wall its all Maxwells fault and not the Kingpin design which as good as it is can't do much to help overclock a bad chip.
 
If I remember right my cards didn't care if there was a voltage increase using Afterburner. Can you disable SLI/PCIE slots on your setup to try individual cards. I had a pair of Gigabyte GHz Ed. 780s prior to the Classy and they would hit the same clocks in single and SLI. I'm not sure if it helped, but they were consecutive serials.
 
Maybe its because I'm running SLI (almost typed SOL lol) anyhow got a probeit cable to measure what exact volts the cards are getting and at 1506 they're getting 1.18 and 1.17 v depending on the card but if I'm hitting a wall its all Maxwells fault and not the Kingpin design which as good as it is can't do much to help overclock a bad chip.

I thought the KPE had special chips to allow real voltage readings using Precision X? I know I read that somewhere before.
 
Found it. Pretty good read too if you want to go [H]C with your KPE.

Enabling real voltage monitor
There is one more trick, regarding voltage readout in Precision X and K|NGP|N Edition cards (both 980 and 980Ti flavours). As you all already know, software is unable to show real voltage supplied to GPU, it shows what GPU VID setting currently is, not actual voltage. Most of users are not aware of this detail, and expect to see real voltage. That’s why we often get confused people in forums, who see LN2 records with 2000MHz clocks and asking how is that possible with only 1.212V?

Well, KPE cards have a solution for this, as Precision X can actually read real voltage. KPE PCB have special circuitry to measure real voltage delivered to GPU chip and Precision X can get that reading. This is not default enabled feature, so to enable real GPU voltage monitoring, follow few simple steps before you start your session:

  • Download latest Precision X, at moment of writing it was 5.3.6
  • Set both OVERVOLTAGE and OVERBOOST mode in Voltage section top right. No need change voltages.
  • Now VOLTAGE in center section will read actual voltage delivered by hardware.
  • You can disable voltage control via secret menu after, reading still will work.
Screenshot below shows example usage with 1.400V set. GPU Voltage is reported correctly in hardware monitor graph, log and OSD as well, which can be handy to monitor voltage droop/change during benchmark sessions. And after you found sweet spot for voltage, disable OSD and run for records (OSD have little performance hit when running, so you want it OFF when benching for records).

px_vmon.png


There is no limit on reading, so even 1.8V will be reported correctly. Again, this works only for EVGA GeForce GTX 980 KPE and EVGA GeForce GTX 980Ti KPE cards.

https://xdevs.com/guide/maxwell_big_oc/
 
I thought the KPE had special chips to allow real voltage readings using Precision X? I know I read that somewhere before.

Well I just got my ProbeIt cable and took real world voltage readings. Right now I'm at 1506 Boost and at that speed Precision XOC will report 1.212-1.202 volts depending on the card under Furmark load but Probeit with volt meter will state its at 1.17-1.18 volts. The difference being vdroop which I can compensate by disabling vdroop with the Classified Maxwell Tool resulting in a sold 1.222-1.232 volts depending on the card. Furthermore Precision XOC readings are only accurate when not under load I found because of that vdroop and setting fixed voltage level will seem spot on accurate on my volt meter until its under load. Theres also a 6mv discrepancy I think where the card supplies 6mv more voltage than what is being actually supplied.

Anyhow my cards without software like Afterburner give 1.212-1.202 volts at 1506 boost on their own as I modded my firmware recently to max out at a constant 1506. Sometimes a little less. For example will do 1215 mhz at cut scenes and drop to 1300-1430 mhz when it doesn't need the horsepower but got it at 1506 for most of time anyhow. I'm sure I can lock in 1506 with Kboost but don't think its neccessary. Going to try to boost it higher soon as my card has dip switches that give it a +25- +50 mv boost via hardware. It also has dip switches to control vdroop but haven't had luck figuring it out.

Thanks
 
Back
Top