Post your GTX 980 TI Overclock Results

I see you run it at 1080P. It may not throttle at that point. 1080P is not a good test for the 980TI. It won't max out your card. At 4K it will throttle by 200-300 MHz.

If you have a no name PSU then I'm almost positive it's the PSU. Don't trust any no name PSU to output its advertised wattage. Your 700 watt is probably gutless and it may be AC wattage rather than DC. Also, if it's not 80 plus, it's efficiency is probably 60-70%. 70% of 700 AC is 490 DC watts.

The EVGA 850 G2 is 850 DC rated. That means it can draw up to 950 watts of AC power. Huge difference. Your PSU is probably not defective, it's just a gutless wonder and can't handle the 980 TI. Don't skimp on your power supply if you're buying a $700 video card!

99% certain that its industry standard that the wattage rating is output not input. At notebook review we discussed this thoroughly at one point figuring this out.

So when you buy a 700 watt PSU at 60% it'll put out 700 watts but will draw 1166 watts. An 80% will draw 875 watt

Again if i am pulling 80 TDP in furmark and no issue how is 50 TDP or less crashing in game. That makes no sense. Also the test is running at 100% load and 30 FPS. any higher res won't make a difference.

4K msaa 8x 1550 mhz +0mv
https://www.dropbox.com/s/4ederdsabtkbblx/furmark 4k msaa8x 1550 +0mv.png?dl=0

375 second with minor throttle. forgot what original is but its like 20 mhz less.

I get this voltage drop thing but I don't believe its a max output issue. The numbers don't add up for that but the up down and changing voltages would make sense if that is the issue.
 
Last edited:
I see you guys mention Furmark issues - what about Kombustor? Isn't it based on Furmark? I tinker with it but generally use Heaven and Firestrike in my tests.
 
99% certain that its industry standard that the wattage rating is output not input. At notebook review we discussed this thoroughly at one point figuring this out.

So when you buy a 700 watt PSU at 60% it'll put out 700 watts but will draw 1166 watts. An 80% will draw 875 watt

Again if i am pulling 80 TDP in furmark and no issue how is 50 TDP or less crashing in game. That makes no sense. Also the test is running at 100% load and 30 FPS. any higher res won't make a difference.

4K msaa 8x 1550 mhz +0mv
https://www.dropbox.com/s/4ederdsabtkbblx/furmark 4k msaa8x 1550 +0mv.png?dl=0

375 second with minor throttle. forgot what original is but its like 20 mhz less.

I get this voltage drop thing but I don't believe its a max output issue. The numbers don't add up for that but the up down and changing voltages would make sense if that is the issue.

Yes, industry standards and reality is different: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=335

Well, if it's not your power supply, that means it's your card. It'd be a shame to RMA a card that hits 1550.
 
Yes, industry standards and reality is different: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story4&reid=335

Well, if it's not your power supply, that means it's your card. It'd be a shame to RMA a card that hits 1550.

I agree. I am just trying to figure it out. I would hate to have to buy a new case and PSU just to trouble shoot. I will buy a new one eventually but i rather not expense that money yet.

I'll make a thread on that (buying items)
 
Just got my MSI 980Ti today. Only overclocked to 1250/1850 so far.

My ASIC is terrible: 61.5%

Firestrike score with 5820K @ 4.3Ghz

t0ncwk.jpg
 
i was able to test PS2 successfully for a few mins at stock and it runs at 90% TDP so for whatever reason furmark and GPUz and afterburner are not reporting right while furmark runs. I do think it might be the PSU now. I made a thread about a case. If I buy a new PSU i am getting a new case so if you got opinions for my needs on a case let me know.

http://hardforum.com/showthread.php?t=1869174
 
Boy-howdy does Wolfenstein TNO not like my overclock and results in a TDR after 30 minutes even after bumping the core voltage up a smidge. I've got it on +50mV now and if I get another TDR looks like I need to revisit my OC.
 
Try hitting above 18K.

I can reach 17,331 with the card overclocked to 1275/1900.

I don't think over 18k is within reach, since my card isn't 100% stable at 1275/1900. I think my CPU is holding me back as well, since it'll only go to 4.3Ghz.
 
Yes at 4.3 GHz you aren't touching 18K. I managed 18K only after overclocking to 4.6 on the processor and 1500 boost on my classified.

17.3K is what I got with 4.2 on the proc and 1468 on the core.
 
My continuous pursuit of the elusive 5K in Firestrike ultra continued as I looked towards Kingpin and his bios. His bios does one thing that my card was struggling with. Peg the voltage to 1.212 to a monkey's bottom. Fired up his bios and ramped up the clocks to 1524/7366.

Card remained at 65-66C during benching. No change in temperatures whatsoever. However, the throttling is gone.

And guess what...

I hit fucking 5K ... on AIR COOLING.

5K

Some more results.

http://www.3dmark.com/3dm/7804929? Firestrike 18238
http://www.3dmark.com/3dm/7804969? Firestrike extreme 9183
 
Last edited:
My continuous pursuit of the elusive 5K in Firestrike ultra continued as I looked towards Kingpin and his bios. His bios does one thing that my card was struggling with. Peg the voltage to 1.212 to a monkey's bottom. Fired up his bios and ramped up the clocks to 1524/7366.

Card remained at 65-66C during benching. No change in temperatures whatsoever. However, the throttling is gone.

And guess what...

I hit fucking 5K ... on AIR COOLING.

5K

Some more results.

http://www.3dmark.com/3dm/7804929? Firestrike 18238
http://www.3dmark.com/3dm/7804969? Firestrike extreme 9183
Its hard to believe you are not throttling at all. I am using the same core clock as you but memory at 7800 and throttle like crazy in parts of Firestrike but I still got a higher graphics score than yours.
 
Waiting for my water block to arive but with 1.23v I can hit 1495mhz EVGA ACX using a custom bios. Just an extra 5 mhz though and its unstable after a hour or two. 455+ on the memory, also seeing a good fps increase in farcry 4 with the memory OC. We'll see what it can do with 1.271v.

Got the block installed yesterday, very impressed with the results. Increased the voltage through the bios to 1.281v, 1.274v after vdroop, and I'm stable at 1550mhz/+488 on the memory. Temps are great, mid 40's after 3 hours of Witcher 3. It may go higher, just wanted several hours testing at 1550 to insure complete stability. Firestrike graphics score 21400!
 
I had to stop OCD'ing on overclocking and benchmarking for what little benefit there really was, with minor number changes. I do enjoy tweaking the best overclock, but I have really neglected the main reason for buying SLI 980 Ti's...... and that's GAMING! So, I put the damn slider for Core at +250 and Memory +275

Gives me about 1455 on the core and about 7560 on the memory. No artifacts and everything is stable. I'm going to fire up Witcher 3, heavily modded Skyrim, Batman, Wolfenstein, etc....and just play


...then, I'll overclock and test again tomorrow :rolleyes:
 
Voltage mod gave me higher mhz up to 1540 but firestrike scores werent much better. I just revertrd back to stock bios at 1480 and get pretty much same performance. Did it to see how much it would improve.
 
So my system specs are in my signature. I ended up setting a modest +109 to core on my Evga SC ACX cooler and that puts me at an even 1400Mhz on boost. My firestrike score is around 15100 on average. Does this sound right? With the oc I do notice about a 700-1000 point difference in scores. People getting 16000+ in firestrike is the main reason for me being lower the fact I'm on a i5 and not getting higher physics and combined scores? Even at my highest oc that was about 1530-1550 on boost my score is only around 15350 in firestrike reg trial version.
 
Last edited:
So my system specs are in my signature. I ended up setting a modest +109 to core on my Evga SC ACX cooler and that puts me at an even 1400Mhz on boost. My firestrike score is around 15100 on average. Does this sound right? With the oc I do notice about a 700-1000 point difference in scores. People getting 16000+ in firestrike is the main reason for me being lower the fact I'm on a i5 and not getting higher physics and combined scores? Even at my highest oc that was about 1530-1550 on boost my score is only around 15350 in firestrike reg trial version.

I'd say the i5 is what's contributing to the lower Firestrike scores. I see folks with those hexacore i7s destroying my score with the same OCed 980Ti.

Speaking of OCing, I had a pretty stabled overclock at +140/+500 which was solid in TW3 and Wolfenstein - caused a hard lock on BF4 with lots of yellow artifacts all over the screen. Does that sound like a memory issue? I backed the memory down to +450 and will keep an eye out.

This reminds me of OCing my GTX980 wherein BF4 hated ANY amount of memory OC and I settled on just adjusting the core clock. Hope that's not the same case here.
 
People look at the graphics score not the overall score when comparing gpu's ;)

Here's mine at 1550mhz/+480 memory:
 
Last edited:
figured out how to do 2 PSUs so I could try my EVGA G2 1300 on my GPU and guess what! My GPU crashing woes are over. I am stable at stock in every test in kombuster.

I am stable in every test except physics in kombuster at 1551 3758!!!

For whatever reason physics is very unstable on this GPU which is what i bet is happening with many of you guys when you play select games. If the game is heavy physics that might be why it sunstable but not in others. Try MSI kombuster and see if you crash in the physics. This GPU arch seems to be touchy in that test.

Also I have added 20 mv as well. I don't think I need it I just added it to be on the safe side to see if i could get up to 1550. I stopped there because I am happy. I'll try some higher clocks and lower voltages later.

So is 1550/7500 pretty good? Thats just where i stopped on my gigabyte G1

Why do some games require more TDP when they both are using memory and GPU at same utilization?

Saints Row 3 at 4K uses like 60-80% and Planetside 2 uses 80-120+% All 100% GPU or less at times for PS2

Saints is 100% GPU all the time and Planetside 2 goes up and down in utilization and TDP is way way higher.

Batman Arkham knight is like 60-80% TDP with 60-100% utilization.

Why are they so different?
 
Last edited:
figured out how to do 2 PSUs so I could try my EVGA G2 1300 on my GPU and guess what! My GPU crashing woes are over. I am stable at stock in every test in kombuster.

I am stable in every test except physics in kombuster at 1551 3758!!!

For whatever reason physics is very unstable on this GPU which is what i bet is happening with many of you guys when you play select games. If the game is heavy physics that might be why it sunstable but not in others. Try MSI kombuster and see if you crash in the physics. This GPU arch seems to be touchy in that test.

Also I have added 20 mv as well. I don't think I need it I just added it to be on the safe side to see if i could get up to 1550. I stopped there because I am happy. I'll try some higher clocks and lower voltages later.

So is 1550/7500 pretty good? Thats just where i stopped on my gigabyte G1

Why do some games require more TDP when they both are using memory and GPU at same utilization?

Saints Row 3 at 4K uses like 60-80% and Planetside 2 uses 80-120+% All 100% GPU or less at times for PS2

Saints is 100% GPU all the time and Planetside 2 goes up and down in utilization and TDP is way way higher.

Batman Arkham knight is like 60-80% TDP with 60-100% utilization.

Why are they so different?

Occupancy says how much of the GPU is being utilised, not what it is doing.
Code efficiency will have a bearing too.

Take for example tests like Unigine Heaven and Furmark.
Both run the GPU at 100% but one of them can use a lot more power and make the GPU a lot hotter.
 
Occupancy says how much of the GPU is being utilised, not what it is doing.
Code efficiency will have a bearing too.

Take for example tests like Unigine Heaven and Furmark.
Both run the GPU at 100% but one of them can use a lot more power and make the GPU a lot hotter.

i dont get how that can happy. It has the power to do only X amount of work and that is the same no matter what uses it so if its at 100% it should be the same no matter what unless:

100% can happen when a program only uses a single aspect of the GPU like the CUDA cores or the ROPS or even both, which would result in different total power numbers

I forget whats the technical names these days. Its gone from something i don't even remember to pixel pipes vextor pipes and so on to compute cores or something.
 
i dont get how that can happy. It has the power to do only X amount of work and that is the same no matter what uses it so if its at 100% it should be the same no matter what unless:

100% can happen when a program only uses a single aspect of the GPU like the CUDA cores or the ROPS or even both, which would result in different total power numbers

I forget whats the technical names these days. Its gone from something i don't even remember to pixel pipes vextor pipes and so on to compute cores or something.
Its the same thing for cpus. A game can push your cpu to 100% and it will use no where near the power of running your cpu at 100% with Prime 95.
 
i dont get how that can happy. It has the power to do only X amount of work and that is the same no matter what uses it so if its at 100% it should be the same no matter what unless:

100% can happen when a program only uses a single aspect of the GPU like the CUDA cores or the ROPS or even both, which would result in different total power numbers

I forget whats the technical names these days. Its gone from something i don't even remember to pixel pipes vextor pipes and so on to compute cores or something.
I already explained that isnt the case.
You could have researched before replying.

Power has many different ways of being defined.
Saying a GPU has the power to do x amount of work isnt relevant unless x is very specific, it has the ability to do many different things.

Execution units within a processor have different elements.
Not all are required for certain functions but the unit is still occupied nonetheless.
A function can utilise a different number of circuits within an execution unit
 
I already explained that isnt the case.
You could have researched before replying.

Power has many different ways of being defined.
Saying a GPU has the power to do x amount of work isnt relevant unless x is very specific, it has the ability to do many different things.

Execution units within a processor have different elements.
Not all are required for certain functions but the unit is still occupied nonetheless.
A function can utilise a different number of circuits within an execution unit

that's what i said -_- wow way to read. Coding efficiency has no impact on this. If someone codes to us just the compute units the compute units can only do X amount of work. Coding can't make a compute unit pull more or less energy if its being 100% used already. rofl
 
anyone know why for the life of me I can't get my GPU to take more than 1.224v? No matter what I can't get it to draw 1mv extra. I am trying to get 1.25v so I can actually test 1600Mhz 1550 works most of the time at 1.224v but it could use a hair extra as well. I just cant get it to actually draw extra. What is wrong?
 
anyone know why for the life of me I can't get my GPU to take more than 1.224v? No matter what I can't get it to draw 1mv extra. I am trying to get 1.25v so I can actually test 1600Mhz 1550 works most of the time at 1.224v but it could use a hair extra as well. I just cant get it to actually draw extra. What is wrong?
Some cards are simply capped at a certain voltage limit. And 1550? Lol sorry I don't believe for a second that 1550 is game stable at settings that actually tax the card like demanding games with settings cranked at 1440 or 4k.
 
Don't know if I posted my updated overclock results but since I went SLi I can do 1480/7800 all day long but it gets quite noisy and hot.

I have kept it at 1450/7700 for daily use with fan speed that is bearable. Cards individually will overclock to 1524/7800 on Classified and 1510/7800 on ACX SC+. I can possibly get my classified to 1550 if I give it 1.23 volts which I haven't done thus far.

25692 Firestrike
15847 Firestrike Extreme
8633 Firestrike Ultra

Some pictures as well.

Z0LOZ7V.jpg


PYHDdwv.jpg
 
Some cards are simply capped at a certain voltage limit. And 1550? Lol sorry I don't believe for a second that 1550 is game stable at settings that actually tax the card like demanding games with settings cranked at 1440 or 4k.

1550 is stable in Planetside 2 (and all games PS2 is the worst) with 1.224/1.243v at 4k max settings I hit 123% power rofl at 1.224v

I figured out whats wrong. I either have to have 1.224 or 1.243v. I had to crank the slider all the way to +50(or was it +60?) before it went to 1.243v. I can crank the slider to max of +87mv and nothing happens. I can't get 1.25 or 1.3v like I wanted to. I guess I have to use BIOS?

I can't get 1600 to be stable in PS2 but it works fine in other things. I'll give it a try in some other games that don't have me hitting 132% TDP (wtf) I bet its my crap ASIC (65.7) that is causing this. 1570mhz at 1.243 is almost stable....it works for a while but planet side 2 has some crazy power draw (132%!!!) It didn't even crash at that power draw :D. It happened much later in battle like an hour in. Alternatively:

arkham knight
Saints Row 3
ark survival evolved
and others

at 4K do not have anything close to 100% draw at 1550mhz. They chill at like 60-80% TDP. I might get away with 1600mhz in those with reasonable success. I don't expect it to be 100% stable but I expect it to work well. I'll try some ark and other games at 1570 and 1600 to see how they go.

As I stated before Physics appears to be very unstable at any OC. I bet it has to do with this architecture or drivers. pure core benchmarks and usage can see high OC but Physics crashes at like 1450-1500 for some reason.

Planetside 2 on the other hand is freaking nuts! I hit 132% TDP!!!! I had 1570mhz at 1.243vs and I was good for a while until I hit an area that was being blown up like Iwo Jima!!! This was also at 7600 memory. I haven't messed with memory yet.

I am finding certain games are way more stable than others so I am trying to make custom OC for each since I am trying to get 4K in all of them :)

F@H only works well at 1500. For some reason it can go to 1600 without crashing but the load is all wonky. It goesto 100% then slowly lowers to 50% and goes back to 100% for a few seconds and cycles in that manner continuously. That happens at anything over 1520 or so. :/

EDIT: yea i am stable at 1550 in everything i have tried so far. I complete firestrike at 1550 but anything higher crashes. I can play many games at 1570 for decent amount of time to indefinitely depending on the game but 1600 crashes in everything. If i had more voltage MAYBE but i doubt it. I could use a hair more voltage for 1570 for sure in those select games like PS2...wtf seriously 132% TDP o_O

Oh also 1600 is stable in furmark (F@H sorta of) but nothing else lol. I think it has to do with physics. Physics is so damn unstable at any OC. Can you guys try MSI kombuster physics test at various OCs? If you guys have the same issue as me then the architecture has major OCing issues in regards to physics and not plain compute/core usage.

3d mark 11 for fun
http://www.3dmark.com/3dm11/10104063

Firestrike (are those scores any good? I am at 1550/7600)
http://www.3dmark.com/3dm/7926024?
 
Last edited:
Your graphics score is much less than 1% better than I get bouncing around between 1460-1523 with 1490 for average so you must be throttling at some point at least by a little bit.

http://postimg.org/image/65jgvzns3/

so what clocks you used? I had like 100 points higher on GPU lol. Though I don't know if it matters that I have 2 screens plugged in (4k/1200p), chrome, and a bunch of monitoring programs. Also I had voltage at max, which might have been counter productive but not sure. I had voltage at max to see make sure voltage wasn't a stability issue. I can try lowering it. I only capped 123% TDP so throttling shouldn't be an issue. I hit 132% before rofl

FYI when i looked at charts i didn't notice throttling. I tend to have nearly static clocks but I'll test again.

EDIT: like 10 points higher this time. There is a dip in usage in one benchmark. I see GPU usage go down but not a clue why. Power isn't high, temp isn't high, clocks are static 100%, CPU is clear, and everything else looks good but its drops in utilization for some odd reason in a single test but thats it. Everything else is 100% static. weird right?

EDIT 2: I tried 1.224 volts for 1552/7600 instead of 1.243v and it auto shifts down to 1540mhz and i got half way through test before drivers crashed so i need 1.243v for 1552/7600 for sure. I wish i could get some extra voltage without having to tinker with bios so I could try 1570 or 1600 but I guess 1550/7600 is pretty good :) I just need to tinker with memory some more for max OC. I am pretty happy with card so far and it does great in games. I am happy i past that 1500 marker :) Thats what i really wanted so i am happy especially considering the crap ASIC lol
 
Last edited:
Here's mine at 1555mhz/+480 memory compaired with yours SomeGuy. About a 4% difference in the graphics score, you must be throttling at some point during the bench. Mine is under water and does not throttle. Also whats your 4770 clocked at?

 
Here's mine at 1555mhz/+480 memory compaired with yours SomeGuy. About a 4% difference in the graphics score, you must be throttling at some point during the bench. Mine is under water and does not throttle. Also whats your 4770 clocked at?


4.2 ghz so meh OC. I sort had my old 3920xm at 4.3 lol. I can't get any higher :/

As i posted above I am not throttling but it looses utilization for some odd reason. Freqs are 100% constant in the entire test. It just drops in usage for some weird reason...no clue why. But again clocks are 100% constant not even 1mhz drop through the whole thing. Just won't use the whole GPU -_-
 
You guys must be getting some throttling going on cause my 290x crossfired setup can get 22k on firestrike normal, I get hammered on physics tho due to running just 8350 cpu.
 
You guys must be getting some throttling going on cause my 290x crossfired setup can get 22k on firestrike normal, I get hammered on physics tho due to running just 8350 cpu.
Well a 980 Ti is only about 50% faster than a 290x and you are getting only 33% better score than a single 980 Ti so that sounds about right.
 
Hmm just figured the 980 ti sli would score better than that overclocked since I am running at default clocks on the 290x, well one runs the core at 1050 the other is at 1000. Tho I admit I am no benchmark junky.
 
Hmm just figured the 980 ti sli would score better than that overclocked since I am running at default clocks on the 290x, well one runs the core at 1050 the other is at 1000. Tho I admit I am no benchmark junky.

we are getting 21k on a single OC 980 TI not SLI.
 
Back
Top