NVIDIA GeForce GTX 980 SLI Overclocked GPU Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,629
NVIDIA GeForce GTX 980 SLI Overclocked GPU Review - We take 2-Way GeForce GTX 980 SLI and overclock both video cards. While we are at it, let's also overclock Radeon R9 290X CrossFire and GeForce GTX 780 Ti SLI. Once we get these clocked to the highest rates we could reach, let's see how these all compare at 4K resolution, NV Surround, and AMD Eyefinity.
 
Wow First? Nice review. I like the voltage discrepancy mentioned between the 2 Cards. It's about time someone acknowledged this finding. The more I played with overclocking my cards the more I noticed that this caused my overclocking to be limited. Card with the lower voltage would not be able to sustain the overclock I had stable on the higher volted card because of lack of voltage. This took a lot of testing and trying only 1 card in the system and then the other, then both and back again. This would also cause in game stuttering, even a crash here and there in some games. Now that I have gone further and fixed the issue via a bios flash my cards are much more stable and my overclock is higher because of this.
 
i pinged evga about the status of my lengthy step-up to reference gtx 980 sli over the weekend. i asked if the step-up would happen faster if i switched over to the acx gtx 980s instead, and the rep replied that if i chose to do so i'd have to go back to the end of the queue. after reading this article, maybe i should change my step-up to the acx cards. hmmm......
 
I just have to throw in my own experience here.
You can look at what I own.
I am no fanboy.
I run both my systems on the same triple monitor set you all use, same drivers, same in game settings.

My clocks on my 980s are straight lined at 1446 in game.
My 290Xs 1000.

I find little or no frame rate differences between the two set ups, I have tested Crysis3, FarCry3, Metro LastLight Redux and Alien Isolation.

I do find just the opposite of what you all say.....CrossFire stutters, especially in Metro, where-as SLi does not......both systems are very playable, but all else equal, I'd go SLi.:D
 
i pinged evga about the status of my lengthy step-up to reference gtx 980 sli over the weekend. i asked if the step-up would happen faster if i switched over to the acx gtx 980s instead, and the rep replied that if i chose to do so i'd have to go back to the end of the queue. after reading this article, maybe i should change my step-up to the acx cards. hmmm......

I run my Zotac reference 980s at 1446 Mhz in game.
I have the fans set at 75% and the temps never go above 70C.:D
 
I run my Zotac reference 980s at 1446 Mhz in game.
I have the fans set at 75% and the temps never go above 70C.:D

Yeah, my reference cards never went above 75c at stock fan curve in games. Heaven or furmark was different. 3d Mark Fire strike maybe too. Not sure why the temps were so hot in the review. On top of that feedback in the past I have used a custom fan curve at stock bios volts for even better temps.

Now I have a aggressive custom fan curve because my bios runs my volts at 1.275 solid on both cards and in games I rarely get above 70c and the noise isn't bad either. :confused:
 
I'm surprised the smoothness is being somewhat attributed to AMD going over the PCIe bus. I would have just guessed a custom bus card to card, keeping the PCIe lanes clear, would of been better.

Not that I really care, I have no plans for multiGPU :)
 
I have to say that was one crappy overclocking 290x.. same with the 780ti.

1150core and 1500mhz on the ram with elpida is very common on the 290x WITH REFERENCE cooler... and elpida is not even that good. My Asic score is only 74.1 and it tops out at 85c with this overclock fan running at 55%

My GTX 780ti runs 1255mhz on the core and almost 7.5 on the memory never reaches over 83c with the fan at 100%

This article is like damage control. I am still waiting on the next NVidia flag ship card.. these cards aren't that good.. I want raw horse power rather than relying on optimizations and overclocks.
 
As it stands, the big advantage in the 980 is not in its performance (only 10% faster than a 290X and that advantage somewhat disappears in SLI vs 290X Crossfire), but in its low power consumption, which enables a bigger OC headroom. That and your room does not feel like a sauna after a couple of hours of gaming - although that's an advantage in the winter.


Wow First? Nice review. I like the voltage discrepancy mentioned between the 2 Cards. It's about time someone acknowledged this finding. The more I played with overclocking my cards the more I noticed that this caused my overclocking to be limited. Card with the lower voltage would not be able to sustain the overclock I had stable on the higher volted card because of lack of voltage. This took a lot of testing and trying only 1 card in the system and then the other, then both and back again. This would also cause in game stuttering, even a crash here and there in some games. Now that I have gone further and fixed the issue via a bios flash my cards are much more stable and my overclock is higher because of this.



It's always harder to overclock multiple cards. There's the fact that the heat output is doubled of course. Then there's the fact that you will be bottlenecked by the weaker of the 2 cards.

All in all, you gotta lower the max clocks by around 100 MHz or so (more if your weak card is really bad on the silicon lottery) to make it work out.
 
I have to say that was one crappy overclocking 290x.. same with the 780ti.

1150core and 1500mhz on the ram with elpida is very common on the 290x WITH REFERENCE cooler... and elpida is not even that good. My Asic score is only 74.1 and it tops out at 85c with this overclock fan running at 55%

My GTX 780ti runs 1255mhz on the core and almost 7.5 on the memory never reaches over 83c with the fan at 100%

This article is like damage control. I am still waiting on the next NVidia flag ship card.. these cards aren't that good.. I want raw horse power rather than relying on optimizations and overclocks.

You aren't running in SLI/CF configuration. Temps are much more difficult to keep under control in a 290X CF config.
 
I run my Zotac reference 980s at 1446 Mhz in game.
I have the fans set at 75% and the temps never go above 70C.:D

very reassuring to hear mr magoo. i'll stay on the reference train. though i must disagree with you about your previous comment regarding sli vs xdma cf smoothness. at 4k and restricting myself to only a 2 card solution, i find myself sitting in the 30-40fps range more often than not. at this resolution the performance of 290x cf, gtx 780 ti sli, and gtx 780 6gb sli is pretty close among the three arrays i've dallied with in this last generation of cards. aside form some momentary hitches i find that the cf platform produces none of the trademark micro-stutter flutteryness of the sli set-ups.
 
Thanks for the thorough review.

I like reading about the overclocking, even though I haven't bothered since the gtx280 days :)
 
as a new owner of the msi 980 4g, im a little worried about 4k gaming in the next 1-2 years.

4k gaming with current and past games is obviously very taxing even for SLI.
imagine what it'll be like when that next "crysis" state of the art game for 4k comes out, an sli 980 setup wont even be worth the try.

the 980 could be the last badazz 1080p card we see, everything after this will be 4k gaming ready.
 
Last edited:
It's always harder to overclock multiple cards. There's the fact that the heat output is doubled of course. Then there's the fact that you will be bottlenecked by the weaker of the 2 cards.

All in all, you gotta lower the max clocks by around 100 MHz or so (more if your weak card is really bad on the silicon lottery) to make it work out.
While true, this doesn't explain the differing voltages with this release. The problem has only arisen after the release 344 drivers in Kepler and Maxwell cards. The voltage discrepancy did not occur with previous drivers. ASIC quality differences is not an explanation, either, because my 780s differed in ASIC quality moreso than my current 970s and there was never any discrepancy in voltages with the 337.88 driver and SLI. Without modifying the BIOS, one will always get a voltage reliability warning on at least one card when running in SLI.

Yes, you will not be able to overclock as high as single card configurations, but with the added voltage discrepancy overclocking and in fact stability is being compromised to a higher degree.

The "official" response from NVIDIA? Add a note in the driver release notes that this is normal behavior :rolleyes:. I bet they're hoping that this will quiet us.
Page 10 of 344.65 Release Notes said:
Differing GPU Voltages in SLI Mode
When non-identical GPUs are used in SLI mode, they may run at different voltages. This occurs because the GPU clocks are kept as close as possible, and the clock of the higher performance GPU is limited by that of the other. One benefit is that the higher performance GPU saves power by running at slightly reduced voltages.

An end-user gains nothing by attempting to raise the voltage of the higher performance GPU because its clocks must not exceed those of the other GPU.
NVIDIA Customer Care said:
We recently clarified in the 344.65 driver release notes that it is normal for the voltage to differ in some cases, as the system maintains the cards at the same clock speed, and in some cases cards of the same model can achieve the same clock speed at differing voltages. If this is not what you are seeing, or you are seeing different clock speeds on your cards as well as different voltages, please let me know.
So just because both cards are running the same clocks, it's okay...

DirtySouthWookie on the master thread over on the GeForce forums summarized it best:
DirtySouthWookie said:
Long Read Warning: Cliffs notes= "Nvidia you're wrong about ASIC voltages!"

The feedback I sent:

Is the SLI voltage issue mentioned in the driver notes officially being blamed on asic quality?

Out of 8 EVGA GTC 980 cards I own I have matched TWO of them with identical 80% ASIC quality. The problem persists even though the ASIC quality is identical. Furthermore, a lower asic quality card will draw MORE voltage regardless of the PCIE slot or SLI enabled.

These cards perform with the proper voltages only when hooked up without SLI.

For testing purposes, I used my lowest asic quality card (out of 8) which was 65% to use in the top PCIE SLI slot. I paired this with my highest card which is 85% and ran the 85% card in the lower PCIE slot.

The 65% asic quality card was set at proper voltage 1.21v during load while the 85% asic quality card ran a 1.15v.

If Nvidias theory about the asic quality is correct, once I swap the cards ( top to bottom ) the voltages should be the same on each card due to their individual ASIC quality.

This is quite the opposite. No matter what the cards ASIC value, the top (primary) card will always run at the correct voltage (1.21v) and the second card will run at lower voltage (1.15v) regardless of their individual ASIC quality. This causes instability and is being complained about across all of your geforce forums.

I'm looking forward to resolving this issue. Advanced users are reporting using custom BIOS resolves this issue. I do not believe the customer should have to flash custom user bios because this voids any warranty's.

This issue will determine if I need to return these 8 GTX 980s and go back to the 700 series, which DID hold a constant matching voltage regardless of ASIC %.

I do not understand how ASIC quality was ignored for years and never caused problems, but now it is being blamed for SLI instability on new flagship cards.

My 660s 680s 770s 780s 780ti cards did not suffer any of these problems and none of them had anywhere close to asic % numbers matching.
And, of course, the moderators are deleting his posts... Thankfully someone quoted this in a reply before it was deleted.
https://forums.geforce.com/default/...n-the-other-driver-bug-/post/4361150/#4361150
 
Last edited:
This article is like damage control.

You are entitled to your opinion, but I really take offense to that. We are not running damage control for NVIDIA. I would suggest you read some of our other articles on 980.
 
I have to say that was one crappy overclocking 290x.. same with the 780ti.

1150core and 1500mhz on the ram with elpida is very common on the 290x WITH REFERENCE cooler... and elpida is not even that good. My Asic score is only 74.1 and it tops out at 85c with this overclock fan running at 55%

My GTX 780ti runs 1255mhz on the core and almost 7.5 on the memory never reaches over 83c with the fan at 100%

I think all the overclocks were a bit conservative. Even the 980s could have been pushed further perhaps with more voltage Many can reach another 25-50mhz also.

I disagree about the damage control comment too.

Noone else has even acknowledged the voltage discrepancy on top of that. Hard gives the story like it is.
 
I think all the overclocks were a bit conservative. Even the 980s could have been pushed further perhaps with more voltage Many can reach another 25-50mhz also.

First off we know that all cards overclock differently.

Our overclocks are 100% stable overclocks for hours and hours of gaming. There is no doubt in my mind that we could get the benchmark gameplay pushed out with higher clocks as well, but that is not the way we do things.
 
Is the micro stuttering apparent at lower resolutions? Is it an internal bottleneck with the massive 4k images, ect?

I do remember a review site taking frame times and had a lot of dropped frames they couldn't explain. I wonder if it's related. They said it didn't affect gameplay though (couldn't notice any side effects) and it may have been their capture system. They were at a lower resolution and higher FPS....
 
Last edited:
Is the micro stuttering apparent at lower resolutions? Is it an internal bottleneck with the massive 4k images, ect?

I do remember a review site taking frame times and had a lot of dropped frames they couldn't explain. I wonder if it's related. They said it didn't affect gameplay though (couldn't notice any side effects) and it may have been there capture system. They were at a lower resolution and higher FPS....
SLI frame pacing *feels* better to me with Maxwell than it did with Kepler. Someone was speculating that it may be a hardware compatibility issue with the program they use to capture frame times. Looking at and comparing my Metro 2033 benchmarks, there are fewer valleys in those graphs it generated with Maxwell than it did with Kepler using the same settings. There were some weird peaks, though, where it was a complete vertical line for a millisecond or two every now and then. Kepler didn't have those anomalies. Overall though, Maxwell does feel smoother to me.

I play at 1080p, by the way. I can't compare to Crossfire because I've never owned an AMD multi-GPU system.
 
A most useful article. I noticed several significant dips at 4K. Did you measure VRAM usage to see if the GPUs were running out of VRAM? Also, I didn't spot any comment about noise: how noisy was it with both GPUs at 100% fan?
 
Good read. Benchmarking OC'd cards is a bit of a crap shoot, but I think it gives a general idea of what to expect.

For the smoothness, I agree the CF 290x cards feel very smooth in most games. Mantle games take that smoothness an even bigger step higher from my experience.

Now waiting on next been cards please!
 
Is there any chance of a future review comparing SLI GTX 970s to crossfire 290Xs, since they are at about the same retail price now?
 
A most useful article. I noticed several significant dips at 4K. Did you measure VRAM usage to see if the GPUs were running out of VRAM? Also, I didn't spot any comment about noise: how noisy was it with both GPUs at 100% fan?


Yes, those are loud at 100%. I think Brent mentioned it on the overclock page.
 
I have to say that was one crappy overclocking 290x.. same with the 780ti.

1150core and 1500mhz on the ram with elpida is very common on the 290x WITH REFERENCE cooler... and elpida is not even that good. My Asic score is only 74.1 and it tops out at 85c with this overclock fan running at 55%

My GTX 780ti runs 1255mhz on the core and almost 7.5 on the memory never reaches over 83c with the fan at 100%

This article is like damage control. I am still waiting on the next NVidia flag ship card.. these cards aren't that good.. I want raw horse power rather than relying on optimizations and overclocks.
No, just no. I have had a very good, enjoyable experience with my 290X CrossFire setup, but they simply are not overclockable in CrossFire with the reference cooler. The top card will throttle. I use headphones so I've let my fans go up to 65-70% and there are still games that will cause my top card to throttle even without a voltage increase.

Non reference cards are a different story, but the reference coolers on the 290X were garbage and AMD knows it. That's why the 390X will have a much better cooler based on the leaks we have so far.
 
Why compare reference vs custom made card and not reference vs reference, or custom vs custom?
It's obvious from the graphic cards history that the custom made models are working better than the reference ones. It doesn't seem fair comparison to compare one company's "basic" design vs the other company's "top" design :eek:
We are using two XFX Radeon R9 290X Double Dissipation video cards for testing. While these are custom cooled video cards, these aren't the best cooled 290X video cards. If we were to have used two reference R9 290X cards we wouldn't be able to overclock at all since these can clock throttle even at stock settings depending on your cooling setup.
I know that you explained your point at the quoted text, but still i think that you have to compare similar things to have comparable results.
 
Why compare reference vs custom made card and not reference vs reference, or custom vs custom?
It's obvious from the graphic cards history that the custom made models are working better than the reference ones. It doesn't seem fair comparison to compare one company's "basic" design vs the other company's "top" design :eek:

I know that you explained your point at the quoted text, but still i think that you have to compare similar things to have comparable results.

it really comes down to an availability issue. when the tis were in the retail channel, and even when production tapered off in lieu of the gtx 9xx release, cards with the reference cooler were well represented in the market. often, the reference cards were the cheapest option for prospective buyers. the opposite is true for the 290s. as soon as the aftermarket cooled cards popped up, the reference cooled cards began to disappear like the bad dream that they were. right now, the only aib who still sells reference cards is powercolor, and even then it isn't the cheapest solution available to buyers.
 
Why isn't 'price' a factor in these AMD/Nvidia comparisons? If the 290X and the 980 were the exact same price, then of course the 980 would be the uncontested winner. But according to the benchmarks in this review, AMD Crossfire 290X is providing 80% - 100% the performance of SLI 980, but is doing it at 55% of the cost ($300 290X vs $550 980). That's a $500 difference when running them in pairs. Since we actually have to pay for our cards, I'd say this makes AMD 290X Crossfire the winner here, not 980 SLI.
 
Great review, glad the voltage issue is mentioned. Was hoping to see some 1440p results though!
 
Why isn't 'price' a factor in these AMD/Nvidia comparisons? If the 290X and the 980 were the exact same price, then of course the 980 would be the uncontested winner. But according to the benchmarks in this review, AMD Crossfire 290X is providing 80% - 100% the performance of SLI 980, but is doing it at 55% of the cost ($300 290X vs $550 980). That's a $500 difference when running them in pairs. Since we actually have to pay for our cards, I'd say this makes AMD 290X Crossfire the winner here, not 980 SLI.

They didn't outright say buy GTX 980s, they kind of left it open for you to draw your own conclusion based on what matters to you. They laid all the cards on the table.

People don't always factor price in so it's best to now draw a solid conclusion because different things matter to different people

To give some type of real world example, People don't go into a Mercedes Benz dealership after reading a review and end up settling for a Honda Accord Coupe because it costs less. (Although they are awesome to drive)

Sometimes people want the fastest regardless of how much it costs.

Budget conscious people would never have walked into the Benz Dealership in the first place. Here at [H]ardOCP Many/most (but not all) people want to see what is faster. I'd say if I cared about price I'd lean toward a 290X but since I find the 980 a good value based on everything and the fact that they are the fastest I choose them. They still have more potential being a newer architecture, drivers will allow them to perform even better over time as they mature. They run cooler, use much less power (even when overclocked), are quieter, have a bunch of features that people like.

I wont say anything extreme e.g. Take that budget talk to the oft Forum :)
 
Why isn't 'price' a factor in these AMD/Nvidia comparisons? If the 290X and the 980 were the exact same price, then of course the 980 would be the uncontested winner. But according to the benchmarks in this review, AMD Crossfire 290X is providing 80% - 100% the performance of SLI 980, but is doing it at 55% of the cost ($300 290X vs $550 980). That's a $500 difference when running them in pairs. Since we actually have to pay for our cards, I'd say this makes AMD 290X Crossfire the winner here, not 980 SLI.
That's like suggesting that the Formula One team that made a car 80% as fast as the fastest car, but only spent 55% as much should be the winner.

That's also false economy, in a sense. If you think about the cost of the rest of the system that's required to make use of an SLI 980 setup, the extra 3 or 4 hundred dollars the graphics cards cost isn't that much to spend when you're talking about getting 20% more performance.

If, conservatively, you're spending $1000 on the guts of the box, plus another $1000 or so on monitors, then the total cost of the 290Xs might be something like $2600, vs. maybe $3000 for the 980s. At that price point, the extra 20% looks like a much better deal, to me, especially if it also comes with benefits like less noise and power consumption.
 
I'm sorry, but I can't agree with "People don't always factor price in". The vast majority of people do take pricing into account. In fact, it's probably the the single most limiting factor when deciding what components to purchase. If pricing were not an issue, we'd all be running $1,000 processors on $500 motherboards with $3,000 worth of video cards, etc. If pricing were not an issue, AMD and Nvidia would only release one card per generation. Their ultimate, highest performance card possible. In reality, we have multiple versions of cards that vary in performance along a whole range of pricing tiers. Because if you can't afford X card, you can drop down in price and purchase Y card.

Your Mercedes Benz/Honda Accord analogy is fundamentally flawed. The performance difference between those two cards is tremendous. The overall difference in performance between 290X CrossFire and 980 SLI is minor, at best. And yet the difference in cost is not inconsiderable. In fact, you could run 290X Tri-Fire vs 980 SLI and still have $200 left over.

Yes, there are differences in energy consumption, in features, etc. But the two most important metrics in choosing a video card are, "How much performance can I get for $X ?".
 
Last edited:
That's like suggesting that the Formula One team that made a car 80% as fast as the fastest car, but only spent 55% as much should be the winner.
What's with these misleading car analogies?

This isn't a race. Nobody cares how much a company spends on their race car, only that it finishes first. We're talking video cards here. People care about how much things cost because they don't have unlimited funds.

That's also false economy, in a sense. If you think about the cost of the rest of the system that's required to make use of an SLI 980 setup, the extra 3 or 4 hundred dollars the graphics cards cost isn't that much to spend when you're talking about getting 20% more performance.

If, conservatively, you're spending $1000 on the guts of the box, plus another $1000 or so on monitors, then the total cost of the 290Xs might be something like $2600, vs. maybe $3000 for the 980s. At that price point, the extra 20% looks like a much better deal, to me, especially if it also comes with benefits like less noise and power consumption.
Most people only have $X amount to spend. If they spend more on one component, then they have to spend less on another. Spending $1100 on 980 SLI makes zero sense to nearly everybody with a budget when they can get almost the exact same performance from 290X Crossfire for $600. That extra $500 can then be spent on a faster processor, a larger SSD, more memory, etc.
 
Last edited:
@Creig, those are your opinions not everyone else's fact.

My post left things as saying I understand where you are coming from but not everyone feels that way e.g. Me, your post is saying. That is how you & everyone should feel because that is how I feel.

See the difference?

Car analogies were added to emphasize that people pay for first place, how much they pay depends on the person.
 
What's with these misleading car analogies?

This isn't a race. Nobody cares how much a company spends on their race car, only that it finishes first. We're talking video cards here. People care about how much things cost because they don't have unlimited funds.

A GPU review IS a race. The fastest card wins, just like an auto race, and I'd argue that some folks probably do have hardware budgets that are more or less unlimited. I'm sorry if you don't fall into that category, but lucky for you, there are less expensive options than a GTX980 SLI setup.

Most people only have $X amount to spend. If they spend more on one component, then they have to spend less on another. Spending $1100 on 980 SLI makes zero sense to nearly everybody with a budget when they can get almost the exact same performance from 290X Crossfire for $600. That extra $500 can then be spent on a faster processor, a larger SSD, more memory, etc.

Spending $1100 on the 980s totally makes sense to someone who wants the additional performance they deliver, and can afford to spend the extra money on it. Whether or not you fall into that category yourself is something only you can decide.
 
I am not disagreeing with either of you. For someone who has the money to spend and doesn't care about price/performance ratios, the 980 makes perfect sense. What I am saying is that the vast majority of people don't have unlimited 'fun money' at their disposal to buy nothing but top shelf components. And for that reason, price should be taken into consideration in these reviews and their conclusions. Because in comparison to the performance of 290X CrossFire, 980 SLI represents a very poor return on your investment.
 
I am not disagreeing with either of you. For someone who has the money to spend and doesn't care about price/performance ratios, the 980 makes perfect sense. What I am saying is that the vast majority of people don't have unlimited 'fun money' at their disposal to buy nothing but top shelf components. And for that reason, price should be taken into consideration in these reviews and their conclusions. Because in comparison to the performance of 290X CrossFire, 980 SLI represents a very poor return on your investment.

Okay lets run with that thought. IF your saying that the vast majority of people consider price/performance ratios then those people will fall under the 4k gaming is very niche and expensive and doesn't return much. Most of them will own a single 25x14 or 1080P screen. With these same overclocks on that type of resolution(1080P/25x14) you are looking at about a 30% performance difference between 2x 980 overclocked vs 2x 290X overclocked.

Now when me and most people who have 980's purchased them the 290X's were not selling for $300 After Mail in Rebate. But looking at this review I think the team here did it right. Not saying price/performance or leaning either way. They should have done exactly what they did. Put all the info out there and let readers decide what they think their best bet will be.

This is a website for hardware enthusiasts and many people choose the fastest just because. Hell I couldn't wrap my head around people buying Titan Z's for $3k or Titans for $1k + but people will do what people do. No sense in forcing any opinion down anyone else's throats as fact.
 
Interesting article, more interesting discussion. Having to choose my words carefully here...has anyone done a proper clock for clock comparison with say a 780ti and a 980? You may find the results very interesting.
I will also agree, Kyle Bennett doesn't do damage control, he's totally WYSIWYG...
 
Interesting article, more interesting discussion. Having to choose my words carefully here...has anyone done a proper clock for clock comparison with say a 780ti and a 980? You may find the results very interesting.
I will also agree, Kyle Bennett doesn't do damage control, he's totally WYSIWYG...

Thanks for the kind words, although a lot of folks don't like WYSIWYG. :)

What do you consider a "proper clock to clock comparison" exactly?
 
He probably meant to downclock the 980's core and vram speed to match that of 780ti (if feasible) and test the performance difference then.

I am personally also interested in GTX 980 SLI vs 290x 8GB Xfire, with 980 downclocked if possible to see the actual effects of VRAM, if any.
 
Back
Top