NVIDIA GeForce GTX 980 SLI 4K Video Card Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,602
NVIDIA GeForce GTX 980 SLI 4K Video Card Review - If you thought one GeForce GTX 980 was a great gaming solution, wait till you see what GeForce GTX 980 SLI can do for gaming at 4K. We look at performance, efficiency, and even a little 1440p SLI goodness. However though, some of the results may not be what you would exactly expect.
 
Wow those are some unexpected results......really makes me wonder if AMD's bridgeless crossfire solution is that much more superior..

Kudo's AMD!!!! For 4k gaming you can get 2 290x's for $600 with MIR,

Or 2 980's for $1100.
 
I would assume drivers can fix some of it, as a 14% increase with SLI in Aliens seems like something is going wrong. Still, pretty impressive how the 290x handles at 4k.
 
I would assume drivers can fix some of it, as a 14% increase with SLI in Aliens seems like something is going wrong. Still, pretty impressive how the 290x handles at 4k.

Maybe, but not enough to make up $500 difference.

The real question is how well do the 970's do.
 
I have completely differnt results then them.
Going from 580 680 and 780ti sli this 980 sli has been the best sli experience ever. I been sli since the 6800gtx days.
 
Maybe, but not enough to make up $500 difference.

The real question is how well do the 970's do.

That's what I was thinking. With the 290X after rebate cards going for same or less than the 970, whats the difference between the two in that price range? Probably substantial considering the 290X is running with a +$250 card.

Baah... I really liked the 1440 comparison more than 4K any ways. I really think 1440 is going to be more important than 4K, specially with the new 21:9 monitor types being released at that resolution.

Great article though. Still shows the R290X hanging in there.
 
I have completely differnt results then them.
Going from 580 680 and 780ti sli this 980 sli has been the best sli experience ever. I been sli since the 6800gtx days.

Reading is fundamental.

Please share your "completely differnt [sic] results."
 
Interesting.

I have a 290X Crossfire system sitting right next to a 980 GTX SLi system, driving EyeFinity/Surround.
I don't have all those games, but I do have Metro LL Redux and Crysis 3 loaded and I will try them head to head and see if this actually is the real deal.

I have to say that I did run Metro LL Redux in a comparison and the 980s were by far superior.:D
 
Nice review as usual.

AMD needs to go on a power diet. What is that like $30 more a month if you game in crossfire?
 
I didn't see any mention of the voltage discrepancy under SLI in this article, which could account for the instability and poor scaling you mention in some games. Did you guys not see it, or just thought it was a non-issue?

Discussion and experience from users can be found here and here. Appears to be affecting Kepler and Maxwell setups with the 34x.xx release drivers. This issue did not occur with the 337.88 driver and my SLI 780s.
 
Nice review as usual.

AMD needs to go on a power diet. What is that like $30 more a month if you game in crossfire?

I enjoyed reading the review as well.

Two observations:

1) AMD has had a lot more time to shake out their drivers. The 980 just came out so once NVIDIA improves them things should improve.

2) The most shocking thing to me was the power. NVIDIA 446 watts under load vs. AMD's 746! A full 300 watts more! Wow. The power company's gonna love the AMD owners.
 
Two adverbs at the beginning of a sentence. I feel like you guys don't proofread before publishing.


Thanks for the extra eyes, fixed. Actually that was the editor's fault in fubaring that one. :)
 
I wonder if that high idle power draw will be fixed through driver updates. Are the cards not throttling down on idle?
 
Nvidia has some serious driver issues to work out. I don't game at 4k and probably wont for at least another 3+ years. I'm looking to go with a Gsync/Freesync 1440P monitor after Christmas. Their driver issues are affecting SLi in all resolutions though.

In case your wondering there is currently a voltage discrepancy bug that is affecting stability in multicard overclocking and boost performance in general. Speaking of overclocking that was seriously lacking in a otherwise awesome article. I'd say out of the 3% of readership on this site that actually runs 4k 99.9% of them will overclock their cards to get the most out of their 4k experience.

Still there is a lot of hard work that went into this article and that is appreciated.

I agree wholeheartedly SLi needs to evolve.

Nvidia also has to do a better job with the drivers. Hopefully they are working on some driver magic to get better scaling and fix their Maxwell driver issues I feel that they can eek out another 10-15% performance out of this architecture in addtion to fixing their SLi scaling issues and getting 80+ percent scaling in most games.
 
I enjoyed reading the review as well.

Two observations:

1) AMD has had a lot more time to shake out their drivers. The 980 just came out so once NVIDIA improves them things should improve.

2) The most shocking thing to me was the power. NVIDIA 446 watts under load vs. AMD's 746! A full 300 watts more! Wow. The power company's gonna love the AMD owners.

Power consumption varies greatly with the underlying architecture. AMD followed the brute-force method with the R9 series, applying more power to a similar architecture with a more mature manufacturing process, as I understand it. NVIDIA achieved incredible efficiency improvements with the Maxwell architecture, making the aging AMD architecture look that much more power-hungry. I expect the power usage scenario to change a bit when the next series of AMD cards come out.
 
I imagine we'll see a revisit in a couple of months once new drivers and game patches fix some bugs.
 
Missing from the discussion is the $500 price difference between a set R9 290X and GTX 980 cards, and you have to play a lot before you make that money back true the e-bill!

300W = $0.033/h thats almost 2 years of nonstop gaming, or 15 year playing 20h a week.
 
Last edited:
Fundamentally I agree that SLI has to evolve. 2 generations in a row we are seeing that R9 290X CF is now providing the best and most consistent 4k gaming experience. I am looking forward to the R9 390X which is rumoured to bring HBM (High bandwidth memory) with an improved GCN 2.0 architecture. This is a good contest. AMD needs to improve power efficiency while Nvidia needs to improve SLI scaling and consistency. :)
 
Nice review as usual.

AMD needs to go on a power diet. What is that like $30 more a month if you game in crossfire?

Well my PC consists of a FX-9370 that runs wide open @4.7GHz 24/7 and an OC'd R9 290 with the power saving off. The lights in my home run 24/7 also as my mother can't see very well. My electricity bill due on November 10th is $130.00 before taxes and $141.00 after taxes.

I really doubt $15 of that is because of the video card. :)
 
I had a flex SLI cable that was gaving me problem with my samsung 4k display. When i used the evga SLI pro bridge all my prolems went away. What kind of bridge you are using for that reviews ?
 
Not really the gulf I was expecting to see between the 780Ti and the 980 in SLI.

Quite happy about it to be honest :)
 
Interesting.

I have a 290X Crossfire system sitting right next to a 980 GTX SLi system, driving EyeFinity/Surround.
I don't have all those games, but I do have Metro LL Redux and Crysis 3 loaded and I will try them head to head and see if this actually is the real deal.

I have to say that I did run Metro LL Redux in a comparison and the 980s were by far superior.:D


In heat output and electricity?
 
The 290x is surprisingly future proof for high resolutions. 512 bit bus and XDMA crossfire are certainly aging well. The 83% more expensive solution can only match it a year on.

Pricing aside, it's a decent mid-range NV card and the interesting cards (and price) of the next year are yet to come (980 ti/390x).

It's ironic after the huge frametimes and slower but smoother (680 vs. 7970) campaign to see that NV has clearly missed the mark on the 970/980 SLI.
 
I think this gives me more reason to upgrade to a second R9/295x2 in the near future when I find a decent used one on Ebay.
 
The 290x is surprisingly future proof for high resolutions. 512 bit bus and XDMA crossfire are certainly aging well. The 83% more expensive solution can only match it a year on.

Pricing aside, it's a decent mid-range NV card and the interesting cards (and price) of the next year are yet to come (980 ti/390x).

It's ironic after the huge frametimes and slower but smoother (680 vs. 7970) campaign to see that NV has clearly missed the mark on the 970/980 SLI.

Remember AMD has had plenty of time to optimize drivers
 
It's ironic after the huge frametimes and slower but smoother (680 vs. 7970) campaign to see that NV has clearly missed the mark on the 970/980 SLI.

Sure, looks that way if you're 1 of the 1632 people in the entire 6 Billion + population of the world running a 4k for computer gaming and absolutely couldn't wait for nvidia to iron out driver issues on a less than 2 month old new architecture ;)
 
Quite surprising. I was expecting the 980's to blow the 290x out of everything.
 
Remember AMD has had plenty of time to optimize drivers

Does that excuse NV from the poor SLI after they campaigned for it? I see your point, but it's still odd that they are slacking with that after a huge campaign and they even created software to monitor it (fcat).

I would expect more from an 83% more expensive GPU than "power savings".
 
Sure, looks that way if you're 1 of the 1632 people in the entire 6 Billion + population of the world running a 4k for computer gaming and absolutely couldn't wait for nvidia to iron out driver issues on a less than 2 month old new architecture ;)


I might not jump right away into 4k gaming . Looks like 21:9 LG curved 34" screen is nice at 3440x1440 not quite 4k but still could offer some improvements over 2560x1440
 
Page 10:
You also are not seeing things in regards to XFX Radeon R9 290X CrossFire wattage. We experienced peak wattage at 746W, the wattage was consistently over 700W while gaming.

Missing a word there near the beginning.
 
Sure, looks that way if you're 1 of the 1632 people in the entire 6 Billion + population of the world running a 4k for computer gaming and absolutely couldn't wait for nvidia to iron out driver issues on a less than 2 month old new architecture ;)

I am.

1/1632 only? Nice deflection, 4k monitors are getting cheaper all the time and certainly people are buying them. Even so, the high end display/gpu buyers have a couple choices, why would you buy a clearly not superior experience for 83% more expensive? You could buy 290x crossfire + a 4k display for nearly the same price as the 980 SLI.

Does that excuse the 83% more expensive GPU from losing/tying to the year old GPU? (after the huge smoothness SLI campaign no less)
 
Does that excuse NV from the poor SLI after they campaigned for it? I see your point, but it's still odd that they are slacking with that after a huge campaign and they even created software to monitor it (fcat).

I could say the same thing for mantle. I am still curious why G-sync is taking forever to be implemented into monitors. I still think Nvidia is superior choice for a video card just due to the fact it has Shadow play, gameStream, and Shield support. AMD needs something else to convince me that it is worth getting. I hope they do release something amazing for their next lineup . I would hate to see AMD go down the tubes and Nvidia have no competition
 
I am.

1/1632 only? Nice deflection, 4k monitors are getting cheaper all the time and certainly people are buying them. Even so, the high end display/gpu buyers have a couple choices, why would you buy a clearly not superior experience for 83% more expensive? You could buy 290x crossfire + a 4k display for nearly the same price as the 980 SLI.

Does that excuse the 83% more expensive GPU from losing/tying to the year old GPU? (after the huge smoothness SLI campaign no less)

What about the 83% more cost of electricity.
 
I am.

Does that excuse the 83% more expensive GPU from losing/tying to the year old GPU? (after the huge smoothness SLI campaign no less)

Technically no it doesn't excuse that especially me being a owner of 2x 980. I don't game at 4k but nvidia is definitely behind the 8 ball regarding drivers with these new cards. I give them a D- so far as there are issues they should have ironed out at least 2 weeks ago once they were confirmed.

Anyhow, AMD has had far longer to optimize their drivers on that architecture, and I'm not even sure Nvidia is truly focussed on 4k Gaming. This result doesn't surprise me since even before this review I've known that driver issues will effect a SLi review on these cards. It seems that at least as 4k gaming is concerned Nvidia barely tried on the 780 series and are not 100% focussed even on Maxwell.

Interestingly they didn't add overclocking to the review. I'm sure there was no reason other than lack of time to do it. As I said in my first comment regarding this thread of the people gaming at 4k I strongly believe Most would overclock to get the most out of their 4k experience. That may have turned the tides. Once their drivers get a chance to catch up things may also be different.
 
Nice review. Good to know to skip an SLI setup for now until those issues are addressed. Sounds like driver problems.
 
What about the 83% more cost of electricity.

Enjoy the power savings and go for the 980's then. Let's see if your tune changes when the 980 ti comes out.

Technically no it doesn't excuse that especially me being a owner of 2x 980. I don't game at 4k but nvidia is definitely behind the 8 ball regarding drivers with these new cards. I give them a D- so far as there are issues they should have ironed out at least 2 weeks ago once they were confirmed.

Anyhow, AMD has had far longer to optimize their drivers on that architecture, and I'm not even sure Nvidia is truly focussed on 4k Gaming. This result doesn't surprise me since even before this review I've known that driver issues will effect a SLi review on these cards. It seems that at least as 4k gaming is concerned Nvidia barely tried on the 780 series and are not 100% focussed even on Maxwell.

Interestingly they didn't add overclocking to the review. I'm sure there was no reason other than lack of time to do it. As I said in my first comment regarding this thread of the people gaming at 4k I strongly believe Most would overclock to get the most out of their 4k experience. That may have turned the tides. Once their drivers get a chance to catch up things may also be different.

All valid points, and as the mid-range NV card (marketed as high end but isn't the big maxwell), it's an impressive card from a technical standpoint. The only problem I have here is the double standards by NV, and the price.
 
Back to the review, thanks for the comparison and I look forward to the overclock results. 4k is demanding and we need all the performance we can get.

Big die maxwell and 390x can't get here soon enough.
 
Technically no it doesn't excuse that especially me being a owner of 2x 980. I don't game at 4k but nvidia is definitely behind the 8 ball regarding drivers with these new cards. I give them a D- so far as there are issues they should have ironed out at least 2 weeks ago once they were confirmed.

Anyhow, AMD has had far longer to optimize their drivers on that architecture, and I'm not even sure Nvidia is truly focussed on 4k Gaming. This result doesn't surprise me since even before this review I've known that driver issues will effect a SLi review on these cards. It seems that at least as 4k gaming is concerned Nvidia barely tried on the 780 series and are not 100% focussed even on Maxwell.

Interestingly they didn't add overclocking to the review. I'm sure there was no reason other than lack of time to do it. As I said in my first comment regarding this thread of the people gaming at 4k I strongly believe Most would overclock to get the most out of their 4k experience. That may have turned the tides. Once their drivers get a chance to catch up things may also be different.

I am pretty sure Nvidia will be releasing new drivers in the future to focus on 4k benchmarks. You can see it on the 980 series non reference MSI and ASUS cards except gigabyte. They all come with HDMI 2.0 4k connection at 60hz or 3 display ports.
These cards are well equipped to handle the future of 4k once drivers get more optimized. By then of course by the time monitors come down in price new video cards will be released. I also cant believe gsync cant be equipped on a nice 2560x1440p IPS monitor. Still disappointed with that.
 
Back
Top