NVIDIA GeForce GTX 980 SLI 4K Video Card Review @ [H]

I'm not sure why are you surprised by the results. AMD has done a great job with XDMA, its clear that now AMD has better scaling and smoothnes than nvidia. Since SLI hasn't changed , why would it be any different?

Anyway. Kudos for AMD. Xfire users can patiently wait for what ever comes next.

Nvidia has stated that its working on SLI to improve smoothness, so hopefully things will get better.
 
The 290x is surprisingly future proof for high resolutions. 512 bit bus and XDMA crossfire are certainly aging well. The 83% more expensive solution can only match it a year on.

Pricing aside, it's a decent mid-range NV card and the interesting cards (and price) of the next year are yet to come (980 ti/390x).

It's ironic after the huge frametimes and slower but smoother (680 vs. 7970) campaign to see that NV has clearly missed the mark on the 970/980 SLI.

Pretty sure the smoothness on AMD CF has been fixed. Sure is super smooth on my system. One they got the frame times fixed, it made a huge difference.
 
The 980 SLI idle power draw seems odd to me. My reference system with reference 980 SLI idles at 81W.
 
I didn't see any mention of the voltage discrepancy under SLI in this article, which could account for the instability and poor scaling you mention in some games. Did you guys not see it, or just thought it was a non-issue?

Discussion and experience from users can be found here and here. Appears to be affecting Kepler and Maxwell setups with the 34x.xx release drivers. This issue did not occur with the 337.88 driver and my SLI 780s.

In case your wondering there is currently a voltage discrepancy bug that is affecting stability in multicard overclocking and boost performance in general. Speaking of overclocking that was seriously lacking in a otherwise awesome article.

Naturally, I will evaluate voltage behavior in the overclocking review. If I see an issue, I will report it.

I like to start with a baseline review before doing the overclocking review. IMO, you need to know where you are coming from, what you have, before you can compare that to where you are going in terms of maximum performance. Overclocking these cards and gaming while overclocked is an entire article in of itself. It would be too much for one article, and take a lot more time to get the review published.

Now that I have this baseline, I can take my overclocking data, and have something to compare it to and make a really nice Overclocking article.
 
I bought a 980 a while ago to drive my Sammy 50 inch UHD TV via HDMI 2.0, which I also bought around the same time. It's a great GPU but performance wise, at 2560x1600, it performs exactly the same as my old 680 SLI setup in 3DM13 (both get 12000 points). But yes, with more vRAM, better efficiency, as a single GPU solution and so on. That's my take on one.
 
I remember all the whining when AMD couldn't get their sh!t together with crossfire now SLI has issues and people who have laid out money on GTX980's are defending their investments, the fact that in 2 generations Nvidia still suck at 4k is quite laughable. Forking out for new GPU's for incremental increases seems quite silly now. Compare the latest generation with 7970's and there is not that much of an improvement. Nvidia sells gimped GPU's based on an architecture that is to hot to scale properly. Wake me up when the AMD 3*** series arrives to give a damn about GPU's.
 
AMD is being lauded for their multi-GPU performance.

While nVidia is making power efficient, cool running GPUs that perform slightly lower than AMD toaster ovens.

The world is bat shit crazy, I want off.
 
I remember all the whining when AMD couldn't get their sh!t together with crossfire now SLI has issues and people who have laid out money on GTX980's are defending their investments, the fact that in 2 generations Nvidia still suck at 4k is quite laughable. Forking out for new GPU's for incremental increases seems quite silly now. Compare the latest generation with 7970's and there is not that much of an improvement. Nvidia sells gimped GPU's based on an architecture that is to hot to scale properly. Wake me up when the AMD 3*** series arrives to give a damn about GPU's.

i get 19160 in firestrike 2x 980 sli Stock. Btw my two cards are dead silent and are not portable heaters

what you getting?
 
AMD is being lauded for their multi-GPU performance.

While nVidia is making power efficient, cool running GPUs that perform slightly lower than AMD toaster ovens.

The world is bat shit crazy, I want off.

You know what? I'm glad I'm not the only person feeling this way. As someone who doesn't upgrade every cycle, (maybe less often than every 2 even), this whole situation has me boggled. Add a few comments ragging on Nvidia for lackluster drivers...

STAHP My face can't take any more!
 
AMD is being lauded for their multi-GPU performance.

While nVidia is making power efficient, cool running GPUs that perform slightly lower than AMD toaster ovens.

The world is bat shit crazy, I want off.

A complete role reversal from the GTX 480 days :D Who would have guessed?
 
You know what? I'm glad I'm not the only person feeling this way. As someone who doesn't upgrade every cycle, (maybe less often than every 2 even), this whole situation has me boggled. Add a few comments ragging on Nvidia for lackluster drivers...

STAHP My face can't take any more!

I'm a small form-factor guy, this crazy new landscape scares and confuses me.
 
A complete role reversal from the GTX 480 days :D Who would have guessed?

Nvidia is all about ISO standards when it comes to environment and clean energy

Go on AMD website and over half of their ISO certs are expired in china

http://www.amd.com/en-us/who-we-are/corporate-information/iso-certificates

Good thing AMD has partnerships with Sony and Microsoft to make console cpus. Or else they would have flopped. Because i can almost guarantee most large server based companies will opt for Nvidia over AMD just because they are so much more efficient
 
Last edited:
Nothing has changed. People replacing 780s with 980s was always silly.

To most who cant afford it yes.

I have seen a huge gain in performance from a stock MSI lightning to a stock MSI gaming. I do not even need to overclock yet

The only thing that has me concerned is the 4gb of video ram . Video games are now coming out with large texture packs as extra downloads, Shadow mordor, skyrim etc. Curious to know how GTA V will run with 4gb vram

I have a feeling AMD will release a monster gpu with 6-8gb ram in the next 3-4 months and be competitively priced with Nvidia
 
That's never been a problem before... the numbers between desktop and mobile always came out at the same time.

800m series is not the same as the 900m series

Nvidia’s general manager of notebook GPUs, said “the GeForce GTX 980M can deliver 70 percent of the performance of its desktop counterpart.” Sanghani also said that “Maxwell delivers twice the performance per watt compared to Kepler.” Kepler is Nvidia’s previous-generation graphics architecture.

Nvidia GeForce GTX 980M GeForce GTX 970M
Nvidia says its latest mobile GPUs deliver very high performance with AAA titles, even when the laptop is running on battery power.

According to Nvidia, 75 percent of gamers play in multiple locations, whether that’s different rooms inside their home, at a friend’s house, or at a LAN party. More of these gamers would buy a gaming laptop over a desktop PC if they could get the same performance with games. Sanghani said the four new technologies in Nvidia’s Big Maxwell architecture, combined with Nvidia’s improved BatteryBoost technology, work to close the gap between playing games on a desktop and playing games on a laptop.



looks like Nvidia owns the laptop and mobile markets.
 
AMD cards are good at 4K but if you want to use them with a large 4K TV pretty much all only have HDMI 2.0 and have no DP ports meaning you are limited to 30 Hz on an AMD card.

May be a niche use case but that is why I went 980
 
AMD cards are good at 4K but if you want to use them with a large 4K TV pretty much all only have HDMI 2.0 and have no DP ports meaning you are limited to 30 Hz on an AMD card.

May be a niche use case but that is why I went 980

They need to get with the times:)

agreed
 
Thanks for the review.

I'm definitely enjoying gaming at 4k, but Nvidia's drivers need some work. There's just something about the 344 series that isn't working quite right. The random drops in performance using SLI is bad, but the other thing that really annoys me is the weird stuttering in video cut-scenes that appeared after I installed my 980s. It's possibly an anomaly derived from using g-sync, like the mouse lag and screen flickering in menus that g-sync has also created, but either way, it needs addressing.
 
Thanks for the review.

I'm definitely enjoying gaming at 4k, but Nvidia's drivers need some work. There's just something about the 344 series that isn't working quite right. The random drops in performance using SLI is bad, but the other thing that really annoys me is the weird stuttering in video cut-scenes that appeared after I installed my 980s. It's possibly an anomaly derived from using g-sync, like the mouse lag and screen flickering in menus that g-sync has also created, but either way, it needs addressing.

I have noticed the same issues with videos as well with the 344s on my 980s, but I seem to recall that I had the same problem with the initial batches of drivers when I first got my Titans over a year and a half ago, but then that sorted itself out eventually. It just seems like NVIDIA has been sort of lagging behind on driver releases, but that may just seem that way because they skipped the month of August for a release.
 
I'm not sure why are you surprised by the results. AMD has done a great job with XDMA, its clear that now AMD has better scaling and smoothnes than nvidia. Since SLI hasn't changed , why would it be any different?

Anyway. Kudos for AMD. Xfire users can patiently wait for what ever comes next.

Nvidia has stated that its working on SLI to improve smoothness, so hopefully things will get better.

Except that I can't play BF4 with mantle on my 290x CFX setup without a game-killing and performance-destroying memory leak. Performance starts out good enough at 5760x1200 and then declines significantly after a round or two. It has been a known issue for months and I don't expect it to be fixed. DICE is certainly at least partially to blame, but once again we have an AMD-only issue here.
 
It sucks the results weren't better.

I have a feeling this is something they can correct in the next driver. I mean [H] has put a big spotlight on their SLI scaling being poor, they can't just ignore it.

Prior driver generations I remember seeing 85% to maybe 90% scaling (for 2 gpus), which is really excellent (was probably on just 1 title, rest were close behind). and something like 70% for the third. There's no immediate reason to think these same results are unreachable with this card. Give them a month to fix the drivers.

Both sides have had driver issues.. I think there were some issues with crossfire on older cards (4xxx) that never got fixed... Fanboys have a short memory when their brand has a fuckup. I honestly believe that the 980 is better silicon/technology than the 290x, as well as nVidia having a better track record on drivers for fixes being timely, and performance improvements in every generation. AMD has some good scaling with xfire, which is what we should all expect. The new system AMD put in was a much needed overhaul from the previous generations. And nVidia's history shows that sli is generally 75% to 85% efficient for 2 gpus. Obviously at 10% or whatever that watch dogs is seeing, is a bug. Bugs can be fixed.
 
The 290x is surprisingly future proof for high resolutions. 512 bit bus and XDMA crossfire are certainly aging well.

This situation certainly reminds me of the 1800 xt vs gtx 7800 days. One with a 512-bit bus and high-power usage. Another with a high clocks, power effiecient and 256-bus. The first does better at high resolution, the other better at standard resolution.

Call me a naysayer, but i really don't think that sli inefficiency is 100% to blame here.
 
Last edited:
Nvidia have the power efficiency and great value for money this round, only question i have is how far this architecture is being pushed - AMD and Nvidia being out of sync with releases is a weird place to be though not like the first time this has happened. I'm guessing AMD surprised with how good their hardware scaling is and Nvidia is now sitting with the legacy of poor scaling, I'm still keen on a single GPU for solving my gaming needs, so best nextgen card gets my money when i get a new monitor.

4k is pretty cool but i think 21.9 will have more traction than 4k, ultra wide satisfies most of us dual screen users from 17inch days :).
 
This situation certainly reminds me of the 1800 xt vs gtx 7800 days. One with a 512-bit bus and high-power usage. Another with a high clocks, power effiecient and 256-bus. The first does better at high resolution, the other better at standard resolution.

Call me a naysayer, but i really don't think that sli inefficiency is 100% to blame here.

Back in the day i also owned a 1800XT with my AMD FX processor vapochilled at 3.4 ghz

destroyed all intel chips until the core 2 duo arrived..

Anyway dont know how it happen but my 1800XT caught on fire and was scared as hell . Luckly i was able to RMA it

That was the last time i bought AMD
 
I know that I'm not the only one who noticed SLI GPU scaling (and overall performance) to be better on the older titles- can anyone comment as to whether Nvidia has released drivers with SLI profiles specifically supporting the tested games?
 
Any chance you were able to test this in surround? I am at 4800 x 2560, I guess if I scale your 4k with my res + titans it does not seem to be much of an upgrade.

Appreciate the time you put in.
 
Back in the day i also owned a 1800XT with my AMD FX processor vapochilled at 3.4 ghz

destroyed all intel chips until the core 2 duo arrived..

Anyway dont know how it happen but my 1800XT caught on fire and was scared as hell . Luckly i was able to RMA it

That was the last time i bought AMD

NVidia fried my laptop (and some friends', they didn't keep the receipt/box), I got a free POS laptop from the class action settlement after it had been replaced once and fried again.
 
Nvidia have the power efficiency and great value for money this round, only question i have is how far this architecture is being pushed - AMD and Nvidia being out of sync with releases is a weird place to be though not like the first time this has happened. I'm guessing AMD surprised with how good their hardware scaling is and Nvidia is now sitting with the legacy of poor scaling, I'm still keen on a single GPU for solving my gaming needs, so best nextgen card gets my money when i get a new monitor.

4k is pretty cool but i think 21.9 will have more traction than 4k, ultra wide satisfies most of us dual screen users from 17inch days :).

From what i understand NCX reported that the new Curved 21:9 LG monitor has worse black levels and response times as last years LG. I still do not understand why Nvidia cant incorporate Gsync in a nice IPS monitor
 
I am still a fan of staying the hell away from SLi and Xfire. I just want to run the most powerful single card, everyday I get confirmation of this.
:)
 
Except that I can't play BF4 with mantle on my 290x CFX setup without a game-killing and performance-destroying memory leak. Performance starts out good enough at 5760x1200 and then declines significantly after a round or two. It has been a known issue for months and I don't expect it to be fixed. DICE is certainly at least partially to blame, but once again we have an AMD-only issue here.

It was a DICE BF4 patch that bugged out Mantle, the issue does not happen in any other Mantle enabled game so its not AMDs fault.
 
looks like Nvidia owns the laptop and mobile markets.

They hardly own either. Intel owns both and by a overwhelming margin. The only thing Nvidia owns is the high end overpriced market, but thats because they can.
 
4k gaming today = tiny, tiny niche

It is also really the only way to push SLI and CrossFire to show the potential performance, without hitting CPU bottlenecks that can occur at lower resolutions.

I there are 120Hz gamers though at 1440p, and that is certainly something I want to evaluate eventually.
 
It is also really the only way to push SLI and CrossFire to show the potential performance, without hitting CPU bottlenecks that can occur at lower resolutions.

I there are 120Hz gamers though at 1440p, and that is certainly something I want to evaluate eventually.

MY GOD this is something I have been wanting for a LONG TIME!!!

Great news Brent!
 
It was a DICE BF4 patch that bugged out Mantle, the issue does not happen in any other Mantle enabled game so its not AMDs fault.

Yes, AMD says it is DICE's fault and DICE says it is AMD's fault. The fact remains that it only affects AMD cards - and seemingly mostly just in crossfire. It is extremely, extremely annoying. I frequently jump back and forth between AMD and Nvidia, but this sort of shit just seems to happen way more regularly when I have AMD cards.
 
Great article.

I do have one question: given the higher DPI of 4K monitors, how does upping the graphical detail compare with going for 2x MSAA?
 
AMD cards are good at 4K but if you want to use them with a large 4K TV pretty much all only have HDMI 2.0 and have no DP ports meaning you are limited to 30 Hz on an AMD card.

May be a niche use case but that is why I went 980

You mean to say that the AMD cards have no DP ports?
 
Back
Top