NVIDIA GeForce GTX 980 SLI 4K Video Card Review @ [H]

New Watchdogs patch Released Yesterday is a Miracle for Geforce cards

http://www.guru3d.com/news-story/watch-dogs-new-pc-update-oct-27.html

Over 5 months and they have finally released a patch that should have fixed the issues on geforce cards as Patch # 1

•Better performance on more recent models of cards - 770 onwards will see better improvements.
•Moved all resource creation to an exclusive worker thread.
•Changed technique used for updating texture mipmaps in Ultra textures mode.
I'll have to check it out. I was going to uninstall it today to make room for other games more worthy of my time :cool:.

I'm just as surprised that I still have it installed as you are :p. At least I got it for free with my second 780.
 
I just installed two R9 290x cards in my rig to push a 21:9 3440x1440 monitor. I am getting quite a bit of stuttering in a few games. I thought frame pacing was supposed to correct this. Is frame pacing a setting that I have over looked? I thought it was built into the drivers. I thought it was the monitor so I hooked my three 1920x1200 monitors back up and I still get it. I did not get the stuttering when I had my SLI 680s. I ask this because I was thinking about building another system. I am torn between keeping the R9s and using the 680s in my secondary system or purchasing two 980s and placing the R9s in my secondary.

I don't have any immediate answer, but we'd likely need to know which games you're talking about, what your motherboard and CPU setup looks like, what power supply you're using, what drivers you're using, etc, and I'll note that even with the frame-pacing driver SLI is still usually pushing more consistent frametimes even with lower average FPS than Crossfire in similar setups and testing conditions.

I'd recommend that you take your question over to the AMD forum and ask there directly with the details I mentioned above, because it's also very likely that there's a simple answer to your problem :).
 
I just installed two R9 290x cards in my rig to push a 21:9 3440x1440 monitor. I am getting quite a bit of stuttering in a few games. I thought frame pacing was supposed to correct this. Is frame pacing a setting that I have over looked? I thought it was built into the drivers. I thought it was the monitor so I hooked my three 1920x1200 monitors back up and I still get it. I did not get the stuttering when I had my SLI 680s. I ask this because I was thinking about building another system. I am torn between keeping the R9s and using the 680s in my secondary system or purchasing two 980s and placing the R9s in my secondary.

In my experience SLI frametimes are still more consistent at lower resolutions. It's only when you get up to 4K AMD is as good or better.

3440x1440 isn't a low resolution of course, but it's still 3 million+ pixels less than 4K, quite a big difference.
 
FWIW I also forgot to post this yesterday when I saw it

Guru3d did some FCAT testing and found that GTX 980 (Maxwell 2.0)

Has done some serious improvements in Frame Pacing over 780 Cards.

So Nvidia did improve SLi to some degree on this generation of Geforce cards as far as frame pacing is concerned. I thought it was already pretty good but lagging behind AMD after they implemented XDMA but it would be interesting to see this revisited now that Nvidia has improved things.

http://www.guru3d.com/articles_pages/geforce_gtx_980_sli_review,9.html
 
A bit of an exaggeration, even for a troll post.

3 hours of gaming /day (borderline unhealthy)
* 300 w/hour more for 290x crossfire
*30 days/mo
=27 kWh /mo
*$0.10 /kWh
= $2.70 /mo
over 2 years = $65

for a $500 cost savings up front, 8% compounding monthly for 2 years = $586.
gee, seems like the cost savings offsets the power useage, even for a heavy gamer.

I too get tired of people who constantly make a big deal about that. Efficiency is always a plus, but it is not a deal killer for hardcore gamers who care about the eye candy. That is like a Camaro and Mustang owner arguing about which car gets 2 mpg better on the highway. Both cars are fast and 2 mpg should not matter when both cars hit the 12's in the 1320.
 
I too get tired of people who constantly make a big deal about that. Efficiency is always a plus, but it is not a deal killer for hardcore gamers who care about the eye candy. That is like a Camaro and Mustang owner arguing about which car gets 2 mpg better on the highway. Both cars are fast and 2 mpg should not matter when both cars hit the 12's in the 1320.

The cost savings in actual electrical usage shouldn't be a big deal, but extra heat dumped into the room and the noise (or additional cost in higher-end cooling) to get it there is a big deal.

Nvidia's Gxx04 GPUs have been kicking ass in this respect since the first Half-Keplers.
 
I could direct you to dozens of threads of this very issue, but I suspect you don't care and just want to carry on, so whatever. If you are interested, head over to the battelog forums.

I have already been there and i have complained about the issue numerous times across forums but that's besides the point to the point of blame, even BF-Hardline does not have the Mantle memory leak which uses the same engine.
 
Last edited:
Great review, thanks!

As one of the rapidly growing population of 120Hz+ 1440p gamers SLI performance is pretty important, and I was hoping a couple of 980s would get me above 120 fps in BF4 (among other games). I think I'll hold off for the 980ti, these results are a little disappointing, and the evidence seems to be pointing more at the 256 bus as much as the drivers, so I'm not sure the solution is only a small download away.
 
Great review, thanks!

As one of the rapidly growing population of 120Hz+ 1440p gamers SLI performance is pretty important, and I was hoping a couple of 980s would get me above 120 fps in BF4 (among other games). I think I'll hold off for the 980ti, these results are a little disappointing, and the evidence seems to be pointing more at the 256 bus as much as the drivers, so I'm not sure the solution is only a small download away.

Hmm... if a 980 SLI setup is getting 69 FPS at 4K (as stated in the [H] review) it seems likely you'd get more than 120 at 1440p. That's less than half the pixels.

There probably won't be a 980 Ti either, just like there wasn't a 680 Ti, considering 980 is the full GM204 chip.
 
I too get tired of people who constantly make a big deal about that. Efficiency is always a plus, but it is not a deal killer for hardcore gamers who care about the eye candy. That is like a Camaro and Mustang owner arguing about which car gets 2 mpg better on the highway. Both cars are fast and 2 mpg should not matter when both cars hit the 12's in the 1320.

That's fair, but I feel there's actually another side to efficiency that some are not seeing -- electrical limitations of the circuits in your house.

My room shares a 15A circuit with the living room, so on a good day I have maybe about 1200W of power left if I load the circuit up to 90% (1620W). Given that the very best PSUs tend to be around 90% efficient, that translates into 1080W of actual deliverable power.

For this reason alone I will most likely never buy any high end AMD cards until they can get the power consumption down, especially when I have a 4930K that pulls anywhere from 90-115W when gaming.

Yes I know I'm probably being a bit conservative and paranoid with the power figure, but I want to avoid getting into a situation where I'm constantly loading the circuit at 90%+, and have to worry whether turning on that 100W floor lamp is going to cause the circuit breaker to blow. And this is why I have double appreciation for Maxwell -- with my 4930K @ 4.5GHz/1.39V, and 2x Gigabyte 970 @ 1506/7600, I'm only pulling 520-550W from the wall when playing Crysis 3. That is an astonishingly low figure for the amount of graphics power the 970 is putting out (at those clocks my Gigabyte 970 is essentially a stock 980).

Before anyone asks yes there are 2 20A circuits in the house, one in the kitchen, one in the laundry room. You couldn't pay me to set up camp in either location. :p (not to mention they're on a 20A circuit for a reason...)
 
I just installed two R9 290x cards in my rig to push a 21:9 3440x1440 monitor. I am getting quite a bit of stuttering in a few games. I thought frame pacing was supposed to correct this. Is frame pacing a setting that I have over looked? I thought it was built into the drivers. I thought it was the monitor so I hooked my three 1920x1200 monitors back up and I still get it. I did not get the stuttering when I had my SLI 680s. I ask this because I was thinking about building another system. I am torn between keeping the R9s and using the 680s in my secondary system or purchasing two 980s and placing the R9s in my secondary.

Which games?
 
I wish AMD did more to leverage its advantage at 4K. DSR, which was recently added to the NV cards, renders at a higher resolution and downsamples to your native res, enhancing image quality.

AMD should have beat Nvidia to this feature - I find it very useful especially for games that lack AA or are limited to poor quality AA options like FXAA.

Such a missed opportunity IMO.

I've posted the same thing in the latest drivers thread in the AMD flavor, no answer. The problem is that AMD intentionaly hampered downsampling on their cards back in 13.1 drivers, where they rendered useless the tool we could use. I got no answer by warsam of course.
 
AMD is pretty far behind when it comes to anti-aliasing in general. Adaptive aa causes a bigger hit than Nvidia's trssaa and usually doesn't even work as where Nvida's trssaa usually does in games that support msaa. Their ssaa seems to work more often but doesn't work well in a fraction of the games that sgssaa does.

They should add something like dsr considering how well their cards do at higher resolutions. Still I would like to see a better SSAA and TRSSAA implementation which is what I tend to use more often.
 
I'm thinking the 256 bus width on the 980 is affecting performance to some degree as well. Pretty unacceptable on a product selling for over $500 dollars, you'd think Nvidia would have at least used a minimum 384 bus width.
 
I'm thinking the 256 bus width on the 980 is affecting performance to some degree as well. Pretty unacceptable on a product selling for over $500 dollars, you'd think Nvidia would have at least used a minimum 384 bus width.

Nvidia stopped using their larger parts consistently when their smaller parts were able to compete with AMD's larger parts; and generally speaking, the GTX980 would be faster all-around with proper optimizations. It's pretty clear that when the Nvidia setup is behind, it's REALLY behind and not in situations that make sense. Performance drops to single-card levels when staring at a static computer display in-game? That's likely just poor coding on the developer's part, and is something that may be fixed any number of ways by the developer or by Nvidia.
 
Sure, looks that way if you're 1 of the 1632 people in the entire 6 Billion + population of the world running a 4k for computer gaming and absolutely couldn't wait for nvidia to iron out driver issues on a less than 2 month old new architecture ;)

You left out the price premium. $500 more for no extra gaming experience. Unless you are very sensitive to your power meter. ;)

I am pretty sure Nvidia will be releasing new drivers in the future to focus on 4k benchmarks. You can see it on the 980 series non reference MSI and ASUS cards except gigabyte. They all come with HDMI 2.0 4k connection at 60hz or 3 display ports.
These cards are well equipped to handle the future of 4k once drivers get more optimized. By then of course by the time monitors come down in price new video cards will be released. I also cant believe gsync cant be equipped on a nice 2560x1440p IPS monitor. Still disappointed with that.

Considering how badly the 780ti SLI setup is doing relative to the 290X's, I wouldn't chalk it up to a 2 month old arch. Kepler is very mature. It goes deeper than drivers.
 
Except that I can't play BF4 with mantle on my 290x CFX setup without a game-killing and performance-destroying memory leak. Performance starts out good enough at 5760x1200 and then declines significantly after a round or two. It has been a known issue for months and I don't expect it to be fixed. DICE is certainly at least partially to blame, but once again we have an AMD-only issue here.

Mantle is still in beta. Expect some bugs.
 
yes, nvidia should follow amd's lead and transition over to a completely bus based sli platform.
 
Nvidia have the power efficiency and great value for money this round, only question i have is how far this architecture is being pushed - AMD and Nvidia being out of sync with releases is a weird place to be though not like the first time this has happened. I'm guessing AMD surprised with how good their hardware scaling is and Nvidia is now sitting with the legacy of poor scaling, I'm still keen on a single GPU for solving my gaming needs, so best nextgen card gets my money when i get a new monitor.

4k is pretty cool but i think 21.9 will have more traction than 4k, ultra wide satisfies most of us dual screen users from 17inch days :).

We need better 21*9 gaming monitors. All of them I've looked at have horrendous input lag.
 
AMD is being lauded for their multi-GPU performance.

While nVidia is making power efficient, cool running GPUs that perform slightly lower than AMD toaster ovens.

The world is bat shit crazy, I want off.

pretty funny isn't it that 4 years ago we were saying the same thing about nvidia's 400 series and why it was so power hungry and ran so hot while AMD's cards were much more efficient power usage and heat.. and then 3 years before that you had the exact opposite with the Nvidia 8k and 9k series which were way more power efficient than AMD's 3k and 4k series. keep looking through the history between nvidia and AMD and that cycle constantly repeats and it will continue to for the foreseeable future.
 
FWIW I also forgot to post this yesterday when I saw it

Guru3d did some FCAT testing and found that GTX 980 (Maxwell 2.0)

Has done some serious improvements in Frame Pacing over 780 Cards.

So Nvidia did improve SLi to some degree on this generation of Geforce cards as far as frame pacing is concerned. I thought it was already pretty good but lagging behind AMD after they implemented XDMA but it would be interesting to see this revisited now that Nvidia has improved things.

http://www.guru3d.com/articles_pages/geforce_gtx_980_sli_review,9.html


Guru3D said:
GTX 980 SLI does roughly 170 FPS on average in this scene sequence
index.php

index.php
Well, those dropped frames look atrocious. I think the apparent smoothness is because they're running at 140-210 fps.
 
You left out the price premium. $500 more for no extra gaming experience. Unless you are very sensitive to your power meter. ;).

So did you? How much do 4k Displays cost again?... Oh yeah ;)

And difference in gameplay experience. In every other scenario other than 4k there is a clear winner save for a game issue with nvidia or a un-optimized driver here or there. Only at 4k with early and crappy drivers are the 290 cards this competitive keep that in mind. :)

Well, those dropped frames look atrocious. I think the apparent smoothness is because they're running at 140-210 fps.

They said in their same review that they believe it is a problem with their FCAT system. But they aren't 100% sure what is causing it. "The frame-drops in FCAT we are still investigating. Rest assured you cannot see/detect these frame-drops yourself. Hence we think it could be an issue with our FCAT system."

If I had to guess I'd say because it was running at over 120fps the monitor (Dell 3007WFP) couldn't keep up with the videocard setup and so the videocards were dropping frames but it could have very well been an issue with the FCAT system.
 
And difference in gameplay experience. In every other scenario other than 4k there is a clear winner save for a game issue with nvidia or a un-optimized driver here or there. Only at 4k with early and crappy drivers are the 290 cards this competitive keep that in mind. :)
Even at 5720x1080/1200 or 2560x1440, 290X cards are still a bargain compared too the GTX980, the real competition to the 290X is the GTX 970 for best bang for buck fps/$ balance.

If i would be ask to build a game machine now for 1440p gaming, the 290X would be my first pick, for a single or SLI setup.
 
Even at 5720x1080/1200 or 2560x1440, 290X cards are still a bargain compared too the GTX980, the real competition to the 290X is the GTX 970 for best bang for buck fps/$ balance.

If i would be ask to build a game machine now for 1440p gaming, the 290X would be my first pick, for a single or SLI setup.

That's fine and all since your bargain conscious. Since at those resolutions the GTX 970s and 980s do quite a bit better overclock vs overclock that's where my purchase decision comes from so I'd choose 970s or 980s. AMD will bring some major competition soon but the 290s.. not so much. Unless you happen to own a 4k computer monitor then the 290X is the safe bet. Nvidia may fix these driver issues but then again they may not.

Edit: It looks like Watchdogs issues may have been fixed, Alien Isolation is a newer game but Nvidia is supposedly working with the developer to get a good working SLi patch, Shadows of Mordor dev team simply says they don't care for multicard (pretty much)

I guess what games you play and when you play them count too and part of the results in this review can also be the handful of games chosen ended up looking pretty bad for nvidia.

Edit 2: Not saying those games chosen were in any way purposely chosen. That's just the way that it happened to turn out.
 
Last edited:
Lately? I view this as a stupendously positive step from Ubisoft, that they took any meaningful action in the first place!

Hell, I might actually be interested in buying the game now...

There have been a few gems lately. Might and Magic X legacy, The last AC (not the new one coming out). Also looking forward to FC4, loved the FC series.

But there are alot of stinkers too.
 
That's fine and all since your bargain conscious.
Look at my sig, i would not call my self really bargain conscious, with a plus €10K rig. :cool:
Since at those resolutions the GTX 970s and 980s do quite a bit better overclock vs overclock that's where my purchase decision comes from so I'd choose 970s or 980s. AMD will bring some major competition soon but the 290s.. not so much. Unless you happen to own a 4k computer monitor then the 290X is the safe bet. Nvidia may fix these driver issues but then again they may not.
I like the 980 as a card, but not at its current price, it has to drop at least $100 to be really competitive against a 290X or its smaller brother 970, on the moment the 980 is just imho not really a good deal for its price.
 
Last edited:
Look at my sig, i would not call my self really bargain conscious, with a plus €10K rig. :cool:

Still the reason you gave was bang for buck or bargain. That's all I was referring to.

I like the 980 as a card, but not at its current price, it has to drop at least $100 to be really competitive against a 290X or its smaller brother 970, on the moment the 980 is just imho not really a good deal for its price.

There is a price premium for being # 1 :) Always has been, always will be. The reason I like the GTX 980 is that this time we're talking about maybe ~ $200 vs almost more than double that price difference in the last gen cards. The price difference from 780 to Titan was much worse.

I guess Nvidia is now successfully at getting us used to paying more because my GTX 680 cards were each $499.99
 
...:speculation: It's pretty clear that when the Nvidia setup is behind, it's REALLY behind and not in situations that make sense./:speculation: Performance drops to single-card levels when staring at a static computer display in-game? That's likely just poor coding on the developer's part, and is something that may be fixed any number of ways by the developer or by Nvidia.

The fps drops when the display is static, could be either an on purpose throtttling (since the GPU is effectively idle) perhaps to save power, or it's some simple bug that doesn't even affect gameplay. Those fps averages and minimums on that game (Alien Isolation) are likely not an accurate measurement of the gpu performance.

Brent, when playing alien isolation, how did it feel? Smooth? Any stuttering???

I am not pushing as high of a res, but alien isolation plays fine for me.
 
Definitely an interesting result, I would hardly have expected the tables to have turned so much with respect to scaling and performance in multi-card setups. I don't regret buying my 290X CF setup as much as I did when the 980/970's released.
 
Hmm... if a 980 SLI setup is getting 69 FPS at 4K (as stated in the [H] review) it seems likely you'd get more than 120 at 1440p. That's less than half the pixels.

There probably won't be a 980 Ti either, just like there wasn't a 680 Ti, considering 980 is the full GM204 chip.

They did the 980 in SLI as well in this review. For BF4 SLI 980s only averaged 104fps in 1440p, that's a far cry from 120fps min which is really what I'm looking for in ULMB mode. That's only on Siege of Shanghai, the new maps are much more demanding. Big disappointment for me.
 
They did the 980 in SLI as well in this review. For BF4 SLI 980s only averaged 104fps in 1440p, that's a far cry from 120fps min which is really what I'm looking for in ULMB mode. That's only on Siege of Shanghai, the new maps are much more demanding. Big disappointment for me.

Wait for the overclocking review Brent is already working on. ;)
 
Really great article. I game at 2560x1440 currently and just picked up an ASUS ROG Swift 144hz monitor. I'm happy with my 290Xs right now and it's nice to know that at higher resolutions they are pretty stout still. It looks like a lot of NVIDIA's issues could be ironed out with driver updates, but the bridge could be a limitating factor versus the XDMA solution AMD used.

I'm looking forward to the overclocking review.
 
I am patiently waiting to see how two 0cd 970's with mature drivers do. ;) Because that is my future.
 
Really great article. I game at 2560x1440 currently and just picked up an ASUS ROG Swift 144hz monitor. I'm happy with my 290Xs right now and it's nice to know that at higher resolutions they are pretty stout still. It looks like a lot of NVIDIA's issues could be ironed out with driver updates, but the bridge could be a limitating factor versus the XDMA solution AMD used.

I'm looking forward to the overclocking review.

Gotta ask- not interested in G-Sync?
 
They did the 980 in SLI as well in this review. For BF4 SLI 980s only averaged 104fps in 1440p, that's a far cry from 120fps min which is really what I'm looking for in ULMB mode. That's only on Siege of Shanghai, the new maps are much more demanding. Big disappointment for me.

you're not going to be getting anywhere near 120 fps minimum in bf4 with a 930. you also definitely do not need 4x msaa at 1440p. use 2x msaa with an smaa injector and you'll get similar image quality with less performance hit. furthermore, those are stock 980s. an average overclock of 1500 MHz will put you over the 120 fps mark even with 4x msaa.
 
Last edited:
you're not going to be getting anywhere near 120 fps minimum in bf4 with a 930. you also definitely do not need 4x msaa at 1440p. use 2x msaa with an smaa injector and you'll get similar image quality with less performance hit. furthermore, those are stock 980s. an average overclock of 1500 MHz will put you over the 120 fps mark even with 4x msaa.

That's good advice, though it's handling my 780s very nicely and I'm averaging over 85fps with them, which is currently what I'm running my ULMB (85Hz) at.

That said, 5930k is on order along with a Case Labs S8, and water cooling components. I want to add a couple 980s and their respective blocks to the list, but I'm not convinced they are up to the task yet. With the 780s good for all but the new maps at 85Hz, it's hard to justify 980s when I won't be able to get a solid 120fps for 120Hz ULMB, even overclocked with driver updates I don't think they'll make it, but here's hoping!.. I really don't want to be forced to sell body parts to pick up a Titan 2, or whatever is next for NVidia. Thanks to how amazing ULMB is, AMD is unfortunately not an option for me.
 
Back
Top