NVIDIA GeForce GTX 980 SLI 4K Video Card Review @ [H]

You mean to say that the AMD cards have no DP ports?

no? I'm referring to the 4K TVs. Samsung, Sony, LG, none of them have DisplayPort. Therefore you are limited to 30 Hz with an AMD card, and there are no HDMI 2.0 -> DP adapters. Panasonic is the only option.
 
no? I'm referring to the 4K TVs. Samsung, Sony, LG, none of them have DisplayPort. Therefore you are limited to 30 Hz with an AMD card, and there are no HDMI 2.0 -> DP adapters. Panasonic is the only option.

i read that a new adapter will be available sometime in late 2015 that will convert the signal

AMD just wasnt thinking were they
 
i read that a new adapter will be available sometime in late 2015 that will convert the signal

AMD just wasnt thinking were they

You really are trying hard here to make Nvidia look like the best in every aspect and making AMD look like shit. Keep fighting the good fight there, you're really convincing everyone that your information is not biased at all!
 
You really are trying hard here to make Nvidia look like the best in every aspect and making AMD look like shit. Keep fighting the good fight there, you're really convincing everyone that your information is not biased at all!

hey man this is a discussion forum get used to it
 
Oh, I forgot I'm in the children's section here. Sorry, won't happen again. Anyways, I don't feel like feeding the trolling or commenting on how someone posts.

Nvidia has some work to do with drivers and SLI when it comes to 4K since they look to be a bit behind. That said while their performance isn't dominating, the power consumption is pretty impressive.
 
Oh, I forgot I'm in the children's section here. Sorry, won't happen again. Anyways, I don't feel like feeding the trolling or commenting on how someone posts.

Nvidia has some work to do with drivers and SLI when it comes to 4K since they look to be a bit behind. That said while their performance isn't dominating, the power consumption is pretty impressive.

i am having deja vu hasnt this been mentioned already . anything else new? come on man you can do better than this
 
i read that a new adapter will be available sometime in late 2015 that will convert the signal

AMD just wasnt thinking were they


why dont you now go do some research and let us know when this new adapter will be available
 
The reason why it happens is because Crossfire scales better than SLI.

Crossfire adds about 70-85% of performance and at times, SLI adds 60-75% performance. If you were to look at 3 way and 4 way configurations, even with the 780Ti faster at single GPU solutions, 4 290X will outperform 4x 780Ti.

The other reason is because even with single GPU, AMD is almost competitive in terms of raw performance at 4k. The 980 may be about 15% faster at 2560x1440 than a 290X, but the gap closes to within 10% on average at 4k. I think it's the wider bus that plays a role here.

Now if only AMD can get its power consumption down, it could have a very competitive solution.
 
The reason why it happens is because Crossfire scales better than SLI.

That's a case-by-case thing. SLI (or Crossfire) not scaling properly in newer games is nothing new.

Now if only AMD can get its power consumption down, it could have a very competitive solution.

History would show that Nvidia will get SLI sorted out long before AMD gets their power consumption in check...
 
The reason why it happens is because Crossfire scales better than SLI.

Crossfire adds about 70-85% of performance and at times, SLI adds 60-75% performance. If you were to look at 3 way and 4 way configurations, even with the 780Ti faster at single GPU solutions, 4 290X will outperform 4x 780Ti.

The other reason is because even with single GPU, AMD is almost competitive in terms of raw performance at 4k. The 980 may be about 15% faster at 2560x1440 than a 290X, but the gap closes to within 10% on average at 4k. I think it's the wider bus that plays a role here.

Now if only AMD can get its power consumption down, it could have a very competitive solution.
The scaling gap in performance in SLI is similar to that in resolution when it comes to that. I see perfect 100% scaling in a lot of games at 1080p, which allows me to push my 144Hz monitor without sacrificing on the details with 970s. The scaling gets worse when I go to 1.78x and 4.00x DSR. In Shadows of Mordor the scaling goes down to around 50-60% at 4k compared to 100% at 1080p.

Naturally, I will evaluate voltage behavior in the overclocking review. If I see an issue, I will report it.

I like to start with a baseline review before doing the overclocking review. IMO, you need to know where you are coming from, what you have, before you can compare that to where you are going in terms of maximum performance. Overclocking these cards and gaming while overclocked is an entire article in of itself. It would be too much for one article, and take a lot more time to get the review published.

Now that I have this baseline, I can take my overclocking data, and have something to compare it to and make a really nice Overclocking article.
Good to hear, but keep in mind that the voltage discrepancy affects SLI at all clock speeds, default included. In the [H] discussion I link in my original post, we found out that offsetting core clock on the card that wasn't undervolting brought voltages up to a consistent level on both cards before starting to overclock. Doing so improved stability and performance in most cases, with the side-effect of having higher Boost clocks that were more inline with what we were seeing in single-card situations.
 
This situation certainly reminds me of the 1800 xt vs gtx 7800 days. One with a 512-bit bus and high-power usage. Another with a high clocks, power effiecient and 256-bus. The first does better at high resolution, the other better at standard resolution.
Call me a naysayer, but i really don't think that sli inefficiency is 100% to blame here.

I think you meant HD 2800 XT(X) with the 512bit bus, not X1800.

I'd like Nvidia to implement their own bridge less SLi solution for the large die Maxwell, but that wouldn't really help anyone at the moment with the issues pointed out in the article.
It would be interesting to see if a large die Maxwell paired with a wider memory bus and the "old fashioned" bridge for SLi could overcome the issues, or if it really is time for the bridge to die off. In any case hopefully drivers get pumped out asap.
 
The thing is though 290/290x are over a year old, and Nvidia just countered 1 year old tech. I would expect some improved with a new generation of cards.

Nvidia delivered on power/heat/performance. Problem is Nvidia just now matched AMD in 4k performance. I dont think drivers are going to fix it all that much (remember Hawaii is 1 year old).

And something to think about, 4k is the future, specially with 4k monitors hovering around $550.

Anyway now im rambling.
 
Noticed a small type-o. In the Alien Isolation segment youve got "We are go into more detail in an upcoming article" instead of "We will go".

As for the article I am very surprised that a pair of 290x's at half the price can give you almost the same performance as a pair of 980's! With 290x's going for almost $300 right now, thats a hard deal to pass up for so much GPU horsepower. I honestly didnt think AMD would lower their prices that much but knocking off $200 for those cards makes them the best bang for the buck for a high end GPU.
 
Last edited:
The thing is though 290/290x are over a year old, and Nvidia just countered 1 year old tech. I would expect some improved with a new generation of cards.

Nvidia delivered on power/heat/performance. Problem is Nvidia just now matched AMD in 4k performance. I dont think drivers are going to fix it all that much (remember Hawaii is 1 year old).

And something to think about, 4k is the future, specially with 4k monitors hovering around $550.

Anyway now im rambling.

I wish AMD did more to leverage its advantage at 4K. DSR, which was recently added to the NV cards, renders at a higher resolution and downsamples to your native res, enhancing image quality.

AMD should have beat Nvidia to this feature - I find it very useful especially for games that lack AA or are limited to poor quality AA options like FXAA.

Such a missed opportunity IMO.
 
Yes, AMD says it is DICE's fault and DICE says it is AMD's fault. The fact remains that it only affects AMD cards - and seemingly mostly just in crossfire. It is extremely, extremely annoying. I frequently jump back and forth between AMD and Nvidia, but this sort of shit just seems to happen way more regularly when I have AMD cards.

A link were Dice says its AMDs fault, Dice made the patch not AMD,no memory leaks in any other Mantle game and BF4 is a bug ridden mess, and of course it only effects AMD cards as Mantle only runs on AMD :rolleyes:
When there are memory leaks in DX games you dont see developers trying to blame Microsoft,s DX.
 
Last edited:
A modest grammar error I've spotted in the conclusion:

'We thought we were going to be able to conclude praising GeForce GTX 980 SLI as the hands down best 4K gaming experience to date. '

should be

We thought we were going to be able to conclude by praising GeForce GTX 980 SLI as the hands down best 4K gaming experience to date. '
 
..increased ROPS but on a 256 bit bus - not a problem @1080p..'almost 4k' 2160p? Not so much...
 
Question:
I never quite understood the testing methodology [H] uses when comparing between NVidia/AMD GPUs.
From what i notice at the "test setup" screen, they test NVidia GPUs, with enabling 2 extra features instead the AMD GPU, Cuda & PhysX.
I know these 2 features cannot be enabled at AMD GPUs since they are exclusive to NVidia, but you cannot compare 2 things using different setups, just because the one can offer extra features!! :eek:
It's obvious and logical, that if you want to compare something with something else, you have to use exactly the same factors to be tested with !! :confused:
 
Question:
I never quite understood the testing methodology [H] uses when comparing between NVidia/AMD GPUs.
From what i notice at the "test setup" screen, they test NVidia GPUs, with enabling 2 extra features instead the AMD GPU, Cuda & PhysX.
I know these 2 features cannot be enabled at AMD GPUs since they are exclusive to NVidia, but you cannot compare 2 things using different setups, just because the one can offer extra features!! :eek:
It's obvious and logical, that if you want to compare something with something else, you have to use exactly the same factors to be tested with !! :confused:

They shouldn't be doing that in the "apples to apples" section.

I would have liked to see max OC vs. max OC. That's where maxwell really shines vs. AMD. AMD's pretty maxed to start with where many people with maxwell are getting 20-30% OCs.
 
They shouldn't be doing that in the "apples to apples" section.

I would have liked to see max OC vs. max OC. That's where maxwell really shines vs. AMD. AMD's pretty maxed to start with where many people with maxwell are getting 20-30% OCs.


There is no doubt when overclocked the GTX 980 destroys the 290x in every benchmark

Thats why these cards sell out like hot cakes. Don't be fooled . If Nvidia doesn't fix sli for 4k then the next lineup amd releases we could see Amd as King of the hill here soon within the next 3 months

Plus this review was using reference video card with no boost or memory increase. Who buys reference cards lol on hard ocp
 
Very interesting review. The R9 290x cards keep on going.

Looks like the 970/980 cards were really rushed to market. Manufacturing issues such as coil whine, unpolished drivers, and very low availability seem to indicate this.

I'm really curious on next Gen AMD cards. AMD has been focusing on 4K. If the R9 M295x is any indication, good things are on the way, especially for people using Hi-Resolution displays.
 
The thing is though 290/290x are over a year old, and Nvidia just countered 1 year old tech. I would expect some improved with a new generation of cards.

And AMD hasn't delivered anything worthwhile in that year at all. You can't argue the power and heat advantages, and NVIDIA at least will improve with driver updates.
 
Nice review as usual.

AMD needs to go on a power diet. What is that like $30 more a month if you game in crossfire?

A bit of an exaggeration, even for a troll post.

3 hours of gaming /day (borderline unhealthy)
* 300 w/hour more for 290x crossfire
*30 days/mo
=27 kWh /mo
*$0.10 /kWh
= $2.70 /mo
over 2 years = $65

for a $500 cost savings up front, 8% compounding monthly for 2 years = $586.
gee, seems like the cost savings offsets the power useage, even for a heavy gamer.
 
I tend to leave my computer on 24/7. Once I get home from work, feed everyone and get the kids in bed, I game for about 3-5 hours, then head to bed.

Looking at the power numbers, the green team cards are 5-10% more energy efficient over the day than the red team competing cards. For no performance advantage, it's clear right now that the AMD cards are still the best 4k cards by value.

At some point, nvidia will get the drivers in shape and a pair of 980s will eventually pull away from a pair of 290x cards. For the forseeable future, anyone actually basing their decision on the posted facts would pick up 290x's if they were using a 4k monitor at home.

If you're concerned about future proofing, then you'd honestly be waiting for the next-gen AMD to drop to make an actual comparison.
 
At some point, nvidia will get the drivers in shape and a pair of 980s will eventually pull away from a pair of 290x cards. For the forseeable future, anyone actually basing their decision on the posted facts would pick up 290x's if they were using a 4k monitor at home.
Actually i think the 512bit mem bus with 320 GB/s bandwidth, is whats giving AMD +1y old card suds a nice preformance at 4K, over the 256bit bus with only 224 GB/s bandwidth of the GTX 980.

And i think the reason the 780Ti did so badly is properly because 3GB is to little space, as the 8.3MP of 4K uses a lot of memory.

I see my Titan setup for a 7680x1600 Surround setup with 12.3MP getting often close to the 6GB mem limit.

If you're concerned about future proofing, then you'd honestly be waiting for the next-gen AMD to drop to make an actual comparison.
And i am two waiting for the 390X, as that card will come with 8GB of VRAM according to the latest smoke signals.
 
A link were Dice says its AMDs fault, Dice made the patch not AMD,no memory leaks in any other Mantle game and BF4 is a bug ridden mess, and of course it only effects AMD cards as Mantle only runs on AMD :rolleyes:
When there are memory leaks in DX games you dont see developers trying to blame Microsoft,s DX.

I could direct you to dozens of threads of this very issue, but I suspect you don't care and just want to carry on, so whatever. If you are interested, head over to the battelog forums.
 
A bit of an exaggeration, even for a troll post.

3 hours of gaming /day (borderline unhealthy)
* 300 w/hour more for 290x crossfire
*30 days/mo
=27 kWh /mo
*$0.10 /kWh
= $2.70 /mo
over 2 years = $65

for a $500 cost savings up front, 8% compounding monthly for 2 years = $586.
gee, seems like the cost savings offsets the power useage, even for a heavy gamer.

If you live in a cold environment...it probably offsets for heating. But if you are in a warmer environment...another 300W of heat in a room is not trivial, especially during the summer. Secondarily, there is a difference in power supply cost that should be accounted for as well and possibly case size. A rough swag would probably put the savings @ ~$75/year would be reasonable.
 
New Watchdogs patch Released Yesterday is a Miracle for Geforce cards

http://www.guru3d.com/news-story/watch-dogs-new-pc-update-oct-27.html

Over 5 months and they have finally released a patch that should have fixed the issues on geforce cards as Patch # 1

•Better performance on more recent models of cards - 770 onwards will see better improvements.
•Moved all resource creation to an exclusive worker thread.
•Changed technique used for updating texture mipmaps in Ultra textures mode.
 
New Watchdogs patch Released Yesterday is a Miracle for Geforce cards

http://www.guru3d.com/news-story/watch-dogs-new-pc-update-oct-27.html

Over 5 months and they have finally released a patch that should have fixed the issues on geforce cards as Patch # 1

•Better performance on more recent models of cards - 770 onwards will see better improvements.
•Moved all resource creation to an exclusive worker thread.
•Changed technique used for updating texture mipmaps in Ultra textures mode.

only 5 months late!!!

Man Ubisoft really is sucking lately.
 
If you live in a cold environment...it probably offsets for heating. But if you are in a warmer environment...another 300W of heat in a room is not trivial, especially during the summer. Secondarily, there is a difference in power supply cost that should be accounted for as well and possibly case size. A rough swag would probably put the savings @ ~$75/year would be reasonable.

Case size? :confused: yeah....riiiight. big size difference between the gpus, especially because people who have multiple gpus have small cases. :rolleyes:

My post was purely targeted at the monthly power consumption of the video card because that's what the troll post talked about.
Even for the secondary power cost of home AC/heat, the heat generation in the winter is money that would have been burned in the home's central heating anyhow. So it averages out over the summer/winter even for people who live in high desert climates like me.

Rough swag of $75/year? Show your numbers.
But before you do, is it even worthwhile to continue this discussion?

There's a whole list of assumptions that would have to be made and it could be easily skewed either way to show advantages/disadvantages for the GTX980 or 290x. It's just an exercise in futility with no real world value in a forum.
 
Let's drop the questioning of each other and please stay on topic.
 
I just installed two R9 290x cards in my rig to push a 21:9 3440x1440 monitor. I am getting quite a bit of stuttering in a few games. I thought frame pacing was supposed to correct this. Is frame pacing a setting that I have over looked? I thought it was built into the drivers. I thought it was the monitor so I hooked my three 1920x1200 monitors back up and I still get it. I did not get the stuttering when I had my SLI 680s. I ask this because I was thinking about building another system. I am torn between keeping the R9s and using the 680s in my secondary system or purchasing two 980s and placing the R9s in my secondary.
 
only 5 months late!!!

Man Ubisoft really is sucking lately.

Lately? I view this as a stupendously positive step from Ubisoft, that they took any meaningful action in the first place!

Hell, I might actually be interested in buying the game now...
 
Back
Top