4 Weeks with NVIDIA GeForce GTX 980 SLI @ [H]

Would love to see a technical explanation as to how AMD is achieving better scaling with their multi GPU solutions.

With all the work they did to overcome their frame stuttering issues, i am not entirely surprised they have their xfire arch maxed out.
 
I'm glad Nvidia released the 970/980GTX .. but only the 970GTX is the one I would buy if I wanted an upgrade because of it price and this is still 28nm which we have been stuck at since the HD7970 release and no need to keep throwing money at a dead end .


So they (Nvidia) have made great values of every card out there in new and used prices .. as this gives new life to older cards at a price discount like the 770GTX that I bought used and seems to handle my needs at $200 and gives me time to wait for 20nm ..
 
Also on the 290X heat issue.. I know first hand that it can be handled with an aftermarket cooler which will drop 20c and keep the clocks rates up as I did a lot of testing on my 290 and logged gpu-z data back before AIB's started releasing there own coolers.. most heat issues is with ref design.
 
Also on the 290X heat issue.. I know first hand that it can be handled with an aftermarket cooler which will drop 20c and keep the clocks rates up as I did a lot of testing on my 290 and logged gpu-z data back before AIB's started releasing there own coolers.. most heat issues is with ref design.

This will helps to keep you from throttling, but it still doesn't change the amount of heat being release.
 
Also on the 290X heat issue.. I know first hand that it can be handled with an aftermarket cooler which will drop 20c and keep the clocks rates up as I did a lot of testing on my 290 and logged gpu-z data back before AIB's started releasing there own coolers.. most heat issues is with ref design.
The card still produces the exact same amount of heat. The aftermarket cooler simply removes more of that heat from the GPU itself. But your room will heat up the same whether it's a reference cooled 290X or if you have a custom water loop and block. The only way to reduce the heat output would be to somehow improve the power efficiency of the card.
 
Interestingly, there was some serious conversation inside NVIDIA about opening up PhysX, and I thought it was actually going to happen a couple of years ago. Something changed in regards to that along the way though.

Well, nvidia invited both AMD and Intel to support Physx shortly after they aquired Ageia. Both declined. I think nowawadays nvidia sees it as added value so probably they are no longer interested if AMD or intel support it.

I did read rumors about nvidia opening Physx a few years ago, so I guess they were true.

BTW what ever happened to havok accelerated physics?
 
Well, nvidia invited both AMD and Intel to support Physx shortly after they aquired Ageia. Both declined. I think nowawadays nvidia sees it as added value so probably they are no longer interested if AMD or intel support it.

I did read rumors about nvidia opening Physx a few years ago, so I guess they were true.

BTW what ever happened to havok accelerated physics?

FWIU the said it could be licensed by others. nVidia offers to open up nothing to nobody.
 
Kyle, to be honest, I am quite puzzled by the power draw difference.
Most reviews report let's say a 100 Watt difference between a single GTX 980 and a R9 290X at full load.
To have a 300 Watt difference when using 2 cards seems therefore very odd. How can this be explained?

BTW, as a GTX 970 owner I am also very curious about a 970 in SLI.
 
Great Review. WOW what performance for the watts used. I tested my rig below and it uses a lot more with my watercooled 290s cranked up.

I turned all of my fans on max (3 Corsair SP120s on my RX360 radiator and 9 XSPC 140s on my MO RA3-420 external radiator). I run dual D5 pumps with my 3930k cranked to 4.6hz. Inside my case I also have a 230mm fan and a 90 mm fan. My sabertooth mb has a chipset fan. Both of my Sapphire Tri-X gpus are watercooled and overclocked via afterburner to 1075 core/1400 memory with power settings maxed.

I decided to run my licensed version of Aida64 which has a stress test feature to stress various components. With fans set at high my idle draw was 175W (155 if fans set to low). Max power usage with ALL components stressed was 910W. When I ran the test without stressing the gpus, the lowest total usage was 410W (went as high as 425).

All tests were done with my Kill A Watt meter. I only tested my computer, not my monitor.

It's a fair conclusion that the gpus uses as much as 500W stressed. or 250W per gpu.

No doubt the GTX980 uses less.
 
Well, nvidia invited both AMD and Intel to support Physx shortly after they aquired Ageia. Both declined. I think nowawadays nvidia sees it as added value so probably they are no longer interested if AMD or intel support it.

I did read rumors about nvidia opening Physx a few years ago, so I guess they were true.

BTW what ever happened to havok accelerated physics?

Havok is a physics engine which is still rather commonly used (Elder Scrolls Online and Watch Dogs this year as examples) and was acquired by Intel.

It was never tied into hardware acceleration.
 
The card still produces the exact same amount of heat. The aftermarket cooler simply removes more of that heat from the GPU itself. But your room will heat up the same whether it's a reference cooled 290X or if you have a custom water loop and block. The only way to reduce the heat output would be to somehow improve the power efficiency of the card.


True .. but that depends on how well the cooler stores the heat and releases the heat over time and rpm of fans also,, as copper can store a lot of heat and something the ref cooler lacks is heat pipes and needs the higher fan rpm to deal with lack of heat storage and just dumps it at a faster rate..


but I do miss my 290 as winter is coming and this where the card makes up for heat output and Kyle will have a colder winter with that Nvidia rig lol..
 
Looking at the price difference it seems to me that the real winner here would be SLI 970 compared to CrossfireX 290X and SLI 980.
Best of both worlds.
 
Looking at the price difference it seems to me that the real winner here would be SLI 970 compared to CrossfireX 290X and SLI 980.
Best of both worlds.


Why, when 290X costs about the same as the 970 and trades blows with the 980?
 
absolutely, i mean dumping most of that heat back into the case is much better.:p

Get a good case and don't worry about it.

290systemtemps.png
290refsystemtemps-1.png


Here's reference compared to Tri-X. Difference of ~2° at the "hotspot" above the card. This is in a CM 690II. Not exactly a state of the art case either.
 
i though we were talking about multi gpu in this thread gents. you got one of those pics for crossfire?
 
i though we were talking about multi gpu in this thread gents. you got one of those pics for crossfire?

That is a valid point, but no, I don't have any images. There are few sites that give us anything other than what software gives them for temp readings (Which I find interesting since TPU says that they caught a supplier, which they won't name, inaccurately calibrating sensors to cheat on temp readings. The few that do thermal imaging have only run reference SLI/Crossfire tests.
 
Can we drop the "Insane heat" comments? Unless you live in Hawaii, next to a volcano, and run R9 290x cards then I can understand the heat comments, but other than that its not that big of a deal. To call it insane is downright silly.

I'm in Canada and it gets cold here, but my gaming room is a small room in my basement with little airflow.
I'm switching from my sli gtx 670 setup to a 980 simply for the temp reasons.
 
Just wondering whether [H] is going to add Ryse: Son of Rome as another game for benchmarking videocards. The game looks awesome and is very resource hungry. Would be perfect for benchmarking sli and cf configurations.
 
Well seeing this thread has quite a bit of followers... I wanted to ask a question.

I have a triple monitor setup (2x23 and 1x27inch) that is currently powered by an XFX R7970 OC Black Edition.

I use it mostly for gaming Eve Online/World of Warcraft and Civilization 5 and I effectively game 2, sometimes use 3 games at the same time.

I'm experiencing the following problems :

- Heat build up.
- Fans make a hell a lot of noise. (XFX made one of the first custom designs and from my opinion they made bad cooling setup)
- Sluggish response sometimes with intervals when using 3 games at the same time.

Would a setup like this with 2x980GTX in SLI improve my gaming a LOT or are we talking about a marginal upgrade like 10%-20% ?
 
Well seeing this thread has quite a bit of followers... I wanted to ask a question.

I have a triple monitor setup (2x23 and 1x27inch) that is currently powered by an XFX R7970 OC Black Edition.

I use it mostly for gaming Eve Online/World of Warcraft and Civilization 5 and I effectively game 2, sometimes use 3 games at the same time.

I'm experiencing the following problems :

- Heat build up.
- Fans make a hell a lot of noise. (XFX made one of the first custom designs and from my opinion they made bad cooling setup)
- Sluggish response sometimes with intervals when using 3 games at the same time.

Would a setup like this with 2x980GTX in SLI improve my gaming a LOT or are we talking about a marginal upgrade like 10%-20% ?

Exactly how are your monitors arranged?
Do you run one game on all three monitors, or one on each?

The 980s still produce heat, but less than the R9 290X and probably about the same as a 7970.

The 980s will be WAY more quiet than the current card you use, up to about 75% fans cycle maybe 80, then they do become audible.
 
What do you guys do to get SLI to work in the newest titles that don't currently have SLI support in the latest drivers? Often times you show SLI enabled benchmarks but don't say if its automatically enabled in the newest driver, manually set to AFR 1 or AFR 2 in the control panel or if it's some magic profile forced via Nvidia Inspector.
 
What do you guys do to get SLI to work in the newest titles that don't currently have SLI support in the latest drivers? Often times you show SLI enabled benchmarks but don't say if its automatically enabled in the newest driver, manually set to AFR 1 or AFR 2 in the control panel or if it's some magic profile forced via Nvidia Inspector.

If Forcing to AFR2 or AFR1 doesn't work in the Control panel usually there is someone who figured out how to do it in Nvidia Inspector a google search will usually show you how. That is usually a interim fix until a patch or driver update comes out with a working profile.
 
If Forcing to AFR2 or AFR1 doesn't work in the Control panel usually there is someone who figured out how to do it in Nvidia Inspector a google search will usually show you how. That is usually a interim fix until a patch or driver update comes out with a working profile.

Would it be possible for the [H] to say during their video card reviews which it is they are using?
 
Would it be possible for the [H] to say during their video card reviews which it is they are using?

I don't speak for them but from my time here as a reader they test it with the latest driver and game patches multi-card support be damned. If the game doesn't support it they post the results and mention lack of support in the article.

They don't go through all of those measures to get SLi or Crossfire working because the average user may not know how to do all of that and they want to report the out of box experience most users will run into. This sometimes makes the graphics cards manufacturers look like they are lacking in the driver dept or the game manufacturer look like they shoveled a shitty console port but it is what it is.

Sometimes this has changed the industry for the better. AMD took note on some things reported at this site and has improved their multi-card scaling by leaps and bounds. Game devs have improved their support likely to avoid the bad press this and other sites give when they get called out.

In the words of the great Kyle Bennett regarding Rage

I want my damn money Back!
 
Exactly how are your monitors arranged?
Do you run one game on all three monitors, or one on each?

Thnx for your answer Magoo.

I run one game per monitor. 2x23 are connected through DVI and the main 27 is connected through HDMI. (Also , a small remark, whatever monitor is connected through HDMI is allways a lot darker in contrast then the 2 screens connected through DVI no matter what I do with the settings)

Also for me the amount of noise is really annoying when those games are running....
 
Thnx for your answer Magoo.

I run one game per monitor. 2x23 are connected through DVI and the main 27 is connected through HDMI. (Also , a small remark, whatever monitor is connected through HDMI is allways a lot darker in contrast then the 2 screens connected through DVI no matter what I do with the settings)

Also for me the amount of noise is really annoying when those games are running....

Wow you play 3 games at the same time? Daaamn you're efficient. Sounds like a lot of work for a single 7970. As far as I can tell a 980 gtx should be about even with 7970s in crossfire and depending on the game and resolution it can vary.

Why don't you just buy a single 980 gtx instead of buying 2 right away? You will see a substantial increase in performance as well as reduced power consumption, heat, and noise.
 
Last edited:
Why don't you just buy a single 980 gtx instead of buying 2 right away? You will see a substantial increase in performance as well as reduced power consumption, heat, and noise.

I'm quite convinced about the heat and noise reduction, for the power consumption I'm guessing that's depending on what your doing at that specific moment.

The reason for 2 cards is that I notice that the R7970 can cope with the heavy load but everything is not as fluid as it should be. While playing for example 2 Eve online clients and 1 Wow client the load would be divided over the 2 cards or am I looking at this wrong ?
 
I'm not sure i've heard of too many people playing more than 1 game at once, let alone 3. I would believe the card would try to distribute resources needed by each game, i don't think it would be divided evenly. But I think this is another case where the 980 gtx is beneficial is the extra 1gb of vram because I doubt vram would be shared between games.

If a 7970 can almost run 3 games for you to your liking then I think a single 980 would be a substantial upgrade and would probably solve your problems, it would definitely be quieter and a lot cooler. Its nearly twice the performance of a 7970.
 
AMD is sooooo far behind Intel/Nvidia...why do people even bother with AMD hot/loud garbage!!
 
Wow you play 3 games at the same time? Daaamn you're efficient. Sounds like a lot of work for a single 7970.

Is Eve really that much of a stress? After all, it's little more than a pretty front end to a spreadsheet and a chat app, right?
 
AMD is sooooo far behind Intel/Nvidia...why do people even bother with AMD hot/loud garbage!!

Please read THIS review on [H] and see how ridiculous that statement is. Also keep in mind it's a comparison with year old AMD tech against nVidia's latest and greatest.

Don't forget to read the conclusion if you want to think that AMD is sooooo far behind.
 
Please read THIS review on [H] and see how ridiculous that statement is. Also keep in mind it's a comparison with year old AMD tech against nVidia's latest and greatest.

Don't forget to read the conclusion if you want to think that AMD is sooooo far behind.
Ignore him. He has never made an intelligent post on here or any other forum. He spams inaccurate crap about his 770 sli setup on anandtech forums all the time.

EDIT: lol he was just recently banned form anandtech cause they were sick of his crap.
 
Last edited:
Back
Top