Testing: 1080 GTX SLI HB + 970 GTX PHYSX PCIE 3.0 Scaling at 4K Resolution

pc1x1

[H]ard|Gawd
Joined
Jan 1, 2008
Messages
1,165
Hi guys,

Because I have been reading contradictory information on the subject. I decided to test it myself.

It still seems this article, and logic would make me agree with it.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/25.html

but some are reporting that it does make a difference at 4k and above. So I wanted to test it, and wanted others to use as reference. A Dell 4k Monitor will be used.

Here's the issue:
Because there is a 970 GTX being used as a dedicated PhysX card, and a 28 lane CPU is being used, the slots will be limited to PCIE 3.0 8x. I am curious to see if there is an impact to the 1080 GTX's in SLI running at 8x. A Nvidia HB 2 Slot bridge will be used on the GTX's.

I am testing a Asus Rampage V V10 with a 6800k. The 6800k was chosen, because there didn't seem to be a real reason to go for the 6850k and the 6800k was on sale. I hope not to regret that decision. But because that was done, we can't have all slots at 16x. The purpose of this to to test if that will affect gaming at 4k resolution and or above (using scaling).

Bonus Question: Previously this rig was using 4x SSD's in Raid 0, which gave a 1k R/W performance. I intend to continue using that. However noticed the M2/U2 configurations being popular now, and makes sense to have a direct connection to the PCIE. Will this PC suffer from having the 4x SSD Raid on the Sata Controller? And what are the pro's and cons vs M2, U2. Obviously not having the liability of Raid 0, but wanted to know if there is any other technical reason. I presume the bandwidth taken from the CPU is from the same place? Just different more convenient form factor not needing raid?

Please post suggestions, and scores for testing. I am planning on testing using UniEngine and any other suggestions here.

The PC is not yet built, so please give it a few days, awaiting backordered fans. But wanted to start gathering prelim information, and knowledge of people with similar setups.

Cheers!
 
There's no point to the testing as you don't have anything to reference and compare especially when you will be using up all your lanes. If you had 40 lanes, then you could test x16 widths vs x8 widths but you can't do that on a 28 lane cpu. Also when you start using the m2 slot you start eating into those pcie lanes.

Moving your ssd's to pcie is also a waste as it eats up pcie lanes, slots (lotta slots) and the ssd's themselves won't be any faster. If you want serious speed, you should start with a high end m2 ssd like a 950 pro or higher and that will get you double the speed of your current ssd raid.
 
Last edited:
There's no point to the testing as you don't have anything to reference and compare especially when you will be using up all your lanes. If you had 40 lanes, then you could test x16 widths vs x8 widths but you can't do that on a 28 lane cpu. Also when you start using the m2 slot you start eating into those pcie lanes.

Moving your ssd's to pcie is also a waste as it eats up pcie lanes, slots (lotta slots) and the ssd's themselves won't be any faster. If you want serious speed, you should start with a high end m2 ssd like a 950 pro or higher and that will get you double the speed of your current ssd raid.

I can test reference? All I have to do is shut off the PCIE lanes via motherboard dipswitches, and only run SLI *(Unless that doesn't free up lanes). That should give me a 16x / 8x, (not sure if I can force 16x 16x shutting all other lanes.) Or worst case scenario, test only a single 1080 GTX at 16x. But thats why I posted here, ideally someone that has a 40 lane cpu and similar setup can also chime in. So we can compare. And the M2 eats up the PCIE X4 Slot, which I believe is the X99 one. Seems that the 4 PCIE lanes that are large come off the CPU, and the little ones off the Northbridge.


Bonus: Thanks that what I thought, and honestly, I don't see to much benefit having more than 1k Read and Write basically. But good to know, if I ever want to upgrade without spending a fortune, drop a 40 lane CPU, and a 950 Pro M2, should help a little.

Thanks
 
Wooo awesome Torry, Keep me posted. And if you have any questions lmk, as I assembled the Rampage already. I did have to buy a USB Header expansion. So you may want to look at that, if you have a bunch of Corsair link, etc stuff. The board has 2 headers, but one gets taken for the SupremeFX Audio module, and that leaves one. Just food for thought.
 
Ok Torry, I am good to go, PC built, installing all software now. Let me know when you are ready! Thanks!
 
I can't figure out why my 2x 1080 gtx are running at 2.0 instead of 3.0. My gpuz reports 16x at 2.0 and Im sure thats hurting performance.
 
I can't figure out why my 2x 1080 gtx are running at 2.0 instead of 3.0. My gpuz reports 16x at 2.0 and Im sure thats hurting performance.
Mine are saying the exact same thing. Not sure if GPUZ is reporting incorrectly, or something is off. Going to troubleshoot tomorrow, , keep me posted if you find a solution.

Mine are all PCIE 2.0 8x
 

Attachments

  • capture_001_16122016_203504.png
    capture_001_16122016_203504.png
    52.9 KB · Views: 63
You guys didn't set it to 2.0 in bios maybe?If not reinstall drivers or do the reg edit?
 
If I recall somewhere that I read, its a 1% difference.

Anyways, I figured it out. Like what thesmokingman said, needed to go into bios and change it manually. I went into bios and the pcie speed was set to auto, so I set both cards to gen 3 and I went into gpuz and they both now report 3.0

So heres what you need to do.

1. Enter bios
2. Click on Advanced tab
3. Click on Advanced system agent configuration
4. Click on NB PCIe Configuration
5. Change each setting from auto to Gen 3
6. Ta-da now we have 3.0!!

Let me know how that works for you.


Edit: Testing the Witcher 3 at 2.0 I was getting 59 fps at 4k at the same spot of my game save. After loading up that same game save spot, my fps jumped to 71.. All by switching to 2.0 to 3.0... Amazing!
 
I can't believe there are physx believers even after all these years. It was a clear scam from the very start. Dedicating GPUs is mind boggling
 
You guys didn't set it to 2.0 in bios maybe?If not reinstall drivers or do the reg edit?
No, had it on Auto, will review Torry's post.
8x PCIE lanes shouldn't affect performance much I wouldn't think.
If its Gen3, then it shouldn't because Gen3 PCIE = 16x Gen2. But having 8x Gen2 on a 1080 GTX. I am sure affects performance.
If I recall somewhere that I read, its a 1% difference.

Anyways, I figured it out. Like what thesmokingman said, needed to go into bios and change it manually. I went into bios and the pcie speed was set to auto, so I set both cards to gen 3 and I went into gpuz and they both now report 3.0

So heres what you need to do.

1. Enter bios
2. Click on Advanced tab
3. Click on Advanced system agent configuration
4. Click on NB PCIe Configuration
5. Change each setting from auto to Gen 3
6. Ta-da now we have 3.0!!

Let me know how that works for you.


Edit: Testing the Witcher 3 at 2.0 I was getting 59 fps at 4k at the same spot of my game save. After loading up that same game save spot, my fps jumped to 71.. All by switching to 2.0 to 3.0... Amazing!
Cool, I am still installing software, but will do. Most likely tomorrow or so, I can start testing.

I can't believe there are physx believers even after all these years. It was a clear scam from the very start. Dedicating GPUs is mind boggling
Def not worth it, but I had 970 GTX Tri-SLI before. And I decided for ease of use for that to become another PC as just SLI. The third card really didn't make much a difference. So I kept one 970 GTX as I own it, might as well do something.

I use these many cards, as I use 6+ monitors, though now cards can support 4x monitors per card, so its not as bad, but I like having input variability.

Hopefully that makes sense.

Thanks!
 
Yup, cards now reporting Gen3. Not sure why Asus in their infinite wisdom defaulted to Gen2, but good to go now :).
 
Ps. Minor update, but I am going to have to deal with this later. So I have

PCIEX16/X8_1 : 1080GTX
PCIEX8_2 : 1080 GTX
PCIE4_1 : Blank
PCIEX16/X8_3 : 970 GTX
PCIEX1_1 : Blank
PCIEX8_4 : Asus Xonar Xense (Headphone)

PCIEX16/X8_1: Reports Gen3 16x @ Gen3 8x
PCIEX8_2: Reports Gen3 16x @ Gen 1.1 8x (Problem)
PCIEX16/X8_3: Reports Gen 2 x16 @ Gen 1.1 8x

Not sure to check Sound card link speed yet.

I set the top 2 PCIE's at Gen3, and the bottom 2 and Gen2, hoping that would leave bandwidth, but that did not work. So I'll have to play with it. Possibly, move one of the 1080's to the 970 GTX location. I hope not, as my SLI HB bridge is a 2 Slot size, but we'll see. When I have time to troubleshoot that I'll be back. :) But in theory the top 2 slots should be able to do Gen3 8x each and the others Gen2 8x no problem, but I assume the way lanes are divided isn't that granular by bandwidth, but by link meaning doesn't matter if its Gen2, or Gen3. So things may get messy, or I'll have to pull something out of the computer, heh! I am going to play with the bios, and see if I can limit links. We'll figure it out. I'll be back!

Final: PSS haha, since keep thinking about it. I have three options. Contact Asus, and see if its possible to control per PCIE lane link speed, or suggest that as a feature.

Tape the 970 GTX to become a 4x device. That way the lanes won't try to keep it at 8x. The Sound card is a 2x device, and says not detected on bios, so I am assuming anything below 8x then it doesn't give full lane allocation. My concern is if I limit the 970 GTX in half, that it may not get enough power to run, or cause issues.

Third option, and the most likely, is pull the 970 GTX off the system, the PhysX advantage is most likely minimal, and technically, and currently I have all my 6/ sometimes 8 monitors plugged in to the 1080's anythow, with no foreseable need to plug more monitors in. So technically I don't need the 970 GTX here, and it will go back to its old system, and become Tri-SLI again.

Welcome suggestions, third option is most sensable, but would be nice if Asus gave us more control.

And moral of the story. Don't buy a 6800k kids, hahaha if you plan on having alot of stuff connected ;). Any takers for a brand new 6800k, $360 shipped :).
 
Last edited:
I use these many cards, as I use 6+ monitors, though now cards can support 4x monitors per card, so its not as bad, but I like having input variability.

Hopefully that makes sense.

Thanks!

I never liked SLI, though it made sense back in the 3dfx days, as well but ye, your use case makes some sense.

As for what to do with excess cards, why not give them away to people who can't afford them. Just a thought. Cheers :)
 
I never liked SLI, though it made sense back in the 3dfx days, as well but ye, your use case makes some sense.

As for what to do with excess cards, why not give them away to people who can't afford them. Just a thought. Cheers :)
I never been a huge fan of SLI or Crossfire either. Both have given me alot of headaches, but I been using them since close to launch. The dollar value isn't there, but for pushing a few more frames use it anyhow. Had its high point in the beginning, and low points, with the frame pacing issues. Now it's in the middle. It's not great but workable and doesn't crash as much, so I often just SLI, especially since I been pushing 3+ monitors for over a decade. I had hoped mini display port, and the radeon that had 6 ports, would been the defacto, but alas lol.

As far as donating, I actually do. I never sold anything really, though it goes to family first. I used to give it to my dad and my mom. Which was why they" "typical email, facebook, maybe word, excel powerpoint users" would have crossfire gaming machines as their computer. People always were like wtf? lol. My dad even had a Pentium 4 EE for a while lol. And their old machines would go to charity, friends and or family, or charity services. Reason being, I am thankful because there was a time when I couldn't afford to give my parents computers, and we got one from my school. So I definitely believe in paying it forward. Though the cycle was a bit broken recently because I gave my parents all laptops, since its what they wanted, they didn't want bulky gaming pc anymore. So thats why this time, the 970 GTX's became my parents Living room PC, which at the moment all they do is play Spotify with a 4670k and 3x 970's.

I do need to clean up and donate some stuff eventually though. Though I keep some stuff just for emergencies. Like my 6950's crossfire pair, in case something dies or something, but I doubt ill ever use them, and I found my old 3870x2 which was on my dad's pc for a while too heh! But thought noted, and I'll keep it in mind.


As far as updates:

Ok guys, so after analyizing and studying the architecture, it really seems each lane is married to a PCIE link. Therefore it doesn't matter what the actual lane uses. It also seems the last slot, and the 4x, 1x is wired to the X99 Chipset, whereas the other 4 main PCIE lanes are wired to the CPU. Which makes it if I keep the 970 GTX, that forces all slots at 8x. If I remove it, then I get 16x 8x, plus much better breathing room.

So I decided to pull the 970 GTX out. Which I will do so, once my new 4 Slot HB bridge gets here. Because unlike in the past it seems these slots are married to the lanes, so if I set SLI only, then the 3rd main PCIE slot becomes the SLI compatible one, at 16x/8x mode. But due to my CPU, I'll have 16x on Slot 1, and 8x on Slot 3. But seems like the best compromise for now.

Then we test.

Thanks!
 
  • Like
Reactions: dgz
like this
Cool. One question: did you notice the lag with SLI? It ought to be worse than vsync. Am I wrong?
 
Ok guys, so after analyizing and studying the architecture, it really seems each lane is married to a PCIE link.

Married isn't quite the word I would use. You should have consulted a block diagram before hand. And obviously working with a 28 lane cpu is going to be a headache when the block diagrams are written for 40 lanes cpus.

EDXiSiMh.png

1%20-%20MSI%20Chipset%20Diagram.png
 
There's no point to the testing as you don't have anything to reference and compare especially when you will be using up all your lanes. If you had 40 lanes, then you could test x16 widths vs x8 widths but you can't do that on a 28 lane cpu. Also when you start using the m2 slot you start eating into those pcie lanes.

Moving your ssd's to pcie is also a waste as it eats up pcie lanes, slots (lotta slots) and the ssd's themselves won't be any faster. If you want serious speed, you should start with a high end m2 ssd like a 950 pro or higher and that will get you double the speed of your current ssd raid.

not if he uses PCI on PCH
 
8x PCIE lanes shouldn't affect performance much I wouldn't think.
depends the games and they never really tested SLI and 8x. It was only single cards. As far as i know tech powerup is the only reliable place that tests these things.
 
Cool. One question: did you notice the lag with SLI? It ought to be worse than vsync. Am I wrong?
Not recently, with my 970 GTX, didn't have any major lag problems. I play hitscan chars, recently in Overwatch game, and was able to hit the boxes pretty well. I don't think it's as bad. Next upgrade I plan on a 144hz+ monitors, So I'll let you know. I'll also try playing with just the 1080 GTX, then with SLI, and report back. Results will be hollistic, and my current 4k monitor is only 60hz though.

Married isn't quite the word I would use. You should have consulted a block diagram before hand. And obviously working with a 28 lane cpu is going to be a headache when the block diagrams are written for 40 lanes cpus.
Yea I knew this would be in theory, but didn't see these charts when I was researching. But good to know the community has these resources available, I know now what to look for next time. Thanks!

not if he uses PCI on PCH
Yea that last slot I believe is safe. Even on the 28 lane CPU.

depends the games and they never really tested SLI and 8x. It was only single cards. As far as i know tech powerup is the only reliable place that tests these things.
Yea and TechPU said it shouldn't be a huge difference. So Ill find out first hand heheh!

My HB SLI 4 Slot bridge gets here on the 22nd.

Keep you posted guys, thanks for the feedback!
 
  • Like
Reactions: dgz
like this
Not recently, with my 970 GTX, didn't have any major lag problems. I play hitscan chars, recently in Overwatch game, and was able to hit the boxes pretty well. I don't think it's as bad. Next upgrade I plan on a 144hz+ monitors, So I'll let you know. I'll also try playing with just the 1080 GTX, then with SLI, and report back. Results will be hollistic, and my current 4k monitor is only 60hz though.

You're using all the rights words in the right way, brother. You should consider contributing to [H] in an official way. Looking forward to your findings.

Old school Quake/UT guy?
 
  • Like
Reactions: pc1x1
like this
You're using all the rights words in the right way, brother. You should consider contributing to [H] in an official way. Looking forward to your findings.

Old school Quake/UT guy?
Yup, Computer Engineer, I actually had my own IT Hardware site, and I actually worked in the industry, helping create computer cases, mice, and keyboards etc. I may bring it back, if I have a chance.

And yup loved me some Rocket Arena 3 back in the day. And Quake 2. Though at those days I mostly played Quake III, RA3. Like ProMode by Vo0 was my fav / inspirational video lol. Man you had to capture bmp sequences to video record lol. Now we just have shadowplay. UT I mostly played UT2004. Still my favorite UT, 2003 and original was close though. 3 was fun, but like Quake 4, just felt off. The tracking wasn't there. Played some CS 1.6, and some source, but never huge in it. And recently just play some Overwatch. It's different, but its fun. I enjoy it.


Ok guys, I got my SLI-HB Bridge. The first slot is now at 16X 3.0. So we are set. However the 2nd PCIE slot, 4th physical slot, says link is at 8x, but 8x 1,1. I am not sure if its because the card is down clocking, and I don't have SLI enabled. But will investigate. Finishing installing software. I'll do that tomorrow.

Thanks!
 
Yup, Computer Engineer, I actually had my own IT Hardware site, and I actually worked in the industry, helping create computer cases, mice, and keyboards etc. I may bring it back, if I have a chance.

And yup loved me some Rocket Arena 3 back in the day. And Quake 2. Though at those days I mostly played Quake III, RA3. Like ProMode by Vo0 was my fav / inspirational video lol. Man you had to capture bmp sequences to video record lol. Now we just have shadowplay. UT I mostly played UT2004. Still my favorite UT, 2003 and original was close though. 3 was fun, but like Quake 4, just felt off. The tracking wasn't there. Played some CS 1.6, and some source, but never huge in it. And recently just play some Overwatch. It's different, but its fun. I enjoy it.


Ok guys, I got my SLI-HB Bridge. The first slot is now at 16X 3.0. So we are set. However the 2nd PCIE slot, 4th physical slot, says link is at 8x, but 8x 1,1. I am not sure if its because the card is down clocking, and I don't have SLI enabled. But will investigate. Finishing installing software. I'll do that tomorrow.

Thanks!
1.6....gtfo...Beta 5.2 :p
 
  • Like
Reactions: pc1x1
like this
1.6....gtfo...Beta 5.2 :p
Hey Hey, I played before too, 1.6 was just the final version that stuck haha. Steam didn't even exist and when it did, was the buggiest piece of crap I had ever seen lol.

TorryHolt81 , Do you have 3DMark, 3DMark Vantage, 3D Mark 11, Uni Engine, etc? Let's pick a few applications, and or games, and start benching, my computer is basically ready. I installed my 4 Slot Bridge.

Only Oddity, if anyone can chime in, is I see the PCIE Link as v1.1, not sure why. I haven't tried SLI yet, will do so hopefully later today.

Happy Holidays everyone!
 
I love overwatch, so fun!
Yea def one if not my fav game of 2016. Love me some Pharah :). Brings back the Quake days.

Anyhow messaging TorryHolt81, so we can do the tests.


Open Question:

Does anyone know if its normal for the second card, to idle at PCIE Gen 1.1? I am going to set the computer into SLI and see if that changes, but curious when all Displays are active if this is normal, or I have to dig into the bios? I removed the 970 GTX. So there's no reason why it should be Gen 1.1, thoughts?

Thanks!

Video:
http://screencast.com/t/7rl1nxMy
 
Last edited:


Linus already tested the difference between 28 lanes and 40. No appreciable difference.

I have a 5820k w/3x 980Ti. All 3 cards are 8x. The problem I have is more heat than anything.

I'm hoping 2 1080Tis will be adequate for new games/high settings @ 4k. Unfortuantely SLI support is dying anyways.
 


Linus already tested the difference between 28 lanes and 40. No appreciable difference.

I have a 5820k w/3x 980Ti. All 3 cards are 8x. The problem I have is more heat than anything.

I'm hoping 2 1080Tis will be adequate for new games/high settings @ 4k. Unfortuantely SLI support is dying anyways.

Right we know tests, but there some variance between sites, and also the 1080 GTX is almost twice as fast as the 980, so it has to tested to see if it saturates the lanes or not, especially 4k and above.

So far the 1080 GTX can handle basically any game, not all at 144hz or etc, but the quality of experience thus far has been good.

Update: Guys waiting on Torry as he had to RMA his CPU. So ill keep you posted when we are moving again.

Happy New Year!
 
Back
Top