Intel CPU / PCIe Lanes / 4-way SLI

clayton006

[H]ard|Gawd
Joined
Jan 4, 2005
Messages
1,070
Okay so this topic balances between Nvidia video cards and Intel CPU's but probably should be in this spot since I'm between CPUs.

So I've got the two systems in my sig (updating mobile computer sig line, now a 4770k and Asus Maximus Extreme z87, 2400MHz DDR3). And I'm going to go 3-way / 4-way SLI as I have in the past with my 670's. This GPU power worked well in my 3930k.

My 3930k board already uses PLX chips to add extra lanes of PCIe 3.0. I know that the 4770k system uses PLX chips to even accomplish 4-way sli.

Is there any comparison / reviews of how much these PLX chips will hurt performance on a 4-way system? How much slower will my 4770k system perform (O/C to 4.4GHz, may go higher) with 4-way SLI than my 3930k (O/C safe to 4.7GHz) with 4-way SLI (lets assume I still had all 4 of my original 670's still)?

My ultimate goal is to use either system to 4k game no matter if its at home on my big water cooled system, or my portable one. I would just pull cards from one system and put them in the other when I go out on the road. I don't plan on buying 8 video cards ;-)
 
Joined
Sep 16, 2012
Messages
827
All I know is at 4k those better be 4gb cards. 4 cards trying to use 2gb of 256bit memory at 4k resolution. Probably going to run like crap in most games. Good frame rates but stutter like crazy due to vram hit.

Edit: my bad saw the 4gb in your sig. I doubt there would be much difference in performance between the 2 setups at 4k resolution. The 3930k may push them a little harder. At 1440p it might make a lot bigger difference but the 4770k will hold the playability just as well at that, or any resolution. Running 120hz at 1080p is the only time you really need that 3930k juice for gaming. If you have the parts would only take you like half an hour of running a couple benchmarks to determine the difference.
 
Last edited:

Priller

Gawd
Joined
Sep 8, 2013
Messages
811
An X79 board has 40 PCI lanes so it can easily run 2 cards at 16x and 1 card at 8x while only using the PLX chip for the remainder of the lanes. This depends on how your motherboard runs 4 GPUs.

Haswell only has 16 PCI lanes meaning that it can only run 1 card at 16x and anything after that will require the use of PLX chips. Some boards will run 2 GPUs in 8x mode(Gen 3 PCI lanes), not sure if all boards do this, to avoid any latency issues from using the PLX chip. Anything over those 16 lanes and Haswell has to use a PLX chip.

The X79 will perform better with that many GPUs, depending on game engine, but given the choice I would dump 4 way SLI and go down to 2 stronger cards to help avoid the PLX issue.

Also, I thought that most of the lastest drivers from nvidia wont allow for 4 way SLI and that is why alot of the GPUs have 3 way SLI ready on the box and not 4 way. Not to mention scaling with 4 cards is horrible.
 

clayton006

[H]ard|Gawd
Joined
Jan 4, 2005
Messages
1,070
My 4GB 670's scaled fine in 4-way sli mode when running at 7680x1600. My board uses PLX chips for all slots. I didn't notice horrible gaming performance, but was just curious of the overall use with haswell and 3-way / 4-way setups. I am definitely doing 3 or 4 cards for the next gen.
 

clayton006

[H]ard|Gawd
Joined
Jan 4, 2005
Messages
1,070
Also, I thought that most of the lastest drivers from nvidia wont allow for 4 way SLI and that is why alot of the GPUs have 3 way SLI ready on the box and not 4 way. Not to mention scaling with 4 cards is horrible.

I believe that is mainly an issue with GTX 780's (which have a workaround) and possibly 770's (but not sure of that.)
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
There isn't a good Nvidia reference that I know of- only Ivy-E would be able to natively or through a PLX switch run those GPUs at PCIe 3.0, and so far as I know that hasn't been explored yet by a benchmark shop. And even then, I wouldn't settle for cards that have less than 6GB of VRAM for 4k, which means that today, it's Titan or bust. Maybe you're okay with that.

But there is an AMD example- and it's been shown that AMD cards do indeed perform better, that is the HD7970 refresh released as the R9 280X, when put in a two-card Crossfire array on a Haswell system that has a PLX bridge, given that they exchange data solely over the PCIe bus. So it is an advantage, maybe, and there wasn't a 'latency' penalty for using the switch as has been seen in the past, at least not as much.

Either way, why not go Ivy-E? Games will use the extra cores/threads, and the GPUs could use the PCIe 3.0 lanes, PLX switch or not.
 

Vega

Supreme [H]ardness
Joined
Oct 12, 2004
Messages
6,683
At 4K resolution, your 3930K / Extreme 11 system will outperform the 4770K system in 4-way SLI. Trust me.

You get PLX lag either way, which is minimal with the new PLX bridge design. You get a full 32 lanes of native split into quad-16x on the E11, versus 16 lanes native split into quad-8x with Z87. The IPC increase on the 4770K would also barely make up the clock difference with the 3930K, but you are also losing 2 physical cores which new games like BF4 will utilize much more.
 
Joined
Sep 16, 2012
Messages
827
People make me laugh. Yup totally worth selling off your 3930k for a 4930k because the split second hack and restart is so much effort. :rolleyes: just because nvidia didn't implement native support on the 3930k is not a reason to upgrade. They have a patch to activate which is basically instant. AMD has native pcie 3.0 on 3930k.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
People make me laugh. Yup totally worth selling off your 3930k for a 4930k because the split second hack and restart is so much effort. :rolleyes: just because nvidia didn't implement native support on the 3930k is not a reason to upgrade. They have a patch to activate which is basically instant. AMD has native pcie 3.0 on 3930k.

Hack plus triple/quad SLI or Crossfire? This is a recommendation? Making the stuff work like it's supposed to is already a delicate balance- the price of a CPU upgrade is nothing when you're trying to keep that many GPUs stable, that cost 3x-8x as much as your CPU new. An upgrade might set you back, what, $100-$200?
 
Joined
Sep 16, 2012
Messages
827
It doesn't matter its not even really a hack. It was released by nvidia. All it is is a file you run it a command pops up for a split second you restart and pcie 3.0 is running fine. No difference between that and a 4930k. I know I've owned many nvidia cards and just got rid of my x79 setup as I'm not gaming much anymore. Not complicated whatsoever and takes no real technical knowledge, no hiccups or issues with it. So how is that worth $100-$200. If your really [H] you'd like having to do the little tweak.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
It doesn't matter its not even really a hack. It was released by nvidia. All it is is a file you run it a command pops up for a split second you restart and pcie 3.0 is running fine. No difference between that and a 4930k. I know I've owned many nvidia cards and just got rid of my x79 setup as I'm not gaming much anymore. Not complicated whatsoever and takes no real technical knowledge, no hiccups or issues with it. So how is that worth $100-$200. If your really [H] you'd like having to do the little tweak.

If it's a 'hack' or a 'tweak', that means that it could work today and not work tomorrow. Stuff goes bad. UEFI updates. Drivers. Doesn't matter.

And there's still both a performance jump and a spec jump by going Ivy-E, as long as you don't sacrifice clockspeed in the process; and that's a roll of the dice either way.
 
Joined
Sep 16, 2012
Messages
827
Perhaps I was wrong in my wording. I guess its more of an unlock tool than a hack or tweak. Really its only about a 100mhz difference to even out the performance from the reviews I've seen. The only thing that entices me about ivy e (which is really a good reason) is the lower power usage. This is likely very helpful in the vrm overheating throttle problem that plagued most x79 boards due to them all being stuffed in a tiny spot. If I was buying new then obv ivy e though. But def no reason to upgrade.
 

IdiotInCharge

NVIDIA SHILL
Joined
Jun 13, 2003
Messages
14,679
Perhaps I was wrong in my wording. I guess its more of an unlock tool than a hack or tweak. Really its only about a 100mhz difference to even out the performance from the reviews I've seen. The only thing that entices me about ivy e (which is really a good reason) is the lower power usage. This is likely very helpful in the vrm overheating throttle problem that plagued most x79 boards due to them all being stuffed in a tiny spot. If I was buying new then obv ivy e though. But def no reason to upgrade.

That's about how I'm seeing it on the upgrade- it's not a great upgrade, but there aren't any real negatives either aside from the overclocking lottery.

As for the 'tweak', it's something that Intel should have fixed; but they didn't, and there's no real guarantee that it'll keep working, though there's no real reason it shouldn't; except, again, that it shouldn't be needed and it introduces an unneeded variable into an already overly complex system. There's nothing to like about that :).
 

ccityinstaller

Supreme [H]ardness
Joined
Feb 23, 2007
Messages
4,241
Perhaps I was wrong in my wording. I guess its more of an unlock tool than a hack or tweak. Really its only about a 100mhz difference to even out the performance from the reviews I've seen. The only thing that entices me about ivy e (which is really a good reason) is the lower power usage. This is likely very helpful in the vrm overheating throttle problem that plagued most x79 boards due to them all being stuffed in a tiny spot. If I was buying new then obv ivy e though. But def no reason to upgrade.

While I think an upgrade to a very well clocking 4930K is warranted in certain situations, this one is on the fence..Most of the time it is good for you guys on the fringe that are running 3-4X GPUs, but the issue I see is you already having a sold clocking 3930K..It is going to be tough to get a random 4930K up to 4.7Ghz, let alone any faster..If you only reach ~4.4-4.5, then you are left with only 90-95% of the CPU power you have now, albeit with less power usage. OTOH, you might get a golden guy and be looking @ 4.6~4.8Ghz, which would be equal to a 4.9~5.1Ghz Sandy, depending on the application..

The other thing to think about is the fact that Ivy has an improved IMC, and will handle much faster memory speeds..I don't know how much 8 DIMMS will hold you back, but with new games moving to 64 bit native code, we are going to be using more and more ram, and speed might start making an even bigger difference..It was touched on in the BF4 Beta by Brett in their system that was only using 8GB, and speed already makes a big difference in a lot of RTS games. As for RPGs/FPS titles, I don't think it does much for *current gen* titles..

*IF* you are willing to buy a couple of 4930Ks and pick the best O/C'er, and money isn't a big deal,then I would say go for it. Otherwise listen to Vega and just rock that 3930K insane GPU goodness..

Oh BTW OP, what Vcore are you using for that 4.7Ghz O/C? If it is fairly low, you should be able to squeeze another ~100-300Mhz outta her if you take it up to 1.47-1.5V, which many have been doing since Sandy chips launched without an issue. You clearly have a high end custom loop, so temperatures shouldn't be that big a deal..

Dutchman I know you are *fairly* new here, and you weren't directing your response @ him, but be warned Vega is one of the foremost authorities of all things [H]ard...That man spends more money on WC'ing fittings alone PER SYSTEM then most do on a car my friend;)..
 
Joined
Sep 16, 2012
Messages
827
That's about how I'm seeing it on the upgrade- it's not a great upgrade, but there aren't any real negatives either aside from the overclocking lottery.

As for the 'tweak', it's something that Intel should have fixed; but they didn't, and there's no real guarantee that it'll keep working, though there's no real reason it shouldn't; except, again, that it shouldn't be needed and it introduces an unneeded variable into an already overly complex system. There's nothing to like about that :).

Shouldnt blame intel its gotta be nvidia. AMD has native pcie 3.0 on sandybridge-e. Right there tells you its nvidia. Why they decided to do it as an unlock tool instead of putting it in the drivers is beyond me. I suppose could be something in the registry of windows from launch that nvidia couldn't get around and AMD could. From what I here the unlock is just a small registry tweak.
 

clayton006

[H]ard|Gawd
Joined
Jan 4, 2005
Messages
1,070
Dutchman I know you are *fairly* new here, and you weren't directing your response @ him, but be warned Vega is one of the foremost authorities of all things [H]ard...That man spends more money on WC'ing fittings alone PER SYSTEM then most do on a car my friend;)..

I modeled my current setup after one of Vega's builds. I trust his advice quite a bit. Thanks for chiming in! Been a while since we've chatted.

FYI - I don't need to use a hack to enable PCIe 3.0 - my dual PLX bridges allow the system to run with 3.0 turned on with no issues. Other than the xfan noise I love the board.

To another question - My vcore at 4.7GHz is 1.46 or 1.47 on water (1 notch down LLC from the max). My 4.8GHz clock takes 1.485 vcore normally (same LLC). 5.0GHz is possible 24/7, but I think it takes max LLC and 1.5v (but temps are still tolerable).

I guess where I was really going with this is..... My main rig 3930k I could swap out to a 4930k net-net because of a replacement plan I have. Not that I will probably end up doing that (not entirely ethical, and not sure if I would get a high-clocking 4930k). I upgraded my Ivy to a 4-way capable haswell system for when I travel and want to game. The idea is to remove the cards from my main rig when I leave, and put them back in when I'm done.

I was wondering how much performance I would lose doing this? It won't be a permanent setup, just curious.

Eventually I'll upgrade to Haswell-E even though that will be a while. I can use the 8 cores for VMs and other stuff. So I'm not looking to dump my 3930k anytime soon (unless it were to degrade, then I think that's grounds to use my replacement plan).
 
Top