8/8 vs 16/16 with 290 Crossfire

MorgothPl

2[H]4U
Joined
Oct 13, 2008
Messages
3,020
Question about PCIe 3.0 Crossfire. I caved in, and getting new system, and as I'm going for that nice 34" 21:9 LG.I reckon, that for this you need something more than my single 290, so getting its sibling.

Now the question about PCIe lanes - will 8/8 be enough for 290 xfire and 4790K, or should I invest into a 16/16 PLX mobo. As I remember at PCIe 2.0, there was none to little difference between 8/16 setups. Is it now the same case?

QUestion is important, because the MSI Mpower AC is about 800 PLN ($270), while the PLX 16/16 Xpower is at 1400 PLN ($470), so it's kind of substantial difference :)
 
I would say no. Unless you benchmark and use HWBOT. As then the 16x lanes would help out your final numbers. But after all it is a synthetic benchmark. You wont notice any difference playing on a dual 8x vs a dual 16x. As they can't saturate the pci-express lanes on 8x to begin with.
 
Question about PCIe 3.0 Crossfire. I caved in, and getting new system, and as I'm going for that nice 34" 21:9 LG.I reckon, that for this you need something more than my single 290, so getting its sibling.

Now the question about PCIe lanes - will 8/8 be enough for 290 xfire and 4790K, or should I invest into a 16/16 PLX mobo. As I remember at PCIe 2.0, there was none to little difference between 8/16 setups. Is it now the same case?

QUestion is important, because the MSI Mpower AC is about 800 PLN ($270), while the PLX 16/16 Xpower is at 1400 PLN ($470), so it's kind of substantial difference :)

I was facing the same choices are you did as I went from a 8x 8x PCI-E 2.0

I eventually went with the MSi Z97 gaming 7 which runs 8x 8x PCI-E 3.0 w/ 2 290x Crossfire. (4790k @ 4.7ghz)

No major bottle necks that I've seen.

I used my original 1600mhz DDR3 ram from my 2600k system also if you're looking into some savings.

My resolution is 7680x1440 (3x 32" Acer 1440P Panels).
 
From what I've heard an x8 lane on pcie 3.0 is like a x16 lane on pcie 2.0. I doubt that a x8 pcie 3.0 lane would hold back a 290x if at all in 99% of applications.
 
Beautiful setup lgg :) And thanks for the good advice. I'll get the MPower then - was considering the Gaming 7, but it has Killer NIC, not Intel one.

And [H] could once again test 16/16 vs 8/8 for PCIe, just to debunk all the myths and let us save some hard earned dollars :)
 
Beautiful setup lgg :) And thanks for the good advice. I'll get the MPower then - was considering the Gaming 7, but it has Killer NIC, not Intel one.

And [H] could once again test 16/16 vs 8/8 for PCIe, just to debunk all the myths and let us save some hard earned dollars :)

To be honest, I have not had any issues with the killer nic once i stopped their useless application from launching ( app installed with driver).

I've transferred 600gb back and forth between my HD and my server at home over Cat 6 and no issues.
 
@ MorgothPl, You are welcome......

Not sure as mine was 100% rock stable with zero complaints. As it seems now more and more companies are now using this setup. Some people will stick with intel nics no matter what. Just a preference I guess. Both are great in my opinion though.

What do people have against killer NICs? I thought they were higher end?
 
A lot of people still regard the Killer NIC as a marketing gimmick. I had one of the add-in cards years ago and it was fine, but I don't believe there is any significant advantage.
 
NO difference, i poted out for 16/8x on my z97WS due to being able to keep my 2 cards as far as possible. Hell the 290s ran just as good on my z68 board in 2.0 8x lanes.
 
My mb has a killer nic in it. To be honest it is about the same as the intel nic in my old p67 mb. For standard desktop/gaming use you will be hard pressed to tell a difference. In the past the drivers were a problem with the killer nics, but they are worked out now.

I really don't understand the plx chip. On a intel CPU there are 16 pcie 3.0 lanes to the gpu. So what is the point of the plx chip? If you have 2 gpus running at 16x on a plx chip, they only connect to the CPU at 16x total (8x8). Am I missing something?
 
My mb has a killer nic in it. To be honest it is about the same as the intel nic in my old p67 mb. For standard desktop/gaming use you will be hard pressed to tell a difference. In the past the drivers were a problem with the killer nics, but they are worked out now.

I really don't understand the plx chip. On a intel CPU there are 16 pcie 3.0 lanes to the gpu. So what is the point of the plx chip? If you have 2 gpus running at 16x on a plx chip, they only connect to the CPU at 16x total (8x8). Am I missing something?

There are lots of devices that want access to the PCIE lanes, so the PLX chip allows for more flexibility by sharing bandwidth. It also makes things a bit easier for configuration purposes.
 
There are lots of devices that want access to the PCIE lanes, so the PLX chip allows for more flexibility by sharing bandwidth. It also makes things a bit easier for configuration purposes.

But the 16x pice 3.0 lanes are for the gpu. Those lanes aren't shared for other devices.

http://www.intel.com/content/www/us/en/chipsets/performance-chipsets/z97-chipset-diagram.html

So no matter how the plx chip slices it, the plx chip only talks back to the gpu at 16x pcie 3.0.
 
NO difference, i poted out for 16/8x on my z97WS due to being able to keep my 2 cards as far as possible. Hell the 290s ran just as good on my z68 board in 2.0 8x lanes.

I chose the Asus Z97 WS mobo as well for my two Saphire Tri-X OC 290x CF set up. I wanted to make sure I had lots of space between those cards for better cooling.
 
A lot of people still regard the Killer NIC as a marketing gimmick. I had one of the add-in cards years ago and it was fine, but I don't believe there is any significant advantage.

Ok I was basing my question on what the below guy said... I don't see why someone wouldn't want a board because it had a killer NIC lol

Beautiful setup lgg :) And thanks for the good advice. I'll get the MPower then - was considering the Gaming 7, but it has Killer NIC, not Intel one.

And [H] could once again test 16/16 vs 8/8 for PCIe, just to debunk all the myths and let us save some hard earned dollars :)
 
Back
Top