When SB drops, what motherboard to get for 2600k?

arachn1d

Gawd
Joined
Jan 7, 2005
Messages
836
I'm seeing a few motherboard threads about SB already.

Are there going to be a lot of motherboards coming out or just a few?

What is the recommended motherboard to pick for SB (gaming centric)?
 
I'd like to get the one with the nvidia add on chip to enable full 16x bandwidth for 2 cards, which would mean a gigabyte UD7 for futureproofing. I don't have the same faith as other people that 8X will be good enough for an SLI setup going out a few years into the future, which is as often as I upgrade. Really I should just wait for 2011, but that remains to be seen. Most will get the gigabyte UD4 for the basic SLI capability (16X/8X) ... I don't blame them since there's a 150 dollar price difference.
 
I'm probably going to get a 2x vid card setup either GTX 570 or 69x0.

Should I be worried about this 16x/8x thing?
 
will the Gigabyte GA-P67A-UD4 support both crossfire and sli? their website states:

Support for ATI CrossFireX™/NVIDIA SLI technology
* The PCIEX16 slot operates at up to x8 mode when ATI CrossFireX™ is enabled.

...so i assume it supports both, but i'm pretty wet behind the ears when it comes to this stuff.

edit: read further on in a review i had open, looks like it does indeed support both.
 
I'd like to get the one with the nvidia add on chip to enable full 16x bandwidth for 2 cards, which would mean a gigabyte UD7 for futureproofing. I don't have the same faith as other people that 8X will be good enough for an SLI setup going out a few years into the future, which is as often as I upgrade. Really I should just wait for 2011, but that remains to be seen. Most will get the gigabyte UD4 for the basic SLI capability (16X/8X) ... I don't blame them since there's a 150 dollar price difference.

The chip you're referring to is the NF200. And a word of warning it doesn't magically create more bandwidth. It's more of a regulator of the available bandwidth in the PCI lanes. To make it simple the NF200 is able to boost performance, however if something goes wrong it causes more problems as well.

In short if you're looking to do a heavy XFire/SLI build you're better off waiting for LGA 2011 if possible.
 
will the Gigabyte GA-P67A-UD4 support both crossfire and sli? their website states:



...so i assume it supports both, but i'm pretty wet behind the ears when it comes to this stuff.

edit: read further on in a review i had open, looks like it does indeed support both.

Thanks for the link. I have been trying to find P67A-UD4 info because I want a solid midrange board.

The end of the review bugged me - he complained that all four memory slots are the same color, so that people who don't read the manual won't be sure how to arrange the memory? Come on, that's stupid. Having all the slots black matches the black PCB and expansion slots anyway.
 
The chip you're referring to is the NF200. And a word of warning it doesn't magically create more bandwidth. It's more of a regulator of the available bandwidth in the PCI lanes. To make it simple the NF200 is able to boost performance, however if something goes wrong it causes more problems as well.

In short if you're looking to do a heavy XFire/SLI build you're better off waiting for LGA 2011 if possible.

Great post, agree on all counts... Also, 16/8 isnt a problem anyway for anyone who does want to go SLI on these boards.
 
The end of the review bugged me - he complained that all four memory slots are the same color, so that people who don't read the manual won't be sure how to arrange the memory? Come on, that's stupid. Having all the slots black matches the black PCB and expansion slots anyway.

I agree, If you aren't savvy enough to be able to check out which memory slots you should be using, you probably shouldn't be putting a computer together in the first place. All black memory slots is a selling point based on aesthetics, not an oversight or disadvantage.
 
The chip you're referring to is the NF200. And a word of warning it doesn't magically create more bandwidth. It's more of a regulator of the available bandwidth in the PCI lanes. To make it simple the NF200 is able to boost performance, however if something goes wrong it causes more problems as well.

In short if you're looking to do a heavy XFire/SLI build you're better off waiting for LGA 2011 if possible.

Great post, agree on all counts... Also, 16/8 isnt a problem anyway for anyone who does want to go SLI on these boards.

No, not really. Very, very small performance difference between x16x16 and x8x8.

I'm hearing three different things on this.

I'm probably going 2x 6950's...

I read that [h] article that the performance wasn't noticeable "in game" but performance metrics wise it was slightly visible....

I don't want to bite myself in the ass though. Would I be better off just buying what's out there now?
 
I'm hearing three different things on this.

I'm probably going 2x 6950's...

I read that [h] article that the performance wasn't noticeable "in game" but performance metrics wise it was slightly visible....

I don't want to bite myself in the ass though. Would I be better off just buying what's out there now?

The answer:

http://www.techpowerup.com/reviews/NVIDIA/GTX_480_PCI-Express_Scaling/

:)

Loss of 2% going to PCI-E 2.0 x8 vs. x16, with a GTX 480. The analysis on how this affects your actual gameplay is left as an exercise for the reader. ;) Note that SLI/CF will impose a small extra loss of ~1% with both cards in this mode.
 
So I'm looking at a 3% FPS difference in penalty if I go with Sandy Bridge in January over i7 motherboards now?
 
So I'm looking at a 3% FPS difference in penalty if I go with Sandy Bridge in January over i7 motherboards now?

I can't answer your question but perhaps some of that 3% would be offset by an x% increase due to the higher IPC of the new CPUs?

Just thinking out loud....
 
I can't answer your question but perhaps some of that 3% would be offset by an x% increase due to the higher IPC of the new CPUs?

Just thinking out loud....

I'd say this is probably accurate. You'll lose potentially 3% over a hypothetical full x16 Sandy Bridge solution. How that compares to an existing i7 with full x16 is yet to be determined. That being said, doesn't the Gigabyte UD7 (and others from other manufacturers) offer full x16x16 SLI? You have to pay for it, but it is available.
 
I guess it would only be the case if the game in question was CPU limited rather than (or as well as) GPU/bus bandwidth limited...

I've been out of the loop so long I could just be talking rubbish though :p
 
I'd say this is probably accurate. You'll lose potentially 3% over a hypothetical full x16 Sandy Bridge solution. How that compares to an existing i7 with full x16 is yet to be determined. That being said, doesn't the Gigabyte UD7 (and others from other manufacturers) offer full x16x16 SLI? You have to pay for it, but it is available.
Oh for some reason I thought 16x/16x wasn't available. I guess a lot of people just don't want to pay the premium for 16x/16x?

What's the price difference? I am guessing over $100. Don't know the model numbers offhand otherwise I'd use google...
 
Yeah I think recall someone saying the UD7 was something $340 vs the $240 for the UD5 and $200 for the UD4 (I could be wrong but those are the ballpark figures I recall seeing).
 
Yeah I think recall someone saying the UD7 was something $340 vs the $240 for the UD5 and $200 for the UD4 (I could be wrong but those are the ballpark figures I recall seeing).

That's about the right spread based on the current pricing - they might all drop in price after the release date, but the spread will probably be the same. Pretty hefty price premium for the small gain of full x16 (depending how much you value the other UD7 extras, of course).
 
For those who worry about future proofing, better wait for LGA 2011 (PCI-E 3.0).
 
For those who worry about future proofing, better wait for LGA 2011 (PCI-E 3.0).

Are we anywhere close to saturating a x16 2.0 link (8 GB/sec)? I don't think there is any need to wait for 2011 for PCIe bandwidth.
 
So I'm looking at a 3% FPS difference in penalty if I go with Sandy Bridge in January over i7 motherboards now?

Only if using two GPUs, and only if they are very good ones. 2% performance drop due to bandwidth on a 480GTX is probably no performance hit on a lesser card that doesn't require as much bandwidth over PCIe.
 
Apparently 2% difference if that.

Only if using two GPUs, and only if they are very good ones. 2% performance drop due to bandwidth on a 480GTX is probably no performance hit on a lesser card that doesn't require as much bandwidth over PCIe.

And that's only in Crossfire/SLI - I don't think anyone has shown a real-world impact on a single card, unless I missed it somewhere.
 
my argument is whether 8x will be enough 2-3 years down the road when cards are much more powerful. How it does with a 480 now doesn't concern me. I don't think the 16x will be seriously compromised though, so the need for pci express 3.0, although nice to have, isn't there yet, my plan is to just get the UD7 and that should be fine.. as long as everything else is in place... the usb3 and sata3 are good to go.. having it native vs added on is a non issue so I don't think I will wait.... plus I can get 500 for my current rig right now, so that definitely helps. Plus little extras like 24 phase power, front usb 3.0 support, extra 3.0 support in the back, extra sata ports, etc could actually come in handy too.
 
Last edited:
Meh, yeah, I guess I'll just end up buying the $300 motherboard and having 2x 16x lanes. Whatever, might as well have top performance right? =p
 
So to be clear the scale is UD3-UD7 and UD7 is the top end bracket, right?

What are YOU guys getting?
 
So to be clear the scale is UD3-UD7 and UD7 is the top end bracket, right?

What are YOU guys getting?

Correct. I will go with the UD4. It has 12 phase power and is the lowest end board to offer SLI which I don't use, but would like the option for in the future.
 
whats the difference from ud3,ud3r or ud3p? why is everyone getting ud4?
 
whats the difference from ud3,ud3r or ud3p? why is everyone getting ud4?

Big reason is crossfire/sli support.

Apparently, the ud3 supports 16x/4x crossfire.

The ud4 supports 8x/8x crossfire, or 8x/8x SLI
 
whats the difference from ud3,ud3r or ud3p? why is everyone getting ud4?


From what I can see from the pictures..

ud3>ud3r - larger, matte black pcb w/ more spaced out layout, additional heatsink, more fan/usb headers
ud3r>ud3p - beefier heatsink/heatpipe, usb3.0 fp header, increased power phases
ud3p->ud4 - right angle sata, sli support & esata

I'm getting the ud4 for the right angle sata and esata ports.
 
lol @ this thread.

The % increase of running 16/16 instead of 16/8 with a NF200 is offset by the additional latency. Most NF200 boards are SLOWER than non-NF200 boards. The only reason to ever have a board with an NF200 chip is if you must run QuadSLI/QuadFire. Otherwise it is a total waste.
 
lol @ this thread.

The % increase of running 16/16 instead of 16/8 with a NF200 is offset by the additional latency. Most NF200 boards are SLOWER than non-NF200 boards. The only reason to ever have a board with an NF200 chip is if you must run QuadSLI/QuadFire. Otherwise it is a total waste.
So basically, "don't worry about 16x/8x?"
 
Back
Top