MSI v. Gigabyte - MSI Cals Bluff on Gigabyte's Claims

entropy13

Gawd
Joined
Jul 16, 2009
Messages
922
http://www.techpowerup.com/151718/MSI-Calls-Bluff-on-Gigabyte-s-PCIe-Gen-3-Ready-Claim.html


In August, Gigabyte made a claim that baffled at least MSI, that scores of its motherboards are Ready for Native PCIe Gen. 3. Along with the likes of ASRock, MSI was one of the first with motherboards featuring PCI-Express 3.0 slots, the company took the pains to educate buyers what PCI-E 3.0 is, and how to spot a motherboard that features it. MSI thinks that Gigabyte made a factual blunder bordering misinformation by claiming that as many as 40 of its motherboards are "Ready for Native PCIe Gen. 3." MSI decided to put its engineering and PR team to build a technically-sound presentation rebutting Gigabyte's claims.
 
Last edited:
It's not a lie, the simple fact is that once you put a Ivy Bridge CPU in a current board the x16 slots will move from 2.0 to 3.0 as there's no physical difference between the PCB design for the PCI Express 2.0 and 3.0.
Boards with old switches will end up with the top slot running at half speed (i.e. x8) in PCI Express 3.0 in any and all circumstances when the second slot is in use, it's limited to PCI Express 2.0.
Only boards with the new switches will support dual PCI Express 3.0 slots.
As such, boards with only a single x16 slot will transition from 2.0 to 3.0 once an Ivy Bridge CPU is used.
Also note that boards with nForce 200 chips won't work at PCI Express 3.0 speed if cards are fitted in slots connected to the nForce 200 chip.

Link
 
Yeah, checked a bit, looks like gigabyte is BS'ing it on every board except for the new sniper.. but I don't think $400 is worth it for that board.
 
marketing mumbo jumbo..

sure it will "support" PCIe 3.0 - at half speed...

lol

besides, techinically, all boards will support a PCIe 3.0 card, as its backwards compatable, so it will work...
 
I don't understand why companies are getting all gung-ho for PCI-E 3 anyway. By the time it actually matters we probably won't be using these boards anyway.
 
I don't understand why companies are getting all gung-ho for PCI-E 3 anyway. By the time it actually matters we probably won't be using these boards anyway.

It's for the uneducated masses that think buying a diablotek power supply is the same as buying a Corsair power supply with the same specs.
 
I don't understand why companies are getting all gung-ho for PCI-E 3 anyway. By the time it actually matters we probably won't be using these boards anyway.

It's a bigger number, so it must be better. Same reason manufacturers put ridiculous amounts of VRAM on crappy GPUs. I've heard lots of people describe their new laptop's capability by telling how big the hard drive is.
 
It's a bigger number, so it must be better. Same reason manufacturers put ridiculous amounts of VRAM on crappy GPUs. I've heard lots of people describe their new laptop's capability by telling how big the hard drive is.

Yes I love when people say they have a 2GB card only to find its like a 4570 or something.
 
FYI, Gigabyte has removed ALL references to Gen3 on their product pages for their mainboard, EXCEPT for the Sniper v2.

I guess this tells you who's telling the truth and who's not ;)

Example Sniper v2:
http://www.gigabyte.com/products/product-page.aspx?pid=3962#ov
GEn3 above UD3

Z68A-D3-B 3:
http://www.gigabyte.com/products/product-page.aspx?pid=3863#ov
No reference to Gen3

Smoking gun?

Appears they have seen the light... Those marketing guys drive engineers crazy. Bet someone gets a demotion over this one.
 
The difference between PCI-E 1.0 and 2.0 hardly matters right now, so idk why people would even bother with 3.0.
 
The difference between PCI-E 1.0 and 2.0 hardly matters right now, so idk why people would even bother with 3.0.

No, Gen3 mobos actually do make a difference even right now.
And not having to buy a new mobo when Ivy Bridge comes out saves money, you know wallets care?
 
No, Gen3 mobos actually do make a difference even right now.
And not having to buy a new mobo when Ivy Bridge comes out saves money, you know wallets care?

PCI-E 3 won't matter for at least a couple years. Even ASUS Mars 2 card doesn't get close to maxing out a PCI-E 2 bus.
 
PCI-E 3 won't matter for at least a couple years. Even ASUS Mars 2 card doesn't get close to maxing out a PCI-E 2 bus.

Dude, it's not about bandwidth and graphics, anyone running a decent PCI express SSD setup actually gets more performance now.

first, there are some numbers provided by msi in this presentation comparing a Gen3 board versus 2 with a LSI SSD card: http://media.msi.com/main.php?g2_itemId=67153

And second, it pretty much DOES matter if you run x8/x8 versus x16/x16 as, it's not much but framerates of x16/x16 boards are better than x8/x8 boards in two thirds of the test last year: http://www.hardocp.com/article/2010/08/23/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x8x8/3

That was with GTX480s and as you know 580s require more bandwidth :)

The difference between PCI-E 1.0 and 2.0 hardly matters right now, so idk why people would even bother with 3.0.
Ha, way to back up your claim! :)
25% fps increase going from PCI-E 1.0 to 2.0 speeds with a GTX480
http://www.tomshardware.com/reviews/pcie-geforce-gtx-480-x16-x8-x4,2696-5.html
 
Dude, it's not about bandwidth and graphics, anyone running a decent PCI express SSD setup actually gets more performance now.

first, there are some numbers provided by msi in this presentation comparing a Gen3 board versus 2 with a LSI SSD card: http://media.msi.com/main.php?g2_itemId=67153

And second, it pretty much DOES matter if you run x8/x8 versus x16/x16 as, it's not much but framerates of x16/x16 boards are better than x8/x8 boards in two thirds of the test last year: http://www.hardocp.com/article/2010/08/23/gtx_480_sli_pcie_bandwidth_perf_x16x16_vs_x8x8/3

That was with GTX480s and as you know 580s require more bandwidth :)


Ha, way to back up your claim! :)
25% fps increase going from PCI-E 1.0 to 2.0 speeds with a GTX480
http://www.tomshardware.com/reviews/pcie-geforce-gtx-480-x16-x8-x4,2696-5.html

Yes, that article does state that x8/x8 runs slower than x16/x16. And a 580 requires marginally more bandwidth than a 480, but not enough to where that article is invalid.

If you read the article, the difference is negligible at best, and noticeable only in benchmarks. In-game performance is the exact same. Now, running x8/x8 with two 6990's is a completely different matter.
 
Back
Top