Do I need to Jump to a x370?

MeatballCB

Weaksauce
Joined
Oct 8, 2008
Messages
124
So, here's my situation. I'm migrating over to an AM4 platform and already have MSI B350 Tomahawk that seems pretty good, but I'm trying to figure out the best way forward with a minor situation I have.

I currently have a Plextor 1 TB NVMe M8PeY drive as well as a Gigabyte GTX 1070 G1 Gaming GPU. The B350 Tomahawk only has a single PCIe v3.0 x16 slot, but does have a second PCIe v2.0 x16 slot. Additionally, it does have a M.2 PCIe 3.0 x4 slot on the board. Also, I generally will game at 2560x1440 if that helps formulate an answer.

So...I see a couple of options.

1) Install the 1070 in the 3.0 x16 slot and then pull the NVMe drive off the add-in card and put it in the M.2 slot. Primary concern here is that the drive won't have a heat sink and may start throttling due to it running hot underneath the 1070.

2) Install the 1070 in the 3.0 x16 slot and install the NVMe Add-In card in the 2.0 x16 slot. That will probably limit the speeds of the M8PeY (2500 MB/s Read | 1400 MB/s Write) due to the 2.0 slot bandwidth. And I also think that will drop both slots down to x8, and I'm not sure if that'll be an issue for the 1070 bandwidth pushing 2560x1440 gaming.

3) I could just jump to a different B350/X370 board that offers a pair of PCIe 3.0 x16 slots, but I still think most boards will drop down to x8 on both slots.

Any thoughts?
 
https://www.msi.com/Motherboard/support/B350-TOMAHAWK.html#down-manual

MSI B350 Tomahawk manual, page 18. You can have your GTX 1070 and your M2 NVME SSD at full bandwith no problem!

ByqjeZPRg.png
 
Also, it'll be under the video card, which looks like it has a lot of fans, and holes in the cooler that would probably blow air on the SSD. I think You're okay. Just make sure you have some kind of external airflow coming in.
 
This might seem like a bad idea being under the GPU1 slot. But my guess is the exact opposite. Sure the air itself will be above ambient, but the M.2 card is going to be producing a large amount of heat itself probably be at a higher temp than the air hitting it from the GPU. This would actually be better than no GPU because it will actually have air flowing over it wicking away heat off of the chips. A heat-sink without a fan only works if it doesn't get hot enough on its own or the HS is big enough that the device can't ever saturate it with enough heat. Memory both with and without heat sinks back in the day were designed around the idea of getting air flow from the CPU cooler, same with older VRM's (or whatever we called them back in the day). It didn't matter that the air flow was from a really hot CPU.
 
Back
Top