Slow Network Transfer Speed

Discussion in 'Networking & Security' started by JayteeBates, Sep 19, 2013.

  1. JayteeBates

    JayteeBates [H]ard|Poof

    Messages:
    4,659
    Joined:
    Jul 21, 2007
    Server: HP Proliant Windows Storage Server 2003 SP2

    My System: HP Z820 2xXeon E5-2630 64GB RAM and 4xIntel 300GB SSD in RAID 0 running Windows 8.1

    Network: Gigabit Wirespeed switches

    I am transferring backup images (roughly 20-40GB each) from the storage server to my local system and I am only getting 10-11MB per second which works out to about 88Mbits per second. That is about 9% of a 1Gb connection. My system can handle writing more than 11MB per second so I know it isn't my local storage subsystem causing this slowdown.

    It is something with the network. When I ping the server it is less than 1 ms response. Does anyone have any idea why this would be happening? From what I found online people were suggesting everything from turning off Flow Control to updating registry keys. Curious if any of you have any suggestions before I go off trying random googled ideas.
     
  2. Mackintire

    Mackintire 2[H]4U

    Messages:
    2,892
    Joined:
    Jun 28, 2004
    First thing I would look at is what kind of disks the storage on the Server is and what RAID configuration, if any, is being used.

    The second issue would be that your using Server 2003 which uses an older SMB transfer protocol for data transfers. If the I/O on the storage server is not the issue you may have a tuning issue. There are many options for trying to re-tune this. Getting a copy of Server 2008 or newer might be worthwhile.
     
  3. /usr/home

    /usr/home [H]ardness Supreme

    Messages:
    6,166
    Joined:
    Mar 18, 2008
    Sounds like they aren't negotiating at Gigabit. Try different cables and switch and verify they can do gigabit with jperf.
     
  4. nry

    nry Limp Gawd

    Messages:
    409
    Joined:
    Jul 10, 2008
    I would highly recommend using iperf to test the network speed between the two boxes then if there are no issues there then proceed to investigate disk speed issues.
     
  5. JayteeBates

    JayteeBates [H]ard|Poof

    Messages:
    4,659
    Joined:
    Jul 21, 2007
    While upgrading the OS isn't a bad idea we cannot at the present time as the system only stores data and isn't a priority for upgrade. The server's local storage system isn't an issue either it is running a Raid 5 on 8 Seagate Barracuda ES.2 terabyte drives. It can transfer things between partitions in a snap. So it is definitely network related.

    I'll try different cables and see - I have some Cat6 shielded Kevlar coated Ethernet cables. Don't ask why lol.
     
  6. JayteeBates

    JayteeBates [H]ard|Poof

    Messages:
    4,659
    Joined:
    Jul 21, 2007
    Well that seems to have fixed it - when CSC moved all our server room around seems they did the main cabling all nice but just found whatever for the patches in the actual racks. Some hand made piece of crap that the sleeving didn't even go into the connector.

    Much better now:

    C:\Users\jaytee\Downloads\jperf and iperf\jperf-2.0.2\jperf-2.0.2\bin>iperf -c c
    for-storage001
    ------------------------------------------------------------
    Client connecting to cfor-storage001, TCP port 5001
    TCP window size: 64.0 KByte (default)
    ------------------------------------------------------------
    [276] local 192.27.226.166 port 51892 connected with 192.27.128.15 port 5001
    [ ID] Interval Transfer Bandwidth
    [276] 0.0-10.0 sec 964 MBytes 809 Mbits/sec