Alternatives to DFS for Server Duplication

GreenLED

Limp Gawd
Joined
Mar 1, 2014
Messages
137
I have, what I stupidly thought, was a simple problem. I will (soon) have two 6-bay Windows servers.

I want to duplicate Server A to Server B efficiently and without implementing DFS. Why? Neither of the servers are on a domain and I don't want to upgrade them to a domain controller. I want to simply regularly (on a schedule with logs) duplicate non-OS files to Server B. I have looked at a couple of solutions.

1). DFS: Problem. From what I have read, duplication dos NOT occur unless you upgrade the unit to a domain controller.

2). Robocopy: Seems like a reasonable solution. However, The file sync process needs to handle ALL manner of file sizes and shapes.

3). Whatever my fellow technicians come up with.

Any thoughts?
 
Look at Sync Toy. Its an old MS app but should get the job done. Its not instant like DFS but you can run scheduled sync jobs.
 
Morning Gentleman, Good to see my fellow problem solvers have had their cup of java already! I have always come back to this forum. It seems to be the old reliable in a sea of forums.

Back on topic -- I'd like to get an affirmative "I've broken the glass" from someone dealing with very large files on either of these suggestions if possible.

To be clear. The servers I'm referring to are in the exact same location. They will be linked via a switch. Maybe that's not the best way, but it's my current solution. Can either of you tell me if rsync or SyncToy are rock solid when it comes to large files?

Should I someone bound my two servers together with another linking method? Is there some faster interface other than Ethernet available to me I'm unaware of on my Dell PowerEdge?
 
Caution with robocopy if you have millions of files. It'll try and index all files/folders before starting to sync and if you don't have enough memory, you're in trouble.

I can vouche for rsync working on large files. I just migrated to a new NAS at home and moved 7TB with it small and large files.

rsync is a linux program. Don't see value of running a linux emulator on top of windows just to run rsync.
 
That's REALLY good information and the guidance I was looking for. Unfortunately, I am forced to work in a Windows environment.

Can you disable indexing while running Robocopy?

Should I not consider using cgwin (forgive me if I typo the name) to run rsync?

I will be using Windows Server 2012.

It's looking more and more like rsync will be my answer here. Can I avoid using SSH and just straight run the command over the network in order to reduce the overhead of the SSH connection? The switch will be dedicated to connecting these two servers.
 
That's REALLY good information and the guidance I was looking for. Unfortunately, I am forced to work in a Windows environment.

Can you disable indexing while running Robocopy?

Should I not consider using cgwin (forgive me if I typo the name) to run rsync?

I will be using Windows Server 2012.

It's looking more and more like rsync will be my answer here. Can I avoid using SSH and just straight run the command over the network in order to reduce the overhead of the SSH connection? The switch will be dedicated to connecting these two servers.
Just do a cross over cable. 10GB ethernet cards are getting cheaper unless you need the switch for other needs.
 
I guess I went a little overkill. I keep seeing "10 Gigabit" Ethernet cards, but I have had bad experiences with "1 Gigabit" cards that never performed anywhere near that. So --- I bought two SFP+ 10 Gigabit cards. Going to run a whip (short) DAC cable from one sever to the other. I have seen these things perform almost to their specs.

I guess I could have gone cheaper. Now that the throughput won't be an issue, I should get blazing fast transfers.

Going to be looking at this script for the start of a template to run on a schedule.

Is there a way to gracefully "PAUSE" rsync (in case, for example, a shutdown is imminent)?
 
That's REALLY good information and the guidance I was looking for. Unfortunately, I am forced to work in a Windows environment.

Can you disable indexing while running Robocopy?

Should I not consider using cgwin (forgive me if I typo the name) to run rsync?

I will be using Windows Server 2012.

It's looking more and more like rsync will be my answer here. Can I avoid using SSH and just straight run the command over the network in order to reduce the overhead of the SSH connection? The switch will be dedicated to connecting these two servers.

You can use powershell and do a get-content command. Robocopy against a folder further down the tree. This takes some knowledge of powershell/robocopy though.

Below is snipplet of powershell script i used to migrate 10k folder in a single directory. It would choke if i did root folder against all of these but break them down with powershell and run robocopy against each sub directory worked.

#grab all sub directories that start with include and exclude anything that has _OLD at end of name.
$sfolders = get-childitem -Path $sd

foreach($source in $sfolders)
{
#build full path to source
$source = $source.name
$sfolder = $sd + $source

#build full path to destination
$dfolder = $dd + $source

robocopy $sfolder $dfolder /E /XO /dcopy:T /COPY:DAT /MT:32 /tee /R:0 /W:0



}
 
You can use powershell and do a get-content command. Robocopy against a folder further down the tree. This takes some knowledge of powershell/robocopy though.

Below is snipplet of powershell script i used to migrate 10k folder in a single directory. It would choke if i did root folder against all of these but break them down with powershell and run robocopy against each sub directory worked.

I have a couple of thoughts. First off, thank you for suggesting the PowerShell script. It looks very interesting.

As I'm contemplating this more and more, I'm getting concerned that I have no control over this procedure. I don't have ...

1). Some sort of progress indicator (this isn't a criticism on your particular sugggestion)

2). I cannot pause or abort the script (can I?)

3). I have no way of knowing what has been successfully copied and what hasn't.

4). I have no GUI interface to work with.

I can get along without a GUI, but I'm just carefully thinking this through.

Having a 10G SFP+ bound should really alleviate much of my concern, but if I'm going to rely on this for my clients, I have to have control of this whole process.

Am I missing anything here? Are there progress indicators as rsync or Robocopy run I am unaware of?

I want to thank everyone for their help and support. It's been a bit jarring getting back heavy into server management.

I had a similar project (actually posted on this forum) several years ago along these lines. It is finally catching up with me now because my clients need backups in a more robust procedural manner.
 
I have a couple of thoughts. First off, thank you for suggesting the PowerShell script. It looks very interesting.

As I'm contemplating this more and more, I'm getting concerned that I have no control over this procedure. I don't have ...

1). Some sort of progress indicator (this isn't a criticism on your particular sugggestion)

2). I cannot pause or abort the script (can I?)

3). I have no way of knowing what has been successfully copied and what hasn't.

4). I have no GUI interface to work with.

I can get along without a GUI, but I'm just carefully thinking this through.

Having a 10G SFP+ bound should really alleviate much of my concern, but if I'm going to rely on this for my clients, I have to have control of this whole process.

Am I missing anything here? Are there progress indicators as rsync or Robocopy run I am unaware of?

I want to thank everyone for their help and support. It's been a bit jarring getting back heavy into server management.

I had a similar project (actually posted on this forum) several years ago along these lines. It is finally catching up with me now because my clients need backups in a more robust procedural manner.

1. No progress bar in robocopy. While you can add one to powershell, it wouldn't be very much value.
2. This is robocopy. You either let it run or abort and start over at the beginning. It'll rescan all files until it find a file that has been modified since last sync or hasn't synced to destination.
3. Nope, you don't. I had to rely on 3rd party tool built into destination. Even then it was a toss up once we cut all users over.
4. There is a robocopy GUI but not one that'll work with powershell.

I wouldn't recommend robocopy if you are presenting this to a client. I'd go with something else given 4 points above.
 
I've used ViceVersa to replicate data also. Nice GUI and easy to work with. Compatible with windows. Should do what you want.
 
I will take a serious look at that program. What type of transfers have you done with it? Large files, small files, etc.

Thank you very much for your recommendation. I will add it to my list of potentials.
 
I will take a serious look at that program. What type of transfers have you done with it? Large files, small files, etc.

Thank you very much for your recommendation. I will add it to my list of potentials.

Both. I was using a internal tool on a NAS and it broke. So i switched to viceversa for small and large file transfer.

viceversa was my go to when a user would open a ticket and say they had missing files. Typically it was due permissions. It allows incremental and full syncs.
 
Back
Top