I want to back up a NAS with some autosync software, what do you recommend?

sram

[H]ard|Gawd
Joined
Jul 30, 2007
Messages
1,699
Hi to all,

There is this ~ 14TB NAS that is connected to a network. It is about 5TB full. I noticed that there is no backup for it and the guys working on the network didn't know any better. Anyways, it wasn't hard to explain that losing 5TB of data is a disaster, so I'm working on a backup solution for them. I did copy the entire array to another location but I want to use an automated way of backing up or at least semi-auto. We have another NAS which we can use (We also have a large external hard drive ~ 10TB). Is there a software which can do a full backup from one NAS to another, and then keep doing differential backups when the user requests so? Full backup one time, then keep doing differentials seems to be the most convenient.

Or should I just use a syncing program and make it update my manual full backup every now and then or every --insert time interval-- that I specify? I remember using allwaysync at one point in time. It was a neat software.

The NAS is synology DS1515+. It is currently setup as RAID5 which at least provides one disk fault tolerance. The 2nd NAS is DS1517+.

Sorry if I sound too stupid but I'm interested on how you guys back up your huge NAS's...you should know better.
 
Most folks that I know use either Backblaze or Google Drive for their backend storage. If you use the Corporate Google Drive, you are looking at $10/month for unlimited storage.
 
Most folks that I know use either Backblaze or Google Drive for their backend storage. If you use the Corporate Google Drive, you are looking at $10/month for unlimited storage.

Unlimited with a minimum 5 licenses ($50/mo) and a Limit of 750gb transfer per day per user. Come on, we all watched that LTT video last night...
 
Unlimited with a minimum 5 licenses ($50/mo) and a Limit of 750gb transfer per day per user. Come on, we all watched that LTT video last night...


You can work with them to just have one user on your plan for $10/month, and from my reading of the OP it seems to be a corporate environment. The transfer limit that you mention is correct, however it is something that I would code/design around just for the purpose of the cost savings.
 
https://gsuite.google.com/pricing.html - Says right on the page. 1 TB usage if fewer than 5 users.
I see you are correct, this must have been updated recently. Still, with the OP being in a corporate environment, 5 users($50/month for unlimited storage) is not something that is impractical. Personally that's the cheapest I've found for something unconstrained. There are plenty of cloud storage providers out there, but here's some pricing to think about based on 4 TB:

AWS S3(US East): $92/month
Google Cloud(single region): $80/month

To answer the question for the OP, Cloudberry can work for you as your backup client, with the destination being the cloud storage provider of choice.
 
Why are you lot going on about cloud solutions when that's not what the OP wanted?
 
Why are you lot going on about cloud solutions when that's not what the OP wanted?

That's a fair point. If OP wants to store on-prem -> on-prem, can use a multitude of tools such as rsync. Off-prem backup is best practice for businesses, which is what led me to bring up cloud storage providers.
 
Why are you lot going on about cloud solutions when that's not what the OP wanted?

Thanks. I actually don't want a cloud solution. It is actually against their policy.

That's a fair point. If OP wants to store on-prem -> on-prem, can use a multitude of tools such as rsync. Off-prem backup is best practice for businesses, which is what led me to bring up cloud storage providers.

Sorry if I didn't clarify, but like explained above, cloud solutions is a no here. Consider me a noob and show me how to use some of the multitude tools, specially if they work with synology NAS.
 
I've never used a NAS and don't know much about them. I'm getting confused reading this thread as to the reasons for doing an onsite backup from the NAS to another storage device. Is it to mitigate against the risk of all (5?) of the NAS hard disk drives failing at once? I can see the reasons for an offsite backup (fire, flood or something else) or am I missing something?
 

Yeah I can see that, but I don't really know to use it or set it up with the two NAS's . From the page you posted, it seems like I need to setup one NAS as the backup server and enable rsync service on it and also enable rsync accounts so that user accounts on the other NAS can back up to it. But how will one NAS detect users from another NAS? Yes, they are on the same network, but it is not like they are connected to a domain network. It is just a workgroup network. Or maybe synology has some special configuration for this? I'm investigating this as we speak. If you can clarify, I will be at your debt.

Thanks.
 
I've never used a NAS and don't know much about them. I'm getting confused reading this thread as to the reasons for doing an onsite backup from the NAS to another storage device. Is it to mitigate against the risk of all (5?) of the NAS hard disk drives failing at once? I can see the reasons for an offsite backup (fire, flood or something else) or am I missing something?

You are not missing anything. The NAS is RAID5 so you can have one disk fail and you are fine. If two or more disks fail, then you are at trouble. You are right, the backup should be placed somewhere else but still, having another copy at the same site serves a purpose. It is less likely to have both two NAS's go bad at the same time. This doesn't mean you should NOT have an offsite backup. You actually should. It all depends on how critical the data is.


Edit: Meant "not missing anything"
 
Last edited:
Don't forget that a NAS doesn't protect against a user deleting files, either. I'm in the same boat for my home setup. I have a DS412+ and was looking into the best method of backing up to a large external disk. My plan was to try the built-in HyperBackup.
 
Don't forget that a NAS doesn't protect against a user deleting files, either. I'm in the same boat for my home setup. I have a DS412+ and was looking into the best method of backing up to a large external disk. My plan was to try the built-in HyperBackup.

That's a good point. What I used to do with allwaysync was to act on file additions/modifications, but not deletion so the backup location will always grow even if the original goes down by size. Allwaysync is a pretty neat software.

Anyways, regarding the NAS in question I did make a full backup using richcopy, but I still didn't setup automatic backup as i'm away. Deltacopy seems to be the simplest option though.
 
I'm using cloud sync to Backblaze B2.

I sync all my Docs, Music, Photo's and home videos to Backblaze. I don't use it for things like recorded TV or Ripped movies since they aren't something I need to backup. It's currently using about 800GB of data and I have been averaging about $3.30 a month for the storage.

My Overall Backup Strategy is:
I use Cloudstation drive to keep basic working folders ( like my desktop and a few other folders that I use as temporary workspaces) synced with my DS918+
I use Cloudstation Backup to save stuff I want to keep permanently.
I use CloudSync to Backblaze to backup everything from the Cloudstation Backup folders for off-site protection.

I'm pretty happy with my solution at the moment. I have reasonable protection from data loss on my desktop PC and my NAS.
 
The main issue is...data is getting out of control. Maybe a new angle is required? Stricter deletion of unwanted data etc? Levels of data usefulness?

We had an issue with data getting out of hand many years ago

So we setup a 1TB dump drive (yes that long ago) called "The Dump" that could be used for temporary data storage for those doing development/experiment/fooling around. Everyone was advised that it could be wiped at the end of the month or data older than a week was fair game. Those were the rules.

It worked pretty well. Business data growth rates dropped considerably and we never had an issue with anything important getting lost. Work carried on.
 
The main issue is...data is getting out of control. Maybe a new angle is required? Stricter deletion of unwanted data etc? Levels of data usefulness?

We had an issue with data getting out of hand many years ago

So we setup a 1TB dump drive (yes that long ago) called "The Dump" that could be used for temporary data storage for those doing development/experiment/fooling around. Everyone was advised that it could be wiped at the end of the month or data older than a week was fair game. Those were the rules.

It worked pretty well. Business data growth rates dropped considerably and we never had an issue with anything important getting lost. Work carried on.
That's a good point. Levels of data usefulness.... I ran into this multiple times.
 
I will look into that, although I remember not really liking it the first time I tried to use it with my own NAS at home. Maybe I need to do some learning & reading.
I figured out how to use hyper backup in the case of having two NAS's although I had some difficulties in the beginning. I had to do offline installs of the packages because the network is not connected to the internet. What you do is install hyper backup on the source NAS, install hyper backup vault in the destination NAS. You then run hyper backup in the source NAS and setup the 2nd NAS as the destination (The 2nd NAS will be automatically detected since they are on the same network but you will have to have hyper backup vault installed there). You then just choose the shared folder you want your files to go to in the 2nd NAS. Hyper Backup turned out to be neat actually.

So I'm now doing the backup in two different ways: 1) I'm syncing the NAS to a local drive, and 2) backing it up using synology hyber backup to another NAS. I will move the 2nd NAS to a little distant location to make it more reasonable.
 
Back
Top