How many Backup USB drives do you have for Windows 10?

Comixbooks

Fully [H]
Joined
Jun 7, 2008
Messages
21,947
If push comes to shove I could use my laptop and make a backup copy on a USB drive. ISO is out of the question I think since optical drives are pretty much done for even ATX cases today don't use them.I had like x4 copies of Windows 10 but now I'm down to X2 because Windows 10 requires more than 4 gigs of space on a USB drive so I picked up X2 32 gig USB drives to replace the 4 gig ones which are basically useless now.
 
I have 1. Its an 250gb external usb hard drive that cost $30. I use the Windows 10 backup and recovery tool to make an image and just restore from that image. I use it all the time when I get the itch to try out some new distros and then need to go back to Windows 10.
 
Last edited:
none. I keep data on other drives and reload off a 16gb with the installer. ive only had to do this once after a mobo failure with raid0 and ive been on insider since day one.
 
none. I keep data on other drives and reload off a 16gb with the installer. ive only had to do this once after a mobo failure with raid0 and ive been on insider since day one.
And then when you get hit by ransomware all your data will be encrypted lol.
 
I have several older internal 3.5" HDDs that got retired over the years and I use them as external storage by using a self-powered USB3 dock station.
The topic title is kind of mindless though.
If you intended to ask "what are your backup/archiving strategies" I'd understand.
External drives are just one stage of a multi-stage backup strategy. Each stage can guard against different problems.
My first stage is to backup (with compression to one file, incremental) to a second drive on the system - this (almost) only helps if the primary drive fails for some reason. Then the file(s) are copied to another machine (server/NAS) through limited-rights FTP account to guard against malware infercting the already upladed files. The FTP account can only write/list/create files but not rename/delete/append. I don't particularly like or trust cloud services so third stage is external drives, although less often performed. Well, even less often I upload most important data to google drive.
 
Handling backups manually seems like a receipt for failure.

I use backblaze ( reference link ). Cheap and problem free.
Nothing wrong at all with doing manual backups and they are never a receipt for failure as you say which I don't know why you would say that. I have been doing manual backups for decades and they have worked flawlessly and only recently moved over to scheduled GFS backup system, it also has been working flawlessly.

Obviously the software or the way you have/had been doing your backups was/is the issue. Perhaps you need to revisit your system, processes, policies and procedures.
 
Nothing wrong at all with doing manual backups and they are never a receipt for failure as you say which I don't know why you would say that. I have been doing manual backups for decades and they have worked flawlessly and only recently moved over to scheduled GFS backup system, it also has been working flawlessly.

Obviously the software or the way you have/had been doing your backups was/is the issue. Perhaps you need to revisit your system, processes, policies and procedures.
I can't count the number of times I've seen backup failure ( usually at the worst possible time too ), and in almost all cases it was because the process was too complex. Manually backing up introduces complexity.

For a long term stable backup process, you need as much automation as possible. Not just in the backing up process, but also in the restoration tests.
 
I can't count the number of times I've seen backup failure ( usually at the worst possible time too ), and in almost all cases it was because the process was too complex. Manually backing up introduces complexity.

For a long term stable backup process, you need as much automation as possible. Not just in the backing up process, but also in the restoration tests.

I don't know how complexity can be introduced with manual backups other than from very poorly designed software, a poorly setup HDD or user error.

You:
  1. start backup package,
  2. select HDD/Partition that Windows installed and boots from (which any decent package will identify for you),
  3. select Full backup (which selects boot partition, main install partition, the entire HDD (again any decent package should do this),
  4. choose backup location (cannot be the same HDD/Partition that you are backing up, any decent package will NOT allow this, if it does, dump the package immediately from your list),
  5. select start,
  6. wait for completion and you are done
  7. remember to test your backup,
  8. remember to test your backup medium also if an external USB HDD
  9. remember to test your Boot USB an/or CD
seven to nine are optional but should be done to make sure you can use the items for recovery when it happens.

Where is the complexity in that process? Then again I am not an idiot running four or more OS's with a ten item dual boot menu neither (just so you can have bragging rights with your family and friends), if that is you, select an OS and stick to it, remove the others, remove your general everyday operating problems not to forget removing your backup problems/complexities.

Sorry to say Grasshopper that you must be doing something wrong or using poorly designed backup software. Regardless, the rule with backups is that you should test them after creation. Only three days ago I had to use a backup to recover my Surface Pro 3, I couldn't access the network location of my backup and this was my fault as I didn't know that I had to load the network driver from within Macrium Reflect, duh... once that was loaded the backup was found and away it went, so the operator (you, me, us) need to also know how to use the packages and not just Next, Next, Finish the installs and be done with it at that, grab the manual and learn stuff.

I am using the full subscription version of Macrium Reflect Backup for Workstation as I have been seriously impressed with Macrium Reflect Free over the years, I am even more impressed with Macrium Reflect Backup for Workstation. All I can suggest and recommend to anyone looking or thinking about backups is that you try three or four different backup packages for yourself. Most offer free versions which are cut down full packages, some if not all today offer 30 day trials of the full package with discounts on purchase. Consideration needs to be given to how often you want to backup, how long you want to keep backups, do you want to backup over a network to a NAS unit (network backup to me is a very important feature) do you want to do differential and incremental backups, do you want to backup to USB HDD's, do you want backup schemes such as GFS (Grandfather, Father, Son) which relates to full, differential and incremental backups, do you want scheduling, do you want Boot Recovery USB, CD and/or OS Boot Menu (Macrium Reflect Backup for Workstation has all these features I mentioned at reasonable price.

I am in no way affiliated or paid by Macrium for testimonials or forum posts regarding their products or services, I speak from a private home user of their products and services. I strongly recommend that you have Macrium Reflect Backup among your testing packages.
 
I have a WD Ext 3TB, USB 3/2, that I do routine automated backups on. I have several USB Thumb drives that I store things on.
 
There is no ground for discussion whether manual or automatic is much better than the other.
I second the thesis that automatic is better in that you cannot forget to make a backup. But I've been doing manual backups for years. It all depends how often the system or files are changing (installing, uninstalling software, changing settings etc.) or how often your files get changed. It depends how often your system fails to the point of requiring restoring a backup. Mine - never, just the server once had to be restored (the OS) after an unsuccessful try of new memory when the OS broke to the point it wasn't feasible to try to fix it.
I'm still doing manual OS images offline (from another OS) once in 1-2 months. Virtually no changes other than normal updates of 5-6 programs like browsers, email clients etc.
I'm now migrating to automatic incremental data backups with my own scripts and rar/7z.
 
You:
  1. start backup package,
  2. select HDD/Partition that Windows installed and boots from (which any decent package will identify for you),
  3. select Full backup (which selects boot partition, main install partition, the entire HDD (again any decent package should do this),
  4. choose backup location (cannot be the same HDD/Partition that you are backing up, any decent package will NOT allow this, if it does, dump the package immediately from your list),
  5. select start,
  6. wait for completion and you are done
  7. remember to test your backup,
  8. remember to test your backup medium also if an external USB HDD
  9. remember to test your Boot USB an/or CD
For my home system, want to know my process?

1) Check to make sure backblaze is running and that it doesn't have any errors.

So 1 vs 6 steps; which is less complex?

My work systems are even easier; I wrote the backup / restoration test scripts and plugged them into nagios, so all I really need to do is look at the dash board. If something fails I A) get an email and B) it's up there with any other issues I need to address. I'll admit to periodically testing those backups, even so, because I'm paranoid, but that's once a month or so.

If a manual process is working out fine for you, great. I'm sure you're quite the Bill Gates when it comes to this stuff, so you should be fine. However, automatic beats manual for any critical system, hands down.
 
I need an 8TB backup so it would take almost 3 months of upload to send my backup to Backblaze at 10 Mbps.

And I would blow through my 1 TB monthly bandwidth cap.
 
Not sure what you're asking "do you have for Windows 10" For windows 10 for what? Bootable UEFI media of win10 installed on a USB stick/SD card>SD card reader to be used to install win10 on a ssd/hdd/?

Anyways I have a Bootable UEFI media of win10 installed on a SD card to make installations to my PC, I also make regular image backups of my win10 with macrium reflect free bootable sd card>sd card reader media and backup the image to a hdd "storage" then I use freefilesynce to mirror what ever is new on my storage hdd to my storage backup hdd. Then I have an offline hdd for any backups.
 
Tad confused about what you're asking as well. For install, I just have a flash drive setup to boot for Windows 10 installs.

For backups? I've got a multi-layered backup strategy.

- Near real-time: Files stored in a MEGA cloud-backed drive. These are my active work files.
- Important, hourly [if changed] --> File History turned on for multiple drives backed to an external server. This encompasses files outside of my active work, but important.
- Daily --> VEEAM backup to an external server of the entire system (server is RAID1).
- Monthly --> That entire server is backed up to a few different external 6tb drives and encrypted, these are rotated monthly and moved off-site. I do it monthly because these backups take approximately 36hours to complete.

Spinner storage space is cheap these days, no reason not to have multiple disconnected backups. Worse case scenario that my house hurns down? I lose a month of backups tops of files not in my MEGA drive.
 
For Windows 10? Absolutely none. :D

For other purposes? Oh, probably 15 USB sticks with a variety of stuff on 'em for various purposes. I can use external hard drives with USB but since I have eSATA ports on my laptop and docking station (Dell Latitude) I prefer to use that for full speed access (they're all USB 2.0 ports since it's an older E6420).
 
For backups I use acronis true image via ftp to my nas and limiting access to backup subfolders (multiple PCs) ONLY to the ftp user acronis uses. I've come not to trust shared folder backups as virus's can scan the network for any available network folders but I haven't heard of this for ftp yet.
 
And expanding on tporter's solution with FTP account (although using my own scripts for backup that use NCFTP package for scripting FTP), I further limited the FTP account's permissions to only write (create file) / list, and removed the RENAME / DELETE / APPEND permissions. This is so if something "steals" credentials for FTP, nothing can delete or otherwise corrupt already uploaded files/backups.
A scheduled script runs on the server where the FTP server is running, that keeps the number of backup's in control by deleting old backups and to not overwhelm the disk space.
 
2 TB of storage for everything that is important that is in a hot swap drivebay. 1 250 GB SSD for the Win 10 OS. 1 250 GB clean Win 10 that is nothing but an emergency back up. 1 250 GB for my Linux OS 1 250 SSD for learning the different OS. 1. 250 GB for Win 7 for legacy devices.

All of the SSD drive are in a 4 slotted, hot swapable carriage that takes up one 5.25 drive bay.
I only use USB drives to transfer information.
 
Back
Top