At the beginning of 2020 the church switched from relying on Customware to provide network administration to hiring a part-time network administrator (my own self, Ben Bellamy). It took over a month to get the administrative credentials from Customware so we could access the administrative portions of the file server, or any other device in the church. Once we had access, backups became a priority. Customware had been preforming backups in some way, but they would not disclose what that was exactly. I found a few backup programs on the file server, but none were active or had any configuration files or settings. So we do not know how backups had been performed.
Note that the following is the most recent information. At the end of this page is the previous documentation which no longer applies, but is being kept for historic purposes.
Ben, get some screen shots and flesh this out.
https://www.idrive.com/
At the time this subscription was purchases, the following was in effect!
* Important note on the offer:
# Normal storage available for the Personal Plan is 2 TB, and for a limited time it is set at 5 TB and if you sign up now for this, it will stay at 5TB. Similarly, the normal storage of 5 TB Personal Plan is set at 10 TB for a limited time.
Pricing Terms
WARNING: IDrive does not store your private encryption key on its servers. It uses a special technique to encrypt your private encryption key that can only be used to verify, but not retrieve it. It is recommended that you archive it safely. Without your private encryption key, you will NOT be able to retrieve your data. We will not be able to provide the private encryption key information.
The iDrive subscription is use to configure the File Server (staffsrv 10.32.10.11) to backup it's files to the iDrive 'cloud' as production changes are noticed. Then each evening, it will perform a backup of all changed files.
All of this is configured by logging into the iDrive account, going to the backup section (and logging in there also) then setting the configurations.
So we developed a new procedure which is detailed below.
First, we identified the disk resources on the file server. There are two network based disk systems. A Synology which was powered off, and a WD My Cloud. In the Inventory spreadsheet there is a sheet that details these disk which are considered mounted, or mapped, by the file server.
There are two basic groups of files on the file server. The production files, and the media files. The media files do not need to be backed up, they will be archived.
The following is the approach we are taking.
Subscribe to a cloud based backup service. Then begin backing up only the file server regularly. It is not enough to ask staff to keep all of their important files on the file server rather than on their local hosts. I need to provide an easy to use routine that updates all 'important' files up to the file server.
Keep in mind the difference between backing up and archiving. The off-site service is for backups and not for archiving. Archiving should be done by moving files to off-line storage media such as CDs, DVDs, and USB removable media. Note that SSD media is not appropriate for either backups or archiving (SSD can begin losing data within 7 days.)
So, approach coordinating with staff, take the following steps.
1. Ask staff to begin moving their important file to the server and work with them there where they can and are comfortable doing so. They should work with them there. For the important files they want to keep on their local systems, provide the update to the file server routine.
2. At the same time, add scripts to each client host that will backup selected files from local drive to the file server where the off-site backup will run.
3. Have staff examine their file server files and delete what is no longer needed, or move it to their local drives in order to make room for the new files that should be on the file server and consequently backed up.
4. I need to do some disk management to make sure there is adequate disk space to store everything that needs to be backed up. The file server has some extra partitions that need to be figured out.
I need to develop an archive of backup configuration files and other system files from the different devices and services.
Second layer
6. Client near-line storage. The would be the next layer in the hierarchy. Staff can decide to keep files backed up on a local USB device in addition to the off-site process.
These are the shares;
Sharename Type Comment --------- ---- ------- ADMIN$ Disk Remote Admin C$ Disk Default share D$ Disk Default share E$ Disk Default share IPC$ IPC Remote IPC S$ Disk Default share Staff Drive Disk
I found this on the root of the file server. 10.32.10.97 is the WD My Cloud. /media/ben/winfs$ cat NAS_to_My_Cloud.BAT ROBOCOPY \\10.32.10.96\BackupDrive\Backups\Staff \\10.32.10.97\Backups\Staff /MIR
This is the best cloud based backup service I have found so far... The cost is $10.00 per month per device. Always just US$10 a month per device.
No file size restrictions
This would be the file server where I plan to cluster all of the material that needs to be backed up.
The service I have selected is CrashPlan
Note that there are currently 2 devises being backed up to the CrashPlan account. One is 'Staff' and the other is 'Ben's Laptop'. There is a monthly cost of $10.00 per host. I (Ben Bellamy) am paying from the backup of my laptop, while the church is paying for the staff backup. Be sure to keep the two separated with restoring or configuring.
External address: 67.131.58.35 Internal address: 10.32.10.3:4242 GUID: 940981828364981175 Username: wbbellamy@gmail.com Hostname: STAFF
The material below deal with the new backup processes for both the Staff backups to CrashPlan, and the rsync backups of the Media Material.
June 2020
Previously there did not appear to be any backup process for the Media G: and H: files which were housed on the Media Server.
Using CrashPlan to backup the many terabytes of Media material is impractical due to the processing necessary to constantly check for changes of so many large files, report that info across the LAN, then update many large files frequently when they do change. So, a separate backup approach is needed.
So I am setting up a dedicated Linux server with adequate disk capacity to keep a local copy of the Media files. Initially, a single 8 TB disk in an older PC is being used as a prototype of the process. It will run a cron job (scheduling program) to launch a shell script (batch file) that will run the backup commands to copy updates from the production material to the local backup.
The media backup server will be housed in Ben Bellamy's office due to the safety that room provides, and its separation from the production servers.
On a nightly schedule, the backup server will copy the updated files from the media areas onto the bkup server.
Thought should be given to some way to keep an off-site copy of the media backups.
Initially the Linux cp utility was used to perform the backup. Soon after, the rsync program was used. No backup product is necessary at this time for the media bkups.
Below are the scripts and commands used to run both nightly and weekly backup routines. All aspects of both steps are shown below and could be run manually. Following that information, steps for restoring are illustrated.
Currently (Tue 09 Jun 2020 02:18:45 PM EDT) the bkupsrv is:
From a bash terminal launch the following commands:
This causes the backup process to run under the root account. Consiquently all backed-up files are owned by root. Keep this in mind when restoring or working with the backed-up files.
At this point, the backup routine (~/bkup.routine.sh) is run via cron job.
The cron program is used to schedule the running of the bkuproutine.sh script. Note that cron files are stored in /var/spool/cron/crontabs/(user name), but should only be edited with crontabs -e.
List the cron jobs:
$ crontabs -l
Edit the cron jobs. This will start the nano editor for editing the cron file for the current user account.
$ crontabs -e
In the case of the backups, they are scheduled to run at 01:00 AM every night. The cron command to do this is shown below:
0 1 * * * ~/bkuproutine.sh
The bkupsrv host is running with a single 8 TB disk. A second will be installed and LVM will be used to merge the two drives into a single volume.
The bkupsrv is located in Ben Bellamy's office for physical safety and to separate it from the computer room for redundance.
Every night a shell script (~/bkup.routine.sh) will run via cron. That script copies any updated files and then emails a report.
The following material will be used to move from cp to rsync for backing up the media material to the bkupsrv.
The code above will synchronize the contents of Directory1 to Directory2, and leave no differences between the two. If rsync finds that Directory2 has a file that Directory1 does not, it will delete it. If rsync finds a file that has been changed, created, or deleted in Directory1, it will reflect those same changes to Directory2. $ rsync -av --delete /Directory1/ /Directory2/ 1. -a = recursive (recurse into directories), links (copy symlinks as symlinks), perms (preserve permissions), times (preserve modification times), group (preserve group), owner (preserve owner), preserve device files, and preserve special files. 2. -v = verbose. The reason I think verbose is important is so you can see exactly what rsync is backing up. Think about this: What if your hard drive is going bad, and starts deleting files without your knowledge, then you run your rsync script and it pushes those changes to your backups, thereby deleting all instances of a file that you did not want to get rid of? 3. –delete = This tells rsync to delete any files that are in Directory2 that aren’t in Directory1. If you choose to use this option, I recommend also using the verbose options, for reasons mentioned above. Ref: https://www.howtogeek.com/135533/how-to-use-rsync-to-backup-your-data-on-linux/
Cron can be used on Linux to automate the execution of commands, such as rsync. Using Cron, we can have our Linux system run nightly backups, or however often you would like them to run. To edit the cron table file for the user you are logged in as, run: $ crontab -e The line in cron to launch the bkuproutine.sh every morning at 01:00 AM is: 0 1 * * * /bin/backup.sh
Let’s suppose we want to copy files and directories from remote machine(192.168.1.29) to our local system, in the below example I am copying remote folder “/opt/rpms_db” in my local machine under /tmp folder # rsync -zarvh root@192.168.1.29:/opt/rpms_db /tmp Ref: https://www.linuxtechi.com/rsync-command-examples-linux/
Be sure to review this URL: https://opensource.com/article/19/5/advanced-rsync
You can preform a restore from practically any machine. But it tends to be easiest if you are sitting at the host you are restoring files to. For this example, assume you are sitting at a user host rather than the staff server.
Note that within the CrashPlan application you can open a command window where you can execute several commands. See https://support.code42.com/CrashPlan/6/Configuring/Pause_backups_and_downloads for details.