There are many tools today that can be used to backup your data. Most of them come with shiny eye-candy GUIs and with a few clicks you can synchronize your data to Dropbox, Google Drive or wherever you want. So, why not use some of them and end this blog post right here? First of all, these solutions are boring, then there is the problem of giving your data to third parties (call me crazy, but I’m never going to upload private SSH keys to Google) and finally I wanted to have daily snapshots. So, I wrote a small shell script that does the job.
## Automated backup script
## Uploads backed up archived to the server, runs daily
## Author Milos Milutinovic
#put all backup dirs here
#create archives into this dir
tar -cjf ssh.tar.bz2 /home/milos/.ssh
#encrypt SSH archive
rm ssh.tar.bz2.gpg #remove old
gpg --passphrase-file /home/milos/secure/keys/gpg.key --simple-sk-checksum --batch -c ssh.tar.bz2
rm sec.tar.bz2.gpg #remove old
tar -cjf sec.tar.bz2 /home/milos/secure
gpg --passphrase-file /home/milos/secure/keys/gpg2.key --simple-sk-checksum --batch -c sec.tar.bz2
#create one daily archive
tar -cjf $today.tar.bz2 ssh.tar.bz2.gpg /home/milos/scripts /home/milos/Documents/AC /home/milos/secure/tmp/bkp/sec.tar.bz2.gpg /home/milos/Documents/db1/code
#scp to the server
scp -p -i /home/milos/secure/keys/bkpuser $today.tar.bz2 firstname.lastname@example.org:/path/to/folder/
Let me explain it. First interesting bit is line 15. This is how archive name is generated, it will be in format YYYY-MM-DD. Then I archive my ~/.ssh folder and encrypt it with gpg, using symmetric encryption with a passphrase file stored in an secure location. I have to remove the encrypted archive from the previous day and after encrypting it, I remove the plaintext one.
I then do similar thing with another location I want to backup securely and finally, on line 37, I create an archive that contains all of the data. You might say that for creating those encrypted archives, I didn’t have to use bzip2 option (create .tar archive instead), as they would be packed into the final archive, but think again. Those archive are encrypted, if I was creating tar archive (which are compressible) and then encrypting them, I wouldn’t be able to compress them. Random data is not compressible.
Another approach would be to create a folder each day and put several archives in it, then upload the folder to the server. This would be a bit more efficient, as it would avoid running bzip2 compression on archives that are already compressed (and encrypted), but the difference in negligible and having all files instead of folders means that it’s a lot easier to get rid of the old files on my server. On the server, I just have this kind of thing in a file in /etc/cron.daily:
find /var/www/miloske.tk/bkp/ -mtime +15 | xargs rm
This deletes any files older than 15 days in this location.
In the end, I scp data to my server. I’m uploading only one file, so rsync is not necessary. I do use rsync on my home backup server to pull the data from the online server, but here I’m synchronizing several folders, so I need rsync. This script is set to run as cron job on my work machine, so I always have backups of important files.