My webhosting is on a virtual machine that is hosted by Linode. Although they have great uptime and a good backup system, I also do my own backups to a local Linux server. It isn’t that hard to setup, uses only free scripts, and is completely automated now.
Why should you do your own backups instead of solely relying on your webhost? There are several reasons. First, if you ever have a problem with your host, you have all of your data. It could be a billing dispute, a TOS violation claim against you, or whatever that could put your hosted data in jeopardy. Second, by keeping a local copy of your data, you can have more flexibility over your backups. I personally keep about 3 months worth of backups. This would be way more expensive via my host. It is also nice to have a local copy of your data so that you can have a test environment if you need it.
With the methodology that I am using, I capture backups of the webserver files and the databases.
Each night, the databases are backed up on the webserver. Because my databases aren’t that large, I keep multiple copies on the webserver. After the database backups are complete, I then synchronize the webserver files and the database backups to my local machine. Finally, I create a rotating archive of the files locally. This ensures I have multiple backups, which is a good practice for critical data.
Before you begin, you should be aware of the following basic requirements. Not all hosts allow you the flexibility to do this type of backup. The biggest stumbling block will probably be ssh access. If you don’t have ssh access to your webserver, you won’t be able to implement this solution. For the local computer, it is optimal to use a Linux box that is on 24/7. However, you can also modify the instructions and use a Windows computer. And if it isn’t on 24/7, you can just schedule the backups when it is on or run them manually.
- ssh – The standard remote shell program. There are equivalents for Windows. Many webhosts don’t allow access via ssh though.
- rsync – Great utility for synchronizing data between two locations
- mail – You can use this or any other simple mail utility that can be utilized from a script.
- automysqlbackup – http://sourceforge.net/projects/automysqlbackup/
- filesbackup.sh – https://pganderson.com/files/filebackup.sh.txt (remove .txt and edit file)
- dailymaint.sh – https://pganderson.com/files/dailymaint.sh.txt (remove .txt and edit file)
- Download the latest version of autommysqlbackup from http://sourceforge.net/projects/automysqlbackup/
- Untar the file to an empty folder (automysqlbackup).
- Open the README file for detailed setup instructions.
- Copy the folder to your user directory on your website.
- Run install.sh
passwordless rsync setup
Generate the public private keys:
ssh-keygen -t dsa -b 1024 -f $HOME/.ssh/webserverrsync-key
*use blank passphrases
Copy the public key to every host you will connect TO:
scp ~/.ssh/webserverrsync-key.pub email@example.com:~/.ssh/id_dsa.pub
* this should prompt you for a password
SSH into your website:
Authorize the key by adding it to the list of authorized keys:
cat ~/.ssh/webserverrsync-key.pub >> ~/.ssh/authorized_keys
Log out of the current shell:
Test that you can log in with no password:
ssh -i ~/.ssh/webserverrsync-key yourwebsite.com
If this prompts for a password:
- ensure the remote user is the owner of the pub key
- ensure the private key “webserverrsync-key” is only accessible to the user
- make sure you didn’t use a passphrase when you generated the key
Setup cron to run the jobs at a scheduled time:
30 3 * * * /usr/local/bin/automysqlbackup /etc/automysqlbackup/myserver.conf
30 4 * * * /home/yourname/dailymaint.sh
Make the directories for the backup files: