Simple VPS Server Backups

Datetime:2016-08-22 23:56:13          Topic:          Share

Most if not all VPS server providers worth their salt these days offer a full backup solution where you can simple restore to a previous backup is something goes wrong. But that doesn’t mean we should be complacent and just rely on the provider to all the work for us.

Luckily its pretty simple to make backups these days, and with services like Amazon S3 storing those backups is also dirt cheap.

I have 3 little scripts which I use as well as Digital Oceans automated backups, just in case there was ever an issue. Plus at least with Digital ocean anyway their automated backup services doesn’t happen every single day so if there backup is a few days stale you could use this to patch the changes their backup missed.

First start by creating a folder on the root filesystem called /backup .

Full File System Backup

While generally the least important part of  backup these days (as most sites are version controlled) it’s always good to keep full filesystem backup. This backup script simply automates the process of creating a tar archive of the entire system, minus a few locations you don’t need to backup. I wouldn’t necessarily recommend doing a filesystem restore from this type of backup (although it is possible), it can always help by giving you an exact copy of everything your server hosted, like config files, upload directories etc.

This little bash script backs up the filesystem and deletes backups older than 7 days old.

Simply place this in the /etc/cron.daily directory to run the backup everyday.

Full MySql Server Backup

This one is arguably more important as your database is usually the data you want back after a system failure.

Whats neat about this little script is that you don’t need to back up every database yourself, just give it a user account capable of accessing all databases and it will search the server for all the databases and create a backup of each for you.

Again simply place this in the /etc/cron.daily directory to run the backup everyday.

Don’t forget to replace {{user}} and {{password}} in the script below.

Automate delivery to Amazon S3

Right so now we have backups of everything, but there still on the server, which really doesn’t give you any sort of backup really, they need to hosted elsewhere to be truly useful. Lets take care of that. But before we do, we need to install a few packages called trickle and s3cmd .

Lets start with Trickle:

sudo apt-get install trickle

That was pretty simple. Trickle is simple a bandwidth limiter for a command, I recommend this as you don’t want your backup “backup” to flood you network connection leaving website visitors with long waits.

We will come back to this in a moment.

Next we need to install s3cmd found here: http://s3tools.org/repositories .

First simply try sudo apt-get install s3cmd , and if it can find it follow the docs from the link above:

  1. Import S3tools signing key: wget -O- -q http://s3tools.org/repo/deb-all/stable/s3tools.key | sudo apt-key add -
  2. Add the repo to sources.list: sudo wget -O/etc/apt/sources.list.d/s3tools.list http://s3tools.org/repo/deb-all/stable/s3tools.list
  3. Refresh package cache and install the newest s3cmd: sudo apt-get update && sudo apt-get install s3cmd

Once we have these packages we can then create a config file to store our AWS credentials.

Don’t forget to replace {{access_key}} and {{secret_key}} with your AWS keys.

I would suggest installing this file in your home directory, as this will then be used if you want to run s3cmd from the terminal without having to provide a config file location.

Then we just need to create the sync script to place in the /etc/cron.daily directory:

Don’t forget to replace {{user}}, {{bucketname}}, and {{location}}.

In the script above we have set the s3cmd to upload whatever resides in the /backups folder to the S3 bucket and sub location within the bucket. We have then wrapped this in a trickle command which will limit the transfer rate to 4000kb/s limiting the impact on any outward facing websites residing on the server.

You can test any of these scripts by running sudo ./scriptname to make sure everything is working.

Then hop over to your S3 bucket and set some expiry policies to purge old backups at an interval your comfortable with.

Conclusion

There are more sophisticated backup methods and solutions, which I advise you look into as well, but that doesn’t mean we can’t have a solid backup-backup solution should the primary backup service fail to deliver in an outage.