Not going to get fancy with this one: it’s a bash shell script that will take the last X backups and push them to Amazon S3 for disaster recovery. I wrote a couple of versions until I settled on this simple one. It requires you to have s3cmd, which if you’re using Linux, is pretty much the only way you could interface S3 and stay sane. Here are docs on how to install it with yum if you are on Redhat/CentOS.
Now, to the script:
#!/bin/bash
# init vars
BACKUP_ROOT=/dblogs/backups
S3CMD=/usr/bin/s3cmd
S3DIR=s3://your-bucket-name/your-folder-name
# take the last X backups and make sure they are on S3 (but not older ones)
BACKUP_COUNT=5
BACKUP_FILELIST=/tmp/.pgbackup_s3cmd_filelist
# first list out the files we want to sync up
ls -1 $BACKUP_ROOT | tail -$BACKUP_COUNT > $BACKUP_FILELIST
# now sync them; requires trailing slash to sync directories
$S3CMD sync --progress --delete-removed --acl-private --exclude '*' --include-from $BACKUP_FILELIST $BACKUP_ROOT $S3DIR/
# clean up
rm -f $BACKUP_FILELIST
Setup s3cmd after installing it by running s3cmd --configure with your Amazon credentials handy. I also use s3fox, a Firefox add-on, as another way of quickly accessing S3 with a GUI.
This script maintains only the last 5 backups on S3 and it deletes what would be the 6th backup each night via cron. You could adjust the number to keep with the BACKUP_COUNT parameter. I’m personally using this to backup a PostgreSQL server but you can adjust it to back up any type of directory with files in it.
Update 2/17 – realized the default behavior is not to delete files on the remote host that are not included locally. I updated the above to include the –delete-removed argument and now it deletes the oldest file as expected.
Tweets that mention Push the last X backups to Amazon S3 ยป ghidinelli.com -- Topsy.com said:
on February 15, 2011 at 6:01 pm
[...] This post was mentioned on Twitter by cfbloggers, Top News AMAZON. Top News AMAZON said: Push the last X backups to Amazon S3 – http://cfbloggers.org/?c=47632 http://aepiot.ro/real-time/AMAZON [...]