Backup Scripts for Cloud Sites
If you like this script and use it, please consider donating below to off-set my costs - after all, you know it's hard to find this sort of thing!
First a disclaimer: I am a web designer, I make things look pretty. I don't know how to write command line code. I don't know how this all functions, but it does, and has been doing so on my sites for about 6 months now. You're completely on your own to use this and implement it.
What I was looking for:
Like many of you, I have clients on Rack Space's cloud sites and was looking for a way to allow backups on a regular, automated schedule.
What this is:
I worked with a very talented developer to come up with two scripts that can be uploaded to the cloud sites' directory and run nightly with a cron job. They copy the files associated with the web site as well as any defined databases. All the content is compressed, and stored locally each time the script is ran (in my case nightly). Then at the end of the cycle of runs (each 7 days in my case), the script FTPs the compressed archives off-site a FTP server I have.
What this is NOT:
A fully secure/vetted solution. I'm NOT a security expert, nor am I transferring/backing up data that is sensitive.This fits my needs and in searching I haven't seen anything like it, so thought I would share.
How to implement:
Download a .zip of the files here.
There are two shell script files, backup-config.sh, and backupmysql.sh. I created a "backup" folder in my Cloud Site's domain root (right alongside "logs", "lib" and "web"). Into that "backup" folder I put both these files after configuring them. Then you go into the Cloud control panel for your site and set up a cron job that runs that points to "backup/backup-config.sh"
That's it.... Wait for the cron to run and see the results in both the local backup folder and also any FTP server you define. Below is my attempt at a plain English explanation of the config file options with line numbers.
||The full path to your site as defined under "Features"on your cloud sites control panel.
||The name of your domain with the "www"
||The number of iterations/local backups before the script sends a copy off-site to your off-site FTP server. In my case I have my script running every night which creates 7 local copies, then on the 7th night, it cycles and causes the FTP transfer.
||The name of the first database you'd like to backup.
||The cloud host of where your database is.
||The username of the first database you'd like to backup.
||The password of the first database you'd like to backup.
||SAME FOR 2nd (and more) databases
||SAME FOR 2nd (and more) databases
||This is a simple way to omit one or more directories that you don't want backed up. In my example, I had a video folder that was full of large files that I also had locally, so I didn't want to include in the backup archive.
||The url of your offsite FTP server where you want the backedup
||your remote FTP username
||your remote FTP password
||the directory path of where you want the backups to go
||You can edit this if you NEED to, but should be good for Cloud Sites as of 10/13/11
||This needs to jive with whereever you upload the script files to.. as I said "backup/" in my example.
So that's pretty much it. I would download/try/use. If you like it, buy me a beer through PayPal below - I spent a lot of time and money working with my developer to get this to do everything I needed to do, but still be flexible enough to use from one site to the next on my Cloud Sites environment. If you have any suggestions on how to improve the script or how to edit this how-to, please e-mail me at: firstname.lastname@example.org - Thanks, Jay.
If you like this script and use it, please consider donating below to off-set my costs - after all you know it's hard to find this sort of thing!