Even when Linode backup system comes out of beta, you should not rely on it exclusively. Depending on the OS of choice, recovering from total failure may be quick or slow. At any rate, critical data should be backed up off-site. This includes server/service configuration and service data -- web served files, databases, etc...
One option is to have another linode just for backups. A cheaper option is to purchase monthly FTP hosting for a few bucks. In any case, a cron script could be made to make regulary daily (or more frequent if required) backups of critical data.
Putting critical backups on a hosted service should in my opinion always include encryption. The process I recommend is then as follows:
1. Dump database(s) into relevant SQL file(s) or whatever format is suitable for quick recovery.
2. Tarball the domain files, including the relevant SQL files. Optionally have a list of directories/files you want to omit.
2a. Every now and then tarball your entire /etc for future reference or quick recovery. Optionally tarball your package manager important files.
3. Using openssl, encrypt the tarballs. I recommend salted aes-256-cbc cypher and a rather strong passphrase.
4. Using curl, ship the resulting tarballs off to the backup server or hosting account via FTP . Curl also allows neat FTP over SSL/TLS which is recommended to avoid sending passwords in plaintext.
5. If you don't have sufficient space to make tarballed and encrypted copies of your existing data, you can do that on-the-fly, with a following
example script, modify where applicable:
Code:
#!/bin/bash
SOURCE="dir.to.backup"
PASS="AES encryption passphrase"
USER="ftp username"
FTPPASS="ftp password"
URL="ftp://url.to/remote/file_to_store.tar.gz.enc"
tar -cpz $SOURCE | openssl enc -e -aes-256-cbc -salt -k $PASS | curl -u $USER:$FTPPASS $URL --ftp-pasv -T -
And to state the obvious: do NOT forget or lose the passphrase or everything is in vain.