I am having a good experience backing up my key Linode files (including all of /usr/local, /etc/, /var/log and /home) using s3sync, a ruby script that uploads files to Amazon S3, where you pay, I believe, 15 cents per gigabyte per month for storage and 20 cents per gigabyte for transfer bandwidth.
I already had an s3 account for JungleDisk, a Windows app I use to back up my home PC. You can sign up for one at Amazon.
I got s3sync set up using this tutorial:
http://blog.eberly.org/2006/10/09/how-a ... ng-s3sync/
and the README:
http://s3.amazonaws.com/ServEdge_pub/s3sync/README.txt
Note: If you use the verbatim SSL cert provided in the README (and referenced, I believe, in the tutorial), you need to name it "f73e89fd.0", which is not mentioned anywhere but the main Amazon website thread on s3sync:
http://developer.amazonwebservices.com/ ... 5&tstart=0
I have a shell script that calls s3sync on various directories, which I then invoke via cron, weekly for now. For my public web directory, I add the --public-read option, which lets me access those files from an URL that looks like:
mybucketname.s3.amazonaws.com
but this can be aliased to, for example, www2.mydomain.com, according to Amazon docs, though I have not tried this.
The only downside is that s3sync does not support putting items in at root level, there always has to be a prefix, if only a slash, so you can't make a perfect mirror of your site,
www.mydomain.com/foo.mov would be mybucket.s3.amazonaws.com//foo.mov (or www2.mydomain.com//foo.mov). Hopefully this will be fixed in the future.
I still backup to my home pc using rdiff-backup as detailed elsewhere on this board. But it's nice to have redundant backups and the option of easily Webserving large files like video from Amazon's servers rather than my Linode, should I ever have the need.