Linode Forum
Linode Community Forums
 FAQFAQ    SearchSearch    MembersMembers      Register Register 
 LoginLogin [ Anonymous ] 
Post new topic  Reply to topic
Author Message
 Post subject: Manual backups
PostPosted: Thu Sep 01, 2011 10:33 am 
Offline
Senior Member

Joined: Fri May 20, 2011 2:45 am
Posts: 63
Location: Spain
Hello,

I am planning to backup my site with a regular shell. How am I supposed to download the backups to my local computer? Is there a good way to do it or just with scp?

Thank you


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 10:38 am 
Offline
Senior Member

Joined: Mon Jul 05, 2010 5:13 pm
Posts: 392
Rsync is your friend.


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 10:55 am 
Offline
Senior Newbie

Joined: Tue Dec 14, 2010 10:30 am
Posts: 16
Even better, rdiff-backup


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 10:59 am 
Offline
Senior Member

Joined: Fri May 20, 2011 2:45 am
Posts: 63
Location: Spain
Thank you both.

Does it affect your site while downloading the files?


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 11:04 am 
Offline
Senior Newbie

Joined: Tue Dec 14, 2010 10:30 am
Posts: 16
Any backup method will use some CPU, but will mostly be IO bound.

I don't think you can directly limit the bandwidth usage with rdiff-backup (it is possible, but unpractical), but with rsync, you have this
Code:
     --bwlimit=KBPS          limit I/O bandwidth; KBytes per second


As for what effect the backup will have, I guess it will mostly depend on how much resources the sites you are referring to needs during the backup.

To summarize, my guess is: Not a lot, but possibly, yes.


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 11:06 am 
Offline
Senior Member

Joined: Fri May 20, 2011 2:45 am
Posts: 63
Location: Spain
What about bacula?


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 1:07 pm 
Offline
Senior Member

Joined: Fri May 02, 2008 8:44 pm
Posts: 1121
If you're just backing up a couple of small sites, Bacula might be overkill. For simple needs, I'd just set up a cron job on the server that dumps the database every day, and rsync/rdiff-backup/rsnapshot both the DB dump and the website files to a different location at regular intervals.


Top
   
 Post subject:
PostPosted: Thu Sep 01, 2011 5:33 pm 
Offline
Senior Member

Joined: Fri May 20, 2011 2:45 am
Posts: 63
Location: Spain
Thank you to all of you!


Top
   
 Post subject:
PostPosted: Fri Sep 02, 2011 3:56 am 
Offline
Senior Member

Joined: Tue Aug 02, 2011 2:45 pm
Posts: 55
I use Amazon Web Services S3 and duplicity. There are a few little quirks with that but over all it works pretty good. Having said that, I don't have much data that I back up so outbound transfer isn't an issue for me. AWS gives you free storage for a year when you sign up (or they did) up to 5GB I believe. After that it's still ridiculously cheap.

If you Google duplicity and Amazon Web Services you should be able to find a good example of the cron script to setup. If you can't find it I can post mine. You can also encrypt the backups, so your private stuff stays private.


Top
   
 Post subject:
PostPosted: Fri Sep 02, 2011 4:07 am 
Offline
Senior Member

Joined: Fri May 20, 2011 2:45 am
Posts: 63
Location: Spain
Thank you again.

Can't you use duplicity with a local PC? Do you need Amazon S3?


Top
   
 Post subject:
PostPosted: Sat Sep 03, 2011 10:26 am 
Offline
Senior Member
User avatar

Joined: Tue Nov 24, 2009 1:59 pm
Posts: 362
Daily mysqldump | gzip -9 --rsyncable, then rdiff-backup of almost everything (including the dbdump.gz) except stuff like logfiles. Disk scan/checksumming takes about a hour, the real amount of data transferred usually ends up to be dozens of megabytes at most, unless someone did some big changes on his site that day.
Restore would take a few hours as it's backed up to a DSL with 512 kbps upload, but as I'm a noncommercial host, it's acceptable.
If you need fast backups, well, look into Linode's offer; S3 can be a double-edged sword.

LATE EDIT: Accidental double negation fixed.

_________________
rsk, providing useless advice on the Internet since 2005.


Last edited by rsk on Sun Sep 04, 2011 7:44 am, edited 1 time in total.

Top
   
 Post subject:
PostPosted: Sun Sep 04, 2011 12:30 am 
Offline
Senior Member
User avatar

Joined: Sun Jan 18, 2009 2:41 pm
Posts: 830
rsk wrote:
gzip -9 --rsyncable

Whoa, thanks! Had never heard of the --rsyncable flag before.


Top
   
 Post subject:
PostPosted: Sun Sep 04, 2011 7:46 am 
Offline
Senior Member
User avatar

Joined: Tue Nov 24, 2009 1:59 pm
Posts: 362
Vance wrote:
rsk wrote:
gzip -9 --rsyncable

Whoa, thanks! Had never heard of the --rsyncable flag before.

And it actually works quite good.

_________________
rsk, providing useless advice on the Internet since 2005.


Top
   
 Post subject:
PostPosted: Sun Sep 04, 2011 8:17 am 
Offline
Senior Member

Joined: Thu Oct 02, 2008 8:56 am
Posts: 99
Note on database dump - what I do is first of month, fresh dump. Each day after, fresh dump and diff it from the first of months dump, and the delete the fresh dump. That way only until the beginning of the next month, I'm only downloading what has changed since the first, and not the entire dump.

I use rsync over ssh.


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
RSS

Powered by phpBB® Forum Software © phpBB Group