Linode Forum
Linode Community Forums
 FAQFAQ    SearchSearch    MembersMembers      Register Register 
 LoginLogin [ Anonymous ] 
Post new topic  Reply to topic
Author Message
PostPosted: Tue Mar 18, 2014 7:45 pm 
Offline
Junior Member

Joined: Mon Jun 06, 2011 1:33 am
Posts: 40
I haven't done a full server restore yet, but yes, I've done several spot check restores, including full directories, and everything worked like a charm.


Top
   
PostPosted: Tue Mar 18, 2014 9:57 pm 
Offline
Senior Newbie

Joined: Wed Jan 22, 2014 8:11 pm
Posts: 6
Website: http://eightyfiveconsulting.com
Location: Orange, CA
bbergman wrote:
Update: I have finally found the (almost) perfect backup service, and it couldn't be easier to use with Linode (and another service I use that has droplets). It's here: http://www.opsmate.com

Thanks for the link. I signed up for beta, looks really promising.

_________________
http://eightyfiveconsulting.com


Top
   
PostPosted: Wed Mar 19, 2014 12:34 am 
Offline
Senior Newbie
User avatar

Joined: Wed Feb 29, 2012 7:49 pm
Posts: 11
bbergman wrote:
Update: I have finally found the (almost) perfect backup service, and it couldn't be easier to use with Linode (and another service I use that has droplets). It's here: http://www.opsmate.com


Thanks for sharing! That looks really promising.


Top
   
PostPosted: Wed Mar 19, 2014 3:42 pm 
Offline
Junior Member
User avatar

Joined: Tue Dec 27, 2005 1:33 am
Posts: 43
Location: USA
Hey everyone, Opsmate founder here. I'd be happy to answer any questions you may have about the service.


Top
   
PostPosted: Thu Mar 20, 2014 12:10 am 
Offline
Newbie

Joined: Tue Mar 18, 2014 7:51 pm
Posts: 2
I take Linode's backup service for what it is. They say that RAID1 (or 10 for that matter) is about availability, not backup, right? That's exactly why I'm paying Linode a (very humble) extra so that I can take a snapshot whenever I mess around with a production server. If that [apt-get dist-upgrade] goes wrong, I can redeploy my Linode in a matter of minutes and begin the brainstorming process about where exactly it went wrong. Availability, baby! (Actually, I'd clone the Node from my snapshot for what, couple cents, stage the upgrade there and then upgrade on production).

That said, I do not rely on Linode's backup service as my sole backup. That'd be silly, come on. Eggs and baskets, remember? If worst comes to worst (Linode backup went FUBAR, Linode itself went belly up, 3-letter agency pulled an entire rack including the backup server, you name it...) I rdiff to one local and one remote server anyway. Granted, this will take time to be up and running again, so there goes the availability but my data is safe.

As for Opsmate, it does look great really, but I find the pricing rather steep. I do realize there's a lot of compression and optimization going on, but that really only applies to the OS and log files. The actual data, on my Linode at least, is oh so very not compressible (RAW photo files anyone?). My Linode 4096 (196 GB storage) runs me $80 plus $20 backup. Backing it up to Opsmate would cost me another $75 for 200 GB storage. (Like I said, my data doesn't compress well, so the 100 GB plan won't do). Well, frankly, for $75 I'd rather just spin up another Linode in another datacenter and rdiff.

Bottom line, I love Linode, they've been extremely reliable in my experience and I appreciate their backup feature for its convenience. I will not exclusively rely on it for reasons mentioned earlier, so I do keep my 2nd and 3rd backup and backup of the backup elsewhere, but I won't pay another $75/mo for it.


Top
   
PostPosted: Thu Mar 20, 2014 6:43 pm 
Offline
Senior Newbie

Joined: Fri Jan 03, 2014 6:21 pm
Posts: 8
It isn't really difficult to write a script to back up to S3 on your own. Here's an example:

Code:
#!/bin/bash

BACKUP_DIR="/backups"
MYSQL_FILENAME="$BACKUP_DIR/prod-$(date +%m%d%Y).sql"
DOCROOT_FILENAME="$BACKUP_DIR/prod-$(date +%m%d%Y).tar.gz"
ETC_FILENAME="$BACKUP_DIR/etc--$(date +%m%d%Y).tar.gz"
BACKUP_HISTORY="365 days"

echo "Creating backups..."

mysqldump -u username --password=password database > $MYSQL_FILENAME
gzip $MYSQL_FILENAME

pushd /opt
tar cfz $DOCROOT_FILENAME www/prod/ scripts/
popd
pushd /etc
tar cfz $ETC_FILENAME php* nginx* sphinx* my.cnf
popd

s3cmd put ${MYSQL_FILENAME}.gz s3://your-s3-path/ && rm -f ${MYSQL_FILENAME}.gz
s3cmd put $DOCROOT_FILENAME s3://your-s3-path/ && rm -f $DOCROOT_FILENAME
s3cmd put $ETC_FILENAME s3://your-s3-path/ && rm -f $ETC_FILENAME

echo "... done."
echo
echo "Cleaning up old backups, deleting files older than $BACKUP_HISTORY..."

s3cmd ls s3://your-s3-path/ | while read -r line; 
do
   createDate=`echo $line|awk {'print $1" "$2'}`
   createDate=`date -d"$createDate" +%s`
   olderThan=`date -d"-$BACKUP_HISTORY" +%s`

   if [[ $createDate -lt $olderThan ]]
   then
      fileName=`echo $line|awk {'print $4'}`

      if [[ $fileName != "" ]]
      then
         s3cmd del "$fileName"
      fi
   fi
done

echo "... done."


robert


Top
   
PostPosted: Fri Mar 21, 2014 12:05 am 
Offline
Senior Member

Joined: Sun May 23, 2010 1:57 pm
Posts: 315
Website: http://www.jebblue.net
Or rsync to another host:

Code:
#!/bin/sh

#[Note: This is a FULL system backup script and requires root. If you
# only want to backup your user files then tailor the script.]
# Use "sudo crontab -e" to set up a cron job to run it.
#
#[Note: --delete will remove target files and dirs that no longer exist in
# the source, you may or may not want this sync'ing.]
#
#[Note: The first backup will take a while, to add the files to the
# target, after that it should only take a matter of minutes.]
#
#[Note: rsync must be installed on the source and the target.]
#

BINPRE="rsync -r -t -p -o -g -v -l -D --delete"
SSH="-e ssh -p 22"
BINPOST="<target_user>@<target_host_ip>:/<target_backup_dir>"
EXCLUDES="--exclude=/mnt --exclude=/tmp --exclude=/proc --exclude=/dev "
EXCLUDES=$EXCLUDES"--exclude=/sys --exclude=/var/run --exclude=/srv "
EXCLUDES=$EXCLUDES"--exclude=/media "

date >> /root/start

$BINPRE "$SSH" / $EXCLUDES $BINPOST

date >> /root/stop


Top
   
PostPosted: Fri Mar 21, 2014 2:18 am 
Offline
Senior Newbie

Joined: Fri Jan 03, 2014 6:21 pm
Posts: 8
The problem with rsyncing to another host is that you only have one copy. If something gets corrupted, hacked, or whatever today, and the backup/rsync script runs tonight, you've lost any chance to recover things. With my script (or any other 'real' backup system), you have 365 (or however many you choose) days of backups to choose from.

robert


Top
   
PostPosted: Fri Mar 21, 2014 10:00 am 
Offline
Senior Member

Joined: Sun May 23, 2010 1:57 pm
Posts: 315
Website: http://www.jebblue.net
robertcope wrote:
The problem with rsyncing to another host is that you only have one copy. If something gets corrupted, hacked, or whatever today, and the backup/rsync script runs tonight, you've lost any chance to recover things. With my script (or any other 'real' backup system), you have 365 (or however many you choose) days of backups to choose from.

robert


I provided a basic script so anyone can figure it out and get it running. I run my script on the client (Linode here) nightly.

On my target server I run a weekly cron too, a weekly location on the target, from daily target to weekly target locations on the target, I do monthly too. It's not perfect, yours is better in that you can choose any day of the year so long as you're willing to pay for S3 storage.

Mine works with any target system, any Linux, Mac or even Windows (there are ssh and rsync solutions on Windows) where you can run:

1) ssh
2) rsync

That's all you need. The target is where you can break out weekly, monthly, bi-daily, whatever periods you want, with cron, as long as your target has the disk space available.

edit: rsync saves immensely on disk wear and tear too.


Top
   
PostPosted: Fri Mar 21, 2014 5:02 pm 
Offline
Senior Member

Joined: Sat Oct 23, 2010 12:56 pm
Posts: 73
Website: http://www.ingber.com
Location: Oregon
If you want to save multiple versions, I suggest using rsync for the current updated backups, and then run rsnapshot {daily, weekly, monthly, etc.} for the multiple versions (using hard links -- very fast and saves space).

Lester

_________________
http://www.ingber.com


Top
   
PostPosted: Fri Apr 18, 2014 9:53 am 
Offline
Senior Newbie

Joined: Fri Feb 27, 2009 2:59 pm
Posts: 6
Thanks for pointer to opsmate.com - I'm going to check them out.


Top
   
PostPosted: Fri Apr 18, 2014 11:51 pm 
Offline
Senior Newbie

Joined: Mon Sep 16, 2013 12:23 pm
Posts: 5
ICQ: 227040339
WLM: MetalHellsAngel@hotmail.com
Yahoo Messenger: MetalHellsAngel
AOL: MetalHellssAngel
bjl wrote:
bbergman wrote:
Does anyone have any good backup scripts they want to share? Debian 6 on my boxes.


I'm currently using a headless install of crashplan to backup to my always on network storage at home.


Thank you for mentioning this. I've been using Crashplan on my home computers for a while now, I don't know why I never thought to use it on my server. But it's done now :)


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic


Who is online

Users browsing this forum: No registered users and 4 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
RSS

Powered by phpBB® Forum Software © phpBB Group