How to do continuous sql db backup to cloud ?

We are running a magento website on Centos.

Everything is going fine apart from the fear of "What if we get hacked"

While all the magento files are backedup and stored off server in my google drive. But its the sql db which matters.

How do professionals backup sql databases ? How regular ? What would be the ideal solution to back up sql with complete peace of mind that if something goes wrong, we would have the latest database back up readily available ?

I am pretty new with server administration and still learning, so please be polite :)

6 Replies

having a replication slave (separate server) will ensure up to the transaction database changes are logged. It also means that a hack drop table will have an effect on the slave to. If mysql ensure binary logs are enabled on the slave. and that you do a mysqldump with –single-transaction and --master-data=2 and that all magento tables are innodb.

The speed of recovery needs to be tested to ensure its within your business requirements. Also see Percona XtraBackup.

A combination of the regular point in time mysqldump/xtrabackup and the mysql bin logs will be needed to restore the database to the point before the hack. Restoring after that may be a little fiddly. Row replication format will help.

Of course I've assumed mysql and if using postgresql or sqlite your path may be different.

We do a nightly backup that involves a database dump. The tricky thing is that a proper database dump will lock tables and interrupt your website so doing it more frequently (like hourly) would be problematic.

Still, we probably ought to do it more frequently. Reconstructing registration records from paypal receipts is not fun.

Switch to innodb and use mysqldump –single-transaction no locking :) or even better switch to innodb and use innobackupex again no locking.

I use the automysqlbackup program to backup to a folder in my home directory and then I simply upload that onto Google Drive myself and I also rsync it to a backup VPS.

We use Percona's Xtrabackup to take nightly incremental backups that get pushed to S3 using AWSCLI. They're essentially non-blocking when using InnoDB. The two unique features of Percona are:

http://www.percona.com/doc/percona-serv … cking.html">http://www.percona.com/doc/percona-server/5.6/management/changedpagetracking.html

http://www.percona.com/doc/percona-serv … locks.html">http://www.percona.com/doc/percona-server/5.6/management/backup_locks.html

So you don't even need a FLUSH TABLES WITH READ LOCK.

We also use the Linode backup service for added redundancy.

@affliction:

I use the automysqlbackup program to backup to a folder in my home directory and then I simply upload that onto Google Drive myself and I also rsync it to a backup VPS.

I also use a script like you mentioned and that is mysqldumper which also make automatic database backup at home folder of your website.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct