Linode Forum
Linode Community Forums
 FAQFAQ    SearchSearch    MembersMembers      Register Register 
 LoginLogin [ Anonymous ] 
Post new topic  Reply to topic
Author Message
 Post subject:
PostPosted: Wed Oct 21, 2009 6:06 am 
Offline
Senior Member

Joined: Sun Sep 20, 2009 3:23 pm
Posts: 52
Website: http://keithnet.dyndns.org/
WLM: keithint37@hotmail.com
Yahoo Messenger: keithint1234
AOL: keithint1234
Location: Prescott, Arizona
Okay.
But I guess my point being is that I wish to backup the entire Linode disk image itself.
So: what steps would be the best for this?
Thanks.


Top
   
 Post subject:
PostPosted: Wed Oct 21, 2009 11:01 am 
Offline
Senior Member

Joined: Tue Apr 27, 2004 5:10 pm
Posts: 212
You basically need to just follow this guide:

http://library.linode.com/linode-manage ... te-account

In your case, instead of setting up another linode for the "receiving end", you'll be using a local linux box at home (or wherever).


Top
   
 Post subject:
PostPosted: Wed Oct 21, 2009 4:21 pm 
Offline
Senior Member

Joined: Sun Sep 20, 2009 3:23 pm
Posts: 52
Website: http://keithnet.dyndns.org/
WLM: keithint37@hotmail.com
Yahoo Messenger: keithint1234
AOL: keithint1234
Location: Prescott, Arizona
Thanks for the guide.
That answers my question entirely, but the one thing that I still need to know is:

Does Linux have any compression tools in general (to reduce) the resulting backup itself?

Suppose, that you have a drive taht is for instance, a little smaller than your Linode image, (but you still need that entire image) in case you ever decide to restore/coppy it over to a new Linode, etc.
What could you do in that case? Thanks!
If there is no compression, that won't work, as 1. My money budget is tite. 2. I don't have enouh to purchase a lot of drives. 3. I can't do so for now anyways, (yet need the backups)
So whatever you can tell me is great with regards to compressing a backup so that the space it does take up won't be as large as what it might really be.
DD copies files as large as the disk, so if you have a 700GB linode or whatever, and you have a disk smaller than that..you can't back it up without compressing.


Top
   
 Post subject:
PostPosted: Wed Oct 21, 2009 4:29 pm 
Offline
Senior Member

Joined: Tue Apr 27, 2004 5:10 pm
Posts: 212
I believe you can pipe the output of dd through gzip, so that would get you some compression. I'm not sure on the specific of how that would look on the command line, though.

One thing to consider - due to the fact that the image backups require downtime for your server, you really don't want to rely on these image backups as your primary backup method. You really ought to have another backup method that will back up your data in between your "full" image backups. These backups will take the form of using rdiff-backup, duplicity, or whatever. You certainly don't want to be having to take down your server for hours at a time each day to back up the disk image, do you?


Top
   
 Post subject:
PostPosted: Thu Oct 22, 2009 10:57 am 
Offline
Senior Member
User avatar

Joined: Tue May 26, 2009 3:29 pm
Posts: 1691
Location: Montreal, QC
Don't need dd to make image backups. My approach:

Desktop:

nc -l 9001 > /media/megalith/Backups/my_image.img.gz

Server:

pv /dev/sda | gzip --fast | nc my_desktop_ip_here 9001

Alternatively, you can decompress on the remote end if you intend to mount it. Additionally, since you will be CPU limited trying to do any gzip at all, it's highly recommended to use "pigz" for the compression instead of gzip.

pigz is a parallel (multithreaded) version of gzip. It's a drop-in replacement (supports the exact same options, produces the exact same output, messages, etc) except it will use the four cores in the linode whereas gzip will only use one.

I also recommend filling the disk with a single large zero'd file (IE, cat or dd /dev/zero to a file until you have very little disk space left).

This is because if you make a disk image, it will store every sector of the disk as-is, regardless of if the data was deleted or not. So your "empty space" takes up real space. By zeroing it out, gzip can compress that to almost nothing and save a ton of space.


Top
   
 Post subject:
PostPosted: Thu Oct 22, 2009 11:25 am 
Offline
Senior Member

Joined: Thu Apr 03, 2008 12:02 am
Posts: 103
AOL: derole
anderiv wrote:
Keith-BlindUser wrote:
How much does S3 caust?
Is it extremely caustly?

I have ~1.5GB in S3 and my daily incremental backups are about 5MB or so. My monthly backup bill is under a dollar for that server, so not too bad.


Similar situation here. I have a few hundred megs of backups on S3 with daily incremental backups and it costs me about $0.30 a month. Totally worth it...


Top
   
 Post subject:
PostPosted: Sat Oct 24, 2009 2:47 am 
Offline
Senior Member

Joined: Sat Jun 05, 2004 12:49 am
Posts: 333
What do you guys use to backup to S3 though?


Top
   
 Post subject:
PostPosted: Sat Oct 24, 2009 1:50 pm 
Offline
Senior Member

Joined: Thu Apr 03, 2008 12:02 am
Posts: 103
AOL: derole
http://www.linode.com/forums/viewtopic.php?t=2612


Top
   
 Post subject:
PostPosted: Sat Oct 24, 2009 9:09 pm 
Offline
Senior Member

Joined: Tue Apr 27, 2004 5:10 pm
Posts: 212
OverlordQ wrote:
What do you guys use to backup to S3 though?


As stated earlier in this thread, I use Duplicity. Simple, efficient, secure, works great.


Top
   
Display posts from previous:  Sort by  
Post new topic  Reply to topic


Who is online

Users browsing this forum: No registered users and 0 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum

Search for:
Jump to:  
RSS

Powered by phpBB® Forum Software © phpBB Group