Encrypted offsite backup - a question

9 réponses [Dernière contribution]
oysterboy

I am a member!

I am a translator!

Hors ligne
A rejoint: 02/01/2011

I have an external backup of all my personal data, in case my house burns down. It is done the following way :

tar zcvf - FOLDER_TO_BACKUP | gpg -c > FOLDER_TO_BACKUP.tar.gz.gpg

This is a simple method. It uses symmetric encryption, so I only need to remember a passphrase - no messing around with public/private keys. The drawback to this method is that it's not incremental. Every time I refresh my offsite backup, all the data is processed, instead of just the delta. This takes too much time.

Searching on the web, I saw that duplicity seemed to be the answer to my needs. Duplicity looks great, but it seems to me (and this is where I need your advice) that it uses asymmetric encryption. Now the problem is that if my house does burn down, this offsite backup will be useless since my private key will have disappeared in the fire. Is there a way to use duplicity with a simple symmetric encryption?

What simple backup tools would you recommend to answer that need (offsite, encrypted, incremental, easy to restore with a passphrase)?

Thanks!

oysterboy

I am a member!

I am a translator!

Hors ligne
A rejoint: 02/01/2011

Cannot edit first message. Oh well. I think I mistyped the command (and seem to remember that there's an xkcd somewhere about tar syntax :)):

tar -zcvf FOLDER_TO_BACKUP | gpg -c > FOLDER_TO_BACKUP.tar.gz.gpg

onpon4
Hors ligne
A rejoint: 05/30/2012
oysterboy

I am a member!

I am a translator!

Hors ligne
A rejoint: 02/01/2011

That's the one indeed :). Je suis Rob :(.

G4JC
Hors ligne
A rejoint: 03/11/2012

It seems like you can use duplicity with a passphrase instead of a key if that's what you would like.

Ubuntu wiki has a script for it explaining your problem of "if computer is destroyed" as well:
https://help.ubuntu.com/community/DuplicityBackupHowto

oysterboy

I am a member!

I am a translator!

Hors ligne
A rejoint: 02/01/2011

Yes, it looks like duplicity can indeed do everything I need. This page + man duplicity have all the answers. Thank you very much!

jbar
Hors ligne
A rejoint: 01/22/2011

I never used it, but attic seems to be an alternative.

From its documentation: "Attic is a deduplicating backup program written in Python. The main goal of Attic is to provide an efficient and secure way to backup data. The data deduplication technique used makes Attic suitable for daily backups since only the changes are stored."
https://attic-backup.org/

Available in trisquel repo http://packages.trisquel.info/belenos/attic

A tutorial
http://www.debian-administration.org/article/712/An_introduction_to_the_attic_backup_program

marioxcc
Hors ligne
A rejoint: 08/13/2014

xdelta isn't a complete solution, but it may help with deduplication. Note that you may need to change the parameters to increase the deduplication ratio.

GPG has built-in compression, but if you want more compression you may want to use an external too; if you do, don't forget to disable the built-in compression because it would just make things slower and likely increase the size slightly.

Also, you can encrypt your private key with a symmetric key algorithm and store the encrypted private key along with the backup so that you can recover the backup by knowing the key used for the symmetric encryption.

t3g
t3g
Hors ligne
A rejoint: 05/15/2011

I know that the majority of you would never use a Dropbox, Google Drive, or OneDrive, but what if you upload the files to those services using GnuGPG in a pinch? They would still host your files for free, but the .asc files wouldn't be able to be opened as you have the private key.

jbar
Hors ligne
A rejoint: 01/22/2011

The problem is that if you locally encrypt a 200GB folder, then modify one text file and newly encrypt that folder the two encrypted files will be completely different, difficulting the synchronization.

That's why a way to detect redundant data is requested.