Remote encrypted backup

7 respostas [Última entrada]
Avron

I am a translator!

Desconectado
Joined: 08/18/2020

Is anyone doing remote encrypted backups? (I want to do encrypted because it is to a server I don't control, rsync.net)

The documentation of BackIntime is warning about EncFS vulnerability if someone can have multiple ciphers of the same file and I suspect this is likely to be the case with multiple backups. Déjà Dup uses duplicity, which uses tar files ciphered with GnuPG, I have no clue whether it is better or not but I found no warning on it, so I decided to use it.

I used a URL with ssh, put a directory name (only one level), I did a manual backup and Déjà Dup shows the backed up directories and files. I did another manual backup after adding files, I see both backups.

I also see the option to do a periodic backup, the only choices are weekly or daily, keep at least 3 months, 6 months, 1 year or permanently. Below that, there is a sentence saying that oldest backups will be deleted earlier if there is enough space or kept 3 more months to avoid removing "linked backups" (not sure what it is exactly in English, my system is not in English).

On the backup server, I see the following files:
duplicity-full-signatures.20211231T181311Z.sigtar.gpg
duplicity-full.20211231T181311Z.manifest.gpg
duplicity-full.20211231T181311Z.vol1.difftar.gpg
many more like this with the number increased (2 ..., 10, 100, etc)
duplicity-inc.20211231T181311Z.to.20220102T105805Z.manifest.gpg
duplicity-inc.20211231T181311Z.to.20220102T105805Z.vol1.difftar.gpg
many more like this with the number increased (2 ..., 10, 100, etc)
duplicity-new-signatures.20211231T181311Z.to.20220102T105805Z.sigtar.gpg

So that somehow says when things were backed up and that the second backup in incremental. I used "duplicity list-current-files sftp://usernameATserver/mydirectory", it fetches the sigtar and manifest files, decrypts them and then I have a list (but it really takes a while).

Supposing my PC dies and I install Deja-Dup on a new PC, besides the server name, directory name, server credentials and encryption password, do I need anything to recover from the backup that the above files?

Am I using Déjà-Dup and duplicity correctly?

Is that kind of ciphering ok to avoid my private data from being accessed? (I don't have anything really critical but this is still private data).

Is there anything to be especially careful about?

SkedarKing
Desconectado
Joined: 11/01/2021

I use rclone, its simple enough for me and its way less bloated the nextcloud or owncloud and their likenesses, etc...

Just get a version that supports those extensions and it all works with cloud providers who support owncloud.

Not sure if the nextcloud protocol works properly in rclone yet though, but owncloud does for sure! :)

I forget what version functionality for owncloud was added at btw, but its at least 1.54, I believe just to clarify...

EDIT: Checking further, 3rd edit btw, it appears that such functionality was introduced in 1.39, so, just need a version that is newer, preferably from 2019 Q3, or newer.

Its probably more stable at that point anyhow.

Hope this helps!

Sorry, final edit/4th edit...

Trisquel 9 is based on ubuntu 18.04, so it has an older version than the one you want, so you will need to see if its in a backport, or use a ppa, or compile it,

Tell me if you find any of this helpful, as knowledge.

Avron

I am a translator!

Desconectado
Joined: 08/18/2020

For backup, my preference is for something that can store multiple versions. My impression is that rclone just synchronizes the latest version and does not keep any previous version, is that correct?

I know that some storage providers say they store previous versions, including rsync.net which I am using, but I don't know whether I should rely on that. What do you think?

Besides, I read at https://forum.rclone.org/t/rclone-instead-of-rsync/11746 linked by Wikipedia that "rclone doesn't currently preserve permissions, ownership or attributes", this is about the version on Ubuntu 18.04, but that there are plans to implement this. Do you know whether this was done?

About nextcloud/owncloud: I never used that. I am running a Seafile server at home (on Trisquel first, with SQLite and Apache, now on Debian on an ARM based machine, with NGINX and MariaDB), to get some things synchronized on multiple computers (and accessible on mobile phone), it works very well. I wanted to use syncthings twice but I failed to set it up.

SkedarKing
Desconectado
Joined: 11/01/2021

I have at least 2 different cloud providers in my rclone.conf file, that being said, never used syncthings either, looked way too complicated for me and at the time, I was very newbie like, it might be easier then I think, but never tried since then. Was like 4+ years ago.
I don't have that rclone issue you found on that forum though, just to be clear...

It might be something that doesn't happen on Hyperbola, due to their strict standards on freedom, privacy/security and debloat policies, but I don't completely know.

That issue might have been fixed since then.

rclone uses webdav to sync and if you have a cloud storage that supports owncloud + a new enough version as I said before, it works well enough.

I do something similar regarding what you did with seafile with rclone, either way though, use what works for you best.

Small edit btw: never tried rsync websites with rclone, just as a heads up.

And no, I don't know if that issue was fixed, sorry...

You could ask the devs themselves, or research it though.

lanun
Desconectado
Joined: 04/01/2021

> I don't have anything really critical but this is still private data
> my preference is for something that can store multiple versions

I guess there are specific reasons why you prefer to archive your backups, but if your goal is simply to keep one reasonably recent copy of everything on a remote server, then you might not need to keep the older versions.

Sometimes files are being heavily edited and versioned, in which case users may want to be able recover a specific version from a specific date, hence the usefulness of keeping older backups too. There is a subtle but important nuance between archiving and backing up, and there is a general rule that safely backing up static data should mean three copies, each on a different support. In such a scenario, more copies on the same support is unnecessarily redundant (as opposed to the recommended redundancy of having copies on multiple supports).

If you do not have specific reasons to keep the older copies, then I think your worry with BackInTime is gone and you can use its standard archive formats instead of the specific duplicity formats. If you have a net preference for multiple versions, I assume that these are supposed to be different versions, in which case the ciphers are hopefully going to be different.

Avron

I am a translator!

Desconectado
Joined: 08/18/2020

> I guess there are specific reasons why you prefer to archive your backups

My motivation is to recover things deleted by mistake even if I notice only after a while.

> there is a general rule that safely backing up static data should mean three copies, each on a different support.

My desktop has 3 HDDs configured with mdadm not to lose data if one fails. I also use a server with 5 HDDs configured not to lose data if 2 fail. There is an "open source project" for its software but I don't have a serious clue whether that is really the software my server is running, and the only one. I had it before I started considering free software. I do remote backups from that machine. I recently got https://www.crowdsupply.com/gnubee/personal-cloud-2 but I haven't set it up yet.

> you can use [BackInTime's] standard archive formats instead of the specific duplicity formats

I did not know that BackInTime is using something more standard than Duplicity, I'll try it for remote backup on a machine I control.

cuculus
Desconectado
Joined: 11/29/2017

You could use 7zip to make an encrypted archive and just send that to wherever you want. I do that like this:

7z a -m0=Copy -mhe=on -p archiveToMake.7z inputfilenameORfoldername anotherfileORfolder
-m0=Copy makes it not compress (compressing my large files that are already compressed is slow)
-mhe=on makes it encrypt the file list as well
-p makes it prompt for a password for encryption (you can also enter it in plaintext: -pMyPassword)

You could use the program cron to do this on any schedule you want. Personally I don't need remote servers, so I backup unencrypted to a folder on my local computers with rsync:

rsync -avh --exclude folderIDoNotWantBackedUp --progress --delete-after --ignore-errors -e "ssh -i ~/.ssh/hereToMyPCKey" ~/ me@backupPC:~/my-home-backup/

If it's just one archive file, you can use scp:
scp testFile.7z me@backupPC:~/testFile.7z

I also have an ssh jump host for sending over the internet. Rclone looks interesting though; I have a Mega.nz account that I could try it with.

SkedarKing
Desconectado
Joined: 11/01/2021

I recommend encrypting files with gpg, and using that method you mentioned, encrypted 7zip, for your keys and if you think its safe, though, I don't recommend it... passwords that you may not have written down, could be put in that 7zip folder.

Some of this sounds like a good idea to me, imo.