Table of Contents

Rethinking my Backup Strategy

in this wiki you can find my current backup script mobi_backup which basically does everything i needed so far. However, after years of using it both privately as well as on customer installations, mostly for local backups, it is time to take a step back and re-think my backup strategy

The main reason why I feel my current backup solution is no longer the perfect solution for me is, that I want to better protect my data from ransomware and similar attacks as well as targeted hacking attacks, where a hacker (or a group of hackers) manually hijack a server and then try to cause damage to the owner of the server by messing with the data on it.

Mobi is not very secure in this regard, as it usually runs on a backup server that needs full root access to all backup clients it should pull backups from. So if someone hijacks the Backup server that person will automatically have password-less root access to any of the Linux systems it backs up, which is really bad to be honest!

On my private server, mobi is running on the client server itself and the backup is stored to another set of local disks.. that's also not very good, as ransomware would then encrypt both my data and my backup at once, so even the backup will be rendered useless.. even worse, since unchanged files between backups are hardlinked instead of copied, encrypting all the backups will be extremely fast, as one only has to encrypt each version of a file once.

Mobi does a great job to protect against accidential data loss, data loss due to hardware issues like multiple disk failures, loss of complete raidsets etc. and also in cases where the client server is hacked but the backup server is not.. since the backup is completely controlled from the backup server, there isn't anything a hacker can do on a client server (the one being backed up) to mess up the backup from there.

Since there have recently been an increasing number of reports of targeted hacking attacks on companies in my vincinity (meaning, switzerland in general, or simliar fields of operation, customers or direct compteditors of customers etc.) I realized it's time to re-think how i do backups and to set new goals as far as security goes.

the main goal of Mobi was to be as portable and simple as possible and to provide incremental backups where i had full snapshots of each backup in a simple folder structure to facilitate restores and make browsing of backups easy.

What i want from my new backup solution

discussion of available tools and solutions

i have discussed some tools mainly regarding encrypted backups already in encrypted_backups_to_the_cloud. In addition to that i have looked at some other tools for this project:

Borg

Borg is a very smart and capable backup solution with lots and lots of features and most importantly, block level deduplication. It can do a lot more than what I need, BUT it sadly does not allow to create a user that can only write new backups but not delete old ones as well. While there is a append-only policy available, it is impractical to rely on it, as one has to either forget about automatic purging of old backups. That's the deal breaker for me. Read more about this in the FAQ and the Drawbacks of append-only mode

Burp

Burp is the best backup tool i know to make backups of clients that aren't available 24/7 such as workstations or even notebooks. it also supports linux, windows and Mac OS. Burp actually does pretty much anything I want and a lot more out of the box. the only draw back that i found so far is, that it seems rather difficult to create secondary backups. there is an offsite-backup script but it says in the header of the script, that it doesn't quite work.. the main issue seems to be the fact, that burp moves the full-backup always along to the youngest backup and only keeps the previous versioins of changed or deleted files stored in the previous backup's data directory. this makes it very easy do purge old backups of course, but it is a little trickier to create secondary backups.. i haven't tested if this can be overcome by using hardlinked backups ( a config option in burp ) though. burp also supports client side encryption which breaks delta uploads of modified files. Burp is a complete and running software and would probably save some time on my end versus a home-made solution made of some other tools glued together, and it offers so many more features which i currently don't use (like windows backups) but that might come in handy in the future. So maybe my final solution could be writing a offsite-bakckup for burp to complete the required feature set for me :)

Restic

Restic seems to be an awesome tool that does almost everything i want my new backup tool to do.. Most importantly, it creates client-side encrypted incremental backups of your servers and can then store it to a broad range of storages available including S3 compatible storages etc. This is all very nice, BUT Restic itself runs on the client side only. This means, if you just use it like a standalone solution, an attacker who has (root)shell access to your system might delete your backups and then encrypt your data. The keyword here is “append only” backups and it must be implemented on the storage side. This in turn means, that restic won't be able to rotate backups anymore, so that needs to be implemented on the storage side. Restic comes with its own solution. They provide a REST server which has the option to accept append-only backups and therefore secures your backup from other attackers. This last part was something i had missed at first but with this server in the pbackage, it makes for a good alternative to burp.

dar

dar is a tool i have to take a closer look at.. it is used in various other backup software as an alternative to rsync and provides backup images which means we can retain permissions in backups even with only unprivileged access to the target storage. this would allow to implement a backup script that creates a backup to a write-only share which is then made available read-only for future use, hence protecting it from a possible attacker with control over the backup target. the big question is, how do they handle incremental backups with regards to rotating old backups and restoring backups. there is a decremental backup method available too (where the full backup is always the latest and then decrements (file states before they where changed to the current state) are saved for older backups). decrementals obviously create a lot of load at the end of a backup and will probably not work when older backups are read-only.. so this might be tricky.

SFTPgo

SFTPgo would make a good server side for a backup solution. Basically it comes with lots of features that can be helpful to implement an append only server to store backups. It is, as the name says, an SFTP server for linux and some more OS's which at first sounds strange, as linux already has openssh which is broadly used for sftp as well because it is on almost every linux server out there to provide ssh access anyway. However, SFTPgo is much more than just a simple sftp server. it comes with its own user management, it provides more protocols such as webdav and it even accepts rsync connections. it can share more than just a plain linux filesystem too.. it can share S3 storage, provide DAR encryption and it supports setting very granular access rights on top of the posix layer. So with this a share could be configered where one would only have write but not overwrite or delete access. pretty nifty.

possible solutions

Burp

Since burp has all the features we want, it seems like it might be worth to just dig into the remote-backup issue and fix that in whatever way i can find.

possible solutions to consider are:

the main advantages of using burp over a selve made variant are

self-made collection of other tools

so “self made” is a bit flexible here.. what i mean is a larger script that will use a combination of several tools together:

Unsolved issues of this solution:

First POC - Burp + rsync

NOTICE i had to give up on rsyncd as a rsync server to push the backup to, as rsyncd seems to be unsuitable for sharing files over the internet. Instead i later went with SFTPgo (see above) on the burp server to share the latest backups via sftp and then have the offsite server pull the data from the burp server. I recon this is still pretty safe as authentication can be done with ssh keys, SFTPgo allows to make sure that the user gets read-only access and nothing else, the data the user can read is encrypted with a key that neither the burp server nor the offsite server know, so a data leak through this channel would be pretty worthless i'd recon (given a good encryption password of course)

with all the arguments above considered, I decided to proceed a burp based solution and just add off-site capabilities to burp. Here is the targeted setup:

to try it all out i used a bunch of ubuntu test docker containers

docker network create burp

to create the custom network

docker run --net burp --name burpsrv -ti ubuntu-test:latest

to create the container for the burp server, and similar commands for the other servers.

for testing i ran the burp server with this command line:

burp -v -F -c /home/jdoe/burp/etc/burp-server.conf

which outputs any logs directly to stdout and keeps the daemon in the foreground.

offsite backup file encryption

since all files are encrypted on the client side before they are sent to the backup server, we don't have to encrypt them again when uploading them from the backup server to the offsite backup. this basically removes all the challenges mentioned in the “self made” backup solution above.

restore from an offsite backup

here is how i have tested the restore-ability of an offsite backup in case we have completely lost the backup server in between.

  1. set up a new burp backup server with the same client config
  2. setup a new client or delete the certificates if the client is still there and should be re-used
  3. using burp -a l execute the initial connection between the client and server and let burp create all the SSL keys.
  4. using rsync -aAHhvXxR –numeric-ids copy the desired backup to the /var/spool/burp/<client name>/<backup name> folder where backup name is identical to the one stored on the offsite server.
  5. now use burp -a l again and you should see the backup listed
  6. restore using burp -a r -b 5 -d / or a similar command, depending on your situation to restore the specified backup

this worked flawlessyl in my test, of coures as long as I still had the encryption password available from somewhere! .. needless to say, if you don't store your encryption password your backup is completely useless, so make sure your encryption password is saved somewhere where you will still have it, even if you loose the client, and of course, don't store it together with the backup on the backup servers :)