Simple backup that chops up a directory (including compression and encryption) into smaller chunks. The main use case is backing up to a smaller medium (blu-ray) for cold storage.
Additionally the techniques used to encrypt and store the data should be "simple". Meaning that, if the software is lost in the future (no one knows the future), the user is able to restore the files using bash and common apps. Currently only bash, tar, bz2 and gnupg (if encrypted) is needed. Sqlite is optional.
Works in the following steps
- Ingests the whole directory and builds up a database of all files with sha256/512 hashsums
- Builds a diff against the supplied database (maybe from the last backup?)
- Creates single disc backups performing the following steps
- Loop
- Gather around 1G of source data
- If source file is larger than 1G?
- Binary split the file in 1G chunks
- If source file is larger than 1G?
- Compress this
- Encrypt this
- Store the information what is stored where in the database
- Gather around 1G of source data
- Until a full blu-ray is reached
- Burn all archives and the current state of the DB to disk
- Loop
- Continue until all differences have been stored
git clone https://github.com/markus-seidl/pybutcherbackup.git
pip install -r requirements
python main.py #see usage below#
docker run -it -v <path-to-src>:/src -v <path-to-dest>:/dest augunrik/pybutcherbackup backup /src file:///dest
python main.py backup #src# #dest# --passphrase "password"
python main.py restore #src# #dest# --passphrase "password"
PyButcherBackup is designed so, that you could restore every backup with a bit of bash magic and standard unix tools (tar, gpg, cat, gzip/bzip, sqlite).
Stores everything inside the given directory. Creates subdirectories named 00001, 00002, etc. for each medium size.
This backup type allows hooks after all internal tasks have been run. See useages of hookhelper.py#execute_hook
TODO
TODO