💾 Archived View for laniakea.rodoste.de › journal › 2023-10-06-offline-backups.gmi captured on 2024-08-25 at 00:32:35. Gemini links have been rewritten to link to archived content

View Raw

More Information

⬅️ Previous capture (2023-11-04)

-=-=-=-=-=-=-

🏠 home

I finally have offline backups

2023-10-06

My brothers company got hit by an encryption-attack the other week. Time to get an offline backup myself.

So my brothers company got hit by a cyber attack the other week that encrypted all their data, on every server, printer, copier and every client machine that was in the (virtual) network.

The company was smart, having everything backed up on offline tapes. They lost one day of data and one week work to get everything back up and running. I can only assume they took measures to get the infection out, too.

This reminded me that while I believe that I have a reasonable backup strategy resembling 3-2-1 (three copies of the data, two local but on different media, one offsite copy) what I do not have is a cold offline backup. Something that isn't connected to anything and can't be reached no matter how badly the network is infected.

Time to get on that.

Starting point

All machines (mine, my wifes, my selfhosting, our phones) do backup their vital data onto a central point in our home network in addition to machine-local backups for most machines. The central backup store then pushes the delta to an offsite hoster in another country weekly.

So all I need to do was to backup everything from that central point.

Other desires

An 8TB harddrive as backup solution was an obvious choice: It has enough storage to hold all data. It is relatively cheap. It is small and light enough to quickly grab and run in case of natural desaster.

Keeping that last point in mind, I want a self-contained unit. Not only a dumb harddrive with data, but a system that can boot and be used to bootstrap the recovery.

My approach

The solution I've settled with is this:

The harddrive has a boot partition and a 5GB partition for Debian Bookworm for ARM64 devices.

A third partition is also 5GB large but empty, this is meant to be free space to create a bootable system for x64 systems. Granted, I probably have to sacrifice the boot partition to swap between ARM64 and x64 but such is life.

The fourth and last partition is an almost-8TB NTFS storage space that houses all backups along with backup / restore tools.

Setting it up

This section is mostly for my own benefit and serves as documentation.

Getting a Debian image for Raspberry Pi to boot from a >2TB HDD took some time. In the end what worked for me was this:

First, download the debian image and write it onto the HDD with

    xzcat 20230612_raspi_4_bookworm.img.xz \
    | sudo dd of=/dev/sdb bs=64k oflag=dsync status=progress

Then, convert the partition table from Master Boot Record to GUID. This is the step that allows partitions larger than 2TB and it seems to be crucial to do this _before_ booting that system. Debian will try to expand the root partition to maximum size and that hangs if the partition table is still MBR but the drive is larger than 2TB.

I used the `mbr2gpt` bash script from the usb-boot tools zip file that I found in a Raspberry Pi forum to achieve this, link below.

The system is then ready to be booted and will expand the root partition to fill the entire disk.

Once that is done, I again reboot into my actual system, mount the backup HDD and re-shrink the boot partition to 5GB. I also add the additional partitions as mentioned above.

Lastly, I can boot back into the Debian system and configure it.

I've kept track of the cornerstone steps of installation in this script:

    #!/usr/bin/env bash
    
    # this script ISN'T a 100% faithful recreation of this system
    # but it outlines the most important parts that set it apart
    # from a baseline Debian Bookworm
    
    # set the hostname to something sensible
    hostnamectl hostname backupper
    
    # ensure up to date packages
    apt update
    apt -y upgrade
    
    # install essentials
    apt -y install avahi-daemon ntfs-3g
    
    # install backup / restore related packages
    apt -y install ecryptfs-utils clonezilla
    apt -y install rsync rclone duplicity
    apt -y install zulumount-cli zulucrypt-cli zulusafe-cli
    
    # various utilities
    apt -y install git wget tmux
    apt -y install figlet lolcat
    ln -s /usr/games/lolcat /usr/bin/lolcat
    wget http://www.figlet.org/fonts/chunky.flf /usr/share/figlet/
    
    # create a better pre-login banner
    figlet -f chunky "I make backups" | lolcat -S 3 -F 0.2 -f > /etc/issue
    echo >> /etc/issue
    
    # install a browser
    apt -y install w3m
    
    # backup sessions can be long, so here's the original rogue
    apt -y install bsdgames-nonfree
    ln -s /usr/games/rogue /usr/bin/rogue

The system only has a root user without password. This isn't an issue since the disk will be offline and off-power nearly all the time.

In addition to the steps above I have disabled some daemons, moved SSHd to another port and secured the daemon. I can SSH into the machine with my default key if I ever need to.

Backup Scripts

Here's an overview of the backup scripts that come with the disk:

Further some tools which were written for the Raspberry Pi but can be used _from_ a running Raspberry Pi to backup mounted disks. They differ in what they can do and whether or not they can shrink backups.

Other tools

Links

RonR-RPi-image-utils

pi-safe backup utilities. Can't backup the running system, but will shrink backed-up partitions. Useful for SDcards

Forum thread where you can download the usb-boot tools

---

see all my articles