RPi Backup via dd: Good idea via SSH?

Hello,
i have to frequently backup my Raspberry Pi.

Until now i used to shutdown the raspberry, plug the SDcard into my Manjaro System and backed up the whole SDcard via
sudo dd if=...

I now finally had time to get the same thing going via SSH by using
ssh pi@[ip] "sudo -S dd if... | gzip -" | dd of=... status=progress
(With the awesome help of this community)

Backing up via directly plugging in the SDcard, creates an image of nearly the size of the card = ~30GB
Backing up via SSH creates files only the size of ~5,3 GB

Here are my questions:

  1. The filesize from SSH-Backup is smaller because of the “gzip” compression option?
  2. Despite the SSH-Backup .img file is much smaller, it should include absolutely everything?
  3. The SSH Backup is made during RPi runtime and needs very much time → Is there a chance, that the Rpi-System changes too much during Backup, so that the created img wil not run after flashing because there are inconsistencies in the img-file, caused by the “Slow backup during changing system-files”

Im afraid, that im backing up and backing up and in an emergency the images are not working.
I did a test, where flashing such a compressed .img, generated via SSH worked to recover the system without any errors (until now)

dd of a mounted disk is not “proper”. Will it work most of the time, yes but it is a “dirty” backup. Better than nothing but not a good backup plan. There are examples of how to backup a live system which can be done via ssh, but involves the use of rsync which excludes the state files (/run /sys /proc etc).

Yes, the use of zip will make an output file that is smaller in size than the image created with dd.

3 Likes

Hi @SHUred,

I suspect the size difference is because of multiple reasons:

  1. When you back it up directly, there is not compression. And VIA ssh there is.
  2. I suspect there is a lot of free space on the SD card? Free space is…well, free, so it compressed extremely well, as there is no info to keep. I suspect if you backup a 20GB card, with gzip like above, it’ll only be a few MBs, if not KBs.

Seeing as you restored it fine, I don’t think there’ll be a problem in that regard.

Hope this helps!

Edit:

What @0n0w1c said. :point_up_2:

2 Likes

Why not use rsync?

The initial sync would take time but then you could run rsync peridically using a systemd timer.

Should the need to restore be an issue you can rsync the backup location back to a new sdcard using your workstation.

1 Like

Regardings rsync:
Currently im using rsync on my Manjaro-System via Timeshift
For my Raspberry i think this would be problematic because:

  • Using rsync without Timeshift → No version control after sync is done → Risk of syncing already present errors in the current system? → I would like to go back to e.g. the 3rd previous backup etc.
  • Using Timeshift → Need to have a ext4 storage mounted to the system?
  • rsync generally is just copying files and not creating flashable images for “fast-recovery”???

In general:
One very important part of my Raspberry are my docker volumes located at
/var/lib/docker/volumes
where (i think) only root has access to

Looks like my problem is, that i want to run a backup of a running system while not running :sweat_smile: → So its impossible

I think my solution will be:

  • Automated backups (e.g. via cron) with dd via ssh
  • Each and every time and before big changes i will create a manual SDcard backup which are the “clean” backups
  • I will research if backup of /var/lib/volumes/* is convenient in my case

Thanks for your fast and helpful replies

  • Then you may have a look at backuppc ;
  • There are similar solutions which use hardlinks

rsync

Pro: You would get a clean filesystem without fragmentation :wink:
Con: You would need to prepare bootloader and partitons bevorehand
Solution: Use a basis image, then overwrite with rsync.
P.S. I work with lots of embedded devices like raspi and would NEVER use dd in a running system :man_shrugging:

3 Likes
  1. rpi-imager

  2. pi-shrink

One caveat: My pi3 runs Buster…but these both worked like a charm.

:point_up: :point_up: :point_up:

2 Likes

rsync syncronizes your files from one location to another.

If you want version control you could use git on the target - perhaps just attach an usb device.

  • Create a branch and check it out
  • Then rsync
  • When done stage and commit the changes
  • Checkout master

@merlock

rpi-imager is a nice tool - it has made it to the official repo - I don’t recall it was there a couple of days ago.

3 Likes

I also thought about pushing this Stuff to Git, because im already doing this with my dotfiles.
At the moment my content is not that big, but i added a dockerized influxdb database and soon it could grow.
Also i dont know about all the “secrets” and “privacy-data” i would push to git …

All of this stuff i want to simply fully re-flash in case of e.g. a SD-Card breaks

Nice!
What exactly is “rpi-imager”?
On Git i cant find a link to documentation or something …

It is the raspberry pi imaging tool.

It can download write a whole heap of different images and for some of the options you can even specify system options such as ssh server, initial username and password - among other things.

1 Like

This is only possible in a filesystem with snapshots !

:man_shrugging:

And even there it is not sure that the system will run unharmed when started. (Only the filesystem itself is protected from harm)

Like btrfs send/receive :rofl:

1 Like

is possible to use xz and xzcat instead of gzip, rsync with proper flags and --delete --exclude={} on a running system example:

rsync -aAXHv --delete /boot/* /media/$USER/$label/
rsync -aAXHv --delete --exclude={/boot/,/dev/,/proc/,/sys/,/tmp/,/run/,/mnt/,/media/,/lost+found,/var/cache/pacman/pkg/*} / /media/$USER/$label/
search for a tutorial.

1 Like

Thanks for your advice!
I will fight my way through!

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.