Best way to create system and files backups

you’re right, clonezilla is still the swiss-knife but when it comes to a server-backup it’s weaken and that’s the main problem of the thread-opener. a good backup-strategy for servers is far-way difficult, especially if the server has to run 24/7

I see. I thought the OP wanted to store the backup on an online server, not to backup the server itself. If the server is on a VM, then the easiest way would be to copy the whole VDI file (the file that represents the virtual hard drive) and store it somewhere else.

Well, in that case you have the option to either use the paid backup service of the provider, which usually does exactly what you are looking for, saving snapshots of the image.

Other choice is what i already outlined, save only the variable data (web, database, mail, whatever you have) and logs, the rest of the server can be recreated any time and its contents should not matter.
For that I recommend using rsync and SSH, but you can use FTP, just make sure its encrypted.

Note that you likely have transfer quotas, and transferring manual backups will count towards that, which is another reason doing full system backups on your own is not recommended.

Not really. Let’s say I have:

  1. VPS or dedicated server
  2. FTP storage

Having them, I want to make a full backup of the VPS system to an FTP storage. In case of data loss / crash / hack / meteorite fall, I can easily download a snapshot from the storage and restore the system along with all the data

I can just save my sites and other data. But a system snapshot allows you to save much more - packages, configs, and so on. The system administrator does not have to do everything again. In addition, a full snapshot will even allow you to switch to another hosting without losing your system

Unfortunately, cheap providers do not have this. It is much more profitable for me to rent a 150GB FTP storage than to look for a hosting with such a service

I know what purpose of the backup is, I do that every week for my system.
You could ask the hosting provider to install CloneZilla on another storage of the VPS and then use it (through the control panel) to create the backup.

I think, you need backup with the useful abilities: “compression”, “deduplication”, “encryption”, “transfer” and “multiple snapshots with different timestamps” are the good solution for the small storage.

Timeshift uses rsync that does not support deduplication. If you do the compression manually or automatically, but it is very slower than BorgBackup/Restic/Kopia that has many features what you need.

See different backup programs

https://wiki.archlinux.org/title/Synchronization_and_backup_programs

1 Like

That is masking an entirely different issue than data protection.
You could clean config of secrets and push it to a private repo to keep track of its changes, use a tool like Ansible and write a playbook for setting up a server.
Config is static, packages are already “backed up” on their respective repositories.
If rsync+SSH is not an option, then just use tar with -g, no need to overthink it.

As for backing up everything:
If you do full backups, you will hurt your quota.
If you do incremental backups, it will additionally consume your quota after every system upgrade and likely exhaust it on a restore.
If you do differential backups, it will continuously additionally eat away at your quota after a system upgrade.
You will have to do a full backup after every system upgrade to use as a new starting state to minimize the drain on the quota on both incremental and differential methods.
You do not have a good choice here, as you can see.

Ultimately you can change providers to one that offers backing up the image, do what I advised and save only the data you need, or pick what is the best of the worst case backup methods for you.

I believe it is possible to have incremental backups, but limit the number of backups. For example, timeshift can make daily backups, deleting old ones if there are more than N

One of my friends and I are planning to write an open source service that would allow us to do this conveniently and store it wherever we want (including on FTP). If there really is no solution that covers my needs, maybe we really should code this

I wanted something all-in-one. System backups, backups of individual files, automatic deletion of old backups, automatic sending to an FTP server. But I was sure that someone had already implemented it a long time ago and I should not reinvent the wheel

You can’t execute on an FTP, so to do that you would have to redownload the backup archives, merge and upload again. Even if you merge 1 and 2 it will still accumulate over time.
Also you should encrypt all archives - that goes double considering you are storing them on an FTP that’s not controlled by you.

Update:

Restic version 0.14.0 or newer already supports the compression zstd by default.
Its option --compression with off, auto or max.

1 Like

The Issue with clonezilla is that you need to boot the ISO to do the backup (unless I am wrong).

What I would like is something similar what Macrium Reflect and others do in Windows which is to create a snapshot of a running system, send that to some location (one giant file). This file can be mounted and browsed, files can be just copied out. Alternatively if the drive dies you can boot the Macrium ISO and restore the image to a drive of same size or larger. Very easy and verifiable backup all via GUI.

Btrfs lets you make snapshots but its a mess when you want to store this on for example a FTP server that doesn’t do btrfs. Also how do you mount and browse files, restore an entire system etc. It is very complex and you can easily do something destructive.

Not necessarily. AUR contains a package clonezilla to install and use in the distro while it is running, so I assume it must be capable of doing the job without booting an ISO. I admit I’ve never used it that way, but considering CZ is excluding all the busy partitions before loading the interface, I think it’s possible to be used the normal way without booting. Especially if the source partition (where the thing that you wanna backup is located) and the target (home) partition are different that the distro partition.

:point_up: :point_up: :point_up:


Clonezilla and “image” backups are useful for long-term archives, data recovery, and migrating devices that require a cloned copy of the block device, bit-for-bit. :+1:

However, they are poor choices for traditional and frequent backups, and are non-intuitive if you want to browse or view your backups, let alone grab a single file from the backup. :-1:


@Zesko provided the best overall solution that addresses most use-cases; especially in regards to the OP of this thread.

Another file-based alternative is BackInTime, which uses rsync as the backend, and has a user-friendly GUI.

2 Likes

I can only agree with you, Reflect is a wonderful piece of software and the program I miss the most from my Windows days. It’s so incredibly handy to be able to make a full image of a running system and easily schedule it with version control. You can even do differential and incremental backups to save disc space.
The lack of anything like it on Linux I believe has to do that Linux doesn’t have anything similar to Windows Shadow copy service.

ZFS (and Btrfs, but like-for-like ZFS is superior, FIGHT ME, BRO! :facepunch: ) innately supports this, even more streamlined due to its block-based (records-based) design, copy-on-write, and snapshots.

It’s not that “Linux lacks anything like this”, but rather it’s an unfortunate situation of no user-friendly GUI options, and a legal grey area with regards to ZFS in particular.

So, yes it’s very possible (and powerful) to do in the terminal in Linux. (I know, because I’ve done it myself.)

But we lack a GUI, and ZFS is a “second-class citizen” on Linux because of uncertain legal bullcrap.

1 Like

Vorta is a front end gui for Borg

If you are interested in the current benchmark in 2022: Borg, Restic, Kopia vs. Bupstash

1 Like