Best way to create system and files backups

you’re right, clonezilla is still the swiss-knife but when it comes to a server-backup it’s weaken and that’s the main problem of the thread-opener. a good backup-strategy for servers is far-way difficult, especially if the server has to run 24/7

I see. I thought the OP wanted to store the backup on an online server, not to backup the server itself. If the server is on a VM, then the easiest way would be to copy the whole VDI file (the file that represents the virtual hard drive) and store it somewhere else.

Well, in that case you have the option to either use the paid backup service of the provider, which usually does exactly what you are looking for, saving snapshots of the image.

Other choice is what i already outlined, save only the variable data (web, database, mail, whatever you have) and logs, the rest of the server can be recreated any time and its contents should not matter.
For that I recommend using rsync and SSH, but you can use FTP, just make sure its encrypted.

Note that you likely have transfer quotas, and transferring manual backups will count towards that, which is another reason doing full system backups on your own is not recommended.

Not really. Let’s say I have:

  1. VPS or dedicated server
  2. FTP storage

Having them, I want to make a full backup of the VPS system to an FTP storage. In case of data loss / crash / hack / meteorite fall, I can easily download a snapshot from the storage and restore the system along with all the data

I can just save my sites and other data. But a system snapshot allows you to save much more - packages, configs, and so on. The system administrator does not have to do everything again. In addition, a full snapshot will even allow you to switch to another hosting without losing your system

Unfortunately, cheap providers do not have this. It is much more profitable for me to rent a 150GB FTP storage than to look for a hosting with such a service

I know what purpose of the backup is, I do that every week for my system.
You could ask the hosting provider to install CloneZilla on another storage of the VPS and then use it (through the control panel) to create the backup.

I think, you need backup with the useful abilities: “compression”, “deduplication”, “encryption”, “transfer” and “multiple snapshots with different timestamps” are the good solution for the small storage.

Timeshift uses rsync that does not support deduplication. If you do the compression manually or automatically, but it is very slower than BorgBackup/Restic/Kopia that has many features what you need.

See different backup programs

https://wiki.archlinux.org/title/Synchronization_and_backup_programs

That is masking an entirely different issue than data protection.
You could clean config of secrets and push it to a private repo to keep track of its changes, use a tool like Ansible and write a playbook for setting up a server.
Config is static, packages are already “backed up” on their respective repositories.
If rsync+SSH is not an option, then just use tar with -g, no need to overthink it.

As for backing up everything:
If you do full backups, you will hurt your quota.
If you do incremental backups, it will additionally consume your quota after every system upgrade and likely exhaust it on a restore.
If you do differential backups, it will continuously additionally eat away at your quota after a system upgrade.
You will have to do a full backup after every system upgrade to use as a new starting state to minimize the drain on the quota on both incremental and differential methods.
You do not have a good choice here, as you can see.

Ultimately you can change providers to one that offers backing up the image, do what I advised and save only the data you need, or pick what is the best of the worst case backup methods for you.

I believe it is possible to have incremental backups, but limit the number of backups. For example, timeshift can make daily backups, deleting old ones if there are more than N

One of my friends and I are planning to write an open source service that would allow us to do this conveniently and store it wherever we want (including on FTP). If there really is no solution that covers my needs, maybe we really should code this

I wanted something all-in-one. System backups, backups of individual files, automatic deletion of old backups, automatic sending to an FTP server. But I was sure that someone had already implemented it a long time ago and I should not reinvent the wheel

You can’t execute on an FTP, so to do that you would have to redownload the backup archives, merge and upload again. Even if you merge 1 and 2 it will still accumulate over time.
Also you should encrypt all archives - that goes double considering you are storing them on an FTP that’s not controlled by you.

Update:

Restic version 0.14.0 or newer already supports the compression zstd by default.
Its option --compression with off, auto or max.