Hi Everyone
I need some help
I’m currently selfhosting some of my applications on Digital Ocean and i run a container using Portainer CE. I was wondering how you guys keep backups for the applications running on docker.
I’m currently using Digital ocean’s snapshots feature but is there a better way i could use, any help on this is highly appreciated.
I use docker in Proxmox and i backup all container
I use an Ubuntu vm for all my containers in proxmox and make backups of the vm onto my zfs pool
Uuuh…timeshift and borg??
Hey that is the plot to First Contact.
Unraid with Duplicacy and Appdata Backup incremental to Backblaze
Proxmox Backup Server (PBS) snapshotting all my VM’s / LXC’s.
External VPS’ and anything that can’t run PBS-Client I am rsync’ing important data into my home network first, then doing a file based backup of that data to PBS via PBS-Client tool. All this is automated through cron jobs.
Those backups then get sync’d to a 2nd datastore for a bit of redundancy.
I backup all the mounted docker volumes once every hour (snapshots). Additionally i create dumps from all databases with https://github.com/tiredofit/docker-db-backup (once every hour or once a day depending on the database).
ZFS snapshots.
When backing up Docker volumes, should not the docker container be stopped first.?? I can’t se any support for that in the backup tools mentioned.
Yes the containers do need to be stopped. I actually built a project that does exactly that.
Thanks, I will look into this.
I use Nautical. It will stop your containers before performing an RSYNC backup.
I have bind mounts to nfs shares that are backed my zfs pools and last snapshots and sync jobs to another storage device. All containers are ephemeral.
On Proxmox i use for my Backup Solution - Hetzner Storage Bix
For databases and data I use restic-compose-backup because you can use labels in your docker compose files.
For config files I use a git repository.
Kopia has been great.
I use duplicati to backup to a secure off site location. Useful for something like vaultwarden.
Most of mine are lightweight so private repos on git.
For big data I have two NAS that sync on the daily.
Cronjobs to backup important folders to a separate disk
Git repo(s) for services & configs with weekly automated commits and pushes
I do the reverse… all configs are ansible scripts and files and I just push them to the servers. That way I can spin up a new machine from scratch, completely automated within minutes… just the time it takes the machine to set itself up.