Backup script for docker-compose/ZFS/rsync based Nextcloud installation

Hi folks,

First time posting here!

My current environment is:

Where the folders are:

  • Server
    • ~/nextcloud/ to host the docker-compose, etc.
    • ~/nextcloud/scripts to host some scripts I use for nextcloud (like update.sh and backup.sh)
    • /tank/nextcloud-db a ZFS subvolume to store the MariaDB database
    • /tank/nextcloud a ZFS subvolume to store nextcloud
  • Pine64
    • /HDD

The script does the following:

  • Set nextcloud in maintenance mode
  • Dumps a MariaDB backup
  • Performs a ZFS snapshot of the nextcloud subvolume
  • Disable nextcloud maintenance mode (so, total downtime, the MariaDB backup only as the ZFS snapshots are immediate!)
  • Backup the ZFS snapshot and the MariaDB backup remotely using rsync in an incremental fashion (using hardlinks, etc., see https://blog.interlinked.org/tutorials/rsync_time_machine.html currently down :frowning: )
  • Keep older backups with the backup date
  • Have a current symlink to the latest successfull backup

The script is the following:

#!/bin/bash
set -eo pipefail

# Folders
BACKUPFOLDER="/tank/backup"
DESTINATION="/HDD/nextcloud-backup"
# Nextcloud volume
NEXTCLOUDVOLUME="tank/nextcloud"
# Files
DOCKERCOMPOSEFILE="/home/myuser/nextcloud/docker-compose.yml"
DBENVFILE="/home/myuser/nextcloud/db.env"
DBFILE="${BACKUPFOLDER}/nextcloud.sql"
EXCLUDEDFILES="/home/myuser/nextcloud/excluded-files.txt"
# Others
HOST="pine64.example.com"
DESTUSER="myuser"
SSHKEY="/home/myuser/.ssh/id_rsa"
SSHPORT="1234"
BWLIMIT="2048"

# Get mysql credentials
source ${DBENVFILE}

DATE=$(date "+%Y-%m-%dT%H_%M_%S")

echo "Started at ${DATE}"
echo "Checking /HDD"

ssh ${HOST} -p ${SSHPORT} \
  "grep /HDD /proc/mounts -q || exit 2"

# Set nextcloud to maintenance mode
docker-compose -f ${DOCKERCOMPOSEFILE} exec -T --user www-data app \
  php occ maintenance:mode --on
# Dump mysqldatabase
docker-compose -f ${DOCKERCOMPOSEFILE} exec db \
  sh -c 'exec mysqldump --single-transaction -u${MYSQL_USER} -p"${MYSQL_PASSWORD}" "${MYSQL_DATABASE}"' > ${DBFILE}
# Create a zfs snapshot
sudo zfs snapshot ${NEXTCLOUDVOLUME}@backup
# Restart nextcloud
docker-compose -f ${DOCKERCOMPOSEFILE} exec -T --user www-data app \
  php occ maintenance:mode --off

# Create a symlink in the destination folder to the zfs snapshot (readonly)
ln -s /${NEXTCLOUDVOLUME}/.zfs/snapshot/backup/ ${BACKUPFOLDER}/nextcloud

cd ${BACKUPFOLDER}

# Perform the backup
# https://blog.interlinked.org/tutorials/rsync_time_machine.html
# Removed compression due to https://gist.github.com/KartikTalwar/4393116
sudo rsync -aPL \
  -e "ssh -l ${DESTUSER} -i ${SSHKEY} -p ${SSHPORT} -o Compression=no" \
  --delete \
  --delete-excluded \
  --exclude-from ${EXCLUDEDFILES} \
  --bwlimit=${BWLIMIT} \
  --link-dest=${DESTINATION}/current \
  ${BACKUPFOLDER}/ ${HOST}:${DESTINATION}/incomplete_back-${DATE}

# Organize the stuff in the backup host
ssh ${HOST} -p ${SSHPORT}\
  "mv ${DESTINATION}/incomplete_back-${DATE} ${DESTINATION}/back-${DATE} && \
  rm -f ${DESTINATION}/current && \
  ln -s ${DESTINATION}/back-${DATE} ${DESTINATION}/current"

# Clean up the symlink
rm -f ${BACKUPFOLDER}/nextcloud
# Clena up the database
rm -f ${DBFILE}
# Destroy the snapshot
sudo zfs destroy ${NEXTCLOUDVOLUME}@backup

DATEEND=$(date "+%Y-%m-%dT%H_%M_%S")
echo "Finished at ${DATEEND}"
exit 0

The content of the excluded-files.txt is:

nextcloud/html/data/myuser/files_trashbin/
nextcloud/html/data/myuser2/files_trashbin/
nextcloud/html/data/admin/files_trashbin/

HTH!

1 Like

Hi minWi,

Sry to reply so long after this topic was posted, but I’m planning a Nextcloud install on Ubuntu 20.04, with ZFS, and using the same docker-compose as the one you used.

I’ve done a couple of test runs on a pretty basic setup (single drive, default partitioning), and I have a working example.

I’m trying to work through a couple of other issues that I’d like to sort out for my production setup (including moving the data directories to my chosen dataset), and was wondering if it would be possible for you to post a version of your docker-compose showing your data mounting strategies/commands.

Thanks.

i use to put config & data into /opt/nextcloud/{data,config} (or where ever you point nextcloud_base_dir to in the inventory.)

nextcloud code and the database are stored docker volumes /var/lib/docker. you don’t need to backup this files.

my playbook creates a backup script with restic and/or rclone.

the database is dumped during backup to /opt/nextcloud/database-dump.

after the backup of /opt/nextcloud you should have everything you need to restore the installation.
just run the playbook, stop everything, restore /opt/nextcloud and load the database dump. and start everything again.

of course you can use the zfs snapshot feature instead of putting nextcloud into maintance mode. depends how much time you have for your backup.

Thsnks for your teply.

I was planning on using the built in ZFS tools and sanoid/syncoid for backup, but I’ll have a look at your script as an alternative.

I think I’ve got the procedure of mounting an ‘external’ data folder to the container from
here

and

here

I added the following to the app:nextcloud section of my docker-compose:

- /data/nextcloud.files:/var/www/html/data

and everything built and runs fine on my test VM, with data present in the specified folder.