How to backup a Nextcloud-installation (+MariaDB and Letsencrypt) with Docker

Hey!
I am new here and also new to the whole Nextcloud/Docker universe in general, so I just want to apologize first of all for this unsophisticated post.

I did a fresh nextcloud installation in my OMV via the GUI with MariaDB and LetsEncrypt. Everything works just fine but I was asking myself how to do automated backups of my data and the whole installation in a sufficient way.

I connected a second external drive to my server tried to follow the steps discribed in the nextcloud documentation- unfortunately I was not able to write a working script to backup everything I need to build a new Server in case of a system-crash or anything similar…

Here are the bash commands I tried to use:

docker exec -it nextcloud bash
sudo -u abc php /config/www/nextcloud/occ maintenance:mode --on
exit
tar -cpzf /sharedfolders/Next_Backup/nextcloud_files_$now.tar.gz /sharedfolders/Nextcloud-Data/
docker exec mariadb sh -c exec mysqldump --single-transaction -h localhost -u 1000 -p100 nextcloud > /sharedfolders/Next_Backup/nextcloudDB_1.sql
docker exec -it nextcloud bash
sudo -u abc php /config/www/nextcloud/occ maintenance:mode --off

I was able to create backupfiles of the Data and the Database… but I would like to automate the process… is this possible? And How? Or is there a better way to do this? (And how would I restore the data if needed?

Thank you very much in advance and kind regards! :slight_smile:

Here is my System-Configuration:

Server configuration detail

Operating system: Linux 4.19.0-0.bpo.6-amd64 #1 SMP Debian 4.19.67-2+deb10u2~bpo9+1 (2019-11-12) x86_64

Webserver: nginx/1.16.1 (fpm-fcgi)

Database: mysql 10.4.10

PHP version: 7.3.11

Modules loaded: Core, date, libxml, pcre, zlib, filter, hash, readline, Reflection, SPL, session, cgi-fcgi, bz2, ctype, curl, dom, fileinfo, ftp, gd, gmp, iconv, intl, json, ldap, mbstring, openssl, pcntl, PDO, pgsql, posix, standard, SimpleXML, smbclient, sodium, sqlite3, xml, xmlwriter, zip, exif, imap, mysqlnd, pdo_pgsql, pdo_sqlite, Phar, xmlreader, pdo_mysql, apcu, igbinary, redis, memcached, imagick, mcrypt, libsmbclient, Zend OPcache

Nextcloud version: 17.0.1 - 17.0.1.1

You don’t need to sudo in the bash command. docker exec -u www-data nextcloud php occ ... will do the job.

You have to create a cronjob for root or any user on your host that can run docker commands to call your backup script. crontab -e
The cron syntax you’ll find on Google.

You forgot config.php . Or?

And you may have a look at

I prefer filesystem snapshots for both mariadb and data. They are more atomic and you reduce downtime. Mysqlbackup is tricky to get right as there is more to the db than just the table data.

Thank you for the reply!
Unfortunaly I use ext4 as my filesystem… As far as I know this dowsn’t support snapshots. Otherwise this would be the way…

Thank you for your reply!

You don’t need to sudo in the bash command.

Yes :sweat_smile: you are right…

docker exec -u www-data nextcloud php occ ... will do the job.

…it does not… the following output ist given by the command:

# docker exec -u www-data nextcloud php occ
unable to find user www-data: no matching entries in passwd file

Yes, could be… where can I find this file?

However, this procedure seems quite laboriuos to me- isn’t there an easier way of backing everything up? -besides snapshots?^^

On top of my head I think you need to:

  • backup the sql database. mysqlbackup is an option, just make sure you include the correct flags.
  • backup the nextcloud installation folder including the config and data folders
  • backup any user filed, especially if it is outside the nextcloud/data folder
  • you may also want to backup your apache/nginx and php configuration files.

There are many ways to backup files. Rsync is very common. Another way is to use tar to create archives.

Some dedicated backup software might be needed if you want to do incremental backups and store several copies.

This is the layout on my installation. config.php is in the config directory. I’m not using Docker though, just plain Linux server with Apache, php-fpm and MariaDB.

okay, this is what I initially wanted to do…

I tried to do so with this:

docker exec mariadb sh -c exec mysqldump --single-transaction -h localhost -u 1000 -p1000 nextcloud > /sharedfolders/Next_Backup/nextcloudDB_1.sql

but I am not sure if it’s working… there is a file created in the target directory… but how can I see if it is working?

I think

tar -cpzf /sharedfolders/Next_Backup/nextcloud_files_$now.tar.gz /sharedfolders/Nextcloud-Data/

will do this job in my case… I just have to add another entry for my config-files, right?

Yes! But how can I do this? (Sorry, as I told you, still have much to learn about this stuff)
Or would you recommend to rebuild the whole system without Docker once again?

Is this the same as the data directory from my screenshot?

Does docker not store all files in an image file. There may be a way to backup the whole image. That would contain the config file.

How is php and apache installed?

Some related threads about backing up containers How can I backup a Docker-container with its data-volumes? - Stack Overflow

Nginx is installed instead of apache… it is part of another DockerImage that provides the automatic generation of a SSL-certificate (Letsencrypt)


this is the content of Nextcloud-data… it is mounted into the Filesystem outside the container…

PHP is installed via another Docker container

That means you need to read up on how to backup docker volumes, as all configuration files are inside those.

@JoRo1990 because you have another webserver user in your container. which image are you using?

i’m using the command in my playboooks and they work.

well.

you create each time a new tar file. and you do this on the same device nextcloud is running on. so you have to make sure older tar file get deleted. and hope your device/server/nas will never fail.

that’s why backup is always kind of challenging.

your first “problem” is solved with restic because you can do automatic house keeping.

for the second “problem” you would need a second device or cloud storage.

you do a restore. :wink:

no. because they come with their images. unless you have additional setting. but they would be stored on your host.

no. but you should do docker right. :wink:

would you mind to send me your docker-compose file via dm? or the output of docker ps and docker inspect <container-id>

just to be precise: the images of nextcloud, nginx, mariadb, etc.pp. are stored on docker hub. you pull them and start containers. anything change or created in the file system of these containers are lost in case you remove the container and start a new one from the image. happens in case of an update.

to get data persistent you use volumes. there are two types of volumes. you can define a directory (like /sharefolders/nextcloud-data) and map this into the containers filesystem. (-v /sharefolders/nextcloud-data:/var/www/html/data) the disadvantage is that you get lost/confused with ownership and rights of that files. I’m pretty sure there should be no user Johannes in your container so your screenshot is a bit confusing to me (and maybe to your docker system.) it’s better to let docker handle this. you do this by defining a volume in your compose file and using -v without a / at the beginning. (e.g. -v nextcloud-data:/var/www/data) now you will find the somewhere below /var/lib/docker. docker inspect will tell you. the files can be backuped like all other files on your host.

so what you want to backup is your docker-compose file and all volumes. if you take this to another server and the system is restored correct you are absolutely sure that backup/restore is working. if you run nextcloud the traditional way it’s a lot more work collecting all files from /etc and so forth.

p.s.: in case of Johannes and Julia only using nextcloud to share some files it might be enough to have the synced files on your desktop and/or laptop. no need for backup here at all. :wink:

I did not want wo be offensive, sorry- I just thought that my solution is not the best and there woiuld be a more elegant way :innocent:

However thank you so much for your very helpful answer! Unfortunately I won’t have time this weekend to work on my project but on monday I can send you the files (I did not use docker-compose, I’ve rather built seperate containers and linked them to work with each other. I followed a tutorial on youtube to do so: https://www.youtube.com/watch?v=YWkWARXzW0k&t=19s (its in german but the visual information given in the video should be enough :wink: ) - but I can also send you the confuguration of the docker containers in monday.

Thank you once again and have a nice weekend! :slight_smile: