Backups and Maintenance Mode

I am currently testing the deployment of Nextcloud using Fedora CoreOS and Docker compose at the company I work at.
But I am running into the problem, of how to backup Nextcloud when you have a larger user base that needs access to the instance 24/7. The Nextcloud Admin Guid states that you should set the Nextcloud instance into maintenance mode, before creating a backup.

This is somewhat problematic if you are planning on creating several backups per day. As each backup would mean a short downtime.

My question is, what is the worst that could happen if you simply create backups without activating the maintenance mode.

I am also very curious to know if there is a better way since Nextcloud does Offer Enterprise scalability and I am quite sure that they would also like regular backups without downtime.

Any tips, links or suggestions are welcome. In favour of making this post short, I have omitted some technical details, but I am glad to share these if required :slight_smile:


I think this specific question requires a detailed answer (I believe) that should take your specific setup/ environment into account. And I think you get the best answer from the Nextcloud support or consulting team.

@jospoortvliet Do you offer consulting services for setting up backups (without a support contract) or is a support contract required for help with these kind of questions?

Anyway, for big NC environments and the importance for 24/7 availability, a support contract is a good way to go, in my opinion.

@Schmu Thank you for your reply :slight_smile:

Allow me to backpedal a bit. The company I work for has around 45 People and 24/7 availability would be nice to achieve but not absolutely necessary.

A quick fix would be to back up the files on an hourly basis in order to prevent data loss but also have a corn job on the host system, that creates a “clean” restore point during of hours once a week. This would be easy enough to implement.

I was mostly looking for further clarification with what is meant by: “prevent inconsistencies of your data”

I was also astounded that Nextcloud offers clustering support in their highest enterprise tier but I could not find one mention of people trying to minimize downtime during backups. I thought that I had missed something rather obvious.

i guess the basic thing is that you do not want a file to be changed (deleted) while it is being copied for backup since this backup would be useless so some kind of locking/downtime is sensible.
i think you can still get most of what you want by doing hourly (or even more) file or filesystem-snapshots and db-dumps or db-copies. for the files i’d consider it the best idea to do sth like a btrfs-snapshot because in this usecase this probably works better than just copying the files (even with rsync w/hardlinks e.a.)
the database has its own tools for that like mysqldump and mariabackup.
if there are heavy load and many users and open files on your system there will probably still be little inconsistencies in case of a bad crash but i can imagine they should be easily repairable (never tested that, though).
tell your users there will be one hour of downtime per night. during that time you can backup the files and do a db-dump or sth. like that. if you are really hard-pressed for uptime, start maintenance mode and stop the db; do a btrfs-snapshot of everything, start the db and nextcloud again, do backup-work from the snapshot (obviously this requires your stuff to run on btrfs.)

1 Like

@pete.dawgg thank you, this seems like a sensible approach. I will see how I can solve the implementation of this the underlying concept sounds good, thank you :slight_smile:

1 Like