Scripts for backup/restore

Besides the explanations about backup and restore in the Nextcloud documentation, I’ve created some bash scripts for easy backup/restore of your Nextcloud instance.

The Scripts can be found on Codeberg:

Using the scripts

  • The backup script will backup the file directory (e.g. /var/www/nextcloud) and the data directory (e.g. /var/nextcloud_data) as tar.gz files. The database backup will be an SQL dump file (*.sql).
  • The backups will be stored in some mounted location (e.g. /mnt/Share/NextcloudBackups). This could be anything like a flash drive, external hard drive or a network share.
  • To backup your Nextcloud, simply call This will backup Nextcloud to a folder with time stamp (e.g. /mnt/Share/NextcloudBackups/20170910_132703).
  • Use the time stamp (20170910_132703) as parameter for the restore script, e.g. simply call 20170910_132703.


  • Do not use the scripts out of the box: These need to be customized for your specific Nextcloud instance. Everything which needs to be customized is marked with TODO in the script’s comments.
  • The scripts assume that your data directory is located outside of your web root.
  • If your data directory is located inside the web root, the scripts need to be altered, otherwise the data directory will be copied twice.

Feel free to give it a try. Suggestions and/or contributions are welcome!


Thank you for putting this together :slight_smile:

Nice :+1:

Could be easily extended to also support apache webserver (service apache2 stop/start), where some check could be implemented or the error messages about missing service could just be suppressed. Or you add some variable about the webserver service name to adjust at the top of the scripts.

Dependent on the size/usage of your data folder, keeping many backups of it, could fast fill your backup drive. I personally just keep one single backup of it. Internally trash and versions are a sort of additional backup for changing files (text/docs vs. images). So e.g. an additional variable for the amount if backups just for the data folder could be implemented.

But anyway thank you very much for sharing this :slight_smile:!

Hey @MichaIng thanks for the hint. I totally forgot to mention that the scripts assume nginx as web server.
I’ve changed it the way that the service name of the web server can now be specified.

The only thing I’m uncertain about is keeping different amounts of file/data backups. Will older data directory backups still work when it doesn’t fit to the rest of the backup?

1 Like

In case you choose to keep e.g. just one data folder backup but e.g. 10 database and nextcloud folder backups, there would be in case the data folder backup same age or newer :wink: . Yeah, in case you choose an older database backup, you would loose e.g. new comments, tags, shares and such things related to the files, but at least the files itself would still be there. To bring the database to the newest stage of the files folder (and thus to make the newer files show up in files app), you simply would need to do some occ files:scan --all and occ files:cleanup. I didn’t see any situation, where an older database restore did not work with a newer (different) data folder.
The nextcloud folder again is actually totally irrelevant, as you can always just download your version again. Only the config.php and in case .htaccess/.user.ini would need to be restored/reconfigured. But the nextcloud folder also does not take any drive space, thus many backups do not hurt at all :laughing:.

I personally just keep 1 data folder backup against drive crashes/data loss and create/update it every night. I assume that I would recognize some drive crash during the day, which would lead to broken nextcloud access in most cases. Otherwise trash/versions are there to restore accidently changed/removed data.

Database corruption sometimes shows up days later, sometimes due to slowly breaking sd card on SBCs (was just the case for me some months ago). Also I had the case, that the database dump just stopped at a certain table and some days later I faced the last dumps just incomplete. So for database it is nice to keep some more backups here, also because they do not take much space.

Of course keeping backups of all 3 parts together several times is a great situation. I just guess that many people/organizations are not able (do not want to invest into this amount of drive space) to keep the size of e.g. 10 data folders on their drives. There it would be better for my point if one can just keep 1 or 2 data folder backups, to still keep 10 or 20 database backups beside. Again also 10 to 20 nextcloud folder backups do simply not hurt there.


The script works fine, but after it finishes with backup of the nextcloud data it freezes on the console, but the progress is still running meaning, it also finishes the backup of the sql and setting maintenance-mode off again.
Do you why that happens?


have you tried to run the commands of the script one by one on the console? Can you reproduce the problem here?

So just ran the script again, it looks to be printing its progression on the console again. That is weird.

The backup is needful thing for every system.There are many way to take backup but it is depend what type of data which you want to back. Your scrip for back is very useful which I use backup for my website If It work then I use for my other website.

Nice script. I think I’ll take some inspiration. Thanks.

One thing though – is it necesary to stop the webserver? Haven’t seen this step in other tutorials or official docs. IMO maintenance mode should be enough to make sure there are no inconsistencies…

I always do stop the webserver. Because my files are on external storage (which is not part of the backup), backup is done very quickly.
Feel free to test if stopping the webserver is really necessary for your needs. Maintenance mode should be enough for the NC backup.

Hey, using this could I link rsync so that after a backup it then moves the backup to an external storage server for an external backup?

Thanks, Tom.

Hi Tom,

you could also edit the script in order that backups are saved directly to an external storage server.

Thanks for the scrip. Have added into cron and works fine very night around 03:00 AM, so no need to worry for stopping the server.

As a side comment, I came to nextcloud in Ubuntu from NextcloudPi running in a Raspberry 3. The NextcloudPi has an admin module to easily configire/use most of the admin requirements, including ufw, rsync, ssl, …

I’ve introduced a versioning of the scripts now. The version can be found in the header of the scripts. I think, now it’s easier to update already used scripts when some changes were made in the GitHub repo.

Also, v1.0.0 comes with some changes improving the maintainability.

You can also download an archive from the release page:

1 Like

I’ve moved from GitHub to Codeberg.
You can find the repository with the scripts here:

Sir if I change your script and change nginx to apache2 its everything will be ok?

Nice script! Thank you very much.
Just one question (maybe i dont get it right)
why do you need to stop the webserver during the backup?
isnt it enough to put it to maintenance mode?
i mean now my clients wont get the information that the server is in maintenance right now because the webserver is not running?


Hello Matthias,

it should work when the webserver is still running and only the maintenance mode of NC is active.
However, I do stop the webserver in order to make absolutely sure that there are no activities in NC.

When your concern is regarding other web applications running on the same server, you could also disable the virtual host for NC and reload the webserver config. This makes sure that no other web application will become unavailable while the backup of NC is running.

Best regards,

OK, i removed the stop/start of the webserver from the script.

my concern is that the nextcloud-clients (for example my brother who is using my server doesnt get the “maintenance/backup in progress” notice when i stop the webserver. so for the user out there it looks like a problem.

and a suggestion/small improvement

i added the “-I pigz” option to the tar call to enable parallel gzip

tar -I pigz -cpf "${backupdir}/${fileNameBackupFileDir}" -C "${nextcloudFileDir}" .