It seems to be working, however when I test with slightly bigger backups (385mb total size in zip) the actual .sh file deletes its self and then of course it doesn’t run anymore.
Here is my cron command:
*/10 * * * * /var/nextcloudbackup/runbackup.sh
Works perfectly fine with very small storage sizes.
Please note: This is a test environment on my local network to see if I can get a reliable backup solution working, that’s why the users and passwords are generic alongside with the high frequency of backups every 10 minutes.
First of all it is not a good idea to run it frequently on a huge installation, especially each 10 Min :). But seems it is your test system only,
Have a look on Snapshots solution with zfs, btrfs…
if script is saved in the same folder where backup should be stored, it is quite possible that it delete it self. I would say script is not very safe, it does not do checks of a lot of things and supposing that everything is on the place and only NC is there.
Do not see any problem from the script side, it is simple tar with compression. It could be that on a bigger installation you have DATA folder somewhere not inside of NC folder installation, then you have to specify it, but it is commented in your script.
Try to added echos on script with step information to see when it fails.
and if you want to do your back into the cloud (aws,dropbox,et.al.) (it’s encrypted. should be save.)
an example script for backup you’ll find here:
you have to adopt it. because it would set your nextcloud into maintenance mode and dump your database easy 10 min.
to backup your files folder each 10 min only should be ok since restic is doing a delta backup.
(i don’t like this bash scripts trying to do backup house keeping. )