Backup Nextcloud with seperate DATA folder and the database?!?!?

Hello,
I have the following setup:

  • Ubuntu Server 16.04.1 LTS
  • Nextcloud 10.0.1 with PHP7, Apache2, MariaDB und RedisServer
  • my Nextcloud folder is here: /var/www/nextcloud
  • my Nextcloud Data is here /var/nextclouddata
  • I have three folders to backup the Nextcloud folder, ~ Data and the database:
    • /media/ncbackup/nextcloud
    • /media/ncbackup/data
    • /media/ncbackup/database

I just found this in the admin manual (https://docs.nextcloud.com/server/10/admin_manual/maintenance/backup.html)
rsync -Aax nextcloud/ nextcloud-dirbkp_date +"%Y%m%d"/
and
mysqldump --lock-tables -h [server] -u [username] -p[password] [db_name] > nextcloud-sqlbkp_date +"%Y%m%d".bak

My wish is to have a planed task wich saves the three parts every day.
My first problem is, that I have no idea how to change the rsync command to my parameters (/media/ncbackup…)
Second the mysqldump command cancels with the message “No permission”

Do I have to change the mysqldump like this:
mysqldump --lock-tables -h nextcloudserver -uroot -psecret nextclouddb > /media/ncbackup/database/nextcloud-sqlbkp_date +"%Y%m%d".bak

My linux skills are very very loooow, but I hope you can help me.

Thank you in advance.
Best

Hi there,

rsync -Aax /var/nextclouddata/ /media/ncbackup/data/

Would cover your data dir, as you can see you just change the paths as required, leaving a slash on the end for all data within that directory.

mysqldump --lock-tables -u root -psecret nextclouddb > /media/ncbackup/database/nextcloud-sqlbkp_date +"%Y%m%d".bak

Run this as the user with permission to export the database. It may be root, it may be the ncdb user, that depends on your setup. Also presumably the database is on the local machine, so no need for the -h flag.

Small correction. You have to omit the space between -p and the password.

mysqldump --lock-tables -u root -psecret nextclouddb > /media/ncbackup/database/nextcloud-sqlbkp_date +"%Y%m%d".bak

Edited, thank you :slight_smile:

Thanks.

#1
When I try to run the mysqldump command as the nextcloud db user, I get this error:
mysqldump --lock-tables -u nextcloudadmin -pSECRET nextcloud /media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_date +"%Y%m%d".bak

-- MySQL dump 10.15  Distrib 10.0.27-MariaDB, for debian-linux-gnu (x86_64)
--
-- Host: localhost    Database: nextcloud
-- ------------------------------------------------------
-- Server version       10.0.27-MariaDB-0ubuntu0.16.04.1

/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;
/*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */;
/*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */;
/*!40101 SET NAMES utf8mb4 */;
/*!40103 SET @OLD_TIME_ZONE=@@TIME_ZONE */;
/*!40103 SET TIME_ZONE='+00:00' */;
/*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */;
/*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */;
/*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */;
/*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES, SQL_NOTES=0 */;
mysqldump: Couldn't find table: "/media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_date"

How can I solve this?

#2
After I run rsync once, what happens wenn I run the same command again? Does it sync from source to destination?

Thanks again :smiley:

Just had a quick look at https://mariadb.com/kb/en/mariadb/mysqldump/ at the bottom under examples the syntax is

shell: mysqldump db_name > backup-file.sql

so you could try

mysqldump --lock-tables -u nextcloudadmin -pSECRET nextcloud > /media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_date +"%Y%m%d".bak

You missed the >

Thanks.
But now I have another two problems:

  1. I only can run the “mysqldump”-command as root
    sudo -s
    mysqldump --lock-tables …
    Just to use:
    sudo mysqldump --lock-...
    does not work!
    When I add +%Y%m%d.bak, I get this error:
    mysqldump: Couldn’t find table: +%Y%m%d.bak

  2. and my last question from my post above:
    #After I run rsync once, what happens wenn I run the same command again? Does it sync from source to destination?

Thanks again :smiley:

Try this instead:

mysqldump --lock-tables -u nextcloudadmin -pSECRET nextcloud > /media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_$(date +\%F).bak

– I think the brackets matter. My own mysqldump script looks like this:

mysqldump jason_band > /var/backup/jason_band_$(date +\%F_\%R).sql

Yes, it’ll always sync in the one direction whenever you use that command, meaning it’ll overwrite old with new, and add files that aren’t in the backup location. I personally use subfolders to create snapshots of the system outside of this script, as it then gives me point-in-time restore capabilities should something go wrong and the rsync copies it all over to the backup location.

I couldn’t tell you why, works fine on all my systems under my normal user. One improvement however would be to create a .my.cnf file allowing you to omit the user and password in the commandline (or cron):

cd ~
touch .my.cnf
vim .my.cnf

[client]
user = root
password = "secret"
host = localhost

ESC + : + w + q + Enter

Then you can run:

mysqldump --lock-tables nextcloud > /media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_$(date +\%F).bak

Whether on commandline or in your crontab, wherein a cronjob might look like this:

0 0 1 * * mysqldump jason_band > /var/backup/jason_band_$(date +\%F_\%R).sql

Which would run once a month.

Probably a good idea to secure that new .my.cnf file also with:

chmod 0600 ~/.my.cnf

Thanks.
Now the command with the date-extension and the mysqldump works and thanks also for the explaination of the the function-rsync.

But the thing with .my.cnf doesn´t work. Do I need to activate the root user first?

Because I log in with my admin-account and then I change to root, like: sudo -s

You change the user to whatever user you’d usually run it with, presumably not root.
If you’ve gotten the mysqldump command working, use the username and secret of that user in the .my.cnf file.

This might be helpful:

You can backup different locations and all databases stored on the source in an automatted process. It makes database dumps and you can manage retention and rotation. It is based on rsnapshot and rsync.

I’m using this script to save encrypted backups, which get rsync’ed to another server for safe keeping.

#!/bin/bash
# Source: https://stackoverflow.com/a/1963940
# Source: https://stackoverflow.com/a/15440967
# Assumption: Using MariaDb with Login Socket Login plugin enabled 
# for the root user. In latest >=16.04 servers this is the default. 
# Thefore no need for -u root and -p password params 
# Add them in the $Dump variable if required. 
# Also assuming you have GPG keys generated and uploaded and imported
# in the server gpg database (-r "add your key ID") 
Host=localhost
Server=servername
BDir=/opt/backup/mysql

Dump="/usr/bin/mysqldump --skip-extended-insert --force"
MySQL=/usr/bin/mysql

Long="`date +%A,` `date +%B` `date +%d,` `date +%Y,` `date +%r`"
short_date=`date +%Y-%d-%d-%H-%M`

# Delete backups older than 15 days
# Source: https://www.howtogeek.com/howto/ubuntu/delete-files-older-than-x-days-on-linux
find $BDir/* -mtime +15 -exec rm -f {} \; 

# Get a list of all databases
Databases=$(echo "SHOW DATABASES" | $MySQL -h $Host)

echo "Backing up databases from '$Server.$Host' on $Long "
for db in $Databases; do
        # Skip MariaDB's internal schema databases
        if [ $db != "performance_schema" ] &&  [ $db != "information_schema" ] && [ $db != "Database"  ] && [ $db != "mysql"  ] ; then 
                # Make separate directories for separate databases
		mkdir -p $BDir/$db;

                file="$BDir/$db/$Server-$db-$short_date.sql.gz.asc"   
                echo "Backing up $db to $file"
                
                # Export, compress and then encrypt file
                # use the following command to decrypt 
                # gpg --decrypt [database_name].sql.gz.asc | gunzip > database_name.sql 
                $Dump -h $Host $db | gzip | gpg --always-trust --encrypt -r "key_id" > $file
        fi
done
1 Like

Im using a third-party app called Backupninja. I have a Nextcloud file share environment supporting about 50 users and Backninja works great! I initially tried to run simple scripts using cron to schedule, but was inconsistent and failed to complete. Anyway, Backupninja made backing up Nextcloud very easy. You can creeate a mysql dump and data directory backup then is rsynced to a remote NAS storage then backed up to tape. Had to restore from backup once already and restore went smooth. Great solution if you are not big on scripting like me.

1 Like