Nextcloud 10.0.1 with PHP7, Apache2, MariaDB und RedisServer
my Nextcloud folder is here: /var/www/nextcloud
my Nextcloud Data is here /var/nextclouddata
I have three folders to backup the Nextcloud folder, ~ Data and the database:
• /media/ncbackup/nextcloud
• /media/ncbackup/data
• /media/ncbackup/database
My wish is to have a planed task wich saves the three parts every day.
My first problem is, that I have no idea how to change the rsync command to my parameters (/media/ncbackup…)
Second the mysqldump command cancels with the message “No permission”
Do I have to change the mysqldump like this:
mysqldump --lock-tables -h nextcloudserver -uroot -psecret nextclouddb > /media/ncbackup/database/nextcloud-sqlbkp_date +"%Y%m%d".bak
My linux skills are very very loooow, but I hope you can help me.
Run this as the user with permission to export the database. It may be root, it may be the ncdb user, that depends on your setup. Also presumably the database is on the local machine, so no need for the -h flag.
#1
When I try to run the mysqldump command as the nextcloud db user, I get this error: mysqldump --lock-tables -u nextcloudadmin -pSECRET nextcloud /media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_date +"%Y%m%d".bak
-- MySQL dump 10.15 Distrib 10.0.27-MariaDB, for debian-linux-gnu (x86_64)
--
-- Host: localhost Database: nextcloud
-- ------------------------------------------------------
-- Server version 10.0.27-MariaDB-0ubuntu0.16.04.1
/*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */;
/*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */;
/*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */;
/*!40101 SET NAMES utf8mb4 */;
/*!40103 SET @OLD_TIME_ZONE=@@TIME_ZONE */;
/*!40103 SET TIME_ZONE='+00:00' */;
/*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */;
/*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */;
/*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */;
/*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES, SQL_NOTES=0 */;
mysqldump: Couldn't find table: "/media/ncbackup/NextcloudDBExport/nextcloud-sqlbkp_date"
How can I solve this?
#2
After I run rsync once, what happens wenn I run the same command again? Does it sync from source to destination?
I only can run the “mysqldump”-command as root
sudo -s
mysqldump --lock-tables …
Just to use: sudo mysqldump --lock-...
does not work!
When I add +%Y%m%d.bak, I get this error:
mysqldump: Couldn’t find table: +%Y%m%d.bak
and my last question from my post above: #After I run rsync once, what happens wenn I run the same command again? Does it sync from source to destination?
Yes, it’ll always sync in the one direction whenever you use that command, meaning it’ll overwrite old with new, and add files that aren’t in the backup location. I personally use subfolders to create snapshots of the system outside of this script, as it then gives me point-in-time restore capabilities should something go wrong and the rsync copies it all over to the backup location.
I couldn’t tell you why, works fine on all my systems under my normal user. One improvement however would be to create a .my.cnf file allowing you to omit the user and password in the commandline (or cron):
cd ~ touch .my.cnf vim .my.cnf
[client]
user = root
password = "secret"
host = localhost
You change the user to whatever user you’d usually run it with, presumably not root.
If you’ve gotten the mysqldump command working, use the username and secret of that user in the .my.cnf file.
You can backup different locations and all databases stored on the source in an automatted process. It makes database dumps and you can manage retention and rotation. It is based on rsnapshot and rsync.
I’m using this script to save encrypted backups, which get rsync’ed to another server for safe keeping.
#!/bin/bash
# Source: https://stackoverflow.com/a/1963940
# Source: https://stackoverflow.com/a/15440967
# Assumption: Using MariaDb with Login Socket Login plugin enabled
# for the root user. In latest >=16.04 servers this is the default.
# Thefore no need for -u root and -p password params
# Add them in the $Dump variable if required.
# Also assuming you have GPG keys generated and uploaded and imported
# in the server gpg database (-r "add your key ID")
Host=localhost
Server=servername
BDir=/opt/backup/mysql
Dump="/usr/bin/mysqldump --skip-extended-insert --force"
MySQL=/usr/bin/mysql
Long="`date +%A,` `date +%B` `date +%d,` `date +%Y,` `date +%r`"
short_date=`date +%Y-%d-%d-%H-%M`
# Delete backups older than 15 days
# Source: https://www.howtogeek.com/howto/ubuntu/delete-files-older-than-x-days-on-linux
find $BDir/* -mtime +15 -exec rm -f {} \;
# Get a list of all databases
Databases=$(echo "SHOW DATABASES" | $MySQL -h $Host)
echo "Backing up databases from '$Server.$Host' on $Long "
for db in $Databases; do
# Skip MariaDB's internal schema databases
if [ $db != "performance_schema" ] && [ $db != "information_schema" ] && [ $db != "Database" ] && [ $db != "mysql" ] ; then
# Make separate directories for separate databases
mkdir -p $BDir/$db;
file="$BDir/$db/$Server-$db-$short_date.sql.gz.asc"
echo "Backing up $db to $file"
# Export, compress and then encrypt file
# use the following command to decrypt
# gpg --decrypt [database_name].sql.gz.asc | gunzip > database_name.sql
$Dump -h $Host $db | gzip | gpg --always-trust --encrypt -r "key_id" > $file
fi
done
Im using a third-party app called Backupninja. I have a Nextcloud file share environment supporting about 50 users and Backninja works great! I initially tried to run simple scripts using cron to schedule, but was inconsistent and failed to complete. Anyway, Backupninja made backing up Nextcloud very easy. You can creeate a mysql dump and data directory backup then is rsynced to a remote NAS storage then backed up to tape. Had to restore from backup once already and restore went smooth. Great solution if you are not big on scripting like me.