Remote Cifs share syncing with database update & some logic

I have an AIO printer with scan function. Scans are saved to a usb flash drive housed in a nearby router in AP mode (asus merlin). My pc (windows) has a task that periodically (several times a day) syncs the files from the flash drive to a folder on the pc for backup. Access to the share on the router is configured for ftp or samba.

Pc is not always on. Nextcloud instance is, so why not back up to it.

My linux knowledge is very limited. I clobbered my way through to make the following.

Implementation required several tasks

  1. Map remote share to a mount point
  2. sync files
  3. update nextcloud database

I came up with the following for each step

  1. /etc/fstab

// /mnt/ cifs credentials=/mnt/ncdata/.,vers=2.0,uid=www-data,gid=www-data 0 0

File . contains credentials for the share in plaintext. Set as root:root, chmod 600.

Is there a way to hash it?

  1. The following script handles #2 & #3 functions.

In a nutshell, the flow is to perform the sync. Generate notification email if sync fails. If successful, compare destination folder state to state before sync. If same, end script. If different, run nc database update.

This script is the final result after many iterations. It started as a simple rsync and database update which morphed into something more sensible.

Any suggestions for improving, making more secure, or other ideas?

The script is in /mnt/ncadmin folder with www-data:www-data owner/group, 755 permissions. It’s called from a cron job every 3 hours between 6am and midnight with the following line in /etc/cron.d/rsync_cron

0 0,6,9,12,15,18,21 * * * www-data /mnt/ncdata/ > /dev/null 2>&1


Why not use the external storage app for this?

I think my definition of syncing needs clarifying. Specifically, I don’t just want the remote files to be visible through nc. I want those files to be copied to the nc storage – ie. an additional backup location.

Maybe I missed it in the external storage. How would I do the above?