Most graceful way to update file database for files not sent via browser or client?

Hello everyone!

I am very happy with NC 16 on my Ubuntu 18.04 server. It’s not perfect but I get way more done than with my Dropbox.

Since it’s a multi purpose server I also use the directories of Nextcloud to transfer data over sftp into the folder of a dedicated “ftp” user. It’s about 2 to 200GB a day, namely an offsite backup.

Now the small annoyance is that if files aren’t transfered via PC client or browser the file database isn’t updated. So the files are on the drive inside the folder but not listed in Nextcloud. Right now I am running a cron for occ:scan at midight. I wanted to know if there is a more elegant and efficient way to do this? Best case without a cron?

Edit: I am no Linux expert, but maybe there is a service I can run to monitor the data folder and when something new appears a “small” scan could be triggered in some way?

1 Like

Hi Hikari,
You should look at inotifywait command in Linux. I believe it should solve your problem. From what I have read about inotify, it is a system call so the OS is already aware that the file has changed and this I believe should make it faster than cron jobs(although I have never used cron).
Help me if I am wrong.

I had a similar situation where I had to monitor a directory for new images and execute an upload program to upload the image to the server. I have added it to gist. Take a look at it and modify as you like -

inotifywait’s man page
How inotify works

1 Like

I wrote short script to rescan external shares only and run it via cron like few times per day. You can combine both solutions.