Automate OCC Filescan

Hey there, hey @gas85, thank you for your script. I tried writing a simple cron job (/5 * * * * sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") and I tried it with your script rund as a cron job (/5 * * * * sudo - u www-data /usr/local/bin/ There is no error output, but in both cases the external folder isn’t updated.

When I run sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ" the folder is updated, but when I run your script or let it run as a cron job, the folder isn’t updated.

Please help me. What am I doing wrong. I really tried all the steps.

This will never work. You have to create cron job for user www-data. To do so, type

sudo -u www-data crontab -e

After this enter your line from above without sudo and user and save it:

*/5 * * * * php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ" 

The aim of my script is to get info about all your shares and rescan them. Please check documentation repo, supported Synntax. I believe you simply did not configure it correctly.

thank you @gas85 for your fast answer.

When I type your cronjob command I get:
"touch: cannot touch '/var/www/.selected_editor': Permission denied Unable to create directory /var/www/.local/share/nano/: No such file or directory It is required for saving/loading search history or cursor positions.

I also putted the line without sudo (*/5 * * * * php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") before, but the folder isn’t updated as well.

Stupid question: Where do I find the documentation repo? I installed your script and checked the highlighted areas. Everything seemed to be like in my configuration.

When I run “sudo -u www-data php /var/www/nextcloud/occ files:scan --path=”/XYZ/files/XYZ" manually the folder is updated.

I think it is strange, but in a screenshot you can see, that two top folders didn’t get scanned (they get there with manual running the occ command) like the old familiar once below.

Then try in other way. This makes you root and tell crontab to edit www-data user tab. Old command makes you user www-data and tell crontab to edit actual tab.

sudo crontab -u www-data -e

Docu related to the script is here:

Docu to the official command is here.

Try to use key --verbose to see what happens when you rescan manually.

It still isn’t working. I don’t know, what I am doing wrong.

When I ran

sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/USERNAME/files/FTP_Mount_Point"

manually it is working, but my cron job with

php /var/www/nextcloud/occ files:scan --path="/USERNAME/files/FTP_Mount_Point"

is not working. Neither with “sudo crontab -u www-data -e” nor with “sudo -u www-data crontab -e”

The script isn’t working either. Both (script and occ-command in cronjob) are just refreshing the old folders and not showing new folders/files.

I check the configuration and the folders/files are missing:

LOCKFILE=/tmp/nextcloud_file_scan <-- Lock file to not execute script twice, if already ongoing

LOGFILE=/var/www/nextcloud/data/nextcloud.log <-- Location of Nextcloud LOG file, will put some logs in Nextcloud format

CRONLOGFILE=/var/log/next-cron.log <-- location for bash log. In case when there is an output by command generated. AND IT IS GENERATED…

Please I need a solution for getting my scanner-files automatically to my nextcloud.

It is extremely complicated to give advice’s without knowing your system. Please provide more data about your environment.

Nevertheless, could be that cron has no path to php, so simply put absolute like to it.

  1. Find php. For me it is under /usr/bin/php
whereis php
php: /usr/bin/php /usr/bin/php7.4 /usr/lib/php ...
  1. Redirect error output to the console so that you will be able to see it. Add 2>&1 to command in cron
  2. Redirect all output to the logfile to analyse it. Add >> /tmp/nextcloud_rescan_debug.log
  3. Increase output information with key --verbose

All together in your contab should looks like:

*/5 * * * * /usr/bin/php /var/www/nextcloud/occ files:scan --path="/USERNAME/files/FTP_Mount_Point" --verbose 2>&1 >> /tmp/nextcloud_rescan_debug.log

Now you should be able to see output and errors in /tmp/nextcloud_rescan_debug.log, simply run sudo tail -f /tmp/nextcloud_rescan_debug.log and wait when job being executed.

After trouble shooting done, please cleanup command from point 2-4.

I see that I need to improve documentation quality here. You only need to set correctly where your occ command is, then script will check your config and extract path to nextcloud log file from it. If it is not presented in config, you have to configure it for script.
I see that you have only 1 external storage, makes easier for you to use occ command directly as described above.

It is working now. Hallelujah. Thank you very much.

Maybe it is the path to php? In the firstling the cronjob referes to cron.php and there is also no folder to php so that is why I thought there is no need for it.

So i removed the trouble shooting. And yes, since I have only one external drive, the cron job is enough for me.

Last question: Do you know how to configure it, that the modified date doesn’t set to 0 every time. and for some reason the timestamp for the files says “in 25 Minutes” and not “25 Minutes ago”.

1 Like

Try to use --unscannedkey to not touch existing files, this will at least live timestamp of the first scan for files.

Funny, but it works for me very well on ext4:

What kind of FS are U using?

What do you mean with FS?

I noticed, that suddenly the cronjobs are deleted/set back to the original state. Is it possible that an updates resets the cronjob? How can I prevent that?

FS is File System, like FAT, ntfs, ext, btrfs…

IDK your environment, if you are on Nextcloud PI, docker or snap then it could be. In other cases Nextcloud should not be able to do something like this.

I use BTRFS and Nextcloudpi.


I am still trying to get external shares synchronize.

When running the @gas85 script manually using:
sudo -u www-data
It works fine, sync worked.

But when integrated into cron, did not works.
Here is my cron for www-data:

hep@bigfoot2:~$ sudo crontab -l -u www-data
# Edit this file to introduce tasks to be run by cron.
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').# 
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
# For more information see the manual pages of crontab(5) and cron(8)
# m h  dom mon dow   command
*/5  *  *  *  * php -f /home/www/
34 2 * * * /usr/local/bin/

This runs on Ubuntu server 20.04.2 LTS

Probable php path is unknown to the cron user. Check where is your php installed and that you set it in script.
E. G. to find out

whereis php

Then take your path and configure it in script.

whereis php
php: /usr/bin/php

In the script:

    # Your PHP location


This looks good

Well, I change the cron settings to test it right now.
Something is running:
Capture d’écran_2021-05-24_14-39-22

We can see PID 441722 looks running the script.
And in the NC Logging window I see this:

[] Info: +++ Starting Cron Filescan +++
by occ at 2021-05-24T14:35:01+02:00

And after some time:

[] Info: +++ Cron Filescan Completed. Execution time: 10 minutes and 8 seconds +++
by occ at 2021-05-24T14:45:09+02:00

Well, I retried running the command:
sudo -u www-data
And after almost same 10 minutes, still not sync.
Very strange.

You can always check actual Scanning External Share status via ps command.

ps -ef | grep "files:scan"
www-data 1040883 1040341  0 May22 ?        00:00:14 /usr/bin/php /var/www/nextcloud/occ files:scan --path=user/files/Externals/Yandex

Here you can see that actually user/files/Externals/Yandex being scanned and, by the way it is hangs since May 22 (today is 25th of May)…
When I kill it, it causing:

+++ Cron Filescan Completed. Execution time: 4082 minutes and 14 seconds +++

This is not an issue of this script, but rather how yandex reports “no free space left”.

Well …

With this command:

ps -ef | grep "files:scan"

I see only one user with 2 mount point being updated.
And this user is the last user in the Account Table or Users list in admin panel.
The script is not scanning all the previous users.

But, if I do this:

php occ files_external:list

And this:

php occ files_external:list | awk -F'|' '{print $8"/files"$3}'| tail -n +4 | head -n -1 | awk '{gsub(/ /, "", $0); print}'

I got the list of 5 mount points with associated users. These mount points are the one defined in the Admin Account’s - settings - External storages.

I do not see user’s external storage (not very important at this moment).