Automate OCC Filescan

How can I use OCC (or something else?) to trigger a rescan of external storage whenever a change is made?

I have Nextcloud running on Docker on my Mac with this docker-compose. I made a small tweak to it so I can access my external storage (external hard drive):

build: ./app
restart: always
  - nextcloud:/var/www/html
  **- /Volumes/LaCie:/Volume**
  - db.env
  - db
  - redis
  - proxy-tier
  - default

I was able to successfully enable local storage:

Immediately after setting up local storage, I am able to access all of my files and the data size states pending. However, after sometime only the files I accessed remain. Hence, if I have a TV show folder and click on the show SpongeBob and select season 1. All episodes from season 1 will load and the data size will update. However, if I don’t select and open the other seasons (i.e season 2,3,4) those folders will disappear from nextcloud.
Also, if I create a new folder with the same name name (i.e Season 2). Nextcloud will state Season 2 already exist and the folder will show up.

Running the command php occ files:scan --all under the user www-data appears to have done the job…all of my external storage files are showing! However, is there a better way to implement this code somewhere? I think running this script every time will use a lot of resources…is there an alternative solution?

By the way files:scan --all will rescan whole file system together with externals…
I used this script to rescan external shares only. Script will fetch data about all external storages of all users and perform scan of them. I put it in cron like twice per day and it consumes much less resources in compare to “rescan all” command

Thanks for the info! Can you please provide the path where you stored this script and how you were able to call it with cron?
Also, I noticed some features on the script are commented. Can you please share what you un-commented to automate scanning on external drives.


Basically it works out from the box. Only that you have to check you nextcloud path, log path and create a log file for php occ output.

I put it in


with chmod 755

I run it under nextcloud user (for me it is www-data) basically twice per day at 2:30 and 14:30. You can run it also hourly. This is my cron config:

30 2,14 * * * perl -e 'sleep int(rand(1800))' && /usr/local/bin/ #Nextcloud file sync

Here I add perl -e ‘sleep int(rand(1800))’ to inject some random start time within 30 Minutes, but since it scans externals only it is not necessary any more. Your cron job config to run it hourly could be:

* */1 * * * /usr/local/bin/

Lets go through what it does (valid for commit 44d9d2f):

COMMAND=/var/www/nextcloud/occ <-- This is where your nextcloud OCC command located
OPTIONS=“files:scan” <-- This is “Command” to run, just live it as it is
LOCKFILE=/tmp/nextcloud_file_scan <-- Lock file to not execute script twice, if already ongoing
LOGFILE=/var/www/nextcloud/data/nextcloud.log <-- Location of Nextcloud LOG file, will put some logs in Nextcloud format
CRONLOGFILE=/var/log/next-cron.log <-- location for bash log. In case when there is an output by command generated. AND IT IS GENERATED…

Line 22 will generate NC log input. You will see it in a GUI as:

From the line 26 starts the job, basically it is left from an older version of script and it is exactly what you done - scan all users, all shares, all locals with all folders. It takes ages to perform an a big installations, so I commented it.

Second option (line 31) is to scan for specific user, but as soon as I get more than one user with external shares it does not work also. Besides it is still scanning whole partition (local and remote) for specific user - commented.

From line 35 till 42 comments how to not forget how I get users from the NC, basically everything is happens in line 45, scipt will generate exactly path for external shares to be updated for all users (you can run it and test output). Here an example command:

sudo -u www-data php occ files_external:list | awk -F'|' '{print $8"/files"$3}'| tail -n +4 | head -n -1 | awk '{gsub(/ /, "", $0); print}'

and output:


Those lines will be read one by one and synced in line 49.

After this script will generate NC log output:

I have had some issues (like described here OCC files:cleanup - Does it delete the db table entries of the missing files?) in older NC versions, so I added workaround from line 60 till 67 as files:cleanup command, nut sure if it is needed now, but it does not harm anything.

1 Like

I have a few docker containers for my nextcloud install. Which one should I use for this script?

Here is what my current cron.php looks like:

Where do I put this script:

30 2,14 * * * perl -e 'sleep int(rand(1800))' && /usr/local/bin/ #Nextcloud file sync

Just to confirm, my file will be in the same directory?

And just to confirm…I don’t need to modify the file?

So I have all of my external shared mounted under the admin user (and certain shares are shared with certain users). My users don’t have the ability to mount their own external storage.


I do not know your installation. If you have few dockers as replicant of your cloud with DB inside, not sure if this script could help you at all.

If you have few dockers that connected to one independent db - theoretically you can run it on one of them and it will do the job for others. But container could be closed and job will be not finished.

If you have few dockers with own db each - run it everywhere where external shares are presented.

With Docker it is a bit more complicated to run a cron job (not related to cron.php from nextcloud, but to system cron). Try to read this and find your best way to do it.

As I wrote you also do not needed to run it with random time shift especially under docker, use simple line. Again, this is not cron.php, this is system cron.

Steps to do:

  1. Put script in whatever place within you NC Docker container.
  2. Adjust header of script if your NC installation folder is different from what is in the header. Like COMMAND=/Your/NC/Path and for LOGFILE also.
  3. Read how to run cron job under Docker
  4. Create cron job (not cron.php) for you NC user with simple line like:
* */1 * * * /path/to/the/
  1. Test it
1 Like

I did update the script, it does not necessary now set LOG files path, it will drop all the output if logs are note presented or not writeable.
Also I add option -all to make it rescan whole partition in an old way. If you run it wihtout any input, it will do rescan of external shares only. You only need to check header and untouch the rest.

# Adjust to your NC installation
    # Your NC OCC Command path
    # Your NC log file path
    # Your log file path for other output if needed
    # If you want to perform cache cleanup, please change CACHE value to 1
1 Like

I basically copied my docker-setup from here for nextcloud:

So the configuration already had a docker file inside the app folder…so I simply added the command to it (hopefully correctly):

And I copied the code of the file and placed it in /usr/local/bin with vim (I created the file and copied/pasted the code.):

The only thing I changed in the file is the file path:
# Adjust to your NC installation
# Your NC OCC Command path
# Your NC log file path
# Your log file path for other output if needed
# If you want to perform cache cleanup, please change CACHE value to 1

I ran docker-compose up -d and waited an hour…but I don’t think the filescan occurred.

I believe I am creating the cron job incorrectly in the dockerfile. Any tips?


@gas85 do you know if it’s possible to run the cron script directly on bash so I can confirm if it’s working? Currently I am not sure if the script isn’t working or if my call to run the script isn’t.


Of course run it under your NC user (e.g. www-data). Not sure how it is under docker, but on normal (e.g. Ubuntu) system it is something like:

sudo -u www-data
1 Like

I made the .sh file executable: chmod +x

And had to adjust the file paths in the sh file:

# Adjust to your NC installation
	# Your NC OCC Command path
	# Your NC log file path
	# Your log file path for other output if needed
	# If you want to perform cache cleanup, please change CACHE value to 1

When I just ran in terminal. I got:

And when I ran sudo -u www-data (had to install sudo in my docker container). I got:

WARNING - could not write to Log file /var/www/html/data/next-cron.log, will drop log messages. Is User Correct?

In both scenarios, my file path didn’t update. So I am assuming I need to edit something in the sh file so it scans my paths?


@gas85 any idea why the script doesn’t update after the scan? Do I have to make any modifications (please see above post)


Hey. It looks like file /var/www/html/data/next-cron.log does not exist, or not writable. Thats all, not a big issue. You will not see output of occ, but it will not stop because of it.

Regarding sudo, not sure if you need to install it. I thought that container user is your NC user, so you should be able to run your script directly under command line in container. Sorry, I did not work with containers directly, so I couldn’t help you how to run script there.

30 августа 2018 г. 21:02:02 GMT+02:00, K-MTG пишет:

I think I have the script running. But it seems to be scanning the wrong thing…because it says scan completed but nothing updates.

Try to run (under NC user):

php occ files_external:list

you should see something similar to this:

This will output list of all external shares. Check that it is not empty.

Then try to run and check the output (it should be list of USERNAME/files/SHARENAME):

php occ files_external:list | awk -F'|' '{print $8"/files"$3}'| tail -n +4 | head -n -1 | awk '{gsub(/ /, "", $0); print}'

Then take one path (from the 1st table, or last adopted list) and run command to check what happens:

php occ files:scan --path="/USERNAME/files/Mount_Point"

Every thing must bus executed under your NC user. For Ubuntu I used sudo -u www-data command as www-data is my NC user. Hope it helps you in trouble shooting.