Automate OCC Filescan

Of course run it under your NC user (e.g. www-data). Not sure how it is under docker, but on normal (e.g. Ubuntu) system it is something like:

sudo -u www-data nextcloud-file-sync.sh
1 Like

I made the .sh file executable: chmod +x nextcloud-file-sync.sh

And had to adjust the file paths in the sh file:

# Adjust to your NC installation
	# Your NC OCC Command path
COMMAND=/var/www/html/occ
	# Your NC log file path
LOGFILE=/var/www/html/data/nextcloud.log
	# Your log file path for other output if needed
CRONLOGFILE=/var/www/html/data/next-cron.log
	# If you want to perform cache cleanup, please change CACHE value to 1
CACHE=0

When I just ran nextcloud-file-sync.sh in terminal. I got:

And when I ran sudo -u www-data nextcloud-file-sync.sh (had to install sudo in my docker container). I got:

WARNING - could not write to Log file /var/www/html/data/next-cron.log, will drop log messages. Is User Correct?
root@c2663b1e77ea:/usr/local/bin#

In both scenarios, my file path didn’t update. So I am assuming I need to edit something in the sh file so it scans my paths?

Thanks

@gas85 any idea why the script doesn’t update after the scan? Do I have to make any modifications (please see above post)

Thanks

Hey. It looks like file /var/www/html/data/next-cron.log does not exist, or not writable. Thats all, not a big issue. You will not see output of occ, but it will not stop because of it.

Regarding sudo, not sure if you need to install it. I thought that container user is your NC user, so you should be able to run your script directly under command line in container. Sorry, I did not work with containers directly, so I couldn’t help you how to run script there.

30 августа 2018 г. 21:02:02 GMT+02:00, K-MTG noreply@nextcloud.com пишет:

I think I have the script running. But it seems to be scanning the wrong thing…because it says scan completed but nothing updates.

Try to run (under NC user):

php occ files_external:list

you should see something similar to this:

This will output list of all external shares. Check that it is not empty.

Then try to run and check the output (it should be list of USERNAME/files/SHARENAME):

php occ files_external:list | awk -F'|' '{print $8"/files"$3}'| tail -n +4 | head -n -1 | awk '{gsub(/ /, "", $0); print}'

Then take one path (from the 1st table, or last adopted list) and run command to check what happens:

php occ files:scan --path="/USERNAME/files/Mount_Point"

Every thing must bus executed under your NC user. For Ubuntu I used sudo -u www-data command as www-data is my NC user. Hope it helps you in trouble shooting.

Dear Friend,

thank you for your script!

I have the same issue with external FTP upload and NextCloud, but i have a FTP PLAN on a shared server.

Is there any way to run this scxript on a Shared FTP ?

I do have a cPanel where i manage my server space, but I’m not sure to have the possibility to follow the instruction that you have wrote (uoload iun user/local/bin/ ecc).

Any help would be appreciated!

If you means

then yes, it works for all external shares independent of type (ftp, smb, local, etc.).

You can upload this script to anywhere and added it to the cron job.
The aim of this script is to reduce scanning time by focused scan of external shares only. Otherwise you can run (as cron job also) following commands:

  1. if you have only 1 external storage then simply run command to rescan it:
php /var/www/nextcloud/occ files:scan --path="/USERNAME/files/FTP_Mount_Point"

you can find more info about and how to in this comment:

OR

  1. You can simply try to rescan whole NC (with external storage also) by command:
php /var/www/nextcloud/occ files:scan --all

This is inefficient way to rescan external shares, but it simply works. More info you can find in official docu: Using the occ command — Nextcloud latest Administration Manual latest documentation

Hi,

Thanks for that script.
I get this error:

.: nextcloud-scripts-config.conf: not found

If I remove this line, script seems to not working:

45: [: Illegal number:
stat: opérande manquant
Saisissez « stat --help » pour plus d’informations.
WARNING - could not write to Log file , will drop log messages. Is User Correct? Current log file owener is
stat: impossible d’évaluer ‘/var/log/next-cron.log’: Aucun fichier ou dossier de ce type
WARNING - could not write to Log file /var/log/next-cron.log, will drop log messages. Is User Correct? Current log file owener is
root@bigfoot2:/jukebox/Pictures/Hep_Familly/Vacance# joe /etc/cron.weekly/NextCloud_Hep

Hi, you can create this file from it nextcloud_scripts/etc/nextcloud-scripts-config.conf at master · GAS85/nextcloud_scripts · GitHub

Then you have to configure variables in script:

# Adjust to your NC installation

# Your NC OCC Command path

COMMAND=/var/www/nextcloud/occ
# Your NC log file path

LOGFILE=/var/www/nextcloud/data/nextcloud.log

# Your log file path for other output if needed

CRONLOGFILE=/var/log/next-cron.log

# If you want to perform cache cleanup, please change CACHE value to 1

CACHE=0

# Your PHP location

PHP=/usr/bin/php

thank you @gas85 , I was using external storage as primary storage, and the quota usage would never update. you saved my whole operation.

1 Like

Hey there, hey @gas85, thank you for your script. I tried writing a simple cron job (/5 * * * * sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") and I tried it with your script rund as a cron job (/5 * * * * sudo - u www-data /usr/local/bin/nextcloud-file-sync.sh). There is no error output, but in both cases the external folder isn’t updated.

When I run sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ" the folder is updated, but when I run your script or let it run as a cron job, the folder isn’t updated.

Please help me. What am I doing wrong. I really tried all the steps.

This will never work. You have to create cron job for user www-data. To do so, type

sudo -u www-data crontab -e

After this enter your line from above without sudo and user and save it:

*/5 * * * * php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ" 

The aim of my script is to get info about all your shares and rescan them. Please check documentation repo, supported Synntax. I believe you simply did not configure it correctly.

thank you @gas85 for your fast answer.

When I type your cronjob command I get:
"touch: cannot touch '/var/www/.selected_editor': Permission denied Unable to create directory /var/www/.local/share/nano/: No such file or directory It is required for saving/loading search history or cursor positions.

I also putted the line without sudo (*/5 * * * * php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") before, but the folder isn’t updated as well.

Stupid question: Where do I find the documentation repo? I installed your script and checked the highlighted areas. Everything seemed to be like in my configuration.

When I run “sudo -u www-data php /var/www/nextcloud/occ files:scan --path=”/XYZ/files/XYZ" manually the folder is updated.

I think it is strange, but in a screenshot you can see, that two top folders didn’t get scanned (they get there with manual running the occ command) like the old familiar once below.

Then try in other way. This makes you root and tell crontab to edit www-data user tab. Old command makes you user www-data and tell crontab to edit actual tab.

sudo crontab -u www-data -e

Docu related to the script is here:

Docu to the official command is here.
https://docs.nextcloud.com/server/latest/admin_manual/configuration_server/occ_command.html#file-operations

Try to use key --verbose to see what happens when you rescan manually.

It still isn’t working. I don’t know, what I am doing wrong.

When I ran

sudo -u www-data php /var/www/nextcloud/occ files:scan --path=“/USERNAME/files/FTP_Mount_Point”

manually it is working, but my cron job with

php /var/www/nextcloud/occ files:scan --path=“/USERNAME/files/FTP_Mount_Point”

is not working. Neither with “sudo crontab -u www-data -e” nor with “sudo -u www-data crontab -e”

The script isn’t working either. Both (script and occ-command in cronjob) are just refreshing the old folders and not showing new folders/files.

I check the configuration and the folders/files are missing:

LOCKFILE=/tmp/nextcloud_file_scan ← Lock file to not execute script twice, if already ongoing

LOGFILE=/var/www/nextcloud/data/nextcloud.log ← Location of Nextcloud LOG file, will put some logs in Nextcloud format

CRONLOGFILE=/var/log/next-cron.log ← location for bash log. In case when there is an output by command generated. AND IT IS GENERATED…

Please I need a solution for getting my scanner-files automatically to my nextcloud.

It is extremely complicated to give advice’s without knowing your system. Please provide more data about your environment.

Nevertheless, could be that cron has no path to php, so simply put absolute like to it.

  1. Find php. For me it is under /usr/bin/php
whereis php
php: /usr/bin/php /usr/bin/php7.4 /usr/lib/php ...
  1. Redirect error output to the console so that you will be able to see it. Add 2>&1 to command in cron
  2. Redirect all output to the logfile to analyse it. Add >> /tmp/nextcloud_rescan_debug.log
  3. Increase output information with key --verbose

All together in your contab should looks like:

*/5 * * * * /usr/bin/php /var/www/nextcloud/occ files:scan --path="/USERNAME/files/FTP_Mount_Point" --verbose 2>&1 >> /tmp/nextcloud_rescan_debug.log

Now you should be able to see output and errors in /tmp/nextcloud_rescan_debug.log, simply run sudo tail -f /tmp/nextcloud_rescan_debug.log and wait when job being executed.

After trouble shooting done, please cleanup command from point 2-4.

I see that I need to improve documentation quality here. You only need to set correctly where your occ command is, then script will check your config and extract path to nextcloud log file from it. If it is not presented in config, you have to configure it for script.
I see that you have only 1 external storage, makes easier for you to use occ command directly as described above.

It is working now. Hallelujah. Thank you very much.

Maybe it is the path to php? In the firstling the cronjob referes to cron.php and there is also no folder to php so that is why I thought there is no need for it.

So i removed the trouble shooting. And yes, since I have only one external drive, the cron job is enough for me.

Last question: Do you know how to configure it, that the modified date doesn’t set to 0 every time. and for some reason the timestamp for the files says “in 25 Minutes” and not “25 Minutes ago”.

1 Like

Try to use --unscannedkey to not touch existing files, this will at least live timestamp of the first scan for files.

Funny, but it works for me very well on ext4:
grafik

What kind of FS are U using?

What do you mean with FS?