then yes, it works for all external shares independent of type (ftp, smb, local, etc.).
You can upload this script to anywhere and added it to the cron job.
The aim of this script is to reduce scanning time by focused scan of external shares only. Otherwise you can run (as cron job also) following commands:
if you have only 1 external storage then simply run command to rescan it:
If I remove this line, script seems to not working:
45: [: Illegal number:
stat: opérande manquant
Saisissez « stat --help » pour plus d’informations.
WARNING - could not write to Log file , will drop log messages. Is User Correct? Current log file owener is
stat: impossible d’évaluer ‘/var/log/next-cron.log’: Aucun fichier ou dossier de ce type
WARNING - could not write to Log file /var/log/next-cron.log, will drop log messages. Is User Correct? Current log file owener is
root@bigfoot2:/jukebox/Pictures/Hep_Familly/Vacance# joe /etc/cron.weekly/NextCloud_Hep
# Adjust to your NC installation
# Your NC OCC Command path
COMMAND=/var/www/nextcloud/occ
# Your NC log file path
LOGFILE=/var/www/nextcloud/data/nextcloud.log
# Your log file path for other output if needed
CRONLOGFILE=/var/log/next-cron.log
# If you want to perform cache cleanup, please change CACHE value to 1
CACHE=0
# Your PHP location
PHP=/usr/bin/php
Hey there, hey @gas85, thank you for your script. I tried writing a simple cron job (/5 * * * * sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") and I tried it with your script rund as a cron job (/5 * * * * sudo - u www-data /usr/local/bin/nextcloud-file-sync.sh). There is no error output, but in both cases the external folder isn’t updated.
When I run sudo -u www-data php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ" the folder is updated, but when I run your script or let it run as a cron job, the folder isn’t updated.
Please help me. What am I doing wrong. I really tried all the steps.
The aim of my script is to get info about all your shares and rescan them. Please check documentation repo, supported Synntax. I believe you simply did not configure it correctly.
When I type your cronjob command I get: "touch: cannot touch '/var/www/.selected_editor': Permission denied Unable to create directory /var/www/.local/share/nano/: No such file or directory It is required for saving/loading search history or cursor positions.
I also putted the line without sudo (*/5 * * * * php /var/www/nextcloud/occ files:scan --path="/XYZ/files/XYZ") before, but the folder isn’t updated as well.
Stupid question: Where do I find the documentation repo? I installed your script and checked the highlighted areas. Everything seemed to be like in my configuration.
When I run “sudo -u www-data php /var/www/nextcloud/occ files:scan --path=”/XYZ/files/XYZ" manually the folder is updated.
I think it is strange, but in a screenshot you can see, that two top folders didn’t get scanned (they get there with manual running the occ command) like the old familiar once below.
Then try in other way. This makes you root and tell crontab to edit www-data user tab. Old command makes you user www-data and tell crontab to edit actual tab.
Now you should be able to see output and errors in /tmp/nextcloud_rescan_debug.log, simply run sudo tail -f /tmp/nextcloud_rescan_debug.log and wait when job being executed.
After trouble shooting done, please cleanup command from point 2-4.
I see that I need to improve documentation quality here. You only need to set correctly where your occ command is, then script will check your config and extract path to nextcloud log file from it. If it is not presented in config, you have to configure it for script.
I see that you have only 1 external storage, makes easier for you to use occ command directly as described above.
It is working now. Hallelujah. Thank you very much.
Maybe it is the path to php? In the firstling the cronjob referes to cron.php and there is also no folder to php so that is why I thought there is no need for it.
So i removed the trouble shooting. And yes, since I have only one external drive, the cron job is enough for me.
Last question: Do you know how to configure it, that the modified date doesn’t set to 0 every time. and for some reason the timestamp for the files says “in 25 Minutes” and not “25 Minutes ago”.
I noticed, that suddenly the cronjobs are deleted/set back to the original state. Is it possible that an updates resets the cronjob? How can I prevent that?
IDK your environment, if you are on Nextcloud PI, docker or snap then it could be. In other cases Nextcloud should not be able to do something like this.
I am still trying to get external shares synchronize.
When running the @gas85 script manually using: sudo -u www-data nextcloud-file-sync.sh
It works fine, sync worked.
But when integrated into cron, did not works.
Here is my cron for www-data:
hep@bigfoot2:~$ sudo crontab -l -u www-data
# Edit this file to introduce tasks to be run by cron.
#
# Each task to run has to be defined through a single line
# indicating with different fields when the task will be run
# and what command to run for the task
#
# To define the time you can provide concrete values for
# minute (m), hour (h), day of month (dom), month (mon),
# and day of week (dow) or use '*' in these fields (for 'any').#
# Notice that tasks will be started based on the cron's system
# daemon's notion of time and timezones.
#
# Output of the crontab jobs (including errors) is sent through
# email to the user the crontab file belongs to (unless redirected).
#
# For example, you can run a backup of all your user accounts
# at 5 a.m every week with:
# 0 5 * * 1 tar -zcf /var/backups/home.tgz /home/
#
# For more information see the manual pages of crontab(5) and cron(8)
#
# m h dom mon dow command
*/5 * * * * php -f /home/www/www.hephoto.ch/cloud/cron.php
34 2 * * * /usr/local/bin/nextcloud-file-sync.sh
Here you can see that actually user/files/Externals/Yandex being scanned and, by the way it is hangs since May 22 (today is 25th of May)…
When I kill it, it causing: