I’m sure this will be easy, but I just can’t see how to do it.
We have a web server that generates files from various data. The files are put into a big directory structure. I would like to sync that directory structure to a directory on a separate Nextcloud server. From there we have users being given access to various directories through shares, and that bit I can understand.
What I am not sure about is how to sync the directory structure on our headless CentOS web server to the remote Nextcloud server. Is there a PHP script I can use to do that? Or would I need to write my own script that pushes files over via webdav? Or is there another API I should use? Or is there a way to do it by installing Nextcloud at both ends and getting them regularly synced to each other?
Ideally I would not have to create my own syncing code - I would just like to point some tool at the local directory and the Nextcloud account and say, “syn 'em up, please”. Running cron to do a sync catch-up once and hour would be sufficient if that was the best approach.
Thanks
Edit: if I understand correctly, I can put files into the data
area for a user then rescan that user for the new files to be recognised. That would work for a local Nextcloud. However, I do want to get the files physically onto the remote server. Maybe cron running rsync to pull the [changed] files over, then a files:rescan
to scan for any changes? Or maybe it’s better to drop files into an “external local storage” directory if I’m moving them around from outside the application? So many options! What would be the best approach?