Adding files via script

Hello guys.
First of all. My Nextcloud instance is doing well.
Another server of mine is generating files. These are different files with file types. Anyway, I am now looking for a way to bring a file into the nextcloud with ftp, ssh or similar. Furthermore, the script should work there with root-like rights in order to, for example, change, delete files and so on. I think I know what I am looking for. Can you help me?

Greetings
Niffecs

WebDAV is the way to go. You could use something like rclone https://rclone.org/

I would not recommend using other protocols, because if you copy files to your Nextcloud server in any other way than via WebDAV, the WebUI or the official Desktop Clinets, Nextcloud will not recognize them and you will have to scan them manually using occ files:scan.

Another option would be to mount a local folder outside of the Nextcloud Data Folder via External Storage App to your Nextcloud, but again I would only recommend this if there is really no other way, and if you are aware of possible security implications. Should you decide to do so, try to use a secure protocol like e.g SFTP with key authentication, especially if the connection is made over the public Internet.

1 Like

@Niffecs
Nextcloud can do a lot. But for some things like making coffee :coffee: or even for your idea Nextcloud is less suitable. Why do you really want to have the files in your Nextcloud? Why don’t you make the files available to the users in another way? Explain the goal of your plan.

1 Like

If it’s nothing else, nothing easier than that!

Lets take ssh. One could do something like this:

server1 : Host of Nextcloud

server2 : Produces Files, wants to copy them to server1

Setup SSH PubkeyAuthentication

First of all, SSH public key authentication must be configured and working.
In this example, for the sake of simplicity, I assume that you are root on both sides

/etc/ssh/sshd.config:

PubkeyAuthentication yes

etc…

The script on server2:

Now the elements of your script on server2 :

# the file (or directory) you want to copy
copy_object="/path/to/file"

# in case it is a dir, set recursive
[ -d "$copy_object" ] && scp_opt="-r" || scp_opt=""

# The Nextcloud-user "UserID" who should receive the files
nc_user="UserID"

# And the directory in which it should lay down
target_dir="desired/target/dir/"

# how occ have to be invoked on the remote maschine
# the webserver-user on server1
ht_user="www-data"
# the webserver-group on server1
ht_group="$ht_user"
# change path accordingly:
occ_call="sudo -u $ht_user -f /var/www/nextcloud/occ"

# the main work:
scp "$scp_opt" "$copy_object" server1:"/path/to/nextcloud/data/${nc_user}/files/$target_dir"
# we do chown recursive because it is cheap ;) :
ssh server1 chown -R $ht_user:$ht_group $(dirname "/path/to/nextcloud/data/${nc_user}/files/$target_dir")
# files:scan only the added files:
ssh server1 "$occ_call" files:scan --path="${nc_user}/files/$target_dir"

Based on these simple building blocks, it’s easy to create a script just the way you want it.

There are of course many other ways to do this, but this is probably the easiest you can think of. Virtually functional “out of the box” :sunglasses:

Please consider that if you have SSH (port 22) open, you are responsible for the security of your servers and have to ensure adequate protection!

Just my 2 cent
I hope I could help

Even though I have created this post with the greatest possible care, I know with certainty that I (as usual) made at least small mistakes. If you find any inaccuracies please point them out to me, I will correct them immediately if possible or your comment will be the correction.

Happy hacking

I also would not recommend opening ssh port and adding files and scan them afterwards.

Use WebDAV instead.

Example about how to upload using curl can be found here:

Use it in many many scripts and it is reliable. My example is only about adding but can be extended.

You also? Who else? :wink:


There is no rule of thumb what would be the best way. one can recommend.

In general, scp is faster than using curl with WebDAV for file transfers.

scp is optimized for speed and efficiency, and can take advantage of compression to reduce the amount of data transmitted over the network. scp also supports resuming interrupted transfers, which can save time when transferring large files.

On the other hand, while it is possible to use curl with WebDAV to transfer files, this approach may be slower than using scp. While curl can transfer large files, it may not be as optimized for large file transfers as scp . scp is specifically designed for efficient and secure file transfers over a network, whereas curl is a more general-purpose tool for transferring data over various protocols, including HTTP, FTP, and others.

As for authentication, public key authentication with SSH is a more secure solution than storing cleartext passwords in an oldfashioned .netrc file. While the .netrc file is still supported by many tools, it is considered a security risk because it stores credentials in plain text. Public key authentication uses cryptographic keys to authenticate users and does not expose passwords to potential attackers.

That being said, the actual transfer speed depends on many factors, including the size of the files, the network speed and latency, the processing power of the servers, and the configuration of the transfer tools. I would recommend to test both methods in your specific environment to determine which one performs better for your use case.

If your hosts are on the same network, opening port 22 for SSH access is not an issue, since you may have it open annyhow, for your normal shell access.

peace :heart:, and
Happy hacking

1 Like

On the other hand, do you really have to upload large portions of files in bulk to your server that often that speed really matters, and do they actually have to be accessable from your Nextcloud instance? Don’t get me wrong there might be edge cases where this might be needed and speed could be an issue. But I guess more often people just want to use Nextcloud for everything, because it’s already there, right?

I would argue that in many of these cases, some generic storage solution with a simple web front end, like e.g. a Synology Disk Station or even a simple Linux server with some SAMBA or NFS shares, would be better suited than Nextcloud. Nextcloud is many things, but it’s not meant to be a generic storage backend for whatever arbitrary application or use case people might have. There are probably better solutions out there for most of these use cases.

Just my 5 cents on this topic :wink:

Why not let the user make the decision for what he uses it. When it suits his demand, it is “meant” to be the solution. It is a cloud-software, not a religion :wink:

This is what the original poster was looking for:

… and i want him to further knowing, what he is looking for. Let him decide.

I can imagine an example like: different servers creating content - and just let it be log files, he wants to get it gathered together in a folder somewhere in his world-reachable cloud with all the advantages of nextcloud.

Sure, but if there are easier solutions, Nextcloud would actually be the religion instead of the solution. :wink:

Also, I just wanted to share my general thoughts. And I like easy and simple solutions, and therefore I tend to I recommend solutions that deviate as little as possible from the intention of how a product should be used, the documentation, the officially supported interfaces etc…

Well, but then speed is hardly an issue, and Rclone via WebDav might be just perfect for that use case, more secure than some curl script with the password in clear text in it, and better than using occ.files scan every time a file gets added. :wink:

You are thinking of
occ files:scan $userid
which is a long “expensive” scan

I wrote:

which is extremely fast (“cheap”).

Look for the -p, --path=PATH option in the help:

:~# occ files:scan --help
Description:
  rescan filesystem

Usage:
  files:scan [options] [--] [<user_id>...]

Arguments:
  user_id                  will rescan all files of the given user(s)

Options:
      --output[=OUTPUT]    Output format (plain, json or json_pretty, default is plain) [default: "plain"]
  -p, --path=PATH          limit rescan to this path, eg. --path="/alice/files/Music", the user_id is determined by the path and the user_id parameter and --all are ignored
      --generate-metadata  Generate metadata for all scanned files
      --all                will rescan all files of all known users
      --unscanned          only scan files which are marked as not fully scanned
      --shallow            do not scan folders recursively
      --home-only          only scan the home storage, ignoring any mounted external storage or share
  -h, --help               Display this help message
  -q, --quiet              Do not output any message
  -V, --version            Display this application version
      --ansi               Force ANSI output
      --no-ansi            Disable ANSI output
  -n, --no-interaction     Do not ask any interactive question
      --no-warnings        Skip global warnings, show command output only
  -v|vv|vvv, --verbose     Increase the verbosity of messages: 1 for normal output, 2 for more verbose output and 3 for debug

When uploading more files in one directory, you can take --unscaned --shallow which is fast as well.

Ok thanks. That sounds actually like a viable solution.

But wouldn’t you have to connect as root or the webserver user, or at least as a user added to the webserver users’s group to the server, in order to be able to execute the occ commd remotely? Not sure if I would feel confortable with that…

But yes, it’s actually a really good solution, certainly better than using the External Storage app to mount a local folder (or litterally to mount anything). This thing should be avoided at all costs… But I’m starting to ramble… :wink:

1 Like

You can get used to the fact that as a former software engineer, once I give an answer, I have thought about it very carefully and tried it out. It may be worthwhile reading my proposed solutions at least very carefully.

peace :heart: and
happy hacking

1 Like

Yeah I got it :stuck_out_tongue_winking_eye:

And yes my answer may not have fit a 100% in here. (Note to myself: I should read posts more carefully before I answer) Nevertheless, I stand by my opinion (even if it may not apply here). People want to “abuse” Nextcloud for all kind of things, after they just managed to copy&paste it together somehow, and then they run into all kinds of issues…

If I want a media streaming service, I use Jellyfinn, Plex or Emby.

If I want a mail server, I put Postfix, Dovecoat etc on a separate server, or I use something like Mailcow, Mail-in-a-Box etc.

If I want to use synchronize files for some specific software, I use the Sync Client

If I want to upload someting to Nextcloud I use WebDAV

If I want to host a website I use a separate server and some minimal CMS

If I want to host a wiki, a photo album… (although Nextcloud is getting better at this…)

If I want a replacement for Google Workspaces or M365, I use Nextcloud! :smiley:

But yes, in this thread my rant was probably out of place :wink:

1 Like

For my little help.
Context, I take a picture via a camera every hour (report) and push it on folder on my nextcloud based per month (not per day, can be if needed)

#!/bin/bash
wget http://127.0.0.1:8888/?action=snapshot -O /path/$(date +%Y-%m-%d-%H-%M-%S).jpeg && sleep 2 && curl -u useronnextcloud:passwordofuser -T /path/.jpeg “https://example.com/remote.php/dav/files//USER/webdavpath/$(date ‘+%Y-%m’)/” && mv /path/.jpeg /path/bak/

explain :slight_smile:
wget it’s for snapshot my picture on my camera and save it to /path/ with name based on date.jpeg
curl with auth to put file on my select folder for this user
and after, i move localy file on backup folder (if not internet)

no need to occ scan file of user, etc
it’s “normal” use of file app on nextcloud

1 Like

Btw. if you have any devices which do not support WebDAV and on which you can’t run any scripts, I found a very slick solution to upload files automatically to Nextcloud without having to run occ files:scan afterwards:

TLDR: Set up a separate server / VM and install rclone and the inotify-tools on it. inotyfywait can then monitor one or multiple folders and as soon as a file is added to one of these folders, it can trigger rclone, which then uploads the file via WebDAV to your Nextcloud.

A nice blog, using a scanner:
https://thesmarthomejourney.com/2021/12/04/automatic-upload-to-nextcloud/#autoupload

GitHub repo with the script used in the above post:
https://github.com/OliverHi/autouploaded

1 Like