Good day,
I am running Nextcloud on a Debian container and it works very well.
However, I would like to access the data directory also directly via SAMBA, as the Nextcloud serves files to the internet and I could access the data directly via my private network.
I think it works, as I have configured
‘filesystem_check_changes’ => 1,
in the config.php, but still, I would like to ask if this is an allowed practice or I should avoid?
the idea is: I put files directly into the data folder (e.g. via SAMBA), bypassing Nextcloud. And then, the files appear in Nextcloud and can be accessed. Is this an OK practice?
Why do you want to do this, like this? Do you think that you gain performance? Do you know, that you can use WebDAV to access your files, too? This would be a more Nextcloudy way. I am not sure, if the SAMBA way will work with Nextcloud. Nextcloud is on thing with the files, but with metadata the other. The metadata is stored in the Nextcloud database. I would expect issues in this setup…
Not really sure, the feature seems to be old in the logs, but it was always recommended to leave the native Nextcloud storage to itself and not do any manual changes to it. Not sure if it is just performance, or if there are some applications and apps that don’t consider such modifications (e.g. you delete files direclty but that are share by link etc.).
For such use cases, it is recommended to use the external storage feature where Nextcloud expects other processes to have access. If you use heavily Samba, you can use this as your primary storage, and you can hook up each user’s folder. Unfortunately, that is not well documented…
thanks for your hints.
In this case, if it is still not recommended to directly access the data directory, I leave it alone and try to use the web interface only.
However, there is one drawback. For instance, I try to upload a large folder in the LAN, containing several hundred files. I then often get random errors and have to retry the upload.
For this reason, I wanted to upload the folder directly, accessing the data directory with SAMBA/CIFS.
It is your life, your data, and your choice. Unless others are actually doing this themselves, feel free to disregard their suggestions while finding how to make this work in the manner you prefer.
Nothing further to offer you besides a hearty good luck.
I do as is recommended, I use the web interface only. I thought I can shorten things a bit by using directly the data directory, but I understand that is not recommended and don’t do it therefore
You can try to chase them down, could be related to file locking or some database optimization to improve the performance.
One alternative for many file might be webdav, you can even use winSCP on windows. Not sure in terms of speed, but you can probably more easily upload a whole folder and compare the content.
You can copy files to data using any protocol you like. I would avoid samba from a security point and use sftp / sshfs or rsync.
After copy fix permissions of files and foders and run occ filles:scan —all
In general: Nextcloud will struggle with folders containing hundreds of subfolders and millions of files unless you have a hell of a machine
I came at this problem from the other direction.
I started with a set of folders and files that I accessed locally via a SMB share.
When I added NextCloud the challenge was to include these files within NextCloud, which I did as a “external drive” (required some admin steps).
I think it should be possible to mount the folder structure with in NC.
I prefer to keep applications & data separate. NC has proven to be a bit fragile in TrueNAS Core environment. Hoping NC on TrueNAS Scale will be more robust.