I use Nextcloud to exchange large amounts of video footage with clients (100s of GBs per project) and I still find it relatively difficult to come up with a clean workflow.
WebDAV should be the way to go, but the Hoster (Hetzner) sets the file size limit to 10GB, wich is totally not enough for me.
The Nextcloud Client doesnt work “headless” enough, as i often get error messages. If anything goes wrong during sync process it just gives up and stops syncing until i click “sync now” again. Also i strongly dislike that if i delete a file on the synced Nextcloud instance, it automatically gets deleted on my local folder.
This is a setup for disaster in my daily work!
My workflow now depends on a Ubuntu-VM that mounts my local storage server via SMB and uploads that amount of data via the webUI. This works pretty reliably, until there’s any kind of error during upload. If i retry to upload, i get the picker which version i want to keep, local or nextcloud.
Unfortunately, this does not work through subfolders. This is pretty big for me, as when i only miss 1% of a single video file, that means the whole file becomes unreadable. Data loss is a big no-no in my field of work, reshoots are often impossible.
It seems like i’m missing something? Can i get a file-level comparison when uploading via webUI like the “merge” option when copying under windows/macOS?
If it is important for you, did you ask them if they have options or other packages that allow larger file sizes?
The desktop client uses chunked uploads which should help you to go around this limit. If you have the error messages, perhaps there is a way to debug this further and makes this option work for you.
Chunking was using 10 MB by default, the protocol is limiting this to 10000 chunks, so for >100 GB you get a problem. For large files and fast connections, a larger chunk size can be interesting, you might try to change this:
( chunkSize variable is the one you’d like to change).
That sounds really bad, you should have something that works better. Or if you know you don’t need the two-way sync, then just have a automated upload procedure with a script using the chunking procedure?
You have an app that gives you checksums of files:
with that you can compare that to a checksum of a local file to check if they are identical.
I did already speak with Hetzner and they dont offer a “file size upgrade” for this kind of service.
I didnt know there was a chunked file upload option via API, this sounds like i could get a reliable upload solution for my use case. I will definitely read into that and try to follow your idea of an automated upload procedure via script.