Guide is unclear on manual File chunk upload process

Hi,

I am facing issues attempting to upload a large file (>30GB) using the guidance available here in the official docs https://docs.nextcloud.com/server/latest/developer_manual/client_apis/WebDAV/chunking.html

I am able to chunk the files manually using the “split” utility in Linux into 256MB chunks, however the instruction is unclear on how to reassemble these parts on the server side.

The instructions suggest naming the chunks based on byte positions, and then issuing this example command

curl -X MOVE -u roeland:pass --header 'Destination:https://server/remote.php/dav/files/roeland/dest/file.zip' https://server/remote.php/dav/uploads/roeland/myapp-e1663913-4423-4efe-a9cd-26e7beeca3c0/.file

to concatenate the chunked file and move it to the location in the “Destination” header.
However, how is this supposed to work when using the naming scheme suggested? How do I select all chunks of the file to be rejoined? The .file location cannot be found.

I’d be appreciative of any help, thank you.

seems to work fine here.

you create a uniq folder in step 1 MKCOL

you upload the chunks in this folder step 2

you set the final destination of folder step 3

pre require;

dd if=/dev/zero of=20mb.txt bs=1M count=20 && split -n2 20mb.txt;

  1. curl -X MKCOL -u username:password https:// domain/remote.php/dav/uploads/username/UNIQ/

UNIQ can be anything you want it is a temporary directory for your chunks

2.1 curl -X PUT -u username:password https:// domain/remote.php/dav/uploads/username/UNIQ/xaa --data-binary @xaa

2.2 curl -X PUT -u username:password https:// domain/remote.php/dav/uploads/username/UNIQ/xab --data-binary @xab

uoload the chuncks in any order you want just make sure when you do

ls | sort

chunks are in order

  1. curl -X MOVE -u username:password --header ‘Destination:https:// domain/remote.php/dav/files/username/20mb.txt’ https:// domain/remote.php/dav/uploads/username/UNIQ/.file

nextcloud will move all chunks in UNIQ in sort order to destination 20mb.txt in username root folder

.file is only a pointer to all files in UNIQ

setting variables and loop through files
make sure you current working directory only contain chunks

TMPCOL="https://server/remote.php/dav/uploads/username/tmpuniq/"
USRPAS="username:password"
DEST="https://server/remote.php/dav/files/username/20mb.txt"

curl -X MKCOL -u $USRPAS $TMPCOL
for FILE in *; do curl -X PUT -u $USRPAS $TMPCOL$FILE @$FILE ; done
curl -X MOVE -u $USRPAS --header 'Destination:$DEST' ${TMPCOL}.file

Thanks Vincent! You’ve solved my headache!

My issue was that I was using the wrong URL for cURL - instead of putting the chunks in the “Uploads” main directory, I was putting them in “Files”, similar to the standard cURL upload procedure. One cannot initiate the merge from

<server>/remote.php/dav/files/<userid>
to
<server>/remote.php/dav/files/<userid>

it has to be from:

<server>/remote.php/dav/uploads/<userid>
to
<server>/remote.php/dav/files/<userid>

I don’t know how many times I read the docs and I still missed it!