Big 2GB+ files duplicating during rebuild after upload

I’m running nextcloud on a raspberry pi 2 with the nextcloudpi distro. It’s 32 bits, so 2 GB+ files are split into 100MB chunks during upload and are rebuilt afterwards into one file and the chunks are deleted. However, after enough time, while the file is being rebuilt, my desktop client gives me an error message that says:

... "503 Service Unavailable" to "MOVE https://my_domain/remote.php/dav/uploads/my_user/file_id/.file"

An ssh session reveals that even after this error message, the file rebuild process continues in the background and it will complete if given enough time, at which point everything will be fine. However, usually after this error, a second copy of the file will also start being created in the same directory, but with a different transfer id. Essentially my raspberry will needlessly split its resources in half to build two copies of the same file. After the bigger one that started first finishes, the other copy is deleted along with the 100MB chunks.

Does anyone know how to fix this? This essentially means that I will need more than twice the free space than the size of the file when uploading something over 2GB. And for all I know if I try something way more than 2GB it might start creating more than one duplicate, thus requiring even more space and time to finish just one file. In the worst case scenario a big enough file might get duplicated forever, splitting resources and taking longer and longer to finish until it crashes my raspberry, who knows.


On long term planning, i would say you definitely need to upgrade to a 64 bits system.

Otherwise, no clue, sorry.

This problem seems to occur because PHP can not handle 2GB+(Large file) in 32bit system.
Please see


Yes that’s why they are split into many 100MB chunks and then rebuilt. The problem happens during the rebuild process. It can still finish and the upload will be fine afterwards, but I suspect the problem could become worse depending on the size of the file.

The problem happens during the rebuild process.

What programs are you using to rebuild?
Actually not only PHP, but every process has a 2GB limit, so you need to recompile your program with LFS options in 32bit system. I think that to split chunks does not help it. Because when you combine chunks over 2GB+, there will be crash.

If you use nextcloud/server package on raspberry-pi, you need to recompile PHP with LFS options. If you use nextcloud/docker package on raspberry-pi, you need to rebuild PHP container with LFS options and rebuild nextcloud container.


Nextcloud-snap had 2GB limit issue on raspberry-pi and odroix-xu4 and they fixed it. I guess owncloud has same problem. So check that issue.

I hope this helps.

I tried going with the nextcloud snap before, but it didn’t help. Now I’m using the nextcloudpi distro for the raspberry pi 2, but I don’t know what processes it’s doing to split and rebuild the file. All I know is that every 2GB+ file is split into 100MB chunks in the user/uploads/file_id directory while it’s being uploaded, and afterwards it starts being rebuilt into the desired directory.