Cant upload files bigger than 2GB over the website

Nextcloud version (eg, 10.0.2):12.0.2
Operating system and version (eg, Ubuntu 16.04): Ubuntu Mate 16.04
Apache or nginx version (eg, Apache 2.4.25): (where can i see this?)
PHP version (eg, 5.6): 7.0.22
Is this the first time you’ve seen this error?: no

Can you reliably replicate it? (If so, please outline steps):
Upload a file bigger than 2GB (host: Raspberry pi 3)

The issue you are facing:
I cant upload files bigger than 2 GB (my file is 3.4GB big). Its always showing “Bad Request”

The output of your Nextcloud log in Admin > Logging:
Fatal webdav Sabre\DAV\Exception\BadRequest: expected filesize 3381661144 got 2147483648 2017-08-18T20:55:22+0200

The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

<?php $CONFIG = array ( 'filelocking.enabled' => true, 'memcache.distributed' => '\OC\Memcache\Redis', 'memcache.locking' => '\OC\Memcache\Redis', 'memcache.local' => '\OC\Memcache\APCu', 'redis' => array ( 'host' => 'localhost', 'port' => 6379, ), 'instanceid' => '', 'passwordsalt' => '', 'secret' => '', 'trusted_domains' => array ( 0 => '', 1 => '', ), 'datadirectory' => '/var/www/nextcloud/data', 'overwrite.cli.url' => '', 'dbtype' => 'mysql', 'version' => '12.0.2.0', 'dbname' => '', 'dbhost' => 'localhost', 'dbport' => '', 'dbtableprefix' => 'oc_', 'dbuser' => '', 'dbpassword' => '', 'logtimezone' => 'UTC', 'installed' => true, 'updater.release.channel' => 'stable', 'maintenance' => false, 'theme' => '', 'loglevel' => 2, 'updater.secret' => '', );

That’s a limit of 32-bit php:

RPi3 is theoretically 64-bit, but there are not many 64-bit OS with 64-bit php.

"RPi3 is theoretically 64-bit, but there are not many 64-bit OS with 64-bit php."
I know but i heard that the pi isnt so good with 64bit because of the processor or anything like that? :confused:
what are the 64bit raspberry os? opensuse?

is there no way to upload more than 2GB? what is if we would split the 3GB into 1GB chunks? :exploding_head:

and why is it uploading the 3GB with the client (Linux)? :thinking:

and when its not workin (more than 2GB on 32bit), why it is letting me upload the file?

Usually it’s interesting in larger servers with a lot of RAM (more than 4 GB). You could create chunks, perhaps one could integrate such a thing on the interface…

The client is uploading chunks (<< 2 GB) and they are merged in a file later.

1 Like