Maximum Upload Size

So, according to what you said, I should change the value in the admin panel, then chown the .userini to root:www-data ?

I changed the max upload size in .userini yesterday, and I got file integrity error.

this will let your nextcloud modify .user.ini and you will be able to change the file size in the admin panel

chown www-data:www-data /var/www/nextcloud/.user.ini

this will give not let nextcloud modify the file

chown root:www-data /var/www/nextcloud/.user.ini

Okay, I found out what I was doing wrong, but can’t remember specifically which conf it was. Thus far I’ve changed the /etc/php/7.0/apache, fpm and cli ones, .htaccess in nextcloud root and created for some stupid reason, a .user.ini one as well despite it being for NGINX and not apache2).

The issue for me on the uploads was simply a syntactical one with the correct setting being;


not 512MB, not Mb. It was simply M.

I am using 512M as one example. Other’s would be 5012M. I will also try with G and GB etc. No need to actually let the whole file copy either, just simply drop a file in and if it goes, good stuff.

Okay, so some settings appeared to have reverted the next day and I’m back to 8M upload. I left it like that for the week due to work commitments until I can find out where the reset is occurring.

Does anyone have any input here? Maybe package updates to apache2 or php are setting me back?

Is there an option in the SQL datamase that manages file sizes at all? In my minds overly simplistic eye, this would be the place to manage the NC instance’s file sizes anyway rather than the text files, but that’s not the issue here.

Did you succeed to get it working ? here on apache2 and it accepts my new limit in NC settings (19.5G) but when I try to upload a file that is only 1.5G, it goes till the end and then gives a server error and in Apache error log I have that:
mod_fcgid: HTTP request length 1073750016 (so far) exceeds MaxRequestLen (1073741824)
I have added in fcgid module parameters that lists as:
FcgidMaxRequestLen 107374182400
but not much more success :frowning: NC tells me my limit is at 19.5G for upload but Web server refuses already at 1.5G file upload :frowning:

hi vincen,
did you already adjust the php.ini’s accordingly?
you will find more information here at my blog


post_max_size = 10240M
upload_max_filesize = 10240M

For NGINX (not familiar with apache2, sorry) further params exists:

post_max_size = 10240M
upload_max_filesize = 10240M
client_max_body_size 10240M;
fastcgi_param PHP_VALUE "upload_max_filesize = 10240M
post_max_size = 10250M

cheers, carsten

I really can not remember what it was. It was only occuring on Ubuntu as well as I was ‘testing’ Ubuntu Server again. I moved on from it (everything just seemed slow to respond. Really did not get what I expected from it to the point using caching tech’s like APCU and Redis was pointless. And this was on multiple differant machines).

Sorry I cant be of much use. What I remember I can guarantee is not what the fix was.

@riegerCLOUD sorry for late answer but had to focus on other stuffs :frowning: I’m still no luck with that and becoming crazy !!
I adjusted global PHP configuration for my apache and also the fcgid parameter like that but keep getting that error :frowning:

mod_fcgid: HTTP request length 1073750016 (so far) exceeds MaxRequestLen (1073741824)

Now the difference is that instead of getting a server error at end of upload I get a notification in Nextcloud about lost connection with server… and nothing anymore in error log of Apache :frowning:

Update: found this in Nextcloud logs so it looks like it uses webdav process for upload of files but how do you setup ? found nothing in NextCloud :frowning:

{"reqId":"ITy\/hBrXW7zRxpK\/3UPs","remoteAddr":"","app":"webdav","message":"Exception: {\"Message\":\"HTTP\\\/1.1 400 expected filesize 1207883929 got 524656640\",\"Exception\":\"Sabre\\\\DAV\\\\Exception\\\\BadRequest\",\"Code\":0,\"Trace\":\"#0 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/apps\\\/dav\\\/lib\\\/Connector\\\/Sabre\\\/Directory.php(137): OCA\\\\DAV\\\\Connector\\\\Sabre\\\\File->put(Resource id #127)\\n#1 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/3rdparty\\\/sabre\\\/dav\\\/lib\\\/DAV\\\/Server.php(1072): OCA\\\\DAV\\\\Connector\\\\Sabre\\\\Directory->createFile('AN1_h264-420_4K...', Resource id #127)\\n#2 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/3rdparty\\\/sabre\\\/dav\\\/lib\\\/DAV\\\/CorePlugin.php(525): Sabre\\\\DAV\\\\Server->createFile('AN1_h264-420_4K...', Resource id #127, NULL)\\n#3 [internal function]: Sabre\\\\DAV\\\\CorePlugin->httpPut(Object(Sabre\\\\HTTP\\\\Request), Object(Sabre\\\\HTTP\\\\Response))\\n#4 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/3rdparty\\\/sabre\\\/event\\\/lib\\\/EventEmitterTrait.php(105): call_user_func_array(Array, Array)\\n#5 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/3rdparty\\\/sabre\\\/dav\\\/lib\\\/DAV\\\/Server.php(479): Sabre\\\\Event\\\\EventEmitter->emit('method:PUT', Array)\\n#6 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/3rdparty\\\/sabre\\\/dav\\\/lib\\\/DAV\\\/Server.php(254): Sabre\\\\DAV\\\\Server->invokeMethod(Object(Sabre\\\\HTTP\\\\Request), Object(Sabre\\\\HTTP\\\\Response))\\n#7 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/apps\\\/dav\\\/appinfo\\\/v1\\\/webdav.php(60): Sabre\\\\DAV\\\\Server->exec()\\n#8 \\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/remote.php(165): require_once('\\\/home\\\/orn6hon4i...')\\n#9 {main}\",\"File\":\"\\\/home\\\/titi\\\/domains\\\/\\\/public_html\\\/apps\\\/dav\\\/lib\\\/Connector\\\/Sabre\\\/File.php\",\"Line\":150,\"User\":\"vincen\"}","level":4,"time":"2017-06-06T12:19:54+00:00","method":"PUT","url":"\/remote.php\/webdav\/AN1_h264-420_4K_25_HQ.mp4","user":"vince","version":""}

So it looks like there is no way to go around the half gig limit on upload through web right ??

Exactly the same problem here. Tried many ideas, but found no solution yet…

It seems to be an issue with the combination with mod_fcgid. I switched to php-fpm/mod_fastcgi and now it works like a charm. My php configuration is still untouched after this change…

Hi, I had the same issue, thanks for the help. I’ve change it in the .htaccess. Which size do you prefer for the max upload settings? What happening if you use nextcloud at home and want to save complete system backups on it? Do I need to increase the values up to 1TB or something like this?

EDIT: I am using a NGINX container to serve the SSL certificate. NGINX apparently has a default limit of just over 1MB. You must define a max size in your nginx.conf file if you want anything other than that.

I am using Docker on CentOS 7. I have mariadb, nextcloud, and nginx containers. I have checked the host for the php.ini file (all the container files are stored in /var/lib/docker/volumes/). And on the host, I only see /etc/php.ini I changed the lines:

upload_max_filesize = 512M
post_max_size = 512M
memory_limit = 512M

After restarting all the containers, the Web Interface still only shows “2 MB”, and won’t save changes.

I also edited the following files as well and put 512M …

With no improvement… Still stuck on “2 MB”
And actually, anything over 1 MB hangs (so 1.1 MB hangs).

Add to your nginx container;

client_max_body_size 1024M;

For 1 GB upload, since even when php can handle 512M of data, the server will cut the transmission to its default (1MB).


Hello, I’ve same issue with error message : 413 Request Entiry Too Large".
I change in /etc/php/7.0/apache2/php.ini & /var/www/nextcloud/.htaccess value for
post_max_size = 10240M
upload_max_filesize = 10240M
client_max_body_size 10240M;
I try with M or 1G.
Restarting Apache but still can’t upload files bigger than 1M.

Hi, I ask for help! The situation is absolutely return. It is impossible to limit the size of the loaded file. Via the browser the file any big file, despite restrictions 5M is loaded into php.ini, .htaccess, .user.ini. Files of a configuration are read out correctly. At a conclusion of phpinfo it is received:
PHP Version 5.6.35
Directive Local Value Master Value
default_charset UTF-8 UTF-8
file_uploads On On
memory_limit 512M 512M
post_max_size 5M 5M
upload_max_filesize 5M 5M
Changes are accepted and brought through the admin settings. I tried to make changes directly to the files php.ini, .htaccess, .user.ini. - nothing helps. On the same server the website works with Moodle - there all restrictions work correctly as it is specified in post_max_size, upload_max_filesize.
FreeBSD 11,1; Apache 2.4; Nextcloud 13.0.6; MySQL 5.6.39; PHP 5.6.35
Why on nextcloud doesn’t srabat_vat restriction of the size of files? Where to dig? It seems that I have checked everything… (The translation into English - Google so in advance I apologize for incorrectness)

make sure if you are changing in php -> apache -> php.ini you must have to change “memory_limit” and “post_max_size”
where memory_limit always be grater then post_max_size

memory_limit = 3G
post_max_size = 2G

and then restart Apache server.
and then go to nextcloud setting and change to max size and/or save.

The problem is that nextcloud does not react at all to the size limit in the php.ini directives memory_limit = 512M, post_max_size = 128M, and a file larger than 512M is easily loaded. On changes to these directives in the files nextcloud / .htaccess, .user.ini. also does not respond, any files are loaded, of any size. Apache after changes in php.ini rebooted, saved changes in the settings nextcloud, there is no other php.ini file on the server.

Has anyone actually gotten to the root-cause of this? I’m seeing any file > 2MB is unable to be uploaded. I’ve checked my php.ini settings and I’ve got:
upload_max_filesize = 10G
post_max_size = 10G
(as reported by phpinfo(), so I know this is active for the web server)

In nginx I’ve set:
client_max_body_size 10240M;
and reloaded nginx though I’m unsure how to test it.

Finally in Nextcloud 16.0.1 I can confirm that Settings -> Monitoring, the PHP section shows a 1GB memory limit and 10G max upload size, so my 3MB files should be uploading just fine.

I looked in the www user’s ~/.user.ini and it doesn’t have any settings related to restricting file size. I tried hacking a few settings into there but none seemed to matter.

Anyone figure this out once and for all?

Never mind… I figured it out. I was running through a reverse proxy at my firewall and I didn’t have that proxy nginx configured to allow large transfers. SOLVED.