Upload for chunking Data fails (Updated) File limit <20MB

The Basics

  • Nextcloud Server version (e.g., 29.x.x):
    • 33.0.0.16
  • Operating system and version (e.g., Ubuntu 24.04):
    • docker.io / library / nextcloud:33.0.0-apache
  • Web server and version (e.g, Apache 2.4.25):
    • Apache/2.4.66 (Debian)
  • Reverse proxy and version _(e.g. nginx 1.27.2)
    • nginx version: nginx/1.28.0
  • PHP version (e.g, 8.3):
    • PHP 8.4.18 (cli)
  • Is this the first time you’ve seen this error? (Yes / No):
    • no
  • When did this problem seem to first start?
    • ~20.02.2026
  • Installation method (e.g. AlO, NCP, Bare Metal/Archive, etc.)
    • podman (docker)
  • Are you using CloudfIare, mod_security, or similar? (Yes / No)
    • No, on premise

Summary of the issue you are facing:

Uploading any file wwith is above the files.chunked_upload.max_size limit

Steps to replicate it (hint: details matter!):

  1. Create file in synchronised folder
    or
    Uploading the same file over the web interface

Log entries

Nextcloud

Please provide the log entries from your Nextcloud log that are generated during the time of problem (via the Copy raw option from Administration settings->Logging screen or from your nextcloud.log located in your data directory). Feel free to use a pastebin/gist service if necessary.

08:17:16||###.mp4|8|1|1771940297||528885013||12|You do not have access to this resource. If you think this is an error, please contact the server administrator.|403|0|0|a961112e-5465-4072-9c4b-a96faaa4719f|

Web server / Reverse Proxy

The output of your Apache/nginx/system log in /var/log/____:

find no error or other indications

Configuration

Nextcloud

The output of occ config:list system or similar is best, but, if not possible, the contents of your config.php file from /path/to/nextcloud is fine (make sure to remove any identifiable information!):

<?php
$CONFIG = array (
  'htaccess.RewriteBase' => '/',
  'memcache.local' => '\\OC\\Memcache\\APCu',
  'apps_paths' =>
  array (
    0 =>
    array (
      'path' => '/var/www/html/apps',
      'url' => '/apps',
      'writable' => false,
    ),
    1 =>
    array (
      'path' => '/var/www/html/custom_apps',
      'url' => '/custom_apps',
      'writable' => true,
    ),
  ),
  'instanceid' => '###',
  'passwordsalt' => '###',
  'secret' => '###',
  'trusted_domains' =>
  array (
    0 => '127.0.0.1:80',
    1 => '###',
  ),
  'trusted_proxies' =>
  array (
    0 => '10.89.0.21',
  ),
  'datadirectory' => '/var/www/html/data',
  'dbtype' => 'mysql',
  'version' => '32.0.0.13',
  'overwrite.cli.url' => 'http://localhost:8008',
  'overwriteprotocol' => 'https',
  'dbname' => 'container-nextcloud',
  'dbhost' => 'container-nextcloud-db',
  'dbport' => '',
  'dbtableprefix' => 'oc_',
  'mysql.utf8mb4' => true,
  'dbuser' => '###',
  'dbpassword' => '###',
  'installed' => true,
  'defaultapp' => 'files',
  'default_language' => 'de',
  'skeletondirectory' => '',
  'maintenance' => false,
  'loglevel' => 0,
  'theme' => '',
  'mail_from_address' => '###',
  'mail_smtpmode' => 'smtp',
  'mail_sendmailmode' => 'smtp',
  'mail_domain' => '###',
  'maintenance_window_start' => 1,
  'mail_smtpauth' => 1,
  'mail_smtphost' => '###',
  'mail_smtpport' => '587',
  'mail_smtpname' => '###',
  'mail_smtppassword' => '###',
  'trashbin_retention_obligation' => '90,180',
  'trashbin:expire' => '',
  'versions_retention_obligation' => '90, 180',
  'app_install_overwrite' =>
  array (
  ),
  'default_phone_region' => 'DE',
);

Apps

The output of occ app:list (if possible).

I checked the Parameters in Nextcloud and Nginx container
APACHE_BODY_LIMIT = 10 GB
upload_max_filesize = 10 GB
post_max_size = 10 GB
all time limits are unrestricted

update: I did a lot of more testing and checked every setting twice. I found that chunking files is not working, in teh updated container as well. 33.0.0, before was 32.0.0 (32.0.6 was not tested, but updated to)

please review the manual and many similar topics tagged uploading_big_files

Supposedly, adding:

maxChunkSize=50000000

to Windows client config file:

“~\AppData\Roaming\Nextcloud\nextcloud.cfg” (in the [General] section)

… helps with large files uploads on Windows. This is largely anecdotal evidence, it might be something else, but I have this added and I have no problems uploading even large ISO and video files, while I used to get errors uploading large files before.

My php limits are also set to 10GB like yours. I have APCu too, but I also have Redis and my instance is manual LAMP install, so not sure how that affects things.

Largest files in my Nextcloud dir are over 6GB. This seems to be Windows client issue as I never saw this with Linux clients during my sporadic Linux tests.

Do you have any non-Windows clients? Do they have this problem too? Can’t hurt to try.

Hello.

@wwe Thanks again, I check the manual again, and every thing is set. i get all checks that, every thing is set for files upload till 10GB, that is also shown in on teh web interface, Space is not an issue, It is a 64bit system, no problem with the upload it selfit fails after is is uploaded with file not found
permission error unlike, since a smaller file in the same place works.

@AdamAnon yes I have read that as well an test it, with no change. No we do not have any other client than windows ones.

How can I check the permission issu in the container for Chunking?
The Error log are not helping and the error itself, does not make anysense.

The Error occured after the Upload is finished and say It can not find the file. There is no log message containing the files name at all.

The files in question are between 500MB and 600MB.

That, I don’t know as I never ran Nextcloud in a container and I’m unfamiliar with containers. I also don’t run any proxies. For manual Apache install you just need to make sure that the Nextcloud web and data directories are owned by user:group www-data:www-data and I set permissions to 775 for both.

Exactly, that’s what I would say. :smiling_face_with_sunglasses: Missing permissions.

sudo chown -R www-data:www-data /var/www/nextcloud/
sudo find /var/www/nextcloud/ -type d -exec chmod 750 {} ;
sudo find /var/www/nextcloud/ -type f -exec chmod 640 {} ;

More informations here: #PERMISSIONS web server/user

Are those the recommended permissions? If I set mine to anything else but 775 I start getting errors in the logs and in the client.

I think so :slight_smile:

1 Like

i checked the permission, again and there are fine, otherwise smaller files would have the same issue. permisson are 755 and 644. There would a lot of people having problems with the permission in the container is wrong.

ps: i also checked, in older version it worked, there are a lot of >500MB files in the data storage. So, in NC27 it worked and no change in the configuration.
I will try to update to the 32.0.6 and to 33.0 and See if there will be a change. (no sure when I will have time for that.)

Have you also updated the clients? Now and then, there is a buggy Windows client release. I’m on 4.0.6 now and this one seems to be alright. My server is 32.0.5, btw.

Hello, that is very intressting, are you an the stable branch? mine is 4.0.5 and I can not update. i donload the new client from the web page 4.0.6 but still the same.

I just confimed chunking is the problem. I can only upload files below the flies.chunked_upload.max_size

I will do some digging again and see if I can find a solution.

Yes, stable release, I never touch anything else, had enough issues with Nextcloud “stable” releases in the past :slight_smile: I simply got it from Download and install Nextcloud

What happens when you try to update the client? Don’t use the built-in update feature, that can be wonky indeed. Quit the client and run the .msi manually.

I also find it useful to kill explorer.exe manually when doing this as the Nextcloud installer will often fail to restart it properly.

If that still fails then somethings is clearly off. At this point I’d try trashing Nextcloud client settings and cache and doing a clean install.

Nextcloud client for Windows has folders in both user Local and Roaming AppData.

Good luck!

Sorry if I was not clear. The interal updater, said I have the latest update. So I checked the website and downloaded it, installing it manualy, worked well, but the error is still the same.
Chunking data is still not possible… looking into that and hope to find a better hint why or to solve it.

Thank you, and thank you for the tip that the update information of the client is off.

ps: Uploads using the Web interface have the same error.

Oh, OK. No worries.

Though I have no more ideas, sorry :frowning:

It is working but still frustraiong.

What I did, Install REDIS container and configured it. - Did not work after that

Set Server-ID, restart Container again - Server ID is still marked as not set but I can read it out (occ command and in config.php) and same here in that process it macically acceded the test file

So tried the real file again and it worked… have php occ maintenance:repair run serveral times before.

No Idea what made the cahnge, but it is working at the moment…

1 Like

good sign that it’s working now

the most likely fix was Redis/cache setup plus restart, not the serverid change. In Nextcloud’s DAV chunking flow, chunking v2 depends on a proper distributed cache, and large uploads are also more stable when locking uses Redis

Some possibilities:

  • likely: cache/locking config issue (resolved after Redis + restart)

  • likely: missing/incorrect memcache.distributed before

  • unlikely: serverid itself as root cause

  • unknown: exact original ~20MB trigger without matching log lines

To keep it stable, verify:

  • occ config:system:get memcache.local → \OC\Memcache\APCu

  • occ config:system:get memcache.distributed → \OC\Memcache\Redis

  • occ config:system:get memcache.locking → \OC\Memcache\Redis

1 Like

hello @hweihwang ,

that was my initial thought as well. So I look for a tutorial for Redis and applied it, yes all 3 parameters are set and persistend.

I was just suppries, that after the setup and every running again. The Upload still failed. I did not thing, that the change will take time after the restart, with the redis container configured.

My guess the 20MB are the curren max chunk size and without the cinfiguration with chunking v2, it fall back to chunking v1, which hat issues.
And yes it still works, just tested it, to be sure.

2 Likes

Redis made a huge difference for me in terms of client synchronization speed, literally day and night. I’m on wired LAN and I was so frustrated with sync speeds that were a tiny fraction of the actual bandwidth. After installing Redis, my full client sync went from an entire afternoon to way under an hour.

So perhaps it helps with large files too? I really don’t know.

Glad you got it working!

1 Like