Issue uploading large files (4GB) with S3, Redis, and Memcache

Hi everyone,

I’m having issues uploading large files (4GB) to Nextcloud, even with all performance settings configured. Here’s my environment and setup:

  • Nextcloud with S3 configured, Redis, and Memcache

  • Redis, APCu, and Memcache installed and working on the server and PHP

  • PHP 8.3 (also tested 8.1, 8.2, and 8.4)

  • php.ini settings:

    • memory_limit = 1024G

    • upload_max_filesize = 1024G

    • post_max_size = 1024G

    • max_execution_time = 86400 (24 hours)

  • Server: Ubuntu 22, 6 CPUs, 12GB RAM

  • Panel: FastPanel

  • Tested with multiple servers and S3 providers, issue persists

Problem:
When I try to upload 4GB files, the upload fails at the end. If I refresh the page, a .part file appears, but then it disappears, and the final file does not show up.

Nextcloud config.php (sensitive info removed/hashed):

<?php
$CONFIG = array (
  'instanceid' => 'ocfphhjmliqh',
  'objectstore' => 
  array (
    'class' => 'OC\\Files\\ObjectStore\\S3',
    'arguments' => 
    array (
      'bucket' => '****',
      'key' => '****',
      'secret' => '****',
      'hostname' => '****',
      'port' => 443,
      'use_ssl' => true,
      'region' => '****',
      'use_path_style' => true,
    ),
  ),
  'memcache.local' => '\\OC\\Memcache\\Redis',
  'memcache.locking' => '\\OC\\Memcache\\Redis',
  'memcache.distributed' => '\\OC\\Memcache\\Redis',
  'redis' => 
  array (
    'host' => '127.0.0.1',
    'port' => 6379,
  ),
  'trusted_domains' => array('cloud.gravonyx.com'),
  'datadirectory' => '/var/www/.../data',
  'dbtype' => 'mysql',
  'dbhost' => '127.0.0.1',
  'dbname' => '****',
  'dbuser' => '****',
  'dbpassword' => '****',
  'installed' => true,
);

Nextcloud logs:

Fails how precisely? What’s the error / state you experience?

Is this via the Web UI or some other upload method?

Can you confirm you have no warnings or errors under Administration setting->Overview?

Also:

  • Please just post the couple of log entries that come up during the upload attempt (ideally directly in your post as preformatted text). We’re volunteers and don’t have time to go sort out irrelevant log entries.
  • Do the S3 platforms you’re testing support S3 MultipartUpload?

As long as you’re using an official client, logged in (not an anonymous user), and not using anything like Cloudflare… file chunking should be in-use automatically so the above shouldn’t matter too much (chunk size is 100 MiB for Web UI uploads by default).

A reverse proxy or WAF could be a factor as well.

1 Like

Thanks for the follow-up!

  • I’m uploading via the web interface and also tested with the Windows Nextcloud client — same issue.

  • In Admin → Overview I see some configuration warnings (missing indices, missing PHP modules, HSTS header not set, etc.), but nothing directly related to file uploads.

  • During the 4GB upload, the process goes to the end, then fails. After refreshing, a .part file shows up briefly and then disappears, without the final file appearing.

  • Both Contabo S3 and another provider were tested — both support Multipart Upload.

  • No Cloudflare or WAF is in use, only Nginx (FastPanel).

I would closely monitor what really happens. big files uploaded in parts multiple “chunks” are temporarily stored as .part and later combined into a full file at the end.. many things could go wrong, chunks could be to large, downloading chunks to combine them could fail, temp storage could exhaust etc..

this references could help troubleshooting:

there are also config settings which might help (see config.sample.php for details):

	/**
	 * Store part files created during upload in the same storage as the upload
	 * target. Setting this to false stores part files in the root of the user's
	 * folder, which may be necessary for external storage with limited rename
	 * capabilities.
	 *
	 * Defaults to ``true``
	 */
	'part_file_in_storage' => true,

	/**
	 * Allow storage systems that do not support modifying existing files to overcome
	 * this limitation by removing files before overwriting.
	 *
	 * Defaults to ``false``
	 */
	'localstorage.unlink_on_truncate' => false,

I see some warnings in Admin > Overview, and here are a few related to memcache.
However, memcache (Redis + APCu) is already installed on the server and also configured in my config.php.

Warnings shown:

  • Transactional file locking is using the database. To improve performance, configure memcache if available.

  • No memory cache has been configured. To improve performance, configure a memcache if available.

So even though I have Redis and APCu installed and active, Nextcloud still shows these warnings.

Your posted config isn’t even using APCu (Memory caching — Nextcloud latest Administration Manual latest documentation).

As for the warning about Redis, that’s pretty definite. It means Nextcloud isn’t actually able to use your Redis installation.

Having functional Redis is fairly important for S3 Primary Object storage: Uploading big files > 512MB — Nextcloud latest Administration Manual latest documentation

Suggestions:

  • Focus on figuring out what is going on with Redis
  • Use occ config:list system to see your real parsed config (as the support template suggests) rather than posting your config.php; it can also help catch some other types of syntax errors/etc
  • Check the logs!
1 Like

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.