Hello !
I have some issues with downloading et uploading large files (>1Go to 16Go) in my nextcloud instance like nginx connection reset by peer,…
My nextcloud run on a Ubuntu 18.04 dedicated server with:
- Intel i7-6700K
- 64 Go RAM
- 2x 2To RAID (HDD)
- 1Gbps connection
My instance run with MariaDB, PHP7.2 and NGINX. All of them are on the lastest stable release
Nextcloud is on version 17.0.3.
My config.php
<?php
$CONFIG = array (
'instanceid' => '***',
'passwordsalt' => '***',
'secret' => ***',
'trusted_domains' =>
array (
0 => '***',
),
'datadirectory' => '/var/www/nextcloud/data',
'dbtype' => 'mysql',
'version' => '17.0.3.1',
'overwrite.cli.url' => '***',
'dbname' => 'nextcloud',
'dbhost' => 'localhost',
'dbport' => '',
'dbtableprefix' => 'oc_',
'dbuser' => '***',
'dbpassword' => '***',
'installed' => true,
'mail_smtpmode' => 'smtp',
'mail_smtpsecure' => 'tls',
'mail_sendmailmode' => 'smtp',
'mail_from_address' => 'noreply',
'mail_domain' => '***',
'mail_smtphost' => '***',
'mail_smtpport' => '587',
'mail_smtpauth' => 1,
'mail_smtpauthtype' => 'LOGIN',
'mail_smtpname' => '***',
'mail_smtppassword' => '***',
'maintenance' => false,
'has_rebuilt_cache' => true,
'skeletondirectory' => '',
'filelocking.enabled' => true,
'config_is_read_only' => true,
'memcache.local' => '\\OC\\Memcache\\APCu',
'memcache.locking' => '\\OC\\Memcache\\Redis',
'redis' =>
array (
'host' => '/var/run/redis/redis.sock',
'port' => 0,
'timeout' => 0,
),
'tempdirectory' => '/tmp/nextcloudtemp',
'updater.secret' => '***',
'theme' => '',
'loglevel' => 2,
'mysql.utf8mb4' => true,
);
On my /fpm/php.ini I’ve set:
output_buffering = Off;
max_execution_time = 3600
max_input_time = 3600
memory_limit = 4G
And my nextcloud file on nginx/sites-available
...
ssl_session_cache shared:SSL:1m;
ssl_session_timeout 1440m;
ssl_buffer_size 8k;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers '***';
ssl_prefer_server_ciphers on;
ssl_stapling on;
ssl_stapling_verify on;
...
# Add headers to serve security related headers
add_header X-Frame-Options "SAMEORIGIN";
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header X-Robots-Tag none;
add_header X-Download-Options noopen;
add_header X-Permitted-Cross-Domain-Policies none;
add_header Strict-Transport-Security 'max-age=31536000; includeSubDomains;';
add_header X-Accel-Buffering no;
...
# set max upload size
client_max_body_size 20G;
fastcgi_buffers 64 4K;
...
fastcgi_split_path_info ^(.+.php)(/.*)$;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param PATH_INFO $fastcgi_path_info;
fastcgi_param HTTPS on;
fastcgi_param modHeadersAvailable true;
fastcgi_param front_controller_active true;
fastcgi_pass php-handler;
fastcgi_intercept_errors on;
fastcgi_request_buffering off;
fastcgi_read_timeout 300;
}
location ~ ^/(?:updater|ocs-provider)(?:$|/) {
try_files $uri/ =404;
index index.php; }
# Adding the cache control header for js and css files
# Make sure it is BELOW the PHP block
location ~* .(?:css|js|woff|svg|gif)$ {
try_files $uri /index.php$uri$is_args$args;
add_header Cache-Control "public, max-age=15778463";
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header X-Robots-Tag none;
add_header X-Download-Options noopen;
add_header X-Permitted-Cross-Domain-Policies none;
# Optional: Don't log access to assets
access_log off;
}
location ~* .(?:png|html|ttf|ico|jpg|jpeg)$ {
try_files $uri /index.php$uri$is_args$args;
# Optional: Don't log access to other assets
access_log off;
}
}
I hope all informations needed is paste here
Any idea for why downloading and uploading work not for files bigger than 1Go ?
Thank you a lot and have a nice day !