Hi everyone,
First of all, this is my environment :
- Debian 10 x64
- Nextcloud 17.0.1
- Nginx 1.14.2
- PHP-FPM 7.3
- Client Nextcloud : 2.6 on Windows 10 x64.
Everything is ok, i’m really satisfy about Nextcloud ! Great software.
However, i have a problem : impossible to download a 3.4GB file on my client computer. Synchronization is good, and when you synchronize the file, everything stops as soon as it reaches 1GB. I of course looked before asking for help, but now I don’t know…
Here are the logs when the file download reaches 1GB on the client:
*1503 readv() failed (104: Connection reset by peer) while reading upstream, client: MY_PUBLIC_IP, server: myserver.io, request: “GET /remote.php/dav/files/f.moreau/COSIKA/%23COSIKA%20CONCEPT/PARCOURS/FORMATION/FORMATION%20COMMERCIALE/FORMATION%20START/formation%20start%20V10.19.key HTTP/1.1”, upstream: “fastcgi://unix:/var/run/php/php7.3-fpm-nextcloud.sock:”, host: “myserver.io”
I have added the following value in my Nginx configuration:
proxy_buffering off;
Despite this, nothing can be done.
An extract from my Nginx configuration :
# Add headers to serve security related headers add_header Referrer-Policy "no-referrer" always; add_header X-Content-Type-Options "nosniff" always; add_header X-Download-Options "noopen" always; add_header X-Frame-Options "SAMEORIGIN" always; add_header X-Permitted-Cross-Domain-Policies "none" always; add_header X-Robots-Tag "none" always; add_header X-XSS-Protection "1; mode=block" always; add_header Strict-Transport-Security "max-age=31536000; includeSubDomains;"; #Remove X-Powered-By, which is an information leak fastcgi_hide_header X-Powered-By; #Path to the root of your installation root /srv/nextcloud/www/; #Suite aux soucis de buffer rencontrés avec Nginx proxy_buffering off; location = /robots.txt { allow all; log_not_found off; access_log off; } # The following rule is only needed for the Social app. # Uncomment it if you're planning to use this app. rewrite ^/.well-known/webfinger /public.php?service=webfinger last; location = /.well-known/carddav { return 301 $scheme://$host:$server_port/remote.php/dav; } location = /.well-known/caldav { return 301 $scheme://$host:$server_port/remote.php/dav; } # set max upload size client_max_body_size 10G; fastcgi_buffers 64 4K; # Enable gzip but do not remove ETag headers gzip on; gzip_vary on; gzip_comp_level 4; gzip_min_length 256; gzip_proxied expired no-cache no-store private no_last_modified no_etag auth; #Uncomment if your server is build with the ngx_pagespeed module #This module is currently not supported. #pagespeed off; location / { rewrite ^ /index.php; } location ~ ^\/(?:build|tests|config|lib|3rdparty|templates|data)\/ { deny all; } location ~ ^\/(?:\.|autotest|occ|issue|indie|db_|console) { deny all; } location ~ ^\/(?:index|remote|public|cron|core\/ajax\/update|status|ocs\/v[12]|updater\/.+|oc[ms]-provider\/.+)\.php(?:$|\/) { fastcgi_split_path_info ^(.+?\.php)(\/.*|)$; set $path_info $fastcgi_path_info; try_files $fastcgi_script_name =404; include fastcgi_params; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_param PATH_INFO $path_info; fastcgi_param HTTPS on; # Avoid sending the security headers twice fastcgi_param modHeadersAvailable true; # Enable pretty urls fastcgi_param front_controller_active true; fastcgi_pass php-handler; fastcgi_intercept_errors on; fastcgi_request_buffering off; } location ~ ^\/(?:updater|oc[ms]-provider)(?:$|\/) { try_files $uri/ =404; index index.php; } # Adding the cache control header for js, css and map files # Make sure it is BELOW the PHP block location ~ \.(?:css|js|woff2?|svg|gif|map)$ { try_files $uri /index.php$request_uri; add_header Cache-Control "public, max-age=15778463"; # Add headers to serve security related headers (It is intended to # have those duplicated to the ones above) # Before enabling Strict-Transport-Security headers please read into # this topic first. #add_header Strict-Transport-Security "max-age=15768000; includeSubDomains; preload;"; # # WARNING: Only add the preload option once you read about # the consequences in https://hstspreload.org/. This option # will add the domain to a hardcoded list that is shipped # in all major browsers and getting removed from this list # could take several months. add_header Referrer-Policy "no-referrer" always; add_header X-Content-Type-Options "nosniff" always; add_header X-Download-Options "noopen" always; add_header X-Frame-Options "SAMEORIGIN" always; add_header X-Permitted-Cross-Domain-Policies "none" always; add_header X-Robots-Tag "none" always; add_header X-XSS-Protection "1; mode=block" always; add_header Strict-Transport-Security "max-age=31536000; includeSubDomains;"; # Optional: Don't log access to assets access_log off; } location ~ \.(?:png|html|ttf|ico|jpg|jpeg|bcmap)$ { try_files $uri /index.php$request_uri; # Optional: Don't log access to other assets access_log off; }
}
And from my PHP configuration :
[…]
request_terminate_timeout = 45s;nbre de requêtes auxquelles chaque process php-fpm peut participer
pm.max_requests = 500;durée inactivité d’un process enfant avant d’être killé
pm.process_idle_timeout = 60s;php_admin_value[open_basedir] = “.:/srv/$pool/www:/srv/$pool/var/tmp:/usr/share/php”
php_value[include_path]=“.:/srv/$pool/www:/srv/$pool/www/include”; UPLOAD
php_admin_flag[file_uploads]=1
php_admin_value[upload_tmp_dir]=“/srv/$pool/var/tmp/upload”
php_admin_value[cache_dir]=“/srv/$pool/var/tmp”;Maximum allowed size for uploaded files.
php_admin_value[upload_max_filesize]=“16G”
php_admin_value[max_input_time]=3600
php_admin_value[post_max_size]=“16G”
php_admin_value[max_input_vars]=“118500”
;php_admin_value sendmail_path “/usr/sbin/sendmail -t -i -C /etc/postfix-$pool/”;#### LOGS
php_admin_value[error_log] = /srv/$pool/var/log/php/error.log
php_admin_flag[log_errors] = on
php_admin_value[log_errors]=1
;php_flag[display_errors] = on
php_admin_value[display_errors]=0
php_admin_value[display_startup_errors]=1
php_admin_value[html_errors]=1
php_admin_value[define_syslog_variables]=1
php_value[error_reporting]=6143
request_slowlog_timeout = 30s
slowlog = /var/log/fpm-slow.log
; Maximum execution time of each script, in seconds (30)
php_value[max_input_time]=“120”
; Maximum amount of time each script may spend parsing request data
php_value[max_execution_time]=“300”
; Maximum amount of memory a script may consume (8MB)
php_value[memory_limit]=“512M”; Sessions: IMPORTANT reactivate garbage collector on Debian!!!
php_value[session.gc_maxlifetime]=3600
php_admin_value[session.gc_probability]=1
php_admin_value[session.gc_divisor]=100; SECURITY
php_admin_value[magic_quotes_gpc]=0
php_admin_value[register_globals]=0
php_admin_value[session.auto_start]=0
php_admin_value[session.gc_maxlifetime]=3600
php_value[session.save_path]=‘/srv/$pool/var/sessions’
php_admin_value[mbstring.http_input]=“pass”
php_admin_value[mbstring.http_output]=“pass”
php_admin_value[mbstring.encoding_translation]=0
php_admin_value[expose_php]=0
php_admin_value[allow_url_fopen]=1
php_admin_value[safe_mode]=0
php_admin_value[expose_php]=0; enforce filling PATH_INFO & PATH_TRANSLATED
; and not only SCRIPT_FILENAMEchdir = /
If anyone has ever had a case like this, I’m interested!
Supras.