Http2 haproxy Nginx upload multiple files problem

Hi folks,

Ive been trying to upgrade to http2 from http1.1 issue im facing is when uploading multiple files the upload starts uploading then stops and fails using http1.1 is fine, Ive added the following to the Server block on my nginx install but problem still remains. im using Haproxy on pfSense.

listen 81 http2 proxy_protocol; # haproxy SSL termination + HTTP/2
listen 82 proxy_protocol; # haproxy SSL termination for HTTP/1.1 and lower

My actual Server block what works with http1.1

server {
listen 80;
listen [::]:80;
server_name cloud.mydomain;

# Add headers to serve security related headers
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header X-Robots-Tag none;
add_header X-Download-Options noopen;
add_header X-Permitted-Cross-Domain-Policies none;
add_header Referrer-Policy no-referrer;
add_header Strict-Transport-Security "max-age=3153600; includeSubDomains" always;

#I found this header is needed on Ubuntu, but not on Arch Linux. 
add_header X-Frame-Options "SAMEORIGIN";

# Path to the root of your installation
root /usr/share/nginx/nextcloud/;

access_log /var/log/nginx/nextcloud.access;
error_log /var/log/nginx/nextcloud.error;

location = /robots.txt {
    allow all;
    log_not_found off;
    access_log off;
}

# The following 2 rules are only needed for the user_webfinger app.
# Uncomment it if you're planning to use this app.
#rewrite ^/.well-known/host-meta /public.php?service=host-meta last;
#rewrite ^/.well-known/host-meta.json /public.php?service=host-meta-json
# last;

location = /.well-known/carddav {
    return 301 $scheme://$host/remote.php/dav;
}
location = /.well-known/caldav {
   return 301 $scheme://$host/remote.php/dav;
}

location ~ /.well-known/acme-challenge {
  allow all;
}

# set max upload size
client_max_body_size 1024M;
fastcgi_buffers 64 4K;

# Disable gzip to avoid the removal of the ETag header
gzip off;

# Uncomment if your server is build with the ngx_pagespeed module
# This module is currently not supported.
#pagespeed off;

error_page 403 /core/templates/403.php;
error_page 404 /core/templates/404.php;

location / {

location / {
rewrite ^ /index.php;
}

location ~ ^/(?:build|tests|config|lib|3rdparty|templates|data)/ {
   deny all;
}

location ~ ^/(?:\.|autotest|occ|issue|indie|db_|console) {
   deny all;
 }

location ~ ^/(?:index|remote|public|cron|core/ajax/update|status|ocs/v[12]|updater/.+|ocs-provider/.+|core/templates/40[34]).php(?:$|/) {
include fastcgi_params;
fastcgi_split_path_info ^(.+.php)(/.*)$;
try_files $fastcgi_script_name =404;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_param PATH_INFO $fastcgi_path_info;
#Avoid sending the security headers twice
fastcgi_param modHeadersAvailable true;
fastcgi_param front_controller_active true;
fastcgi_pass unix:/run/php/php7.4-fpm.sock;
fastcgi_intercept_errors on;
fastcgi_request_buffering off;
fastcgi_read_timeout 600;
proxy_read_timeout 600;
}

location ~ ^/(?:updater|ocs-provider)(?:$|/) {
   try_files $uri/ =404;
   index index.php;
}

Adding the cache control header for js and css files

# Make sure it is BELOW the PHP block
location ~* \.(?:css|js)$ {
    try_files $uri /index.php$uri$is_args$args;
    add_header Cache-Control "public, max-age=7200";
    # Add headers to serve security related headers (It is intended to
    # have those duplicated to the ones above)
    add_header X-Content-Type-Options nosniff;
    add_header X-XSS-Protection "1; mode=block";
    add_header X-Robots-Tag none;
    add_header X-Download-Options noopen;
    add_header X-Permitted-Cross-Domain-Policies none;
    add_header Referrer-Policy no-referrer;
    # Optional: Don't log access to assets
    access_log off;

}

location ~* .(?:svg|gif|png|html|ttf|woff|ico|jpg|jpeg)$ {
try_files $uri /index.php$uri$is_args$args;
# Optional: Don’t log access to other assets
access_log off;
}
}

Thanks

Jack.

Update,

Uploading singles files are fine but multiple files is the problem. when using http2 the connection drops on multiple file upload.

im beginning to wonder if Nextcloud is just bad code. Wordpress is perfectly fine on HTTP2 but Nextcloud is problematic. Downloading huge files are also a problem with Nginx Worker consuming huge amounts of RAM. Might be time to find something else because no response and no solution online except people just say it works fine for them! that is no help!

problem seems to be with Nginx. tested with Apache and works fine. i am guessing Nginx is broken with Nextcloud. HTTP2 and memory exhaustion doesnt happen with Apache.

Update, I have tried everything with Nginx and tried what people recommended problem is still there. Just keeps eating and Eating Memory. Nginx Worker Process is a memory hog on Downloads. Apache works but cannot get 301 re-direct to work. Think its time i give up on this and find another alternative

Update. Problem solved by using iSCSI instead of NFS. No longer get timeout Errors.