looks right!
put it inside */conf.d/include/proxy.conf
proxy_connect_timeout 1d;
proxy_send_timeout 1d;
proxy_read_timeout 1d;
send_timeout 1d;
looks right!
put it inside */conf.d/include/proxy.conf
proxy_connect_timeout 1d;
proxy_send_timeout 1d;
proxy_read_timeout 1d;
send_timeout 1d;
Hm in that case, I may have tried it already, but apparently I appended it to the end of the file instead of placing it inside that area, I’ll have to get back with you on that, I’ll still have to find the nginx.conf file then for the fastrcgi timeouts, or is that supposed to be the same file?
Edit: It seems placing it in that section worked, apparently that was what I did wrong, I’ve only tested it once so I’m not gonna cheer too soon, but it looks hopeful, you’re a lifesaver mate thank you
you’re welcome!
in case of aborting after 60 seconds into “processing”, then change following for image linuxserver/nextcloud:
/config/nginx/nginx.conf
httpd {
fastcgi_connect_timeout 60;
fastcgi_send_timeout 1800;
fastcgi_read_timeout 1800;
I changed mine to 3600 and I still get the same error.
One of NextCloud’s core functions - uploading and storing large files and out of the box NextCloud can’t handle it. It’s disgraceful if you ask me.
I’m wondering why such an essential thing like uploading a file doesn’t work out of the box. first I have to increase the max file size somewhere deep in the system internals. then this error happens and there is no 1 simple solution. why is all open source so nerd oriented? these are rhetorical questions I ask myself and the more I ask the more I’m looking into returning back to good old FTP
The issue is actually in the apache.conf file. Open the file and increase the Timeout string to match PHP max_execution_time. IE: max_execution_time = 3600 Apache Timeout = 3600