Hi, I think I must have made a mistake in my config! All seems to work ok, but I have an issue when downloading large files from server (haven’t tried uploading to the server yet) all my large files are in folders on my array drives and accessed using external storage. When downloading my docker image fills up to 100% causing my other containers to freeze and sometimes crash! Is there a cache dir that I should have mapped to elsewhere?
Nextcloud V17.02 is installed on my Unraid Server V6.8.0 using Linuxserver’s docker image. and accessed through their Letsencrypt reverse proxy docker.
Now the downloads are failing!! This is useless to me! Anyone know of better software I can try, as there is obvouisly no answer to this problem? Perhaps its just not suited to running in docker?
Please have some patience, the people here have to work and stuff
I just saw your thread and interestingly I noticed this issue myself since a few days. For me this issue is triggered by php-fpm in combination with external storage.
Whenever I access big files via SMB (embedded in Nextcloud via external storage feature) php-fpm is downloading the whole files to the TEMP folder configured for PHP. Last time this had been 140 GB. And for me the temp folder is:
and more precisely it is:
Restarting php-fpm helps to clear the temp files, but this is a rather a workaround than a solution.
(I scripted a regular check which triggers a restart if my filesystem is filling up again. It’s bad for the SSD though )
Can you please check if that is the same for you?
The strange thing is, that this issue started some weeks ago without any changes to the server apart from an update to NC17.0.2 and system updates.
So not sure if it was an update to PHP and PHP-fpm which caused that issue or the NC update.
Maybe worth opening a github issue:
I didn’t have the time yet to analyze this issue in more detail and to open an issue. If you confirm php-fpm and the temp folder being the problem for you, please feel free to open an issue.
Hi, thanks for responding and sorry for my lack of patience! It was typical that people suddenly started complaining that downloads were failing! Hasn;t happened before!
My problem only happens while files are downloading, once they’re done it returns to normal! I’m thinking that I need to map the tmp directory to outside docker?
Could you check first, that this is the same issue? Do you know how to check filesystem contents in a docker container? Maybe you check download a bigger file and then have a look into the container, where the folder is, which is filling up the filesystem. If that is also the php-fpm folder, we have a clue here and can probably address this issue with the developers.
Hi, sorry for the delay getting back to you but work got in the way! Not sure how to do as you ask? Is it a command line thing or am I looking in the gui somewhere?
No problem. Rather a commandline thing. Something with docker -exec or so. I’m not using docker much.
Ok if someone else knows? Or if log files will help? Happy to test anything, would really like this to work!
No one? Would memory caching help with this issue?