Hello Nextcloud Community,
I am running nextcloud 27.0 linux version on a virtual machine having 32 GB ram and 20 core. This is a dedicated vm to run nextcloud application server only, database is running externally. I have configured php to upload large files upto 15GB in size and also configured appropriate timeout mentioned in official documentation(using Apache reverse proxy). I have created a seperate partition /tmp of size 200GB to hold temporary files during upload. Whenever i upload large size files (>13GB) then entire system ram gets consumed and vm gets freeze till upload is over. Finally after upload is completed everything comes back to normal.
I have tried to configure “tempdirectory” in config.php but that is not being taken, however my /tmp partition itself is 200GB (xfs) . What could be the reason for such a behaviour ?
application version- nc 27.0
os- rhel 8.7 (vm, 32 GB ram, 20 vcpu) (only runs application server)
dedicated partition for /tmp (xfs, 200 GB)
Note: I have disabled chunk file upload (–value 0)