Error 400 when uploading multiple files on stress tests concurrently with curl and webdav

I’m designing some kind of stress tests to observer my server behaviour under some pressure circumstances. Tests consists in a sequence: Create random folder name, upload file, download file, remove file, remove folder. All requests are performed via curls and webdav protocol

I’m using 10 simultaneous users making N concurrent test set each one (son 10xN simultaneous request). I’ve also introduced a short random wait time between 0 and 2.5" on each request to avoid all the same requests reaching the server at the same time.

When I increase the size of file up to 50MB and the concurrent requests to 10 I start receiving 400 errors on the upload operation after 15 simultaneous uploads. Then most then fails except some ones that succeed. No errors observed on apache log. Couple of questions then…

Is this a normal behaviour? Is this nextcloud protecting itself of a DOS attack? (all requests come from the same users and same IP) Are there any parameters I should check to allow higher upload loads? (PHP, apache or nextcloud.conf)

Just to clarify, I’m receiving error on UPLOAD request and consequently on download and delete request related to those files. The rest of requests run normally, and server is responsible and available for other request.

Nevermind, the problems was not nextcloud, it was the php configuration. The upload_tmp_dir set for php was pointing to a mount with limited storage, as far it was filling up the uploads fail.

I was able to check it in nextcloud logs where clearly pointed out the error. Reconfigure php.ini to point another location fixed the problem. Now I’m trying to figure out why nextcloud is no using the temporary folder I set in config.php