Is it possible to prevent the max file size that a user can **try** to upload in one file?

We are making some tests with a very limited user storage but huge shared folders. We’ve found that a user can try to upload huge files, bigger than their user limit but even bigger than the shared resource.

For example, a user with 1GB of storage tries to upload a 10GB file to a shared resource that has a limit of 5GB. Monitoring the filesystem I’ve found that in upload folder of the user, the chunks are coming up to complete the upload. When done, the system raises an error showing that the shared folder cannot store that file because of the limits and destroys the temporary chunks.

That’s ok. But my fear is, what happens if a user tries to upload a 100GB file, or a 2TB file, or a 50TB file?, Is it possible to affect the system filling up the storage?

I’ve been looking about these limits in documentation but all I found is referred to push up PHP limits, wich is not related to this scenario where chunking is involved.

I’ve found that with Flow app I can limit the max file size a user can upload to Nextcloud, BUT again the user is able to upload temporary a huge file far away from its limits.

For example. I’ve set a rule to avoid user to upload files bigger thatn 2GB. Then I place a 10GB file in the client Nextcloud folder. Upload starts and in server you can see that chunks grow up to reach 10GB, then on consolidating stage, nextcloud rejects the file and remove chunks.

So theoretically, I understand that a malicious user (or just ignorant) can create a kind DoS on server if uploads huge files.

Again, there’s any way to prevent this behaviour and limit max file size a user can upload to the system BEFORE consolidating?

Regards