We are making some tests with a very limited user storage but huge shared folders. We’ve found that a user can try to upload huge files, bigger than their user limit but even bigger than the shared resource.
For example, a user with 1GB of storage tries to upload a 10GB file to a shared resource that has a limit of 5GB. Monitoring the filesystem I’ve found that in upload folder of the user, the chunks are coming up to complete the upload. When done, the system raises an error showing that the shared folder cannot store that file because of the limits and destroys the temporary chunks.
That’s ok. But my fear is, what happens if a user tries to upload a 100GB file, or a 2TB file, or a 50TB file?, Is it possible to affect the system filling up the storage?
I’ve been looking about these limits in documentation but all I found is referred to push up PHP limits, wich is not related to this scenario where chunking is involved.