Using NC 19.0.6. Trying to use files_antivir with Clamscan.
Randomly getting unrelated error messages about infected files. I found out it depends on the file size of the uploaded file: files over 10MB are uploaded in multiple peaces (this is ok) but the uploaded pieces are passed on stdin to clamscan - as if they were separate files (this is weird). Consequently, the subsequent pieces trigger random behavior of the virus scanner because it can only determine the file type from the content itself (piped on stdin).
Yes I know, “executable” mode is the slowest mode at all – but what is the intent behind this behaviour? And how can it be made ‘just work’?
Just for completeness. I repeated the test with the other two modes (tcp and Unix domain socket) – it behaves exactly the same way: Files over 10MB are split into pieces, and every piece is passed as a single file to the virus scanner.
Not really, it depends on the file being uploaded. Sometimes the error is simply “can’t upload file”, sometimes it says the file is infected (for instance: you upload a PDF file and it says it is a ZIP bomb), sometimes the upload stops at 10MB and times out.
Hmm, I have left all the settings at their default values. Which of the three modes (executable, tcp, unix socket) works for you?
So you have increased the ‘max stream size’ in Nexcloud’s admin panel too? Would your server really handle files of 1G with ease?
I did not increase these limits. I see that Nextcloud does not honor the limit I set (the limit in the admin panel, not the one of clamd). Instead of STOPing when the stream limit is reached, it CONTINUES to send snippets of the file, each of LIMIT size, until the end of the uploaded file. Some of these pieces then trigger weird infection messages. I can even reproduce this behavior without clamav being involved, by using a wrapper which simply logs what it gets.
@it25fg On nextcloud, I have also changed the value in the admin panel. My server has no problem with these value and scan without performance problems ( I have an i5 4xxxx ).
The piece of file is due to the chunk size set by default to 10 M. In my case, I have set it to 1000M so I had to increase the max stream size value ( set to 1100M because sometimes some error was created due to the limit of max stream size been reached ). My maybe increase your value to 15M in clamd.conf can solve your issue.
No. I can reproduce the weird behavior even without clamd, so clamd is not the culprit.
The first chunk – yes this is expected. I set the limit, Nextcloud sends this many bytes. The problem is that Nextcloud does not stop at this point. It sends more and more chunks until the file size is exhausted. And this is the problem. Increasing the limit can somewhat shift the problem, but it does not solve it. You would encounter it if somebody uploads a file bigger than 1G. Would you solve that by increasing the limit to, say, 2G?
@it25fg I don’t have any issue when uploading files of more than 1 Gb ( I have set chunk size to 1G to save more time when I upload big files around 3-4 G per file).