Weird behavior of Nextcloud when using clamscan

Hello everyone,

Using NC 19.0.6. Trying to use files_antivir with Clamscan.

Randomly getting unrelated error messages about infected files. I found out it depends on the file size of the uploaded file: files over 10MB are uploaded in multiple peaces (this is ok) but the uploaded pieces are passed on stdin to clamscan - as if they were separate files (this is weird). Consequently, the subsequent pieces trigger random behavior of the virus scanner because it can only determine the file type from the content itself (piped on stdin).

Yes I know, “executable” mode is the slowest mode at all – but what is the intent behind this behaviour? And how can it be made ‘just work’?

Just for completeness. I repeated the test with the other two modes (tcp and Unix domain socket) – it behaves exactly the same way: Files over 10MB are split into pieces, and every piece is passed as a single file to the virus scanner.

Someone out there who knows how to fix this?

Hi @it25fg , I have never encounter this issue on my nextcloud. Can you send the error message ?

Not really, it depends on the file being uploaded. Sometimes the error is simply “can’t upload file”, sometimes it says the file is infected (for instance: you upload a PDF file and it says it is a ZIP bomb), sometimes the upload stops at 10MB and times out.

Hmm, I have left all the settings at their default values. Which of the three modes (executable, tcp, unix socket) works for you?

@it25fg I’m using the Unix socket. this is really strange…

@Mageunic Thanks, let’s see if someone else™ has some clue.

@it25fg here is my clamd.conf

my clamd.conf

LocalSocket /var/run/clamav/clamd.ctl
FixStaleSocket true
LocalSocketGroup clamav
LocalSocketMode 666
User clamav
ScanMail true
ScanArchive true
ArchiveBlockEncrypted false
MaxDirectoryRecursion 15
FollowDirectorySymlinks false
FollowFileSymlinks false
ReadTimeout 180
MaxThreads 12
MaxConnectionQueueLength 15
LogSyslog false
LogRotate true
LogFacility LOG_LOCAL6
LogClean false
LogVerbose false
PreludeEnable no
PreludeAnalyzerName ClamAV
DatabaseDirectory /var/lib/clamav
OfficialDatabaseOnly false
SelfCheck 3600
Foreground false
Debug false
ScanPE true
MaxEmbeddedPE 10M
ScanOLE2 true
ScanPDF true
ScanHTML true
MaxHTMLNormalize 10M
MaxHTMLNoTags 2M
MaxScriptNormalize 5M
MaxZipTypeRcg 1000M
ScanSWF true
ExitOnOOM false
LeaveTemporaryFiles false
AlgorithmicDetection true
ScanELF true
IdleTimeout 30
CrossFilesystems true
PhishingSignatures true
PhishingScanURLs true
PhishingAlwaysBlockSSLMismatch false
PhishingAlwaysBlockCloak false
PartitionIntersection false
DetectPUA false
ScanPartialMessages false
HeuristicScanPrecedence false
StructuredDataDetection false
CommandReadTimeout 30
SendBufTimeout 200
MaxQueue 100
ExtendedDetectionInfo true
OLE2BlockMacros false
AllowAllMatchScan true
ForceToDisk false
DisableCertCheck false
DisableCache false
MaxScanTime 120000
MaxScanSize 1000M
MaxFileSize 1000M
MaxRecursion 25
MaxFiles 15000
MaxPartitions 50
MaxIconsPE 100
PCREMatchLimit 10000
PCRERecMatchLimit 5000
PCREMaxFileSize 100M
ScanXMLDOCS true
ScanHWP3 true
MaxRecHWP3 16
StreamMaxLength 1100M
LogFile /var/log/clamav/clamav.log
LogTime true
LogFileUnlock false
LogFileMaxSize 0
Bytecode true
BytecodeSecurity TrustSigned
BytecodeTimeout 60000
OnAccessMaxFileSize 5M

So you have increased the ‘max stream size’ in Nexcloud’s admin panel too? Would your server really handle files of 1G with ease?

I did not increase these limits. I see that Nextcloud does not honor the limit I set (the limit in the admin panel, not the one of clamd). Instead of STOPing when the stream limit is reached, it CONTINUES to send snippets of the file, each of LIMIT size, until the end of the uploaded file. Some of these pieces then trigger weird infection messages. I can even reproduce this behavior without clamav being involved, by using a wrapper which simply logs what it gets.

I’m not sure if Filter False-Positive PUA.Doc.Packed.EncryptedDoc-6563700-0 · Issue #161 · nextcloud/files_antivirus · GitHub (among other bugs with false positives) is eventually the same thing, but it smells that way.

@it25fg On nextcloud, I have also changed the value in the admin panel. My server has no problem with these value and scan without performance problems ( I have an i5 4xxxx ).
The piece of file is due to the chunk size set by default to 10 M. In my case, I have set it to 1000M so I had to increase the max stream size value ( set to 1100M because sometimes some error was created due to the limit of max stream size been reached ). My maybe increase your value to 15M in clamd.conf can solve your issue.

No. I can reproduce the weird behavior even without clamd, so clamd is not the culprit.

The first chunk – yes this is expected. I set the limit, Nextcloud sends this many bytes. The problem is that Nextcloud does not stop at this point. It sends more and more chunks until the file size is exhausted. And this is the problem. Increasing the limit can somewhat shift the problem, but it does not solve it. You would encounter it if somebody uploads a file bigger than 1G. Would you solve that by increasing the limit to, say, 2G?

@it25fg I don’t have any issue when uploading files of more than 1 Gb ( I have set chunk size to 1G to save more time when I upload big files around 3-4 G per file).