Clamav scan of data directory - ginormous unreasonable size

Support intro

Sorry to hear you’re facing problems :slightly_frowning_face: is for home/non-enterprise users. If you’re running a business, paid support can be accessed via where we can ensure your business keeps running smoothly.

In order to help you as quickly as possible, before clicking Create Topic please provide as much of the below as you can. Feel free to use a pastebin service for logs, otherwise either indent short log examples with four spaces:


Or for longer, use three backticks above and below the code snippet:


Some or all of the below information will be requested if it isn’t supplied; for fastest response please provide as much as you can :heart:

Nextcloud version: 15.0.5
Operating system and version: ubuntu server LTS 18.04
Apache or nginx version: 2.4.29
PHP version: 7.2.15

The issue you are facing: I have the data directory mounted from a NAS (2TB) and I assigned 50GB to the data directory. I have at the moment 3 users each one with 10GB. If I run:
du -sh data
the result is 22T which is above any reasonable understanding and NAS possibility
by instance,let say USER1 has used 2.5g as displayed online and when I do:
du -sh data/USER1
the result is 312G

It can be a misscalculation of du or maybe nextcloud somewhere create crossed links messing up the actual directory size.

Has anybody faced similar situation?

Is this the first time you’ve seen this error?: Y