A way to notify users of file/folder deletion

Hello !

Our Nextcloud instance has hundreds of users, with dozens of shares used as common folders.
Almost everybody is using the desktop client and the accidental deletions were common… So we decided to create a script that parses the Apache access log and send a report of deleted items to all users daily.

That was working pretty fine until we had more and more false positives, mostly from the same accounts but not exclusively.
So I wonder if there is an explanation, or a better way to accomplish what we want…

Access log uses format %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"
So we use every lines with DELETE on them, such as :

178.x.x.x - - [03/Jul/2024:15:44:26 +0200] "DELETE /remote.php/dav/files/user/thefolderthenfile HTTP/1.1" 204 635 "https://clood.compagnie.org/" "Mozilla/5.0 (X11; Linux x86_64; rv:109.0) Gecko/20100101 Firefox/115.0" 52086

Which is a real deletion. But the same day we also had :

178.x.x.x - - [03/Jul/2024:13:36:02 +0200] "DELETE /remote.php/dav/files/user/thefolderthenfile HTTP/1.1" 204 635 "-" "Mozilla/5.0 (Linux) mirall/3.1.1-2+deb11u1 (Nextcloud)" 47382

About a file that wasn’t actually deleted.
Sometimes it happens with HTTP codes like 404/500 and the script ignore them. But there we have the same kind of log entry, one for the real thing and one for a false positive…

Any idea how to really detect actual deletion using apache logs ? Or a Nexcloud log ? (couldn’t find a way with Nextlcoud) ? Or something else ?

Perhaps you can simply use rsync under Linux. You synchronise all files to another storage (you might do this for the backup anyway). Somehow you can log which files have been deleted there (in source and then in destination).

rsync option -v in combination with --delete

Overall, however, I would rather work out a backup/restore strategy and if something is missing, you have to find it again from the backup. I wouldn’t inform anyone. I would just tell everyone how long backups are available and who they should ask if they really miss a file.

As a compromise, you could perhaps build the query (rsync log) so that the deletion of entire directories is recognised and evaluated. That could be rather interesting.

Thanks for the idea ! I wonder how that would work when moving files, maybe it would be a hassle…

We have to warn our users, we have to rotate backups and a missing file could be detected months later… And massive accidental deletions do happen, regularly…

You should not parse the apache logfile. You should use the admin_audit logfile instead.

The good part of the audit logfile is, that it is in json format, which means that it is extremely good to handle. → The command-line JSON processor jq is needed for this.

you must → activate the audit logfile ← (as a file, not in syslog). Then you can monitor all file deletions without any false positives.

Here an example to obtain a list of all deleted files since ‘yesterday 00:00:00’ in your local timezone:

LOGFILE="/path/to/audit.log"
START_DATE="$(date -d 'yesterday 00:00:00' +%Y-%m-%dT%H:%M:%S%z)"

jq --arg start "$START_DATE" '
select(.method == "DELETE" and (.time | strptime("%Y-%m-%dT%H:%M:%S%z") | mktime >= ($start | strptime("%Y-%m-%dT%H:%M:%S%z") | mktime))) |
[.time, .remoteAddr, .user, .url, .message, .userAgent] | join(" - ")
' "$LOGFILE"

Instead of ‘yesterday 00:00:00’, you can use ‘2 days ago 00:00:00’ or ‘last week 00:00:00’
(These alternatives are detailed in the “Date input formats” section found in the info date documentation.)


This commands in a little script, invoked by a cron-job and you have your daily list of deleted files.
Maybe you want to combine it with logrotate, as a “prerotate” job and if you rotate on a daily basis, you do not need the time calculation as in my example, which makes the jq command a lot easier.


→ Here ← I explained some more about filtering the audit.log with jq.


Much and good luck,
ernolf

Thanks Ernolf that sounds very promising ! I’ll try that when I have time, and will mark your answer as solution if that works (I’m confident)