Delete users files and directory with script


So at my company where I am working do we have an internal Nextcloud that we only want to use as a share nextcloud for customer etc, where we should be able to put stuff in and share.

After a while (7 day-ish) should the file or whatever be deleted because the nextcloud only have 100GB of space.

I have been looking in the DB and tried with a script to delete directly in the users folder.
But i guess i still needed to find the file in the DB aswell.

Is there any easy way to find every USER file/folder that is older than x-min/x-day and delete it with a script? can’t find any good option for that in the occ or which file is owned by who in oc_filecache.

I think a easy possiblity is to use with names like “week 01”, “week 02” as subfolder or independent groupfolders. Your customer does not see them.

Then set all users to a small quota and the groupfolders to a big quota (if it is possible). Also you educate the users to use always the newest week-folder for sharing. If the user like then they can create own subfoders. But the user does not use it as favorite and never change it :wink:
After a few generations or if you have not enough space you delete the old folders.
Advantage is that your users can go back a few weeks if it is necessary.

I think there are better ideas.

The thing is that we don’t really want to maintain it, the plan is:

  1. Colleague asks for the access to the server (LDAP connected)
  2. The colleague should after that be able to login and share what ever that is needed to share to someout outside of the company, for example images or files that are to big for e-mail and that they might want to be abit more secure.
  3. The shared object should then be “stored” on the server for an approx 7-14 days then get deleted so that the server don’t get overloaded by unused files.

My first plan was to delete the file in /data/account and check the unix time and compare to the current time, if older than x-days, delete

result=$(sudo mysql $database -s -N -e \
        "SELECT $dbItem \
        FROM $dbTable \
        ORDER BY $dbOrderBy ASC")

for account in ${result}; do
    cd $nxtcFolder/$account || return
    sudo find $nxtcFolder/files/ -type d -mtime +1 -delete
    sudo find $nxtcFolder/files/ -type f -mtime +1 -delete

But I did found out that even if you delete the direct file in /data/account, the file will still be found in the DB, and if you take a look at the DB table oc_filecache, then it isn’t really possible to distinguish what file belongs to who, Not sure if i’m just blind or stupid, but I can find the connections in the Db anyway.

Our plan was to create this script so it will be ran as a cronjob each day or something.
Don’t think that Groupfolders will work without alot of manuall work, because the problem is to find the file that is older than x-days.

I think for this sharing feature you can better use tools like (demo without ssl: ) (demo: )

On Nextcloud i think the user is the owner of is data.
On all Nextcloud i know the problem is solved with the quota.

With 100 GB you can host 100 users with 1 GB quota or 20 users with 5 GB.
If you delete files after 7 days 1000 users can only upload 100 MB each a week.
You need a quota.

Yes, the user is the owener of the data, but as an admin with CLI access and access to occ, you should be able to delete files and stuff if needed.

With 100 GB you can host 100 users with 1 GB quota or 20 users with 5 GB.
If you delete files after 7 days 1000 users can only upload 100 MB each a week.
You need a quota.

Not really that we are that big of a company, we’re a company of 300 employees and not everyone will use this nextcloud, so a quota wont really be needed, if more ppl will start using it, we just put more disk storage on the server.

The idea of the server is to be able to share big folders, file, pictures or whatever to them you need to share too.

So this isn’t a quota issue.

Perhaps you can use automatic tags. Read

I think this is exactly what you need:

1 Like

This one seems to work, thanks for the help!