S3 as primary storage not cleaned, suspect issue with cron on one.com - missing posix



Here is my setup:
Latest Nextcloud running on one.com, S3 as primary storage (wasabi).

Storage usage is not decreasing when removing files - not even after 2-3 days

I suspect that its a problem with cron - running it either as webcron or ajax makes no changes, i do get and error with cron saying that it havent run correct in XX days.

I tried to run occ to run cleanup command, but here i get:
The posix extensions are required - see https://www.php.net/manual/en/book.posix.php

Running php cron.php - gives the same error.

I have contacted one.com to hear if it can be resolved to get posix support

Since i do have SSH, i tried compiling another php, but that lead to whole new problems.

./php-cgi7.4 cron.php
X-Powered-By: PHP/7.4.25
Content-type: text/html; charset=UTF-8

Fatal error: Uncaught Error: Call to a member function getLogger() on null in /customers/f/1/0/DOMAIN/httpd.www/cloud/cron.php:165 Stack trace: #0 {main} thrown in /customers/f/1/0/DOMAIN/httpd.www/cloud/cron.php on line 165

Any suggestions here ?

Ok, so an update here.

Cron issue was resolved - missing php module on the shared hosting.

And after spending (way to much) time on Github reading thru the different reports on S3 it honestly scares me - It seems to me that we have some rather big bugs.

  1. Uploads can fail and leave alot of dead files on the S3… eg. on my test, my users have 11gb data… but it takes 19gb on the S3… logically i could go thru the DB manually and compare files - but i shouldent have to.

  2. If you delete a user WITHOUT deleting the users data - the users data will just hang there… without no way of cleaning it.

  3. occ commands does not help clean the S3 storage.

I think it would be wise to create some kind of script that looks in the db and compare it to the s3 storage to clean up the storage… and offcourse fix above bugs.

1 Like

Hi @valkin. I’m curious what your experience with S3 has been so far? I’m thinking of switching from volume-based to this.

I gave up on s3… sadly

I think it’s all in the right way of migration…

There is one catch though. File names etcetera only exist in the database when using S3. I have opted here and there, that it would be a good idea to use the file/folder capability of S3 for redundancy, but that has not happened.

Therefor, it is a good idea to have a mysql master/slave structure set up, for creating a live backup of your sql data… (betrter safe then sorry :wink: )

I’m using S3 for over a year now, and it is running smoothly.

Do keep in mind there are quite a few “extra files” out there, preview files and such. If all goes well, it should mean about the same amount (and size) of files should be in S3 as there would be on local storage.

That said, I must say I have very little uploading trouble. I have read that extremely large files have trouble uploading… and I do remember having done some tweaks there (sorry, but I don’t remember which ones). Is the S3 hoster you chose a bit slow? Have you tested an other?

I use https://www.ovhcloud.com/ and am quite content with them as my NextCloud/S3 sollution

I have tested this (inadvertantly). And the files did get removed (version 25). Some caching files weren’t removed though (do read on… :wink: )

@tarek asked me about the tool “local → S3”… and I decided to share the tool I built to perform a good migration with a bunch of checks. One of these checks that were needed to perform a “multi step migration” was to check for files on S3 that weren’t in the database (and vice versa)

So the “bonus” of my migration script is that it can be used to check for inconsistencies!
So in essence this script is that “occ files:cleanup” that does work on local, but not with S3 (at Nextcloud team: hint, hint :wink: )

When I deleted a user I saw the deletion of some cache files. When I have the time I’ll check for this

I have only recently published it on GitHub and no one else but me has used it (to my knowledge) so please read the readme.md !

Take a look: