High AWS S3 costs due to Nextcloud requests

“Check for changes on direct access” causes it to scan the entire directory and see if there are any new/changed files.

I don’t know how it handles the backend on S3 but knowing how S3 is not a priority for the developers it likely sends requests to every file. So if you have 30k files in the directory you will be using 30k API requests just opening the directory. Not sure if this translates to cron, either.

Nextcloud is the metadata server for S3. All uploads/downloads/changes/deletions/new etc are handled directly by nextcloud. It’s pointless to check the bucket for changes and a waste of money.

Hopefully that fixes it for you!