Failed to open stream: Too many open files

My Nextcloud (v17) cron job runs every 15 minutes with Previewgenerator pre-generate cron every hour. The enviroement is small (family) with few changes at any one time. Every 12 hours (on the button) I get errors similar to the ones below when the cron runs. I’m on a shared host so do not have full control of the environment. When I ask my hosting provider they say “For stability and performance reasons no hosting account on the server should have more than 500 files opened simultaneously”

Anything I can do to fix these 12 hour errors? What does the cron job do every 12 hours vs the interim runs - its seems the 12 hr run must be doing more things. This happens in the middle of the night when nobody is doing anything on my Nextcloud intall.

{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“include(/home/pmhdomain/www/www/nextcloud/lib/public/Files/NotPermittedException.php): failed to open stream: Too many open files at /home/pmhdomain/www/www/neextcloud/lib/composer/composer/ClassLoader.php#444”,“userAgent”:"–",“version”:“17.0.0.9”}
{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“include(): Failed opening ‘/home/pmhdomain/www/www/nextcloud/lib/composer/composer/…/…/…/lib/public/Files/NotPermittedException.php’ for inclusion (include_path=’/home/pmhdomain/www/www/nextcloud/3rdparty/pear/archive_tar:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/console_getopt:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/pear-core-minimal/src:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/pear_exception:/home/pmhdomain/www/www/nextcloud/apps’) at /home/pmhdomain/www/www/nextcloud/lib/composer/composer/ClassLoader.php#444”,“userAgent”:"–",“version”:“17.0.0.9”}
{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“include(/home/pmhdomain/www/www/nextcloud/lib/private/Log/ExceptionSerializer.php): failed to open stream: Too many open files at /home/pmhdomain/www/www/nextcloud/lib/composer/composer/ClassLoader.php#444”,“userAgent”:"–",“version”:“17.0.0.9”}
{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“include(): Failed opening ‘/home/pmhdomain/www/www/nextcloud/lib/composer/composer/…/…/…/lib/private/Log/ExceptionSerializer.php’ for inclusion (include_path=’/home/pmhdomain/www/www/nextcloud/3rdparty/pear/archive_tar:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/console_getopt:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/pear-core-minimal/src:/home/pmhdomain/www/www/nextcloud/3rdparty/pear/pear_exception:/home/pmhdomain/www/www/nextcloud/apps’) at /home/pmhdomain/www/www/nextcloud/lib/composer/composer/ClassLoader.php#444”,“userAgent”:"–",“version”:“17.0.0.9”}
{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“Error: Class ‘OC\Log\ExceptionSerializer’ not found at /home/pmhdomain/www/www/nextcloud/lib/private/Log.php#316”,“userAgent”:"–",“version”:“17.0.0.9”}
{“reqId”:“1nScOr2rsD5ZvU2LLeJ5”,“level”:3,“time”:“2019-10-13T22:57:26+00:00”,“remoteAddr”:"",“user”:"–",“app”:“PHP”,“method”:"",“url”:"–",“message”:“fopen(/home/pmhdomain/www/www/nextcloud/data/nextcloud.log): failed to open stream: Too many open files at /home/pmhdomain/www/www/nextcloud/lib/private/Log/File.php#84”,“userAgent”:"–",“version”:“17.0.0.9”}

A max of 500 open files is quite low. The cron job does a bunch of operations that is queued up i have mine to run every 15 minutes on my virtual server.

If that vhost is using more than 500 files you should consider moving it to something with a better limit

Would running the con jobs more frequently (than 15 minutes) help? Other than teh mild annoyance of the messages what are the downsides of this continuing to occur? Will these failures 'fix themselves" in subsequent runs?

What are the crow jobs actually doing? (I’m a technical noob)

Just to provide feedback: I persuaded my hosting provider to increase the file limit for my cron jobs (I don’t know to what value) and have not have a cron error since