NC file access leads to system crash

Support intro

Nextcloud 26 in Docker, apache-stable image, MariaDB 10-5.19

Sorry to hear you’re facing problems :slightly_frowning_face: is for home/non-enterprise users. If you’re running a business, paid support can be accessed via where we can ensure your business keeps running smoothly.

In order to help you as quickly as possible, before clicking Create Topic please provide as much of the below as you can. Feel free to use a pastebin service for logs, otherwise either indent short log examples with four spaces:

Some or all of the below information will be requested if it isn't supplied; for fastest response please provide as much as you can :heart:

Nextcloud version (eg, 20.0.5): 26.0.0
Operating system and version (eg, Ubuntu 20.04): replace me
Apache or nginx version (eg, Apache 2.4.25): replace me
PHP version (eg, 7.4): 8.1.17

The issue you are facing:

Is this the first time you’ve seen this error? (Y/N): N

Steps to replicate it:

  1. Start container
  2. log into NC
  3. Goto Files-Media folder
  4. Host Crash

Issues: I have a large data directory of 500GB images which I am mapping as a volume to the Nextcloud container. One of the subfolders has about 1000 files in one directory.

When I log in via (any) web browser and access the folder with 1000 images, the NC container activity goes through the roof and browser becomes unresponsive. After 1 minute or 2 the entire host systems breaks down and reboots.

Host system has ARM processor, 4 cores, 4GB ram

What I have tried:

  • restrict NC container to 525MB / 1GB memory
  • restrict NC container to 1.0-3.5 CPU
  • add Redis caching with / without vm.overcommit_memory=1 options

I cannot stabilise the container once I enter said folder. What is NC doing there that takes so much activity and how can I stabilise the container?

The output of your Nextcloud log in Admin > Logging:


The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

$CONFIG = array (
  'htaccess.RewriteBase' => '/',
  'apps_paths' => 
  array (
    0 => 
    array (
      'path' => '/var/www/html/apps',
      'url' => '/apps',
      'writable' => false,
    1 => 
    array (
      'path' => '/var/www/html/custom_apps',
      'url' => '/custom_apps',
      'writable' => true,
  'instanceid' => 'ockrumwjt7bi',
  'passwordsalt' => 'aaa',
  'secret' => 'xxx',
  'trusted_domains' => 
  array (
    0 => 'x.x.x.x:8585',
    1 => 'myDomain',
  'datadirectory' => '/var/www/html/data',
  'dbtype' => 'mysql',
  'version' => '',
  'overwrite.cli.url' => 'http://1x.x.x.x:8585',
  'overwriteprotocol' => 'https',
  'dbname' => 'nextcloud',
  'dbhost' => 'db',
  'dbport' => '',
  'dbtableprefix' => 'oc_',
  'mysql.utf8mb4' => true,
  'dbuser' => 'nextcloud',
  'dbpassword' => 'xxx',
  'installed' => true,
#  'memcache.locking' => '\\OC\\Memcache\\Redis',
#  'memcache.distributed' => '\\OC\\Memcache\\Redis',
  'memcache.local' => '\\OC\\Memcache\\APCu',
#  'redis' => 
#  array (
#    'host' => 'redis',
#    'password' => '',
#    'port' => 6379,
#  ),

The output of your Apache/nginx/system log in /var/log/____:



Output errors in nextcloud.log in /var/www/ or as admin user in top right menu, filtering for errors. Use a pastebin service if necessary.


sounds like you are running your Nextcloud on a Raspi… it is definitely limited in terms of performance and once you enter a directory with lot of files Nextcloud starts generating preview of this files which might crash the system… definitely not a good habit and you should create an issue to address the problem long term…

as quick and dirty solution

  • you can try pre-generating previews see previewgenerator - plz read and understand the docs first to get it right fast…
  • you can tell Nextcloud the directory has no media stored (.nomedia file in the directory)
1 Like
  • Is this a new installation or did this start after a recent upgrade?
  • ARM is a bit vague, but I’m guessing it’s a Raspberry Pi? Which model?
  • Until you get this situation sorted, you may want to disable on-demand preview generation entirely: Previews configuration — Nextcloud latest Administration Manual latest documentation
  • You mentioned the data directory is mapped as a volume. Can you confirm it is a on local storage? (i.e. not connected via an SMB mount/etc)

thanks. will try that.
Yes - its a Raspberry 4 but the performance is fine until I enter the said folder. even when I access NC via mobile apps and get files its no problem for the server. The RPI runs at 20-50% CPU and uses 2GB out of 4GB ram, only maybe 50 PIDs running in NC_app container, so no issues. Only when I access the media folder via browser, NC CPU usage goes to 400% and apache PIDs go up to 450+. Def a problem with NC server and apache calls. Still it should never crash the host, if anything it should lock up the container.
I will try your suggestions above. thanks

I have the volume mapped as local storage but I tested as SMB and it is the same problem, so not a file protocol issue.
Not after recent upgrade, it happens when ever I enter that Media folder (1000+ images in 1 folder). I found a (sad) workaround. If I split up the 1000+ files into subfolder of around 100, NC can cope. So I guess there is a mechanism in NC missing that will improve handling large numbers of files in 1 folder. Its seems that NC tries to generate previews for all 1000 files when I only enter the folder. Thats so unneccesary. NC should reduce the preview activity until it has spare system resources.
Thank you - I will try your suggested link.

The on-demand (i.e. upon access) preview generation only happens if you don’t setup the daily/weekly/hourly job for preview generation. It’s pretty taxing (and wasteful) to do every time you access a folder otherwise. I definitely recommend setting up the external cron job that does that. Then the previews get generated once and never again.

The upon access generation will still occur briefly for newly uploaded stuff until it gets picked up in the next cron run (e.g. that evening or whatever time you schedule it for).

That’s how I have my Raspberry Pi 4 setup and it made a huge difference when browsing folders with media.

As for the crashes… I suspect something else is going on here. Like hardware or something.

Do your host system logs have any hints? A system doesn’t just crash without hints in the logs unless there is severe enough hardware issue that

  • Is your storage on an external drive by chance?
  • Is your host perhaps overheating?
  • Do you have a spare power supply for your Pi to try?

As an example I’ve had issues where using two external USB drives seemed okay until various situations caused appropriate use across the Pi and drives to cause power supply saturation crashes. The Pi is definitely sensitive to that.

Indeed. However this got improved with Nc26. See 26.0 PHP extension sysvsem

Awesome! Thanks for the tip with Cron. I did that. It was a bit awkward because I run NC in a Docker container but I have a solution and it does improve performance vastly. Many thanks. That cron job should be setup by NC by default. It makes such a difference.

Re the crashes, there is absolutely nothing in syslog etc. The system just stops and reboots. I have a few HDs attached to the RPI via USB obviously, so maybe the intense db activity that the previews create is just too much for the HDs.

Yes but my problems occurred on v 26.0

Do you have php-sysvsem installed like recommended?

no idea. I run NC in the Docker container.

If you can track down a Pi compatible externally powered USB hub to attach those drives to, you may see reliability increase since they will no longer tax the Pi’s own power.

I have 2 usb hubs each with their external power supply. Running 7 HDs on the Pi. They are only 2.5’’ so they don’t draw that much power.