Bad performance on folder with many files


I’m on NextCloud 10.0.1, lighttpd, mysql, debian.
dbfiles, www-home and data dir is mounted on ext4 (external disk)

The overall performance of Nextcloud in the webinterface is pretty fine (about 2-3 secs for a page, sometimes 10) The same goes for webdav access - it’s a tad slower but still in the seconds.

I have one folder with about 5000 pictures - and only pictures. All around ~5MB. No subfolders.
Opening that folder in webdav takes about 20-30sec - which is Ok and something which is within my expectations.

But opening the same folder via the web-interface takes about 4+ minutes. ( roll)

When doing this request I see, that iotop is idling and mysql process is idling. Top shows me, the lighttpd process is running fast-cgi at about 30-40% CPU usage the full 4 minutes.

When doing a SHOW PROCESSLIST in mysql I see nothing but a few short bursts with the SQL select commands which are usually too fast to note. The slow-query.log is empty.

I was switching from Apache2 to lighttpd recently because I had the same issues on Apache2 and was under the impression, things would become better when using lighty. I was wrong…

Does anyone has an Idea, where I could look further? It seems, that the code used by the webinterface is doing some extra stuff, the webdav code doesn’t.

And really, this is the first time in my entire life, that webdav is faster then a plain http request…


Thanks, but that’s about the Files app (like the link in the post /apps/files/?dir=…)
The gallery takes initially the same amount of time, but scrolling down is pretty fast for my limited upstream bandwith.

No one experiences similar issues? It’s absolutely down to that folder…

Same. I think it’s a client side problem (guessing at javascript trying to read the metadata on all the files at once?) as it drives the fans on my laptop wild.

In all honesty it’s becoming a bit of a deal breaker, which is a shame as I really don’t want to hand my files back to third party services who scan everything like Google or Dropbox.

Same or similar issue here. We have a folder for publications with currently 41,036 entries. On most clients (Windows, Mac, Linux) none of the entries are shown; the wait cursor is circling forever. It simply takes too long to build the list. The search function works however surpisingly well. However, if I get a small selection, say three items, non of them can be downloaded. So I was looking for a paging plugin; till now to no avail. I know that I can remedy this situation by introducing a large number of subfolders and reorganizing the whole library. Yet, I’m a bit reluctant to do so. Is there somebody able to point me to an acceptable solution/plugin? Or is this a work item for the next version of Nextcloud?

I am having the same trouble with my system. I use the IOS app to sync my pictures on the phone to nextcloud.
When trying to view of list them from the files section it takes forever but from the Gallery it is fine and creates 2x3 icons for picture with name in it and goes reasonably fast. The problem is the phone does not let you break up photos in different directories. On option is by month and year but that is way too many directories and some only having a few pictures in them.


Just wanted to revisit this, I notice similar behavior here. Folder contains about 2000-3000 files. (the only large folder).
Browsing other folders is pretty quick. Browsing this folder takes significant time. While this folder is open searching for filenames also takes significant times (I’m talking 30+ seconds).

Interestingly if I go to another folder I can search for the same filenames in seconds.

I looked at the dev tools of chrome browser to what takes most to load and it looks like core.js loads folder content. Then browser renders it? And I guess PC performance will be the key factor here.

I think the problem here is, it loads all folder content and then it’s rendering it while scrolling fast.
versus traditional way when it loads fast but the view is broken down to certain number of items per page.

I bet a lot of people have the same issue and not sure how it’s addressed. :confused:

For now I only have ~3k files, but folder is growing, it takes 2.2 megs to load the content, then browser’s just rendering it.
Any action like search or changing order by name/file size makes browser re-render the content and takes long time. So I figured fastest way is to just search by file name from another folder.

Personally I think this is bad practice and there should be a better solution implemented (I don’t mind breaking it down by pages).


I generally avoid too large folder because many file systems become very slow. In your case it seems to be the code on the web-interface, there seems to be potential for improvements. You should create a bug report on the bug tracker to improve the performance:

I run a nextcloud server with nearly 50 active users who upload many small video files. Currently we have over 130GB of video files, and browsing the folder where all of the videos are is painfully slow for many of the users. This is likely because the request /remote.php/dav/files/<username>/<folder> is 7.5 MB in size that they have to download each time. This is entirely just a large XML request, and likely the cause of freezing on some clients.

I’ve tried searching for similar issues to this, and can’t find a solid answer or solution to fixing this issue. The server almost instantly completes the request, it’s a matter of transmitting the request from what I can tell, and not all clients have super fast internet.

( another side note, this forum is also extremely slow, with up to 10 second response time )

1 Like

Does the problem only occurs with nginx, or does the slow page load with many files also occurs when using apache2?

I have the same problem with nginx - any progress or solutions/fixes?

That’s the same problem with apache2 and I search a solution also.

Same here. Nginx, newest Nextcloud 19.0.3 server versions and my software development folder with 80000 very small files let my NextCloud die.

Any solution welcome

did any one find a solution for this? im dealing with a same issue with NC 20

I guess I am having the same problem. I am downloading 8 TB’s from Google.
While most of my folders don’t have hundreds of things in them, they are nested… and it takes forever to load. Not to mention the I/O is crazy.
I’m using a fast Seagate HDD drive.
I don’t have any problem browsing in file explorer on windows with the same data…

I’ve been experiencing the same issue with my local NextCloud instance. It seems to be related to the number of files/folders in a directory and not to the size of said files. This is not a new problem and has been persistent for me ever since I installed NextCloud a few months ago. This thread date back to Nov 2016, which indicates that this has been with us for some time.

Running ls in the same directory returns instantly, which shows that it is not a problem with the drive or filesystem.

Looking at the Network tab in Firefox’s browsing tools does shine some light on the situation. There is one request that took 43 seconds to return and that was a PROPFIND request on the directory. These tools measure the amount of time between the request to the server and the response from the server. So this confirms that it’s a server-side issue.

Now 43 seconds is a long time to wait for a page to load from a local server, but the additional concern is that even after waiting that long, it only renders the first dozen or two files in the web UI. This means to get half way through the list, one has to press page down repeatedly until each of the entries load. This pagination problem appears to be a client-side issue, as I do not see any other long web requests. If anyone knows of an option somewhere that would allow me to turn this off, please let me know.

When trying to move a file into the directory which contains a lot of files, it takes the same ~45 seconds to load the contents, but then the entire list is available and one can scroll around quickly to find the folder they want without having to wait for any pagination.

If anyone has any suggestions on how I can debug this further to determine where the problem is in remote.php (the file that is responding to the PROPFIND request). Based on the path, it looks like this could be coming from a problem in a webdav library or something. If so, I’d like to help get the problem isolated so we can report it upstream.

There it queries the filecache table. Better caching settings of your database can speed up repeating calls considerably.

If you want to separate the issues a bit, you can take a native webdav client and check the speed, since the web-interface is probably not suited for a large number of files. That is a different issue to work on.

1 Like

I see the same issue in NextCloud
Folder with more than 15000 subfolders each having 6 files each does not render and spins for more than 30 mins.

Have to split this into folders with 150 sub-folders then performance is good, however this becomes a manual activity for users now !!! If response is slow create a subfolder and move files around.

Is there a pagination App which can alleviate this problem ?

There is a report with a user that has about 10 000 files complaining about a loading time of 6-7s:

There was the idea of disabling certain apps, for NC 21 this could be interesting, since there were quite a few apps added recently (recommendations, …)

Me too have the very same problem with NC20 installed. In fact even the Maria-DB server for NC goes way up in load (around 5-10) when a cloud client accesses such a big folder. Since several clients access these simultaniously the load of the cloud box rises to 25-50 …
The NFS-server where the data is located sleeps with load around 0.10. It seems the DB handling in NC is really not good. I thought it should speed up fs access, but looking at the fs and the client it really slows it down significantly.