Bad performance on folder with many files

Hi,

I’m on NextCloud 10.0.1, lighttpd, mysql, debian.
dbfiles, www-home and data dir is mounted on ext4 (external disk)

The overall performance of Nextcloud in the webinterface is pretty fine (about 2-3 secs for a page, sometimes 10) The same goes for webdav access - it’s a tad slower but still in the seconds.

I have one folder with about 5000 pictures - and only pictures. All around ~5MB. No subfolders.
Opening that folder in webdav takes about 20-30sec - which is Ok and something which is within my expectations.

But opening the same folder via the web-interface takes about 4+ minutes. (https://example.com/index.php/apps/files/?dir=/Photos/Camera roll)

When doing this request I see, that iotop is idling and mysql process is idling. Top shows me, the lighttpd process is running fast-cgi at about 30-40% CPU usage the full 4 minutes.

When doing a SHOW PROCESSLIST in mysql I see nothing but a few short bursts with the SQL select commands which are usually too fast to note. The slow-query.log is empty.

I was switching from Apache2 to lighttpd recently because I had the same issues on Apache2 and was under the impression, things would become better when using lighty. I was wrong…

Does anyone has an Idea, where I could look further? It seems, that the code used by the webinterface is doing some extra stuff, the webdav code doesn’t.

And really, this is the first time in my entire life, that webdav is faster then a plain http request…

Thanks,
AS

Thanks, but that’s about the Files app (like the link in the post /apps/files/?dir=…)
The gallery takes initially the same amount of time, but scrolling down is pretty fast for my limited upstream bandwith.

No one experiences similar issues? It’s absolutely down to that folder…

Same. I think it’s a client side problem (guessing at javascript trying to read the metadata on all the files at once?) as it drives the fans on my laptop wild.

In all honesty it’s becoming a bit of a deal breaker, which is a shame as I really don’t want to hand my files back to third party services who scan everything like Google or Dropbox.

Same or similar issue here. We have a folder for publications with currently 41,036 entries. On most clients (Windows, Mac, Linux) none of the entries are shown; the wait cursor is circling forever. It simply takes too long to build the list. The search function works however surpisingly well. However, if I get a small selection, say three items, non of them can be downloaded. So I was looking for a paging plugin; till now to no avail. I know that I can remedy this situation by introducing a large number of subfolders and reorganizing the whole library. Yet, I’m a bit reluctant to do so. Is there somebody able to point me to an acceptable solution/plugin? Or is this a work item for the next version of Nextcloud?

I am having the same trouble with my system. I use the IOS app to sync my pictures on the phone to nextcloud.
When trying to view of list them from the files section it takes forever but from the Gallery it is fine and creates 2x3 icons for picture with name in it and goes reasonably fast. The problem is the phone does not let you break up photos in different directories. On option is by month and year but that is way too many directories and some only having a few pictures in them.

john

Just wanted to revisit this, I notice similar behavior here. Folder contains about 2000-3000 files. (the only large folder).
Browsing other folders is pretty quick. Browsing this folder takes significant time. While this folder is open searching for filenames also takes significant times (I’m talking 30+ seconds).

Interestingly if I go to another folder I can search for the same filenames in seconds.

I looked at the dev tools of chrome browser to what takes most to load and it looks like core.js loads folder content. Then browser renders it? And I guess PC performance will be the key factor here.

I think the problem here is, it loads all folder content and then it’s rendering it while scrolling fast.
versus traditional way when it loads fast but the view is broken down to certain number of items per page.

I bet a lot of people have the same issue and not sure how it’s addressed. :confused:

For now I only have ~3k files, but folder is growing, it takes 2.2 megs to load the content, then browser’s just rendering it.
Any action like search or changing order by name/file size makes browser re-render the content and takes long time. So I figured fastest way is to just search by file name from another folder.

Personally I think this is bad practice and there should be a better solution implemented (I don’t mind breaking it down by pages).

Cheers

I generally avoid too large folder because many file systems become very slow. In your case it seems to be the code on the web-interface, there seems to be potential for improvements. You should create a bug report on the bug tracker to improve the performance: https://github.com/nextcloud/server/issues

I run a nextcloud server with nearly 50 active users who upload many small video files. Currently we have over 130GB of video files, and browsing the folder where all of the videos are is painfully slow for many of the users. This is likely because the request /remote.php/dav/files/<username>/<folder> is 7.5 MB in size that they have to download each time. This is entirely just a large XML request, and likely the cause of freezing on some clients.

I’ve tried searching for similar issues to this, and can’t find a solid answer or solution to fixing this issue. The server almost instantly completes the request, it’s a matter of transmitting the request from what I can tell, and not all clients have super fast internet.

( another side note, this forum is also extremely slow, with up to 10 second response time )

Does the problem only occurs with nginx, or does the slow page load with many files also occurs when using apache2?

I have the same problem with nginx - any progress or solutions/fixes?