Sync extremely slow with large number of files (on Desktop)

I have a user with 33GB 21135 directories and 167198 files of data. Sync on the laptop with the desktop client (3.0.3, macOS) is extremely slow and crashes multiple times without any error. It takes especially long for the ‘Checking for changes in remote’ step to finish.

Are there any recommendations on how to tackle this issue or reasons as to why it is slow?

From the logs I retrieve with nextcloud --logdir ... command, it seems like nextcloud is performing a single http request for every directory. Is this true?

Thanks in advance!

So the logs are not catching why it crashes? What is the last thing before the crash?

A long time ago, I tried how much I could speed up the upload of a large number of files:

I managed to get up to 1000 files/minute. Not sure what has been improved since, impact of SSDs and faster systems. But for you perhaps a starting point.

Yes and every file, and each time the filecache-table is updated.

There is also a bug report: https://github.com/nextcloud/desktop/issues/691
(I think there is a part for the server config that can help to improve the situation, the crash and freezing of the client seems to be a client issue, and of course there could be a more intelligent upload).

I see. I ended up using rsync and then scanning the files, which was way faster.

There is an issue discussing the sync speed problem here: https://github.com/nextcloud/desktop/issues/691

1 Like

@jvhoffbauer 's solution is correct workaround.

Proper fix is to add to bottom of php.ini for the FPM & CLI & CGI to add in:

memory_limit = 10G
post_max_size = 25G
upload_max_filesize= 25G

Then increase chunksize in /var/www/nextcloud/config/config.php:

‘chunkSize’ => ‘5120MB’,

Then you need to restart php-fpm:

systemctl restart php7.4-fpm

Simply replace the 7.4 with your version number, can be found with php -v or php -info

So far desktop sync sped right up. Still testing to see if Android app finally can finish its uploads.

I also added the preview image cacher plugin, ran the initial run, and added the cronjob. Next I will offload my MySQL, Redis, and Clamscan to a second raspi. That should offload lots of work from the pi running NextCloud

I wish there was a plugin/way to add a secondary pi to offload the primary pi’s resources for NextCloud. So you can add more processing power from another pi. It would be cool to have a way to set up master/slave next cloud servers, so you can have multiple servers serving nextcloud.

1 Like

Thanks for the excellent answer - worked perfectly, through 10G memory limit is rather heavy (if you only have 8 GB memory).
But if the memory_limit = 6Gmade everything work smoothly. Thanks