Windows Desktop Client: Uploads 100 files at a time and freezes

Nextcloud version: 25.0.10
Operating system and version: Gentoo
Apache version: Apache/2.4.54
PHP version: 8.1

The issue you are facing:

So we have a fleet of old nextcloud clients and 1 computer with the latest version (3.12.0). Reason we’ve held back the rest of our computers is because of the nasty bug a few years ago that removed timestamps for files, so we’ve been cautious upgrading…

So the new client is way faster at uploaded. Using 3MB files we are hitting 135mbps upload speed, which is going through a low-end mikrotik router which is doing a lot of packet mangling, so we anticipate that headline speed is actually a result of the CPU hitting 100% on the router, not the limitation of the servers/client/nextcloud.

However, what we are seeing with this latest version of the client is that it uploads 100 files in a few seconds, then just hangs not uploading anything for about a minute, then quickly uploads another 100 files, hangs, and this continues in a loop in this fashion. During the hang of the client, there are no spikes on the servers (MySQL, Application Servers or Storage Servers).

What makes this more strange is that the older clients don’t do this, they consistently upload with no breaks, but at a lower speed (About 35 - 40mbps). Because of the huge gaps on the new client, despite the headline speed being substantially faster, it is actually way slower because of these pauses.

We have redis configured as a file-locking memcache.

If we set both the new client and the old clients uploading at the same time, when the new client freezes, the old clients continue to upload with no drop in speed.

Apache is configured with mod_php. Max upload/post size is 2GB, max file uploads I set to 1000 (because it was set to 100, which I thought might have been the reason it hung every 100 files, but that change and restart of apache didn’t fix it).

Is anyone else having this issue? I kind of feel like it’s a client issue, considering the older client doesn’t seem to have the same issue… but I’m also wondering if something is mis-configured on the server side that the new client doesn’t like.

Haven’t pasted any log files as no errors happen during the freezing.

3 Likes

Some further info:

This is using “virtual files”. I’ve not tested it yet with normal file sync but I will try when I get a chance.

Also, I’ve just clicked “Make files available locally” on a large folder to test downloading, as opposed to uploading, and the issue isn’t there on downloads. It consistently downloads without any pausing, so the issue only affects uploads.

Any limit set under the Download Bandwidth setting in the impacted client installations?

Haven’t pasted any log files as no errors happen during the freezing.

What are the last couple lines in the client debug log before each freeze period?

I believe the bulk uploader - which is for optimizing the transfer of small files - has a batch size of 100. Is this solely happening with groups of small files by chance?

It appears to be a compatibility issue with the latest Nextcloud client. To troubleshoot:

  1. Check client logs for error messages.
  2. Ensure the server runs the latest compatible version.
  3. Explore forums for similar issues and solutions.
  4. Reach out to Nextcloud support or community for assistance, specifying your client and server versions.

Right, when I originally uploaded files and this was happening, they are all around 2.5mb - 3mb. I’m not sure what the client justifies as “small files” which need batching, but from what you’ve said and the speed that the client hits when uploading, I’m going to assume it was batching them together.

I just tried the same thing with 30mb files and there is no freezing/pausing at all.

I will upload the 3mb files again shortly and see if anything is logged to the server/client log files during the upload and get back.

And no, no bandwidth limits set in the client

Quite old post but I’m facing the exactly same troubles, after adding a local folder for sync, containing a few 10 000 photos.

Did you get round the problem by downgrading the NC desktop client ?

How could it be a compatibility issue as my server is up to date (30.0.2) and so is my client (3.14.3) ?

The issue likely lies with the new Nextcloud client (3.12.0) and how it handles uploads in bursts. Here’s how to troubleshoot:

  1. Check Client Logs for errors (%AppData%\Nextcloud\logs or ~/.config/Nextcloud/nextcloud.log).
  2. Review PHP Settings (max_input_time, memory_limit) to ensure they support large uploads.
  3. Check Redis Load for any delays in file-locking.
  4. Test Without Router to rule out network issues.
  5. Verify Nextcloud Settings like max_chunk_size and background jobs.
  6. Test Single Large File Upload to isolate the issue.

These steps will help pinpoint if it’s a client or server issue.

Thank you very much for your help.

FYI I created a bug here.

  1. Client log seems to blame the server ?

First errors (if I read correctly) are :

2024-11-23 10:18:46:711 [ warning nextcloud.sync.networkjob C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\abstractnetworkjob.cpp:223 ]:	QNetworkReply::UnknownServerError "Le serveur a répondu \"504 Gateway Time-out\"  à \"POST https://cloud.huynen.fr/remote.php/dav/bulk\"" QVariant(int, 504)
2024-11-23 10:18:46:711 [ warning nextcloud.sync.credentials.webflow C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\gui\creds\webflowcredentials.cpp:207 ]:	QNetworkReply::UnknownServerError
2024-11-23 10:18:46:711 [ warning nextcloud.sync.credentials.webflow C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\gui\creds\webflowcredentials.cpp:208 ]:	"Error transferring https://cloud.huynen.fr/remote.php/dav/bulk - server replied: Gateway Time-out"
2024-11-23 10:18:46:711 [ info nextcloud.sync.networkjob.put.multi C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\putmultifilejob.cpp:85 ]:	POST of "https://cloud.huynen.fr/remote.php/dav/bulk" "" FINISHED WITH STATUS "UnknownServerError Le serveur a répondu \"504 Gateway Time-out\"  à \"POST https://cloud.huynen.fr/remote.php/dav/bulk\"" QVariant(int, 504) QVariant(QString, "Gateway Time-out")

Then exactly 100 lines like :

2024-11-23 10:18:46:712 [ warning nextcloud.sync.networkjob.put.multi C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\putmultifilejob.cpp:94 ]:	oneDevice has error: "Erreur inconnue"

And 400 lines : (4 lines per file)

2024-11-23 10:18:46:712 [ info nextcloud.sync.propagator.bulkupload C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\bulkpropagatorjob.cpp:541 ]:	Item completed "photos du NAS/IMG_20170904_091921.JPG" OCC::SyncFileItem::NormalError CSyncEnums::CSYNC_INSTRUCTION_SYNC "Erreur réseau : 499"
2024-11-23 10:18:46:712 [ warning nextcloud.sync.propagator C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\bulkpropagatorjob.cpp:723 ]:	Could not complete propagation of "photos du NAS/IMG_20170904_091921.JPG" by OCC::BulkPropagatorJob(0x21a21ea60c0) with status OCC::SyncFileItem::NormalError and error: "Erreur réseau : 499"
2024-11-23 10:18:46:712 [ info nextcloud.sync.propagator.bulkupload C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\libsync\bulkpropagatorjob.cpp:753 ]:	modify final status NormalError OCC::SyncFileItem::NormalError OCC::SyncFileItem::NormalError
2024-11-23 10:18:46:714 [ info nextcloud.gui.activity C:\Users\User\AppData\Local\Temp\windows-26654\client-building\desktop\src\gui\tray\usermodel.cpp:849 ]:	Item  "photos du NAS/IMG_20170904_091921.JPG"  retrieved resulted in error  "Erreur réseau : 499"
...
  1. PHP settings are :
upload_max_filesize = 16G
post_max_size = 16G    
memory_limit = 16G   
max_file_uploads = 100
output_buffering = Off
max_input_time = 3600
max_execution_time = 3600
session.gc_maxlifetime = 3600
default_socket_timeout = 600 # changed from 60 with no change on bug
max_file_uploads = 100 # changed to 1000 without change
  1. I have not installed Redis, file locking is managed by the database.

I disabled file locking in config.php, without any change.

  1. Router : same behaviour at work and at home… where I used to live without this bug.
    Edit : I’m not sure I’ve ever tried to upload so many files from the desktop client in the past…
  2. Nextcloud settings : I tried to disable chuncking with no effect (max_chunk_size --value 0). I don’t know what other setting might help…
  3. No issue with large files.

Finally I guess the gateway timeout is something to look into ?

1 Like

I think I found a solution :

I tried several php and nginx settings, it appears to work after adding the following in my /etc/nginx/sites-available/cloud config file in the container, in the location ~ \.php(?:$|/) section :

fastcgi_send_timeout 600;
fastcgi_connect_timeout 600;
proxy_connect_timeout 600;
proxy_send_timeout 600;
proxy_read_timeout 600;
send_timeout 600;

Note that my NC instance runs in an incus container behind an nginx reverse proxy. I also added some of those parameters on the host side.
After (tens of) hours of testing, I must admit that I haven’t kept track of all my files changes.

Correct me if I’m wrong : I naively understand that the NC desktop client uploads the first 100 files one by one, then by packs of 100.
The timeout occurs if the sync of a 100 pack files takes too long.

I’ve been having the same problem and I found a hackaround by reading the source code.

There’s a check for whether a file should be batch uploaded and one of the conditions in the OwncloudPropagator::isDelayedUploadItem() function is that _syncOptions.minChunkSize() > item->_size, ie a file won’t be batch uploaded unless it’s smaller than the minimum chunk size.

I tried setting minChunkSize=1 in the client config and now all files are uploaded without batching. It’s super slow but at least it works. Note that minimum chunk size 0 does not work.

Same problem here. Using Nextcloud Desktop 3.15.0 and Nextcloud container docker.io/library/nextcloud:30.0.2.
Syncing stops after 100 files. After about an hour, another 100 files are being processed. If I pause and resume syncing during the “wait” period, syncing starts again and the max files waiting for a sync is lowered a bit (by 3 or 4).

Where do you get those error messages from? The desktop client only shows “you added file …” and there are no errors in the server log.

So I put this original post up ages ago.

Client is still buggy (Has had this bug along with numerous other bugs for ages now) that are just ignored with all the focus on just constantly adding new functionality, which I understand they need to focus on, but it is a shame as well, because core things just aren’t implemented well that need fixing.

For anyone that is frustrated by this bug, you can disable bulk uploads (packing 100 files at a time) by adding this in your server config file:

'bulkupload.enabled' => false

By adding that line, it disables packing up 100 files into a single upload… Obviously, packing small files into a single upload SHOULD make it faster, as it eliminates the latency between each file, but something is broken somewhere and for me, a transfer that would have taken about 5 days came down to a few hours (over 100,000 small files)

You may have to restart the client afterwards, not sure… I restarted mine to make sure it picked up the change, but it may just detect it on next sync without a restart, not sure?

7 Likes

Hi @mattewan,

Did you consider my post above ?

It turns out that this bug is not the fault of the desktop client, but that the bulk download functionality has caused it to appear when the server is not configured appropriately.

I’m sure you will get rid of it if you edit the appropriate apache or nginx config file.

That said, it would have been a good idea for the developers to give the recommended parameters when the function first appeared…

Sorry for my English, but I created an account just to thank you, your solution helped me a lot. Thank you very much.

Your solution works! After hours of scouring the internet… I am finally having a smooth sync > 100 files at a time
image

You are a great man. It works. The problem is gone, I suffered for six months.

Thank you!

I am always amazed by the level of “competence” of the NC developers and the glorious User Interface reminding me of the beginning of the Internet.

Hello,

Do you know the configuration for apache?

Thanks

Hello,

No, sorry, I manage my own server and I’ve never used Apache…
You can probably find the equivalent syntax in the Apache documentation.

I marked disabling the bulk upload as a solution. This is only a temporary workaround. There is a overview post on github if you want to follow up the developers:

If you suspect package size or timeout limits on your webserver, the documentation mentions a few of these parameters. In case it is any of these, the documentation will certainly be updated. If you do anything out of the box, check corresponding issues on the bug tracker, give feedback on stuff that did or did not work.