Docker Download Issues on Desktop Client (10 GB foldrers or larger)

So I have nextcloud instance running in a docker container running on a Synology NAS.

The issue: any file or folder larger than 10 GB will NOT download on my desktop client (running on Windows 10). I can however download files/folders of any size via the web-based client, whether or not I am on my local network at home where the nextcloud instance is hosted or not

It’s currently setup so that I have an nginx reverse proxy to get into it.

The issue is not nginx. I know because I recnetly changed the nginx proxy config file and tested it, and it’s fine. The issue is as stated above.

Because this isn’t a “traditional” nextcloud installation I’m not really sure what to do about this. The config files mentioned in a lot of help pages don’t exist.

I have tried starting the image with a config file that looks like this, but this doesn’t seem to be helping:
max_execution_time >= 300
max_input_time >= 300
memory_limit >= 512M
post_max_size >= 20GB
upload_max_filesize >= 20GB

Any ideas would be appreciated :slight_smile:

Hi bsquared :slight_smile:

What do you mean by “will not download”?
Does it initiate, but stops mid-download - or
Doesn’t start at all?

If it stops mid-download, does it stop after x amount of seconds or after a certain amount of data transferred?

Given that you can download files via web, it doesn’t sound like PHP is the problem, but who knows :slight_smile: You can also try temporarily setting max_execution_time higher or set it to 0 to remove that variable out of the equation.

1 Like

What do you mean by “will not download”?
Does it initiate, but stops mid-download - or
Doesn’t start at all?

It doesn’t start at all. It sees that the object/folder is larger than 10 GB and gives me a message that it will not download the folder. Unfortunately the message is cut off in Windows and I don’t know where to find a history of notifications/messages to see the full message.

I’ve uploaded large files and folders with sizes exceeding 5 GB with no issue.

I could trythe max_execution_time fix you are suggesting, but as you said this doesn’t seem to be it.

What is puzzling is that this isn’t a size that people seem to run into issues with. Normally its 1 GB (which was an issue for me on nginx, which I fixed) or 2 GB (file system limitations). But 10 GB seems unrelated to either of these.

I wonder if it is actually a limitation with webdav? I am not sure if the client uses webdav (correct me in so case!), but I know that you can increase this limit by checking these steps (PSST: External link)

Could you at least check what the value is at now, as that seem to be a plausible theory :slight_smile:

@denNorske

First off thank you for your continued help :slight_smile:

So I checked the value of the FileSizeLimitInBytes value in the registry editor.

The interesting this is that the value is 50000000 bytes. Which is weird. Because I have used both OneDrive and Dropbox on my PC with no issues. (I have stored files and folders well over 10 GB on dropbox when I had a lot of free space there…long story :wink: )

I still suspect ths to be a config issue with nextcloud but with the docker instance I just don’t know which config file/variable needs changing :confused:

(Still googling on the side btw)

Hi again!

So I just checked with nextcloud 21.0.2 freshly installed in docker, using docker-compose;

version: '2'

services:
  db:
    image: mariadb
    restart: always
    command: --transaction-isolation=READ-COMMITTED --binlog-format=ROW
    volumes:
      - ./data/var/lib/mysql:/var/lib/mysql
    environment:
      - MYSQL_ROOT_PASSWORD=password
      - MYSQL_PASSWORD=password
      - MYSQL_DATABASE=nextcloud
      - MYSQL_USER=nextcloud

  app:
    image: nextcloud
    restart: always
    ports:
      - 8080:80
    links:
      - db
    volumes:
      - ./data/var/www/html:/var/www/html
    environment:
      - MYSQL_PASSWORD=password
      - MYSQL_DATABASE=nextcloud
      - MYSQL_USER=nextcloud
      - MYSQL_HOST=db

I’ve tried installing the clients on linux and windows, and then tried to sync the folders. Folder syncing works as expected, it downloads my 30 gb file without problem. I used Version 3.2.3 on both ubuntu and windows.

Is it the folder syncing that fails, or are you doing something else? Any logs you can share from the client/server that would be useful? :slight_smile:

Hey again,

Any logs you can share from the client/server that would be useful?

I wish I knew where the logs are. I am not sure if Nextcoud will log stuff in the general docker logs or not.

So I just checked with nextcloud 21.0.2 freshly installed in docker, using docker-compose;

So I installed nextcloud using Synology’s Docker interface. I’m not sure if this is why things are going awry. Here are some screenshots of how I set it up:
(update: this site wasn’t allowing screenshots for some reason, so I entered the environment variables and things in a pseudo-docker-compose style)

Maria DB setup:

LANGUAGE en_US.UTF-8
MYSQL_DIR /config
DATADIR /data
TZ America/New_York

(hidden: PGID, PUID, ports, passwords, etc)

Nextcloud setup:

environment: 
- PATH: /usr/local/sbin:/user/local/bin
- NEXTCLOUD_PATH: /config/www/nextcloud


volume: 
- nextcloud/nextcloud-web/config.php: /config/php_mine.ini
- nextcloud/nextcloud-web/data:/data
- nextcloud/nextcloud-web/config: /config

(hidden: PGID, PUID, ports, etc)


I’m wondering if Synology’s docker thing is interfering this somehow?

Thanks for the continued help btw :slight_smile:

Ah hmm, I am meeting a wall for what you can try to do… So I have one last thing I can think of. Docker is exposing everything from stderr and stdout via “docker logs”.

Can you try doing follow mode;
docker logs -f [container name]

And let the command run, while you next try to download a massive file via your client?

If you don’t know the container name, it should be listed under
docker ps
or
docker ps | grep next
assuming it has “next” in it.

And maaaybe we can see something helpful from that?

And no problem btw, I am happy to help. Could learn something from this.

Hi again,

Holidays just finished and I have to look busy at work again :slight_smile:

So I decided to and set the “loglevel” to 1 in my config.php file. (this variable was previously not set, and I believe the default level is just “warn”. As per documentation:
https://docs.nextcloud.com/server/latest/admin_manual/configuration_server/logging_configuration.html

I assume this will make nextcloud.log actually log something, because as of yesterday it didn’t get anything recorded to it for over a week.

The docker logs don’t seem to show the information we are looking for at the glance I took at them. And it updates too often.

Is there not a place in the web version of nexcloud that contains logs? I keep seeing this online but haven’t found it myself yet

I have uploaded a 30 GB file onto my nextcloud online. Wil attempt to download it tonight or tomorrow and see if the log updates.

1 Like

@denNorske

Okay so:

Just did a test with a 30 GB .zip file via the windows desktop client.

It downloaded just fine (it was actuallly a zip of a folder that is over 10 GB)

Am going to test a 12-15 GB folder to see if it works.

One other thing: this download was over my local network, however even downloads of 10GB+ over my local network caused issues previously so this is a win in that case too.

Gonna test with a folder of items instead of one large item, see if it is working…

1 Like

@denNorske

Sorry for the blackout. I did some investigations and tests my side.

It turns out that there is a setting in the desktop app which is on by default for 10 GB folders or larger :man_facepalming:

I don’t know why I didn’t see this sooner, but at least through all this I figured out a bunch of stuff about my setup and fixed it / made it more efficient.

I’m sorry if I wasted your time. Thank you so much for helping with this, and for your perseverance alongside mine. I’m going to mark this topic as solved now.