Cannot download some apps because of curl timeout

Nextcloud version (eg, 29.0.5): 29.0.3
Operating system and version (eg, Ubuntu 29.04): Fedora Server 40
Apache or nginx version (eg, Apache 2.4.25): Apache 2.4.59
PHP version (eg, 8.3): 8.2.20

The issue you are facing: I cannot download some large apps (+40mb) because curl seems to be timing out before they are downloaded.

Is this the first time you’ve seen this error? (Y/N): Y

Steps to replicate it:

  1. Attempt to download an large app (ex: Talk, Recognize)
  2. Download stops after some time
  3. New error appears in admin “Logging” panel

The output of your Nextcloud log in Admin > Logging:

[settings] Error: could not enable apps	POST /settings/apps/enable	from 172.16.1.101 by admin at Jul 4, 2024, 8:19:28 PM

The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

{
    "htaccess.RewriteBase": "\/",
    "memcache.local": "\\OC\\Memcache\\APCu",
    "apps_paths": [
        {
            "path": "\/var\/www\/html\/apps",
            "url": "\/apps",
            "writable": false
        },
        {
            "path": "\/var\/www\/html\/custom_apps",
            "url": "\/custom_apps",
            "writable": true
        }
    ],
    "upgrade.disable-web": true,
    "instanceid": "***REMOVED SENSITIVE VALUE***",
    "passwordsalt": "***REMOVED SENSITIVE VALUE***",
    "secret": "***REMOVED SENSITIVE VALUE***",
    "trusted_domains": [
        "172.16.1.95:8080"
    ],
    "datadirectory": "***REMOVED SENSITIVE VALUE***",
    "dbtype": "mysql",
    "version": "29.0.3.4",
    "overwrite.cli.url": "http:\/\/172.16.1.95:8080",
    "dbname": "***REMOVED SENSITIVE VALUE***",
    "dbhost": "***REMOVED SENSITIVE VALUE***",
    "dbport": "",
    "dbtableprefix": "oc_",
    "mysql.utf8mb4": true,
    "dbuser": "***REMOVED SENSITIVE VALUE***",
    "dbpassword": "***REMOVED SENSITIVE VALUE***",
    "installed": true
}

The output of your Apache/nginx/system log in /var/log/____:


Output errors in nextcloud.log in /var/www/ or as admin user in top right menu, filtering for errors. Use a pastebin service if necessary.

I am running Nextcloud off of Nextcloud’s Docker image (not AIO) through Docker Compose. I am also running a connected MariaDB on the same Compose file.

27 MB in 2 minutes, is that normal for your internet connection? I mean, you could try to adjust the timeout limits, or manually download, but for a file sharing service having such a slow connection, that would be the first issue to address.

If you do these downloads just via curl/wget from command line, is it as slow as that as well?

1 Like

This is not a good sign. Are you sure you’re not having other problems within your environment? Where is your Docker mount for /var/www/html/config located? Any chance it’s on a NAS or something?

{"reqId":"Sj5SCpUBeaNI3DkkoA9j","level":3,"time":"2024-07-05T01:39:31+00:00","remoteAddr":"172.16.1.101","user":"--","app":"PHP","method":"GET","url":"/","message":"fopen(/var/www/html/config/config.php): Failed to open stream: No such file or directory at /var/www/html/lib/private/Config.php#221","userAgent":"Mozilla/5.0 (X11; Linux x86_64; rv:127.0) Gecko/20100101 Firefox/127.0","version":"","data":{"app":"PHP"}}

I have very slow DSL where I live, this is normal for me.

So far I have no issues. I currently have /var/www/html mounted on my computer’s SSD (no network drives yet).

:astonished:
In this case, I’d manually load the apps from apps.nextcloud.com and extract it then to the apps-folder. The one download in the logs was about 270 MB, with your speed that would take 20 minutes or so, then you have script time limits, …