Download takes a long time to start, but download speed is then very fast

Check your preview directory size and be happy :wink:
I wrote about preview a bit here: Previews / data size also there is a linked information how to decrease preview folder (and may be DB) size.

Did you moved you data directory even once after installation? I have had this issue when DB was twice bigger than it should: Files amount after moving of data directory is wrong (much bigger)

I adapted my config.php to yours but it didn’t help.

No, I set the data directory to /var/nc_data during the installation. At that time this problem didn’t exist either.

1 Like

Config only will not helps you. You have to “reset” your preview cache also as described here

Now I’ve done it as described.
But unfortunately still no improvement.
I’m really helpless now and I’m running out of ideas.
After all, the download works, but the long loading time until it starts is really corrosive.

A new installation would be really fatal.

<IfModule mod_headers.c>
  <IfModule mod_setenvif.c>
    <IfModule mod_fcgid.c>
       SetEnvIfNoCase ^Authorization$ "(.+)" XAUTHORIZATION=$1
       RequestHeader set XAuthorization %{XAUTHORIZATION}e env=XAUTHORIZATION
    </IfModule>
    <IfModule mod_proxy_fcgi.c>
       SetEnvIfNoCase Authorization "(.+)" HTTP_AUTHORIZATION=$1
    </IfModule>
  </IfModule>

  <IfModule mod_env.c>
    # Add security and privacy related headers
    Header always set Referrer-Policy "no-referrer"
    Header always set X-Content-Type-Options "nosniff"
    Header always set X-Download-Options "noopen"
    Header always set X-Frame-Options "SAMEORIGIN"
    Header always set X-Permitted-Cross-Domain-Policies "none"
    Header always set X-Robots-Tag "none"
    Header always set X-XSS-Protection "1; mode=block"
    SetEnv modHeadersAvailable true
  </IfModule>

  # Add cache control for static resources
  <FilesMatch "\.(css|js|svg|gif)$">
    Header set Cache-Control "max-age=15778463"
  </FilesMatch>

  # Let browsers cache WOFF files for a week
  <FilesMatch "\.woff2?$">
    Header set Cache-Control "max-age=604800"
  </FilesMatch>
</IfModule>
<IfModule mod_php7.c>
  php_value mbstring.func_overload 0
  php_value default_charset 'UTF-8'
  php_value output_buffering 0
  <IfModule mod_env.c>
    SetEnv htaccessWorking true
  </IfModule>
</IfModule>
<IfModule mod_rewrite.c>
  RewriteEngine on
  RewriteCond %{HTTP_USER_AGENT} DavClnt
  RewriteRule ^$ /remote.php/webdav/ [L,R=302]
  RewriteRule .* - [env=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
  RewriteRule ^\.well-known/host-meta /public.php?service=host-meta [QSA,L]
  RewriteRule ^\.well-known/host-meta\.json /public.php?service=host-meta-json [QSA,L]
  RewriteRule ^\.well-known/webfinger /public.php?service=webfinger [QSA,L]
  RewriteRule ^\.well-known/nodeinfo /public.php?service=nodeinfo [QSA,L]
  RewriteRule ^\.well-known/carddav /remote.php/dav/ [R=301,L]
  RewriteRule ^\.well-known/caldav /remote.php/dav/ [R=301,L]
  RewriteRule ^remote/(.*) remote.php [QSA,L]
  RewriteRule ^(?:build|tests|config|lib|3rdparty|templates)/.* - [R=404,L]
  RewriteCond %{REQUEST_URI} !^/\.well-known/(acme-challenge|pki-validation)/.*
  RewriteRule ^(?:\.|autotest|occ|issue|indie|db_|console).* - [R=404,L]
</IfModule>
<IfModule mod_mime.c>
  AddType image/svg+xml svg svgz
  AddEncoding gzip svgz
</IfModule>
<IfModule mod_dir.c>
  DirectoryIndex index.php index.html
</IfModule>
AddDefaultCharset utf-8
Options -Indexes
<IfModule pagespeed_module>
  ModPagespeed Off
</IfModule>
#### DO NOT CHANGE ANYTHING ABOVE THIS LINE ####

ErrorDocument 403 //
ErrorDocument 404 //
<IfModule mod_rewrite.c>
  Options -MultiViews
  RewriteRule ^core/js/oc.js$ index.php [PT,E=PATH_INFO:$1]
  RewriteRule ^core/preview.png$ index.php [PT,E=PATH_INFO:$1]
  RewriteCond %{REQUEST_FILENAME} !\.(css|js|svg|gif|png|html|ttf|woff2?|ico|jpg|jpeg|map)$
  RewriteCond %{REQUEST_FILENAME} !core/img/favicon.ico$
  RewriteCond %{REQUEST_FILENAME} !core/img/manifest.json$
  RewriteCond %{REQUEST_FILENAME} !/remote.php
  RewriteCond %{REQUEST_FILENAME} !/public.php
  RewriteCond %{REQUEST_FILENAME} !/cron.php
  RewriteCond %{REQUEST_FILENAME} !/core/ajax/update.php
  RewriteCond %{REQUEST_FILENAME} !/status.php
  RewriteCond %{REQUEST_FILENAME} !/ocs/v1.php
  RewriteCond %{REQUEST_FILENAME} !/ocs/v2.php
  RewriteCond %{REQUEST_FILENAME} !/robots.txt
  RewriteCond %{REQUEST_FILENAME} !/updater/
  RewriteCond %{REQUEST_FILENAME} !/ocs-provider/
  RewriteCond %{REQUEST_FILENAME} !/ocm-provider/
  RewriteCond %{REQUEST_URI} !^/\.well-known/(acme-challenge|pki-validation)/.*
  RewriteRule . index.php [PT,E=PATH_INFO:$1]
  RewriteBase /
  <IfModule mod_env.c>
    SetEnv front_controller_active true
    <IfModule mod_dir.c>
      DirectorySlash off
    </IfModule>
  </IfModule>
</IfModule>

May there is a problem?

<IfModule mod_rewrite.c>
  RewriteEngine on
  RewriteCond %{HTTP_USER_AGENT} DavClnt
  RewriteRule ^$ /remote.php/webdav/ [L,R=302]
  RewriteRule .* - [env=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
  RewriteRule ^\.well-known/host-meta /public.php?service=host-meta [QSA,L]
  RewriteRule ^\.well-known/host-meta\.json /public.php?service=host-meta-json [QSA,L]
  RewriteRule ^\.well-known/webfinger /public.php?service=webfinger [QSA,L]
  RewriteRule ^\.well-known/nodeinfo /public.php?service=nodeinfo [QSA,L]
  RewriteRule ^\.well-known/carddav /remote.php/dav/ [R=301,L]
  RewriteRule ^\.well-known/caldav /remote.php/dav/ [R=301,L]
  RewriteRule ^remote/(.*) remote.php [QSA,L]
  RewriteRule ^(?:build|tests|config|lib|3rdparty|templates)/.* - [R=404,L]
  RewriteCond %{REQUEST_URI} !^/\.well-known/(acme-challenge|pki-validation)/.*
  RewriteRule ^(?:\.|autotest|occ|issue|indie|db_|console).* - [R=404,L]
</IfModule>

Please have a look at:

Hope this helps.
:smile:

Nope :frowning:

Anyone else have an idea?
That really sucks.

1 Like

@nxcl – Please understand I am very sorry to hear of your mishap and hassle in continuity. Please find my virtual gesture of sympathy included.

Unfortunately, many cooks spoil the broth. :exploding_head:

IMHO one could have chosen to stay at NC16 to reduce complexity and to avoid the usual beta tester issues of an early so-called stable NC 17 release. However, happens all the time and almost every fresh major release, unfortunately.
:shield:

I am very sorry to not being able to provide some advice more specific to your issue and cannot help the repair, unfortunately.
:safety_vest:


However, some more general advice and again mainly aimed at the public domain of this user forum.

One could consider to remove the php-fpm and use the libapache2-mod-php7.3 instead. This would be quite appropriate to most home user NC scenarios and one’s data would stay safe with the more conservative approach, I presume.

php7.3-fpm — server-side, HTML-embedded scripting language (FPM-CGI binary)

This package provides the Fast Process Manager interpreter that runs as a daemon and receives Fast/CGI requests. Note that MOST Apache users probably want the libapache2-mod-php7.3 package.
https://packages.debian.org/en/buster/php7.3-fpm

Furthermore, one may consult the below article for background information.

Naturally, one may not use a RasPi embedded system. However, one may learn about Apache2 vs. NGINX in a more reasonable way and referring to factual triage.
:lab_coat:


Good luck and all the best.
:four_leaf_clover:

Happy hacking.
:sunflower:

I will try this!

Hmm, at the moment some employees are testing and see this:

CPU 398. ??

Now I’m getting 504 GW Timeout …

@nxcl – Sorry, this complete thread is too long and the IMHO somewhat contradictory steps far too confusing for my pour old brain.
:crazy_face:

Just guessing in the dark:

Please understand I do not blame you and just lending a helping hand basically. IMHO from learning of the many in-official or vague sources of some advice above you could be in some serious trouble, unfortunately.
:nerd_face:


This is the home user forum and although you may ask freely and virtually anything, it may depend what you get answered for free. Any enterprise should ref. to Nextcloud GmbH as there shall be professional support available:

  1. Customers and Partners (mainly) closed portal — https://portal.nextcloud.com/
  2. An Enterprise Subscription from Nextcloud is available with email and phone support.

NOTE: Please be aware I am not affiliated with Nextcloud GmbH and my views are the free views as both a volunteer and a EU citizen and appear in a private capacity solely.


Good luck.
:four_leaf_clover:

I think, i did
sudo apt-get purge 'php*'

How do i test this correctly?

image

image

do I have to give www-data permission here?

I installed Redis like here:

@nxcl – You are in trouble and you may have lost your data or may lose it inadvertently. Honestly, this way of procedure was not exactly brilliant.
:exploding_head:

Warning: You may lose all of your data (inluding the data of presumed customers) if you have no backup. :warning:

Please make a full backup of both the database and the nextcloud/data file tree onto an external volume. Afterwards detach this volume from your computer and put this backup volume in a secure and safe place.
:safety_vest:

IMHO your team seems to be completely in the dark. If you are a business and if you truly have employees please get somebody professional to assist you at your earliest convenience.
:lab_coat:

Unfortunately, you did not although your statement is correct. Apparently, you did not only ‘purge’ the php-fpm module but instead you may have removed any and all PHP modules and thus the complete PHP software package including all configuration files from your system.

  • Please recall the phrase ‘remove’ but not ‘purge’ was used by me above.
  • Please be aware that the wildcard operator php* addresses any and almost all PHP modules on your system.
  • Please consult the CLI manual for the apt command by help of the CLI i.e. just make a man apt and look for the entries of ‘remove’ and ‘purge’ please.
  • Please consider the grave error you may have made and you may have destroyed your PHP configuration completely and utterly lost any chance of safe return inadvertently.

Please think first, act second only and always have the protection of your data in mind.
:innocent:


BTW a kind gesture like problem solved (i.e. :white_check_mark:) or a little ACK to one or more of my comments (i.e. click on the heart icon :heart:) would show you are satisfied. This could be a kind gesture and would motivate me like authors of other advice to continue in lending a hand freely…
:smiley:


Please immediately start studying the basics and in your own best interest consult the documentation:

Take your time to study and make some trials in a safe environment. Consider to install a test system, if you have a spare machine at hand and you can afford the time.

Stick to the more known Un*x flavours. Naturally, you can choose Ubuntu, CentOS, FreeBSD, ArchLinux and probably NextCloudPi or other flavours besides Debian Linux. Also available are some other OS …


As I am in a good mood today and lending a hand please find an excerpt from the a.m. APT manpage:

Removing a package removes all packaged data, but leaves usually small (modified) user configuration files behind, in case the remove was an accident. Just issuing an installation request for the accidentally removed package will restore its function as before in that case. On the other hand you can get rid of these leftovers by calling purge even on already removed packages. Note that this does not affect any data or configuration stored in your home directory.
https://manpages.debian.org/stretch/apt/apt.8.en.html

:nerd_face:

Happy hacking.
:sunny:

We are still testing Nextcloud with pilot users only.
We also use the cloud exclusively for temporary file exchange with external users. Accordingly, we don’t make a real backup of the data, which is also communicated to the employees. We delete all uploaded files automatically after 30 days anyway. A loss of data is therefore relatively unimportant.

I’m new to the world of Linux.
I already noticed my mistake myself. Well, too late.

1 Like

Good to know and I rest my case. Your approach is a good one and I may have misunderstood some of your a.m. phrasings, apparently.
:smile:

Furthermore, one could consult the Nextcloud Community Guidelines


Be pragmatic

Nextcloud is a pragmatic community. We value tangible results over having the last word in a discussion. We defend our core values like freedom and respectful collaboration, but we don’t let arguments about minor issues get in the way of achieving more important results. We are open to suggestions and welcome solutions regardless of their origin. When in doubt support a solution which helps getting things done over one which has theoretical merits, but isn’t being worked on. Use the tools and methods which help getting the job done. Let decisions be taken by those who do the work.


Better safe than ugly.
:stuck_out_tongue_closed_eyes:

Good luck.
:four_leaf_clover:

Unfortunately, the problem is still there.

I have executed the fatal command “sudo apt-get purge ‘php*’” before, when I was doing something else because of PHP.
I’m not really sure if this caused my problem - but I know that I didn’t have my above problem at the beginning.

So I will reinstall my Nextcloud completely.

However, I have set up about 50 users locally, which I don’t like to rebuild. Better, I want to reinstall the Nextcloud over the weekend so that the users don’t notice anything about it. So they should be able to log back in on Monday with the same user. Also their documents should be available.

How can I do this?
I once looked for something - but unfortunately I don’t find a really good, up-to-date manual.
Do you know any instructions, at best self-tested?

Currently I have on my screen:

  • Database backup - HOW?
  • /var/nc_data is on separate drive
  • /var/www/html/nextcloud/config/config.php
  • /etc/apache2/sites-available/ -everything-
  • /etc/php/7.3/apache2/php.ini
  • /etc/fail2ban/jail.d/nextcloud.local

Basically it is only one command to do backup of DB:
https://docs.nextcloud.com/server/16/admin_manual/maintenance/backup.html?highlight=backup#backup-database

But I believe this is not your problem. Your problem is that Apache2 eats a lot. e.g. for me Apache is very small part of CPU time, but php-fpm and mysql are.

I found here that mod_status could give you an overview of what is happens with apache2:

I would try also to use lsof to check what a hell they are doung, e.g. for you first screenshot:

lsof -p 1828 1756

Try to do clean php 7.3. install after you purge php*.

As long as you have the problem with a single user, giving access to about 50 users will make things worse and even more difficult to debug.
To clarify: the first problems you have is when you just access as a single user (not with many other using it at the same time)

A few ideas, I’d try out:

  • Put a large file in a folder and download it (without php or anything, just a static file), this way you can check your basic setup including the network
  • Did all of your cronjobs run properly (check the admin-page when the cronjob run successfully the last time)
  • Did you try to access via webdav directly? Check out with WinSCP/Cyberduck or something, if the access is faster (if it works, I’d look into the apps that are installed and used on the web-interface)
  • Disable not used stuff. I’d start with http2, and all additional apps (especially things like the preview and stuff that might do stuff in the background)
  • Check more logs, did you have a look at the Nextcloud log? Increase the loglevel, over 1 minute looks a bit like something is running into a timeout (could be like an old external storage, incorrect ipv6 setup, …)

It seems I have the same issue…
CPU at 100% while file is being loaded into memory before the download starts.
Takes ~50s before download starts of a 500Mb file.

Can’t find anything in config files which might have an impact… also thinking of reinstalling the whole thing.

I’m not sure I’m supposed to like it when other people have the same problem. After all, the problem becomes public and we might find a solution.

Well, I will reinstall my Nextcloud the days and report

image

Can it be a problem that the owner is root and not www-data on /usr/sbin/apache2 ?