X-Robots-Tag does not support more than "none"

Hi,

Iam using Nextcloud 19.0.1 in Docker from Linuxserver.io behind Nginx.
I always have the Message "the “ X - Robots - Tag ” HTTP header is not configured to equal to “none” when more than none is configured e.g. add_header X-Robots-Tag “none, nosnippet, noarchive”. When only none is configured everything is fine. Any Ideas why, is it a bug?

Anyone? :sweat_smile:

Bumby bumb :slight_smile:

Bumb… :neutral_face:

And again :grin:

hi, I see the same issue in my installation, adding more directives in X-Robots tag displays this error. Maybe this is because ‘none’ is enough as it should prevent crawlers from indexing the the website already so no need for nosnippet and noarchive?

Sadly thats not the case. “none” is google only. Bing and other crawlers does not support it:
https://www.bing.com/webmaster/help/which-robots-metatags-does-bing-support-5198d240

i did not give up yet :upside_down_face:

Please use the support intro:

Some or all of the below information will be requested if it isn’t supplied; for fastest response please provide as much as you can :

Nextcloud version (eg, 18.0.2):
Operating system and version (eg, Ubuntu 20.04):
Apache or nginx version (eg, Apache 2.4.25):
PHP version (eg, 7.1):

The issue you are facing:

Is this the first time you’ve seen this error? (Y/N):

Steps to replicate it:

The output of your Nextcloud log in Admin > Logging:

PASTE HERE

The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

PASTE HERE

The output of your Apache/nginx/system log in /var/log/____:

PASTE HERE

Nextcloud version 20.0.0
Operating system and version is Docker on QNAP, see here for more information about the versions inside the container.

The issue you are facing:
The X-Robots-Tag does not support more than “none”

Is this the first time you’ve seen this error? (Y/N):
No, for several weeks and versions now.

Steps to replicate it:
Easy, just add “none, nosnippet, noarchive” to your config and tell me the result. It would be great if at least one person could test that with their environment. Maybe its only a Docker issue, that would be good to know too.

Logs are empty, made a new NC with version 20.

The output of your config.php file in /path/to/nextcloud (make sure you remove any identifiable information!):

<?php
$CONFIG = array (
  'memcache.local' => '\\OC\\Memcache\\APCu',
  'datadirectory' => '/data',
  'instanceid' => '',
  'passwordsalt' => '',
  'secret' => '',
  'trusted_domains' => 
  array (
    0 => 'asdf',
  ),
  'dbtype' => 'mysql',
  'version' => '20.0.0.9',
  'overwrite.cli.url' => 'https://asdf/nextcloud',
  'dbname' => 'nextcloud',
  'dbhost' => 'mariadb',
  'dbport' => '',
  'dbtableprefix' => 'oc_',
  'mysql.utf8mb4' => true,
  'dbuser' => '',
  'dbpassword' => '',
  'installed' => true,
  'trusted_proxies' => ['swag'],
  'overwritewebroot' => '/nextcloud',
);

I’m running Nextcloud in Docker on QNAP as well (with Traefik Reverse Proxy), and I can confirm System checks reports an issue if X-Robots-Tag is extended with additional directives (like nofollow, noarchive). The system check only succeeds if you exactly define X-Robots-Tag: none not more, not less. Technically addionaly directives are more “secure” but the check fails then…

Thanks for the confirmation wwe.

Guess it has to be fixed by the devs then :wink: