Robot.txt prevent indexing share pdf documents

Hello,
I’ve a website that use share pdf documents from a Nextcloud installation (subdomain) but Google Search console advice me that are blocked from robot.txt

I think that Google Search Console refers to the Nextcloud robot.txt:
User-agent: *
Disallow: /

Is there a way to allow the search engines to index these share documents (public, without password restriction)?
Thanks.

Sorry no idea. But perhaps it is not really possible with the urls. You find very few with Nextcloud public shared documents with e.g. Google.

Perhaps you need the app “Sharing Path” for CDN and SEO optimization.
https://apps.nextcloud.com/apps/sharingpath

Alternative to Sharing Path:
I use a http-301-rewrite-php-script outside Nextcloud to map nice names to nextcloud public shares (folders and files). You can also use .htaccess and url-rewrite. Then Google can index them.

1 Like

I’ll try to testing the app “Sharing Path” but I would prefer to use the normal shared file url also because on Safari browser the pdf are blocked for default and is necessary to enable the plugin to open them, instead with shared files url we have the viewer.

About “http-301-rewrite-php-script outside Nextcloud” could you give me more informations about it with some example? Thanks!

@Michele1
I send you a PM.

1 Like

Anyway I hope that in the future versions Nextcloud will be more SEO friendly