Protect Nextcloud using Anubis

I’d like to use Anubis to protect my instance from AI-powered plunder bots (and in a boader way mitigate DDoS), but putting it as-is will certainly break API calls from clients (since they cant/won’t execute js).

Fortunately, I can whitelist both URL and/or useragents using regex in Anubis botPolicy.json.

Here some example configuration:

    {
      "name": "robots-txt",
      "path_regex": "^/robots.txt$",
      "action": "ALLOW"
    },
    {
      "name": "internal-traffic",
      "remote_addresses": ["10.0.0.0/8", "172.16.0.0/12", "192.168.0.0/16"],
      "action": "ALLOW"
    },
    {
      "name": "generic-browser",
      "user_agent_regex": "Mozilla|Opera",
      "action": "CHALLENGE"
    }

Now, if i want to protect my NextCloud instance, what do you think would be the best?

  • Allowing .well-known and API URL?
    • is there documentation for API endpoints?
  • Allowing specific useragents (NxC and apps windows and android clients)
    • is there documentation for any and all clients useragents?

disclaimer : Anubis is not an all-powerful protection, and allowing specific url/useragent is like putting a big hole in it, BUT, it’s still better than getting DDoS by some moron botnet and waiting for them to calm down.

I’d say that’s probably a non-issue with Nextcloud. I mean, since most of Nextcloud’s content is behind authentication, there’s not much for the plunder bots to crawl, so I’d say the load it would cause if they tried is negligible.

As far as general protection against bots is concerned, I’d say Fail2Ban is totally sufficient for a personal instance, as it is rather unlikely that someone will launch an actual DDOS attack against your server. If you want more preventive protection, you may want to take a look at CrowdSec.

Otherwise, the solution would be reverse proxy with web application firewall, with commercial filer lists, which costs money, or a CDN like Cloudflare. The latter is free for home users, but comes with its own challenges, and Cloudflare can read your traffic.

2 Likes

fully agree. as SOHO you can’t protect yourself against real targeted DoS attack (just because it’s easy to saturate a pipe of the common home or small company) - only some powerful infrastructure in front of your installation can do this and fail2ban or crowdsec should suffice for normal crawler bots.

If you use another product look at crowdsecs Nextcloud whitelist which is not complete but already good jump start.

1 Like