I solved the rescan issue by disabling geoip in NGINX.
After having restarted NGINX and pressed the icon for rescan, the new results were shown after few minutes.
Nextcloudâs scan-server seems to be located out of germany and out of US, thatâs why it failed for me regarding geoip.
I understand that this scanner works by querying domain.com/status.php (and also searching for it in a couple of other locations). I applaud any efforts to increase security, as a large portion of my day job is IT security and this is a great initiative. However I have a few questions.
The idea of having a /status.php page which can be queried by anyone un-nerves me a little. And in fact Iâve blocked public access to it on my server for now. Is there a better way of providing the information to the scanner. A POST request from the nextcloud server to the scanner could provide the same information and would not be leaking information. You could simply provide a Scan button in the admin interface, or a regular cron job.
If status.php is required by clients (as suggested elsewhere), should it not be visible only after authentication?
As well as the version info provided by status.php is there any other info assessed by the scan service to give the grade ( Iâm guessing the presence of SSL is one element, perhaps some of the SSL parameters as well ⌠and âŚwhat else?). Maybe we could figure out how to get those back to the scan server too.
Asked the same thing and it essentially boils down to public APIs. At some point you need to know which APIs to access. You can ofc run requests against all your endpoints and see if they 404 but thatâs inefficient. You can ofc use a version in your API url or pass a version token. The issue with that is that itâs a pain for forwards compatibility because you never know which new features you can use from an API standpoint.
To solve that you could add an API call that tells you which API subversion you are dealing with and now you have arrived at the current status
Apart from that the version is not really an issue since automated attacks usually just brute force all vulnerabilities. Personally Iâd also ignore the status.php since its more work to parse the version than just to try all known vulnerabilities starting with the latest one
Hi FolksâŚ
I registered an automatic rescan every 8 hours.
Where is the hint on the scanner page,
your URL saved for automatic scan?
I want to delete my URL from scan.
Is that possible?
I have my firewall locked down pretty tight and the scanner cannot seem to access my Nextcloud instance. This is likely due to the ip addresses used by the scanner being blocked by my server. Is there any chance you could post the ip addresses that the scanner is using or even which hosting provider it would be coming from so that I could unblock it.
Thanks.
Yeah and also do the scan and show results in admin panel as part of the other checks that are done there anyway. It is important enough in my view to include this directly obvious into nextcloud than âjustâ give the link here that people might or might not recognize.
But of course thanks for the scan anyway, this was/is already a great step. Would be just the consistent next step to make nextcloud even better in lifting security as topic into the view of every admin .
I just checked my apache log as regular task and found attempts from an france ip to access status.php in my webserver root (nextcloud is on a total different location), which fails, because no status.php in webroot ;).
The attempt occurs every ~6 hours ~10 times at once for the whole time, my apache logs remain (~1 week now).
This was discussed much and I agree that for security reasons there is a benefit to collect statistical data about clouds security and give webhosts with lack a hint. But this looks little aggressive, inefficient and floods my apache log quite much.
Okay, this checks should not cost much server performance or traffic and therefore it should not matter much how unnecessary often they are done. But to not annoy admins too much, couldnât there be some intelligent sorting done somehow? I.e. usually the IPs change once a day by ISP, if no static IP is there. So I guess doing the check once (really once, stop after 1 error (file not found)) a day should be totally enough. For me it is done ~50 times a dayâŚ
If there are static IPs known, there could be checked way less often. In my case I have a fix domain which I also used on scan.nextcloud.com.
In such cases domains could be collected and IPs they are pointing to could be also excluded from the check, respectively they could be checked way less often, once a month or just after every nextcloud upgrade.
Yaa, just some ideas, if this is actually under control of Nextcloud GmbH, maybe this is done totally external with no easy way to influence. However I will ban the related IP (was the same since beginning of log) now, taking care about my security myself, including running scan.nextcloud.com regularly ;).
Just found the current ip address of the scans. They come from France: 51.15.140.197
In regards to the problems some are having with being scanned too often, I think Iâll just leave this address blocked in my firewall and unblock it whenever I do a major upgrade and want to scan security.
Today we have got a letter from BSI to our provider because of this scan. We had for testing an old owncloud instance some months ago on a single vhost in the net. Sending the results directly to BSI is not a good idea from nextcloud. We will delete our nextcloud instances and will warn our costumers to use nextcloud and this security tool⌠very bad idea!