Talk with HPB gives HTTP 429

  • Nextcloud Server version (e.g., 29.x.x):
    • 31.0.3
  • Operating system and version (e.g., Ubuntu 24.04):
    • openSUSE Tumbleweed
  • Web server and version (e.g, Apache 2.4.25):
    • Apache 2.4.63
  • Reverse proxy and version _(e.g. nginx 1.27.2)
    • Apache 2.4.63
  • PHP version (e.g, 8.3):
    • 8.4
  • Is this the first time you’ve seen this error? (Yes / No):
    • Yes
  • When did this problem seem to first start?
    • After installation of HPB (2 weeks ago)
  • Installation method (e.g. AlO, NCP, Bare Metal/Archive, etc.)
    • Archive
  • Are you using CloudfIare, mod_security, or similar? (Yes / No)
    • No

Summary of the issue you are facing:

I am running Nexcloud on an Apache web server and the high performance backend in a docker container on the same machine. When there are 2 or 3 people in a meeting / talk conversation, additional people that are trying to join get the error message:

Failed to join the conversation.
Please try to reload the page.

When I disable the high performance backend everything seems to be working fine.

Apart from the error message in the Nextcloud log I did not find any other error messages in the Apache logs or in logs of the HPB Docker container (I have to say that I did not really find any logs in the docker container apart from docker logs).

Log entries

Nextcloud

I see the following entry multiple times in the Nextcloud log:

{"reqId":"Qsjs97yonUcLOpNJzNFh","level":3,"time":"2025-05-04T18:16:10+00:00","remoteAddr":"172.17.0.2","user":false,"app":"spreed","method":"POST","url":"/ocs/v2.php/apps/spreed/api/v3/signaling/backend","message":"Failed to send message to signaling server","userAgent":"nextcloud-spreed-signaling/2.0.2~docker","version":"31.0.3.2","exception":{"Exception":"GuzzleHttp\\Exception\\ClientException","Message":"Client error: `POST https://nc.cronic.de/standalone-signaling/api/v1/room/h5p468yo` resulted in a `429 Too Many Requests` response:\nToo many requests\n\n","Code":429,"Trace":[{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/guzzle/src/Middleware.php","line":72,"function":"create","class":"GuzzleHttp\\Exception\\RequestException","type":"::","args":["*** sensitive parameters replaced ***"]},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":209,"function":"{closure:{closure:{closure:GuzzleHttp\\Middleware::httpErrors():60}:61}:67}","class":"GuzzleHttp\\Middleware","type":"::","args":["*** sensitive parameters replaced ***"]},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":158,"function":"callHandler","class":"GuzzleHttp\\Promise\\Promise","type":"::"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/TaskQueue.php","line":52,"function":"{closure:GuzzleHttp\\Promise\\Promise::settle():156}","class":"GuzzleHttp\\Promise\\Promise","type":"::","args":["*** sensitive parameters replaced ***"]},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":251,"function":"run","class":"GuzzleHttp\\Promise\\TaskQueue","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":227,"function":"invokeWaitFn","class":"GuzzleHttp\\Promise\\Promise","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":272,"function":"waitIfPending","class":"GuzzleHttp\\Promise\\Promise","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":229,"function":"invokeWaitList","class":"GuzzleHttp\\Promise\\Promise","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/promises/src/Promise.php","line":69,"function":"waitIfPending","class":"GuzzleHttp\\Promise\\Promise","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/guzzle/src/Client.php","line":189,"function":"wait","class":"GuzzleHttp\\Promise\\Promise","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/Http/Client/Client.php","line":277,"function":"request","class":"GuzzleHttp\\Client","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/BackendNotifier.php","line":57,"function":"post","class":"OC\\Http\\Client\\Client","type":"->","args":["*** sensitive parameters replaced ***"]},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/BackendNotifier.php","line":134,"function":"doRequest","class":"OCA\\Talk\\Signaling\\BackendNotifier","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/BackendNotifier.php","line":218,"function":"backendRequest","class":"OCA\\Talk\\Signaling\\BackendNotifier","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/Listener.php","line":334,"function":"roomSessionsRemoved","class":"OCA\\Talk\\Signaling\\BackendNotifier","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/Listener.php","line":137,"function":"notifySessionLeftRoom","class":"OCA\\Talk\\Signaling\\Listener","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Signaling/Listener.php","line":81,"function":"handleExternalSignaling","class":"OCA\\Talk\\Signaling\\Listener","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/EventDispatcher/ServiceEventListener.php","line":68,"function":"handle","class":"OCA\\Talk\\Signaling\\Listener","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/symfony/event-dispatcher/EventDispatcher.php","line":220,"function":"__invoke","class":"OC\\EventDispatcher\\ServiceEventListener","type":"->"},{"file":"/srv/www/htdocs/nc/3rdparty/symfony/event-dispatcher/EventDispatcher.php","line":56,"function":"callListeners","class":"Symfony\\Component\\EventDispatcher\\EventDispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/EventDispatcher/EventDispatcher.php","line":67,"function":"dispatch","class":"Symfony\\Component\\EventDispatcher\\EventDispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/EventDispatcher/EventDispatcher.php","line":79,"function":"dispatch","class":"OC\\EventDispatcher\\EventDispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Service/ParticipantService.php","line":927,"function":"dispatchTyped","class":"OC\\EventDispatcher\\EventDispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Controller/SignalingController.php","line":866,"function":"leaveRoomAsSession","class":"OCA\\Talk\\Service\\ParticipantService","type":"->"},{"file":"/srv/www/htdocs/nc/apps/spreed/lib/Controller/SignalingController.php","line":673,"function":"backendRoom","class":"OCA\\Talk\\Controller\\SignalingController","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/AppFramework/Http/Dispatcher.php","line":200,"function":"backend","class":"OCA\\Talk\\Controller\\SignalingController","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/AppFramework/Http/Dispatcher.php","line":114,"function":"executeController","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/AppFramework/App.php","line":161,"function":"dispatch","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->"},{"file":"/srv/www/htdocs/nc/lib/private/Route/Router.php","line":307,"function":"main","class":"OC\\AppFramework\\App","type":"::"},{"file":"/srv/www/htdocs/nc/ocs/v1.php","line":49,"function":"match","class":"OC\\Route\\Router","type":"->"},{"file":"/srv/www/htdocs/nc/ocs/v2.php","line":7,"args":["/srv/www/htdocs/nc/ocs/v1.php"],"function":"require_once"}],"File":"/srv/www/htdocs/nc/3rdparty/guzzlehttp/guzzle/src/Exception/RequestException.php","Line":111,"message":"Failed to send message to signaling server","exception":[],"CustomMessage":"Failed to send message to signaling server"},"id":"6817d84e75f6f"}

Web server / Reverse Proxy

I did not see any errors in the logs, only the following entries in the access log:

nc-access_log:127.0.0.1 - - [04/May/2025:20:14:15 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:14:21 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:14:48 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:14:56 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:16:05 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:16:06 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"
nc-access_log:127.0.0.1 - - [04/May/2025:20:16:10 +0200] "POST /standalone-signaling/api/v1/room/h5p468yo HTTP/1.1" 429 18 "-" "Nextcloud Server Crawler"

Configuration

The output of occ config:list system:

{
    "system": {
        "instanceid": "***REMOVED SENSITIVE VALUE***",
        "passwordsalt": "***REMOVED SENSITIVE VALUE***",
        "secret": "***REMOVED SENSITIVE VALUE***",
        "trusted_domains": [
            "nc.server.tld",
            "..."
        ],
        "datadirectory": "***REMOVED SENSITIVE VALUE***",
        "overwrite.cli.url": "https:\/\/nc.server.tld",
        "dbtype": "mysql",
        "version": "31.0.3.2",
        "dbname": "***REMOVED SENSITIVE VALUE***",
        "dbhost": "***REMOVED SENSITIVE VALUE***",
        "dbport": "",
        "dbtableprefix": "oc_",
        "dbuser": "***REMOVED SENSITIVE VALUE***",
        "dbpassword": "***REMOVED SENSITIVE VALUE***",
        "logtimezone": "UTC",
        "installed": true,
        "maintenance": false,
        "theme": "",
        "loglevel": 2,
        "updater.release.channel": "stable",
        "memcache.local": "\\OC\\Memcache\\APCu",
        "filelocking.enabled": true,
        "memcache.locking": "\\OC\\Memcache\\Redis",
        "redis": {
            "host": "***REMOVED SENSITIVE VALUE***",
            "port": 6379,
            "timeout": 0,
            "password": "***REMOVED SENSITIVE VALUE***",
            "dbindex": 0
        },
        "mysql.utf8mb4": true,
        "default_phone_region": "DE",
        "mail_smtpmode": "smtp",
        "mail_smtpsecure": "tls",
        "mail_sendmailmode": "smtp",
        "mail_from_address": "***REMOVED SENSITIVE VALUE***",
        "mail_domain": "***REMOVED SENSITIVE VALUE***",
        "mail_smtphost": "***REMOVED SENSITIVE VALUE***",
        "mail_smtpport": "587",
        "maintenance_window_start": 2
    }
}

429 mean brute force protection. I assume you tried too many times with a wrong shared secret. Re-check secret and url and restart the HPB.

I do not believe this is the issue. I just tried with an intentionally wrong secret and could not join the conversion. The error in the log is

Client error: `POST https://nc.server.tld/standalone-signaling/api/v1/room/bvg7w4w5` resulted in a `403 Forbidden` response: Authentication check failed 

With the original secret two or three people could join a video call before the error occurs (at least yesterday). As stated above the error is HTTP 429 in this case.

Today it seems that nobody can join a call. The HTTP 429 error is gone. In the docker logs of the HPB I see the following error message:

hub.go:862: Register user user1@backend-1 from 2003:e6:170e:7900:683c:d7f4:5432:6f64 in unknown-country (Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:138.0) Gecko/20100101 Firefox/138.0) some_base64_string (private=some_other_base64_string)
backend_client.go:167: Could not send request {"type":"room","room":{"version":"1.0","roomid":"bvg7w4w5","userid":"user1","sessionid":"..."}} to https://nc.server.tld/ocs/v2.php/apps/spreed/api/v3/signaling/backend: Post "https://nc.server.tld/ocs/v2.php/apps/spreed/api/v3/signaling/backend": context deadline exceeded

I tried to increase the timeout of Apache but this did not change anything:

<VirtualHost *:443>
  DocumentRoot "/srv/www/htdocs/nc"
  ServerName ...
  ServerAdmin ...
  ErrorLog /var/log/apache2/nc-error_log
  CustomLog /var/log/apache2/nc-access_log combined

  SSLEngine on

  SSLCertificateKeyFile ...
  SSLCertificateFile ...

  TimeOut 300

    <IfModule mod_headers.c>
       Header always set Strict-Transport-Security "max-age=15768000; includeSubDomains; preload"
    </IfModule>

    # Enable proxying Websocket requests to the standalone signaling server.
    ProxyPass "/standalone-signaling/"  "ws://127.0.0.1:8080/" connectiontimeout=30 timeout=300

    RequestHeader set X-Real-IP %{REMOTE_ADDR}s

    RewriteEngine On
    # Websocket connections from the clients.
    RewriteRule ^/standalone-signaling/spreed/$ - [L]
    # Backend connections from Nextcloud.
    RewriteRule ^/standalone-signaling/api/(.*) http://127.0.0.1:8080/api/$1 [L,P]

</VirtualHost>

This is correct, that happens before bruteforce protection kicks in. The idea is to prevent people from randomly guessing secrets and eventually figuring out the right one. From a single test with a wrong secret, nothing will kick in here.

The error you mention today seems to be the other way round. Check if you ended up in bruteforce protection from the nextcloud side. Also check that the connection between Signaling and Nextcloud works correct.

OK. I emptied the database table oc_bruteforce_attempts. Additionally I found that the coturn password was wrong. Now it seems to work if there are less than 5 people in the call. For the 5th person joining there is only the spinning wheel. There are a lot of repeating error messages in the docker logs of the HPB:

hub.go:2666: No MCU subscriber found for session CSlWCFAL7Sg9C31-Z2n6A_lQ335nqMaDOyDHVnlOQA98PT1ndmozSDlpZTVwNDlfRHhVSE0zZ1cweUFTNnhkaTdFRmtPTUxuNUlweGpIV19KRDVwSXF2cDZGSEJ2fDQzMTA2NDY0NzE= to send &{Type:candidate Sid:6007343880900517 RoomType:video Payload:map[candidate:map[candidate:candidate:13 1 TCP 2105393407 192.168.42.3 9 typ host tcptype active sdpMLineIndex:0 sdpMid:0]] Bitrate:0 AudioCodec: VideoCodec: VP9Profile: H264Profile: offerSdp:<nil> answerSdp:<nil>} to Fz6k9Y9QUX0pzZbYXaaqpIgGh8ZoVwGhDX9NDqtjpGF8aWhiMUxpSFlyMmFBR2xKTnN4VEF3US1kRUJVLWNlTTZJV1R6dGRiYTNKNnYxOHpBWFRrXzRjeEV8NTMwMDY0NjQ3MQ==

or

[WARN] [5846893409475670] ICE failed for component 1 in stream 1, but let's give it some time... (trickle pending, answer received, alert not set)

or

janus_client.go:480: Unable to deliver message {
   "janus": "detached",
   "session_id": 1171844923872441,
   "sender": 3706428384055338
}. Handle 3706428384055338 gone?
mcu_janus.go:685: Attached subscriber to room 2894711437156378 of publisher CSlWCFAL7Sg9C31-Z2n6A_lQ335nqMaDOyDHVnlOQA98PT1ndmozSDlpZTVwNDlfRHhVSE0zZ1cweUFTNnhkaTdFRmtPTUxuNUlweGpIV19KRDVwSXF2cDZGSEJ2fDQzMTA2NDY0NzE= in plugin janus.plugin.videoroom in session 1171844923872441 as 4278146555349861
mcu_janus_subscriber.go:193: Already connected subscriber 157 for video, leaving and re-joining on handle 4278146555349861
mcu_janus_client.go:165: Started listener &{{janus.plugin.videoroom map[room:2894711437156378 started:ok videoroom:event]} map[] 1171844923872441 4278146555349861}
[ERR] [plugins/janus_videoroom.c:janus_videoroom_handler:11164] Already in as a subscriber on this handle

I tried reloading the browser multiple times (including clearing the cache) and restarting the docker container of the HPB - always with the same result.

How would I check the connection between Signaling and Nextcloud?