Notify_push randomly (not)working

  • Nextcloud Server version (e.g., 29.x.x):
    • 32
  • Operating system and version (e.g., Ubuntu 24.04):
    • Debian 13
  • Reverse proxy and version _(e.g. nginx 1.27.2)
    • Nginx Proxy Manager
  • Installation method (e.g. AlO, NCP, Bare Metal/Archive, etc.)
    • Docker compose
  • Are you using CloudfIare, mod_security, or similar? (Yes / No)
    • No

My new setup of notify_push seems to be randomly working/not working.

Can anybody poke me in to the right direction?

How should I debug this? What can be the cause?

I’m getting these responses (randomly not working - 🗴 push server can’t connect to the Nextcloud server):

root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:self-test
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
🗴 push server can't connect to the Nextcloud server
root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:self-test
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
✓ push server can connect to the Nextcloud server
✓ push server is a trusted proxy
✓ push server is running the same version as the app
root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:self-test
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
🗴 push server can't connect to the Nextcloud server

I’m using this docker-compose.yml:

services:
  notify_push:
    image: nextcloud
    restart: unless-stopped
    depends_on:
      - app
    environment:
      - NEXTCLOUD_URL=http://app
    ports:
    - 7867:7867
    entrypoint: /var/www/html/custom_apps/notify_push/bin/x86_64/notify_push /var/www/html/config/config.php
    volumes:
    - /mnt/data:/var/www/html/data
    - html:/var/www/html
  app:
    image: nextcloud
    volumes:
    - /mnt/data:/var/www/html/data
    - html:/var/www/html
    ports:
    - 80:80
    environment:
    - APACHE_DISABLE_REWRITE_IP=1
    - NEXTCLOUD_TRUSTED_DOMAINS=data.localdomain data.ucl.cas.cz app
    - TRUSTED_PROXIES=172.16.0.0/12
...
    depends_on:
    - db
    - onlyoffice
    - redis
    restart: unless-stopped
...

Metrics of notify_push:

root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:metrics
Active connection count: 2
Active user count: 2
Total connection count: 2
Total database query count: 1
Events received: 7
Messages sent: 0
Messages sent (file): 0
Messages sent (notification): 0
Messages sent (activity): 0
Messages sent (custom): 0

Setup (randomly not working - 🗴 push server can’t connect to the Nextcloud server):

root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:setup https://data.ucl.cas.cz/push
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
🗴 push server can't connect to the Nextcloud server
root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:setup https://data.ucl.cas.cz/push
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
✓ push server can connect to the Nextcloud server
✓ push server is a trusted proxy
✓ push server is running the same version as the app
  configuration saved
root@data:/root# /usr/bin/docker exec -u www-data data-app-1 php -d memory_limit=-1 occ notify_push:setup https://data.ucl.cas.cz/push
✓ redis is configured
✓ push server is receiving redis messages
✓ push server can load mount info from database
🗴 push server can't connect to the Nextcloud server

config.php (without mail, db and redis…):

<?php
$CONFIG = array (
  'htaccess.RewriteBase' => '/',
  'memcache.local' => '\\OC\\Memcache\\APCu',
  'apps_paths' => 
  array (
    0 => 
    array (
      'path' => '/var/www/html/apps',
      'url' => '/apps',
      'writable' => false,
    ),
    1 => 
    array (
      'path' => '/var/www/html/custom_apps',
      'url' => '/custom_apps',
      'writable' => true,
    ),
  ),
  'memcache.distributed' => '\\OC\\Memcache\\Redis',
  'memcache.locking' => '\\OC\\Memcache\\Redis',
  'trusted_proxies' => 
  array (
    0 => '172.16.0.0/12',
  ),
  'trusted_domains' => 
  array (
    0 => 'localhost',
    1 => 'data.localdomain',
    2 => 'data.ucl.cas.cz',
    3 => 'app',
  ),
  'datadirectory' => '/var/www/html/data',
  'version' => '32.0.0.13',
  'overwrite.cli.url' => 'https://data.ucl.cas.cz',
  'installed' => true,
  'instanceid' => 'xxx',
  'ldapProviderFactory' => 'OCA\\User_LDAP\\LDAPProviderFactory',
  'theme' => '',
  'default_phone_region' => 'CZ',
  'loglevel' => 0,
  'maintenance' => false,
  'overwritehost' => 'data.ucl.cas.cz',
  'overwriteprotocol' => 'https',
  'data-fingerprint' => 'xxx',
  'updater.release.channel' => 'stable',
  'auth.webauthn.enabled' => false,
  'defaultapp' => '',
  'maintenance_window_start' => 1,
);

Some logs of notify_push:

[2025-10-15 19:13:34.495757 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:283: read eof

[2025-10-15 19:13:34.498120 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:211: parsed 10 headers

[2025-10-15 19:13:34.498127 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:224: incoming body is empty

[2025-10-15 19:13:34.498168 +00:00] DEBUG [hyper_util::client::legacy::pool] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/pool.rs:269: reuse idle connection for ("http", app)

[2025-10-15 19:13:34.524619 +00:00] DEBUG [hyper_util::client::legacy::pool] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/pool.rs:395: pooling idle connection for ("http", app)

[2025-10-15 19:13:34.524689 +00:00] DEBUG [notify_push] /build/source/src/lib.rs:282: got remote test cookie 685031571

[2025-10-15 19:13:34.524802 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:330: flushed 125 bytes

[2025-10-15 19:13:34.524946 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:283: read eof

[2025-10-15 19:13:34.527585 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:211: parsed 10 headers

[2025-10-15 19:13:34.527594 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:224: incoming body is empty

[2025-10-15 19:13:34.527649 +00:00] DEBUG [hyper_util::client::legacy::pool] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/pool.rs:269: reuse idle connection for ("http", app)

[2025-10-15 19:13:34.553780 +00:00] DEBUG [hyper_util::client::legacy::pool] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/pool.rs:395: pooling idle connection for ("http", app)

[2025-10-15 19:13:34.553859 +00:00] DEBUG [notify_push] /build/source/src/lib.rs:322: got remote 1.2.3.4 when trying to set remote 1.2.3.4

[2025-10-15 19:13:34.553924 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:330: flushed 123 bytes

[2025-10-15 19:13:34.554086 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:283: read eof

[2025-10-15 19:13:34.557040 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:211: parsed 11 headers

[2025-10-15 19:13:34.557047 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:224: incoming body is empty

[2025-10-15 19:13:34.557595 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:330: flushed 119 bytes

[2025-10-15 19:13:34.557718 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:283: read eof

Any other information’s needed?

https://data.ucl.cas.cz/index.php/apps/notify_push/test/version

→ Internal Server Error

Logs:

DomainException No responder registered for format xhtml+xml!

{"reqId":"Tvw0SnGMNA6ldWA8AnMB","level":3,"time":"2025-10-16T05:12:11+00:00","remoteAddr":"172.16.0.143","user":"spravce","app":"index","method":"GET","url":"/index.php/apps/notify_push/test/version","message":"No responder registered for format xhtml+xml!","userAgent":"Mozilla/5.0 (X11; Linux x86_64; rv:143.0) Gecko/20100101 Firefox/143.0","version":"32.0.0.13","exception":{"Exception":"DomainException","Message":"No responder registered for format xhtml+xml!","Code":0,"Trace":[{"file":"/var/www/html/lib/private/AppFramework/Http/Dispatcher.php","line":216,"function":"buildResponse","class":"OCP\\AppFramework\\Controller","type":"->","args":[null,"xhtml+xml"]},{"file":"/var/www/html/lib/private/AppFramework/Http/Dispatcher.php","line":118,"function":"executeController","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->","args":[{"__class__":"OCA\\NotifyPush\\Controller\\TestController"},"version"]},{"file":"/var/www/html/lib/private/AppFramework/App.php","line":153,"function":"dispatch","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->","args":[{"__class__":"OCA\\NotifyPush\\Controller\\TestController"},"version"]},{"file":"/var/www/html/lib/private/Route/Router.php","line":321,"function":"main","class":"OC\\AppFramework\\App","type":"::","args":["OCA\\NotifyPush\\Controller\\TestController","version",{"__class__":"OC\\AppFramework\\DependencyInjection\\DIContainer"},{"_route":"notify_push.test.version"}]},{"file":"/var/www/html/lib/base.php","line":1061,"function":"match","class":"OC\\Route\\Router","type":"->","args":["/apps/notify_push/test/version"]},{"file":"/var/www/html/index.php","line":25,"function":"handleRequest","class":"OC","type":"::","args":[]}],"File":"/var/www/html/lib/public/AppFramework/Controller.php","Line":139,"message":"No responder registered for format xhtml+xml!","exception":[],"CustomMessage":"No responder registered for format xhtml+xml!"},"id":"68f07ead3138a"}

https://data.ucl.cas.cz/index.php/apps/notify_push/test/remote

→ OK

https://data.ucl.cas.cz/index.php/apps/notify_push/test/cookie

→ Internal Server Error

Logs:

 DomainException No responder registered for format xhtml+xml!

{"reqId":"7AbRX1HgNS2pAZUQ3eSB","level":3,"time":"2025-10-16T05:19:43+00:00","remoteAddr":"172.16.0.143","user":"spravce","app":"index","method":"GET","url":"/index.php/apps/notify_push/test/cookie","message":"No responder registered for format xhtml+xml!","userAgent":"Mozilla/5.0 (X11; Linux x86_64; rv:143.0) Gecko/20100101 Firefox/143.0","version":"32.0.0.13","exception":{"Exception":"DomainException","Message":"No responder registered for format xhtml+xml!","Code":0,"Trace":[{"file":"/var/www/html/lib/private/AppFramework/Http/Dispatcher.php","line":216,"function":"buildResponse","class":"OCP\\AppFramework\\Controller","type":"->","args":[{"__class__":"OCP\\AppFramework\\Http\\DataResponse"},"xhtml+xml"]},{"file":"/var/www/html/lib/private/AppFramework/Http/Dispatcher.php","line":118,"function":"executeController","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->","args":[{"__class__":"OCA\\NotifyPush\\Controller\\TestController"},"cookie"]},{"file":"/var/www/html/lib/private/AppFramework/App.php","line":153,"function":"dispatch","class":"OC\\AppFramework\\Http\\Dispatcher","type":"->","args":[{"__class__":"OCA\\NotifyPush\\Controller\\TestController"},"cookie"]},{"file":"/var/www/html/lib/private/Route/Router.php","line":321,"function":"main","class":"OC\\AppFramework\\App","type":"::","args":["OCA\\NotifyPush\\Controller\\TestController","cookie",{"__class__":"OC\\AppFramework\\DependencyInjection\\DIContainer"},{"_route":"notify_push.test.cookie"}]},{"file":"/var/www/html/lib/base.php","line":1061,"function":"match","class":"OC\\Route\\Router","type":"->","args":["/apps/notify_push/test/cookie"]},{"file":"/var/www/html/index.php","line":25,"function":"handleRequest","class":"OC","type":"::","args":[]}],"File":"/var/www/html/lib/public/AppFramework/Controller.php","Line":139,"message":"No responder registered for format xhtml+xml!","exception":[],"CustomMessage":"No responder registered for format xhtml+xml!"},"id":"68f08070e615b"}

The error logs could be explained by:

But it still doesn’t explain why some of the URLs above are giving an error.

Hi Antonin,

The issue happens because the notify_push container is using the wrong internal URL to reach your Nextcloud app.

In your docker-compose file, change the line to use your public HTTPS domain instead of the internal service name:

NEXTCLOUD_URL=https://data.ucl.cas.cz

After that, restart your containers and run the notify_push setup again.

Also, make sure your reverse proxy is forwarding the correct headers:

  • Host

  • X-Forwarded-For

  • X-Forwarded-Proto (set to “https”)

The “xhtml+xml” error you see is a known issue in Nextcloud 32, it doesn’t affect functionality and can be safely ignored.

This should stabilize your notify_push connection and stop the random failures.

Hello @Antonin_Chadima,

welcome to the Nextcloud community! :handshake:

In the past I hit strange issues when I had multiple (Nextcloud) apps connected to the same reverse proxy network. For this reason multiple app containers existed and DNS resolution was random as notify_push was connecting to a random “app”.. As suggested by marvin use you public DNS name or ensure unique DNS

Thank you a lot.

using:

environment:
  - NEXTCLOUD_URL=https://data.ucl.cas.cz

leads to exactly the same results. 3 times ERROR + 1 time OK etc. randomly…

the reverse proxy headers are set OK

How to debug this issue?

should i try to ping from the notify_push container to app container or to data.ucl.cas.cz?
and try wget/curl from notify_push? capture network packets?

ping ok from inside container notify_push to app and data.ucl.cas.cz
curl not working (?)
wget 100% ok to http://app and https://data.ucl.cas.cz

should I try some specific address used by the notify_push test?
and/or trying some headers?
any idea?

i tried the test client

antoninchadima@penguin:~/Downloads$ ./test_client-x86_64-unknown-linux-musl https://data.ucl.cas.cz chadima NA24.pw999
[2025-10-16 12:40:59.538173 +02:00] INFO [test_client] test_client/src/main.rs:39: Found push server at wss://data.ucl.cas.cz/push/ws
[2025-10-16 12:40:59.967311 +02:00] INFO [test_client] test_client/src/main.rs:73: Authenticated

stuck there and after a while:

Error:   × IO error: Connection reset by peer (os error 104)
  ╰─▶ Connection reset by peer (os error 104)

logs:

[2025-10-16 10:41:00.609055 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:211: parsed 9 headers

[2025-10-16 10:41:00.609089 +00:00] DEBUG [hyper::proto::h1::conn] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/conn.rs:224: incoming body is empty

[2025-10-16 10:41:00.609122 +00:00] DEBUG [notify_push] /build/source/src/lib.rs:262: new websocket connection from Some(172.16.0.1)

[2025-10-16 10:41:00.609211 +00:00] DEBUG [hyper::proto::h1::io] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-0.14.32-41dfc780fdec9373c01bae43289ea34c972e40ee3c9f6b3c8801a35f35586ce7/src/proto/h1/io.rs:330: flushed 166 bytes

[2025-10-16 10:41:00.672617 +00:00] DEBUG [notify_push::nc] /build/source/src/nc.rs:35: Verifying credentials for chadima

[2025-10-16 10:41:00.672665 +00:00] DEBUG [reqwest::connect] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/reqwest-0.12.22-cbc931937e6ca3a06e3b6c0aa7841849b160a90351d6ab467a8b9b9959767531/src/connect.rs:789: starting new connection: https://data.ucl.cas.cz/

[2025-10-16 10:41:00.673376 +00:00] DEBUG [hyper_util::client::legacy::connect::http] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/connect/http.rs:768: connecting to 147.231.80.37:443

[2025-10-16 10:41:00.673653 +00:00] DEBUG [hyper_util::client::legacy::connect::http] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/connect/http.rs:771: connected to 147.231.80.37:443

[2025-10-16 10:41:00.897814 +00:00] DEBUG [hyper_util::client::legacy::pool] /nix/store/0znp1v8mdyfkxabl7l7b8vqdxyfpsyc0-crates-io-dependencies/hyper-util-0.1.16-8d9b05277c7e8da2c93a568989bb6207bef0112e8d17df7a6eda4a3cf143bc5e/src/client/legacy/pool.rs:395: pooling idle connection for ("https", data.ucl.cas.cz)

[2025-10-16 10:41:00.897896 +00:00] INFO [notify_push::connection] /build/source/src/connection.rs:111: new websocket authenticated as chadima

Hi Antonin,

This issue happens because of how the push server connects through HTTPS inside Docker. You can fix it completely with these steps:

  1. In your docker-compose file, set
    NEXTCLOUD_URL=http://app
    (Use the internal container name, not the public HTTPS domain.)

  2. In config.php, make sure you have:
    ‘overwritehost’ => ‘data.ucl.cas.cz’,
    ‘overwriteprotocol’ => ‘https’,
    ‘trusted_proxies’ => [‘172.16.0.0/12’],
    ‘trusted_domains’ => [‘data.ucl.cas.cz’, ‘app’, ‘localhost’],

  3. In Nginx Proxy Manager, open the proxy for data.ucl.cas.cz:

    • Disable HTTP/2

    • Add these under “Advanced”:
      proxy_read_timeout 3600;
      proxy_send_timeout 3600;
      proxy_set_header Upgrade $http_upgrade;
      proxy_set_header Connection “upgrade”;
      proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
      proxy_set_header X-Forwarded-Proto https;

  4. Restart both containers (app and notify_push), then run
    occ notify_push:setup https://data.ucl.cas.cz/push

After this, the push server should connect consistently without random errors.

1 Like

Hello, could you fix this ? I am struggling with the same issue, except I am not using a docker install but manual install, and apache2 for the reverse proxy.
The self-test also randomly works. The version and cookie pages also lead to internal server error, and same error in the logs

I am running the notify_push server as a systemd service

[Unit]
After=network.target mariadb.service apache2.service redis.service
Description = Push daemon for Nextcloud clients
Documentation = https://github.com/nextcloud/notify_push

[Service]
Environment = PORT=7867
Environment = NEXTCLOUD_URL=https://cloud.mydomain.com
ExecStart = /var/www/html/nextcloud/apps/notify_push/bin/x86_64/notify_push /var/www/html/nextcloud/config/config.php

Type = notify
User = www-data
Restart = always
RestartSec = 60

[Install]
WantedBy = multi-user.target

In config.php in trusted_proxies I have:

‘trusted_proxies’ =>
array (
0 => ‘my public ipv4’,
1 => ‘my public ipv6’,
2 => ‘127.0.0.1’,
3 => ‘::1’,
),

In apache vhost I have this:

ProxyPass /push/ws ws://127.0.0.1:7867/ws ProxyPass /push/ http://127.0.0.1:7867/
ProxyPassReverse /push/ http://127.0.0.1:7867/ RequestHeader set Host %{HTTP_HOST}s
RequestHeader set X-Forwarded-For %{REMOTE_ADDR}s
RequestHeader set X-Forwarded-Proto "https"

I’m a bit stuck…

Edit: apart from these errors, I think the app works

I’m in the same situation as you, and I think it works as well… I created a .txt through the web interface, it got synced right away to my PC, all while running the test_client binary, which displayed when a change occured…