Running Nextcloud Client inside Docker on Synology NAS

Hello everyone :wink:

As the title implies, I’m currently trying to get the Nextcloud client running in a docker container on a Synology NAS. That is, because I want to sync a folder on my NAS with university-related stuff to the Nextcloud-server which is provided by the university. Running an additional Nextcloud server on the NAS proves difficult, because it’s behind a DualStack-Lite internet connection. I also don’t want to run the Nextcloud client on another PC in the NAS’ network to minimize the required devices. A possible solution would be a VM on the NAS with NAS-folders mapped into it, but with Docker it should be possible to achieve the same thing without the VM overhead (e.g. increased RAM and CPU usage). Also, this should be a good project for me to learn Docker :wink:

I started with this Dockerfile and was able to create an image from it, as well as to start the container on the NAS. Currently there’s no network storage mapped into the container (and I’m unsure how to exactly achieve this, maybe like here?), but I think I should be able to sync even a folder from within the container (for testing purposes). So I created a folder (/home/sync/) in the container and executed nextcloudcmd /home/sync/ ABCD and was promted to provide my login credentials (with ABCD as my university’s website ‘https://box.uni-xxxx.de’ and EFGH as my username there). The login seems to work (I get a quick error with wrong credentials), but it doesn’t sync anything.
I’ve added the output at the end of this post.

So it would be great if anyone knows why this failed, how nextcloudcmd behaves inside Docker and which environmental variables I have to map.
Thanks in advance :wink:

#####Terminal-Output#####

root@nc_client1:/home/sync# nextcloudcmd /home/sync/ ABCD
Please enter user name: EFGH
Password for user EFGH:
06-25 12:13:56:147 [ info nextcloud.sync.accessmanager ]: 2 “” “ABCD/ocs/v1.php/cloud/capabilities?format=json” has X-Request-ID “bb76ee1f-3b64-4c9f-a6cc-3993a6ecc94b”
06-25 12:13:56:147 [ info nextcloud.sync.networkjob ]: OCC::JsonApiJob created for “ABCD” + “ocs/v1.php/cloud/capabilities” “”
06-25 12:13:57:197 [ info nextcloud.sync.networkjob.jsonapi ]: JsonApiJob of QUrl(“ABCD/ocs/v1.php/cloud/capabilities?format=json”) FINISHED WITH STATUS “OK”
06-25 12:13:57:197 [ debug default ] [ main(int, char**)::<lambda ]: Server capabilities QJsonObject({“core”:{“pollinterval”:60,“webdav-root”:“remote.php/webdav”},“dav”:{“chunking”:“1.0”},“files”:{“bigfilechunking”:true,“blacklisted_files”:[".htaccess"],“undelete”:true,“versioning”:true},“files_sharing”:{“api_enabled”:true,“federation”:{“incoming”:false,“outgoing”:false},“group_sharing”:true,“public”:{“enabled”:true,“expire_date”:{“enabled”:false},“password”:{“enforced”:false},“send_mail”:false,“upload”:true,“upload_files_drop”:true},“resharing”:true,“user”:{“send_mail”:false}},“theming”:{“background”:“ABCD/themes/unimr/core/img/background.jpg”,“color”:"#745bca",“logo”:“ABCD/themes/unimr/core/img/logo.svg”,“name”:“Sync&Share”,“slogan”:“Sync&Share - sicherer Datenaustausch”,“url”:“https://nextcloud.com”}})
06-25 12:13:57:228 [ info nextcloud.sync.database ]: sqlite3 version “3.22.0”
06-25 12:13:57:228 [ info nextcloud.sync.database ]: sqlite3 journal_mode= “wal”
06-25 12:13:57:229 [ info nextcloud.sync.engine ]: There are 44420710400 bytes available at “/home/sync/”
06-25 12:13:57:229 [ info nextcloud.sync.engine ]: Sync with existing sync journal
06-25 12:13:57:229 [ info nextcloud.sync.engine ]: “Using Qt 5.9.5 SSL library OpenSSL 1.1.1 11 Sep 2018 on Ubuntu 18.04.2 LTS”
06-25 12:13:57:229 [ info nextcloud.sync.engine ]: NOT Using Selective Sync
06-25 12:13:57:229 [ info nextcloud.sync.engine ]: #### Discovery start ####################################################
06-25 12:13:57:230 [ info nextcloud.sync.engine ]: Server “”
06-25 12:13:57:230 [ info sync.csync.utils ]: Memory: 544720K total size, 23436K resident, 20856K shared
06-25 12:13:57:230 [ info sync.csync.csync ]: ## Starting local discovery ##
06-25 12:13:57:230 [ info nextcloud.sync.csync.updater ]: ._sync_caf7bb9e6616.db excluded (1)
06-25 12:13:57:230 [ info nextcloud.sync.csync.updater ]: ._sync_caf7bb9e6616.db-wal excluded (1)
06-25 12:13:57:230 [ info nextcloud.sync.csync.updater ]: ._sync_caf7bb9e6616.db-shm excluded (1)
06-25 12:13:57:231 [ info nextcloud.sync.csync.updater ]: <= Closing walk for /home/sync with read_from_db 0
06-25 12:13:57:231 [ info sync.csync.csync ]: Update detection for local replica took 0 seconds walking 0 files
06-25 12:13:57:231 [ info sync.csync.utils ]: Memory: 544720K total size, 23436K resident, 20856K shared
06-25 12:13:57:231 [ info sync.csync.csync ]: ## Starting remote discovery ##
06-25 12:13:57:231 [ info nextcloud.sync.accessmanager ]: 6 “PROPFIND” “ABCD/remote.php/dav/files/EFGH/” has X-Request-ID “0c9e7ddb-6f68-4abf-84b2-940560313667”
06-25 12:13:57:231 [ info nextcloud.sync.networkjob ]: OCC::LsColJob created for “ABCD” + “” “OCC::DiscoverySingleDirectoryJob”
06-25 12:13:57:843 [ warning nextcloud.sync.networkjob ]: QNetworkReply::NetworkError(ContentNotFoundError) “Server replied “404 Not Found” to “PROPFIND ABCD/remote.php/dav/files/EFGH/”” QVariant(int, 404)
06-25 12:13:57:843 [ info nextcloud.sync.networkjob.lscol ]: LSCOL of QUrl(“ABCD/remote.php/dav/files/EFGH/”) FINISHED WITH STATUS “ContentNotFoundError Server replied “404 Not Found” to “PROPFIND ABCD/remote.php/dav/files/EFGH/””
06-25 12:13:57:843 [ warning nextcloud.sync.discovery ]: LSCOL job error “Error transferring ABCD/remote.php/dav/files/EFGH/ - server replied: Not Found” 404 QNetworkReply::NetworkError(ContentNotFoundError)
06-25 12:13:57:843 [ warning nextcloud.sync.engine ]: ERROR during csync_update : "File or directory not found: "
06-25 12:13:57:843 [ info nextcloud.sync.database ]: Closing DB “/home/sync/._sync_caf7bb9e6616.db”
06-25 12:13:57:844 [ info nextcloud.sync.engine ]: CSync run took 614 ms
06-25 12:13:57:844 [ info nextcloud.sync.database ]: Closing DB “/home/sync/._sync_caf7bb9e6616.db”

why do you need the nc client?
synology has a webdav client. doing the job of syncing files as well.

That would be an okay-ish workaround I guess. I just set it up, but I’m getting error messages on the Synology DSM (like ‘This action is not supported and therefore cannot be completed.’ when I’m trying to copy files from another folder into the WebDAV share. Also, with this share the files are not on the NAS but only inside the university’s box, am I right?
I still think it should work much better with Docker.

i use it as backup. so the files are synced.

i hope we are talking about cloud sync.

https://www.synology.com/de-de/dsm/feature/cloud_sync

Oh, my bad. Cloud Sync does what I was looking for. Previously I added the WebDAV as a remote Connection in File Station, which was not satisfying.
Thanks for your Help :wink: