How do I mount a NextCloud server drive using SFTP, as a virtual drive on Windows?

Using WebDAV would be super easy and convenient for your end-users. Very similar to Google Stream Drive For GAPPS users.

A lot would depend on how well our WebDAV stack implementation is with NextCloud.

The last time I used it was several years ago with ‘ownCloud’ before Nextcloud existed.

Anyone have experience with how well the WebDAV protocol works with Nextcloud.?

BJ

I use webdav to access my nextcloudpi server running on a raspberry pi 2b, so i can not really judge performance wise, but it behaves similar to samba, with the difference that the files get synced to nextcloud database when samba needs external files scan to do this. So my suggestion would be definitely to use webdav preferedly over samba, i dont know how sftp would be, but i guess it has the same shortcomings as it would be a seperate server.

+1. this is something which is often misunderstood. nextcloud is not a “web front end for your filesystem”. it’s a cloud portal that keeps metadata for the files that are kept inside it. you should ensure that all files are processed through nextcloud, either via a sync agent on a computer, via the web portal (file upload/download) or webdav. you shouldn’t be updating files on the backend directly and expecting everything to work cleanly.

while you can use the external files scan, you can cause issues when attributes are applied to files in nextcloud. take for example a share link for a folder or a file. if you rename the file or folder in the back end, it would make sense that the share link would break as the operation wasn’t performed through nextcloud.

webdav is a great solution. make sure you use SSL and have a valid SSL certificate (ie: LetsEncrypt or a provider purchased certificate) and not a self-signed certificate. I can’t speak for other platforms, but I know that webdav support to map a drive in windows won’t work unless it’s SSL and a certificate that validates correctly. (while you could use a private CA and distribute the CA ROOT certificate manually, it’s hardly worth it when LetsEncrypt is free and even CA purchased certificates are now dirt cheap. You only need a DV [domain validation] certificate)

the webdav link for each user is displayed in the bottom left hand corner of the nextcloud portal.

i hope the rest of your implementation goes well. everyone enjoys reading success stories and implementation experiences.

cheers, wizdude.

Sorry, I’m a bit new to this stuff, but are you saying that I shouldn’t be directly manipulating my files in the file explorer? That I should only be using an accessing my files in the NextCloud folders?

what i’m saying is that your interaction with nextcloud should either be through the web portal, via a desktop file sync agent or webdav. you shouldn’t be interacting with the backend filesystem directly.

for files that are synced to your local machine via a desktop sync agent, you can happily use file explorer on these as that is exactly how you are meant to use nextcloud. when you add/remove/change files they are sync’d back via the sync agent into nextcloud.

i would avoid solutions (SFTP, SAMBA, etc) that permit you to bypass nextcloud and access the files in the back end directly as you’ll need to use the external files scan to pick up the changes. the exception would be if anyone developed an SFTP or similar plugin for nextcloud, but nothing like this currently exists.

Okay so I can use the file explorer on my windows laptop which is synced by the desktop sync agent, but can I also directly manipulate files on my desktop Linux machine that is hosting the files? I have my Documents and Media folders in the Home folder mounted as external folders to Nextcloud through the external folders app.

So far that seems to be fine but I am also running into these problems and I can figure out how to fix it. Why will these files not sync?

I’m worried that I’ve been using it incorrectly and that’s why I’m encountering this problem with the client.

I might have figured out my problem. It seems the .vscode folder for some reason had made itself write protected on the windows machine. I’m changing that now and hopefully that will clear up the last little bits of incorrect syncing issues I’ve been having. Then I’ll get around to mounting the big archive folder through web dav.

It’s taking awhile for the permissions to change on the vscode folders so I decided to try and connect the archive folders via webdav and I keep on getting an error message saying that it can’t be found.

According to my nextcloud web portal my webdav url should be: https://cloud.jackalope.tech/remote.php/webdav/

[redacted guest login after confirming it works]

I’m going to keep fiddling with this too.

Wellll… Now nextcloud simply won’t connect at all. The client on my laptop can’t connect. When I go to cloud.jackalope.tech I can’t connect, and when I try to go to my IP address, it tells me that I can’t connect because the cert is for cloud.jackalop.tech. If I then add an exception it takes me to this page:

Then when I checked back to that window after writing up most of this message it had a login , but logging into my non admin account gave me a thing saying I was locked out and needed to be admin. I’m going to try and log in as admin. But it’s acting weird.

I just installed an Ubuntu security update. Could this have done something to misalign my settings or something?

i’ve just connected from a windows 10 pc and it appears to be working fine. i’m doing a test copy of a music folder which i’ll then upload again into a test folder.

i mapped from my windows 10 pc using:

net use q: \\cloud.jackalope.tech@ssl\remote.php\webdav

and provided the credentials when prompted.

update: i’ve just finished the file copy test. no errors, albeit a bit slow, but then again i’m located over 250ms away :slight_smile:

you’ll see a folder called “test2” where i copied a random music folder into.

perhaps it’s been locked out in the bruteforce table. the error you screenshotted is normal - if the name doesn’t match, nextcloud won’t let you login.

you can check the bruteforce table manually. see this post for more information:

you could also install the bruteforce plugin which helps to manage this easier:

1 Like

i would avoid doing this. the “files on my desktop linux machine that is hosting the files” is the “backend file system” i am referring to.

if you modify or drop files directly into the backend file system you are going to notice issues until you perform a external file scan.

i have a separate box setup for nextcloud and all of my machines are clients, either using the desktop file sync or connected via webdav.

2 Likes

Okay gotcha. Just to be clear, I’m not directly manipulating any files that are under the var/www/nextcloud data folders etc. It’s just my users home/documents folder that’s been mounted. But that counts as backend anyway and shouldn’t be messed with correct?

If that’s the case I think what I might do mount my work drives into nextcloud but sync them to my laptop directly using a sync program. That way changes and manipulations to those files directly in my home/work and home/documents folders doesn’t mess up or get messed up by nextcloud but I’ll also have them accessible through nextclouds web interface in case I need to share a document or something. Does that sound like a good solution?

And then I’ll be able to access the archive folder remotely using the webdav through nextcloud. Sounds like I won’t even really be using the sync client on my laptop at all.

I use goodsync on my windows machine to sync to between windows machines and upload via sftp to my personal portfolio website. I’m not sure how well their stuff works on Linux, I believe they only provide Linux cli with their enterprise package, but I can always use something like rsync or whatever if I have to. From what I’ve read syncing is pretty straightforward.

Also I can confirm you are correct about the bruteforce thing. I have fail2ban installed (I used Reiner’s excellent ansible script to install) and it was triggered. Not sure if I really need fail2ban if nextcloud already has an integrated solution though.

Thank you everyone for the help! I’m hoping to get this polipeline finished up today. Then I can get back to working on my website and I want to write up a blog promoting the nextcloud platform when I do. I really like the idea of owning ones own cloud! I’m trying to get more people I know to try it and having such a helpful community has been a lifesaver.

Update:

@wizdude @Reiner_Nippes Sorry to both you guys, but I wanted to make sure I understood the situation correctly. I should not be directly manipulating files on my desktop/server, using file manager or other programs other than NextCloud, correct? Doing so screws up the sync process for NextCloud?

A process like this however might work better, correct?

image

I could sync the laptop /user/david/work and the desktop /home/david/work using Syncthing, and then have those files made accessible for sharing etc to the web through NextCloud by mounting the /home/david/work folder to NextCloud using the external folders application correct?

The web front for NextCloud would sometimes be a little out of sync with the actual files, because changes made on the laptop would have to sync to the desktop and then a cronjob on nextcloud would have to run in order to reindex the changes that had been made to those folders, but otherwise, it should basically work, right?

I’ve been having some trouble getting Syncthing to work but I think I’ve figured out the problem.

The process you describe here still does what they’ve been warning you about - it modifies files that Nextcloud thinks it owns (even if it’s mounted as external storage), without going through Nextcloud. Modifying the files in your Linux home folder, or syncing them with SyncThing from Windows, is exactly the same as directly messing with files in /var/www/nextcloud/data from Nextcloud’s standpoint. It will still cause the issues they warned about.
Edit: Nextcloud docs aren’t adamant that problems will occur, but says it might not always keep up to date with external storages. I can say in practice that can lead to a very poor user experience, so the cron job is indeed well recommended. See Adding files to External Storages. Also, internal vs external files may have different handling when it comes to apps other than the Files app, but seem to be exactly the same for that purpose at least.

However. As long as you know you are messing with things behind Nextcloud’s back, and can take responsibility for scanning them before having Nextcloud interact with them again, I can say this causes relatively few issues. It’s not supported, but I’ve done it without frying anything.

A note on the diagram, I think the “webdav” dotted line should go from the laptop to Nextcloud, where Nextcloud then has the direct pink line to Archive on the Linux workstation - Nextcloud’s WebDAV is right alongside the GUI, as it’s all PHP. Unless of course you have a different server providing WebDAV from that machine.

Here I am told something different though:

But I think the gist in the end is basically the same. I’m not really supposed to mess with the files in the external storage, but as long as I am aware of how it might mess with things and let the cron job scan for stuff in between my changes, I should be basically okay.

And this is just the default cron job right? I see in the NextCloud options in the web app a thing that talks about running a cron job every 15 minutes and I wanted to double check that is the correct one.

As stated on my last post and also others clarified:

  • As long as you keep the SyncThing/SFTP/SMB/NFS/whatever shared folders outside of your Nextcloud internal data folder, everything will work fine and you don’t need any cron job to manually sync the data, AFAIK.
  • I am not sure how SyncThing works, if it’s similar to Nextcloud, having an own data index, or an opened protocol, but in case you can then add that folder via external storage to Nextcloud. As said there is no manual sync needed then, as this should be done by Nextcloud automatically on access.
  • And more natively you can access the Nextcloud data via webdav or its own clients, which use the webdav backend as well. And you can access the external folder via the related protocol (SMB, SFTP) or the SyncThing client respectively. All directly from all clients or course (work and home desktop/notebook).

According to performance it does not make much sense to access the external protocol data via Nextcloud. If you care about speed, user the related clients to access/mount those directly to your notebook and work place. The only benefit of accessing those via Nextcloud interface is that you will find everything within a single UI and can make use of Nextcloud share and access features.

Okay I should make really clear the structure here because I think some of the confusion here is because when I first started this thread it was about SFTP connections etc.

The folder I’m trying to sync are not SFTP. They are the /home/david/work folders on the linux desktop which is ALSO running the server. The /home/david/work folders are mounted to the NC server by the external folders app. I am trying to make a system in which I can directly manipulate (delete, rename, edit in a text editor, edit in a photo editor etc) the files on my desktop in /home/david/work and have those files sync to my laptop, and be available through the web through the NextCloud web gui.

This is true of a computer that is running the client yes? But it’s not true of the desktop which is also running the server and is not running the client, yes? That is what I’m trying to clarify.

@thedonquixotic I’ll try to clear up some of the apparently contradictory statements by quoting his post that he referred to earlier. Emphasis mine.

  1. He’s correct that the internal data folders are intended for only Nextcloud to use. He’s also correct that the cron job running occ files:scan can mitigate but not entirely negate any issues caused by messing with this (tags and comments may be lost, for example).
  2. He’s correct that Nextcloud will try to update its view of external storage options without the occ cron job. The problems with this approach are pointed out by @FadeFx (namely, your files can easily get out of sync). Ultimately, I agree with his suggestion to use WebDAV or Nextcloud sync rather than behind-the-scenes stuff you want, but again, I’ve done it the cron job way without losing anything.

I think where he and I would disagree is with how well Nextcloud does at keeping up with changes to external storages.
This depends on the external storage type, as for example SMB/CIFS has some optional configuration to keep it up to date, but is unworkable without either that or the cron scan.
You may also be interested in an app that was just brought to my attention, which is designed for local external storage you’d be using: Files_INotify. It also shares some issues, though, so read carefully before you decide on it. :slight_smile:

Again in the documentation section I linked, it’s mentioned that Nextcloud does not always keep up to date on external storages, just like FadeFX said. I’ve personally never had it keep up to date with a local external without using the cron job, regardless of the “scan on every access” setting. The cron job is the only thing that’s worked for me other than the preferred methods such as the Nextcloud client or WebDAV access.

@MichaIng Apologies if I’ve misconstrued your meaning, and here’s hoping we can get this all sorted out. :slight_smile:

1 Like

Jep, sorry for always bringing SFTP and stuff inside this. This perhaps confuses more than it clarifies. I just wanted to make the point clear, that everything that you want to share/sync with another method then Nextcloud internally, needs to be accessed by/attached to Nextcloud only as external storage, even if you want to manually edit local files within a share folder.
Yeah and I also read many cases where these external storage, dependant on the type, does not work perfect and Nextcloud might sync them either slow or delayed.

If you want to make use of Nextclouds share features for all the data, then indeed (like the others stated) it should be easiest to keep everything Nextcloud internally, skip Syncthing and use webdav respectively Nextcloud clients to access the data from notebook. Your intention to use Syncthing (respectively SFTP as your first idea) for part of the files was to use this for large files only that can sync faster then, right? I think the mentioned issues outweigh the possible advantages and I agree that you should try to use Nextcloud internal data + webdav for all files you want to sync.

But now the second question is, as your desktop machine is the Nextcloud server machine as well, you want to be able to directly edit the files via desktop software.
I first thought you could just install the Nextcloud client on the server as well. But this would lead to all the files being duplicated, once inside Nextcloud server data folder, once inside client data folder. I think merging both together will break the setup.
Just searched around a bid an it seems that indeed there is no perfect solution for this. It is just no intended that server and work desktop are on the same machine.

If you just want/need to be able to directly edit files within /user/david/work, then I think the solution you provided by the graphic above, is the best you can do, mounting this local directory via external storage engine into Nextcloud.
Not sure if Syncthing then is the best solution to keep those in sync between desktop and notebook.

As said, there is no cron job needed/possible to sync external storages with Nextcloud, AFAIK? occ:files --scan just scans the Nextcloud internal data directory.

Would be actually a nice feature, have a certain configured kind of Nextcloud client on the server machine itself, that enables direct local edit. The client would then not need to sync the changed/added files to the server, but just need to trigger the database to recognize that changes, just if you would changes files on a real external client.