How do I mount a NextCloud server drive using SFTP, as a virtual drive on Windows?

i’ve just connected from a windows 10 pc and it appears to be working fine. i’m doing a test copy of a music folder which i’ll then upload again into a test folder.

i mapped from my windows 10 pc using:

net use q: \\cloud.jackalope.tech@ssl\remote.php\webdav

and provided the credentials when prompted.

update: i’ve just finished the file copy test. no errors, albeit a bit slow, but then again i’m located over 250ms away :slight_smile:

you’ll see a folder called “test2” where i copied a random music folder into.

perhaps it’s been locked out in the bruteforce table. the error you screenshotted is normal - if the name doesn’t match, nextcloud won’t let you login.

you can check the bruteforce table manually. see this post for more information:

you could also install the bruteforce plugin which helps to manage this easier:

1 Like

i would avoid doing this. the “files on my desktop linux machine that is hosting the files” is the “backend file system” i am referring to.

if you modify or drop files directly into the backend file system you are going to notice issues until you perform a external file scan.

i have a separate box setup for nextcloud and all of my machines are clients, either using the desktop file sync or connected via webdav.

2 Likes

Okay gotcha. Just to be clear, I’m not directly manipulating any files that are under the var/www/nextcloud data folders etc. It’s just my users home/documents folder that’s been mounted. But that counts as backend anyway and shouldn’t be messed with correct?

If that’s the case I think what I might do mount my work drives into nextcloud but sync them to my laptop directly using a sync program. That way changes and manipulations to those files directly in my home/work and home/documents folders doesn’t mess up or get messed up by nextcloud but I’ll also have them accessible through nextclouds web interface in case I need to share a document or something. Does that sound like a good solution?

And then I’ll be able to access the archive folder remotely using the webdav through nextcloud. Sounds like I won’t even really be using the sync client on my laptop at all.

I use goodsync on my windows machine to sync to between windows machines and upload via sftp to my personal portfolio website. I’m not sure how well their stuff works on Linux, I believe they only provide Linux cli with their enterprise package, but I can always use something like rsync or whatever if I have to. From what I’ve read syncing is pretty straightforward.

Also I can confirm you are correct about the bruteforce thing. I have fail2ban installed (I used Reiner’s excellent ansible script to install) and it was triggered. Not sure if I really need fail2ban if nextcloud already has an integrated solution though.

Thank you everyone for the help! I’m hoping to get this polipeline finished up today. Then I can get back to working on my website and I want to write up a blog promoting the nextcloud platform when I do. I really like the idea of owning ones own cloud! I’m trying to get more people I know to try it and having such a helpful community has been a lifesaver.

Update:

@wizdude @Reiner_Nippes Sorry to both you guys, but I wanted to make sure I understood the situation correctly. I should not be directly manipulating files on my desktop/server, using file manager or other programs other than NextCloud, correct? Doing so screws up the sync process for NextCloud?

A process like this however might work better, correct?

image

I could sync the laptop /user/david/work and the desktop /home/david/work using Syncthing, and then have those files made accessible for sharing etc to the web through NextCloud by mounting the /home/david/work folder to NextCloud using the external folders application correct?

The web front for NextCloud would sometimes be a little out of sync with the actual files, because changes made on the laptop would have to sync to the desktop and then a cronjob on nextcloud would have to run in order to reindex the changes that had been made to those folders, but otherwise, it should basically work, right?

I’ve been having some trouble getting Syncthing to work but I think I’ve figured out the problem.

The process you describe here still does what they’ve been warning you about - it modifies files that Nextcloud thinks it owns (even if it’s mounted as external storage), without going through Nextcloud. Modifying the files in your Linux home folder, or syncing them with SyncThing from Windows, is exactly the same as directly messing with files in /var/www/nextcloud/data from Nextcloud’s standpoint. It will still cause the issues they warned about.
Edit: Nextcloud docs aren’t adamant that problems will occur, but says it might not always keep up to date with external storages. I can say in practice that can lead to a very poor user experience, so the cron job is indeed well recommended. See Adding files to External Storages. Also, internal vs external files may have different handling when it comes to apps other than the Files app, but seem to be exactly the same for that purpose at least.

However. As long as you know you are messing with things behind Nextcloud’s back, and can take responsibility for scanning them before having Nextcloud interact with them again, I can say this causes relatively few issues. It’s not supported, but I’ve done it without frying anything.

A note on the diagram, I think the “webdav” dotted line should go from the laptop to Nextcloud, where Nextcloud then has the direct pink line to Archive on the Linux workstation - Nextcloud’s WebDAV is right alongside the GUI, as it’s all PHP. Unless of course you have a different server providing WebDAV from that machine.

Here I am told something different though:

But I think the gist in the end is basically the same. I’m not really supposed to mess with the files in the external storage, but as long as I am aware of how it might mess with things and let the cron job scan for stuff in between my changes, I should be basically okay.

And this is just the default cron job right? I see in the NextCloud options in the web app a thing that talks about running a cron job every 15 minutes and I wanted to double check that is the correct one.

As stated on my last post and also others clarified:

  • As long as you keep the SyncThing/SFTP/SMB/NFS/whatever shared folders outside of your Nextcloud internal data folder, everything will work fine and you don’t need any cron job to manually sync the data, AFAIK.
  • I am not sure how SyncThing works, if it’s similar to Nextcloud, having an own data index, or an opened protocol, but in case you can then add that folder via external storage to Nextcloud. As said there is no manual sync needed then, as this should be done by Nextcloud automatically on access.
  • And more natively you can access the Nextcloud data via webdav or its own clients, which use the webdav backend as well. And you can access the external folder via the related protocol (SMB, SFTP) or the SyncThing client respectively. All directly from all clients or course (work and home desktop/notebook).

According to performance it does not make much sense to access the external protocol data via Nextcloud. If you care about speed, user the related clients to access/mount those directly to your notebook and work place. The only benefit of accessing those via Nextcloud interface is that you will find everything within a single UI and can make use of Nextcloud share and access features.

Okay I should make really clear the structure here because I think some of the confusion here is because when I first started this thread it was about SFTP connections etc.

The folder I’m trying to sync are not SFTP. They are the /home/david/work folders on the linux desktop which is ALSO running the server. The /home/david/work folders are mounted to the NC server by the external folders app. I am trying to make a system in which I can directly manipulate (delete, rename, edit in a text editor, edit in a photo editor etc) the files on my desktop in /home/david/work and have those files sync to my laptop, and be available through the web through the NextCloud web gui.

This is true of a computer that is running the client yes? But it’s not true of the desktop which is also running the server and is not running the client, yes? That is what I’m trying to clarify.

@thedonquixotic I’ll try to clear up some of the apparently contradictory statements by quoting his post that he referred to earlier. Emphasis mine.

  1. He’s correct that the internal data folders are intended for only Nextcloud to use. He’s also correct that the cron job running occ files:scan can mitigate but not entirely negate any issues caused by messing with this (tags and comments may be lost, for example).
  2. He’s correct that Nextcloud will try to update its view of external storage options without the occ cron job. The problems with this approach are pointed out by @FadeFx (namely, your files can easily get out of sync). Ultimately, I agree with his suggestion to use WebDAV or Nextcloud sync rather than behind-the-scenes stuff you want, but again, I’ve done it the cron job way without losing anything.

I think where he and I would disagree is with how well Nextcloud does at keeping up with changes to external storages.
This depends on the external storage type, as for example SMB/CIFS has some optional configuration to keep it up to date, but is unworkable without either that or the cron scan.
You may also be interested in an app that was just brought to my attention, which is designed for local external storage you’d be using: Files_INotify. It also shares some issues, though, so read carefully before you decide on it. :slight_smile:

Again in the documentation section I linked, it’s mentioned that Nextcloud does not always keep up to date on external storages, just like FadeFX said. I’ve personally never had it keep up to date with a local external without using the cron job, regardless of the “scan on every access” setting. The cron job is the only thing that’s worked for me other than the preferred methods such as the Nextcloud client or WebDAV access.

@MichaIng Apologies if I’ve misconstrued your meaning, and here’s hoping we can get this all sorted out. :slight_smile:

1 Like

Jep, sorry for always bringing SFTP and stuff inside this. This perhaps confuses more than it clarifies. I just wanted to make the point clear, that everything that you want to share/sync with another method then Nextcloud internally, needs to be accessed by/attached to Nextcloud only as external storage, even if you want to manually edit local files within a share folder.
Yeah and I also read many cases where these external storage, dependant on the type, does not work perfect and Nextcloud might sync them either slow or delayed.

If you want to make use of Nextclouds share features for all the data, then indeed (like the others stated) it should be easiest to keep everything Nextcloud internally, skip Syncthing and use webdav respectively Nextcloud clients to access the data from notebook. Your intention to use Syncthing (respectively SFTP as your first idea) for part of the files was to use this for large files only that can sync faster then, right? I think the mentioned issues outweigh the possible advantages and I agree that you should try to use Nextcloud internal data + webdav for all files you want to sync.

But now the second question is, as your desktop machine is the Nextcloud server machine as well, you want to be able to directly edit the files via desktop software.
I first thought you could just install the Nextcloud client on the server as well. But this would lead to all the files being duplicated, once inside Nextcloud server data folder, once inside client data folder. I think merging both together will break the setup.
Just searched around a bid an it seems that indeed there is no perfect solution for this. It is just no intended that server and work desktop are on the same machine.

If you just want/need to be able to directly edit files within /user/david/work, then I think the solution you provided by the graphic above, is the best you can do, mounting this local directory via external storage engine into Nextcloud.
Not sure if Syncthing then is the best solution to keep those in sync between desktop and notebook.

As said, there is no cron job needed/possible to sync external storages with Nextcloud, AFAIK? occ:files --scan just scans the Nextcloud internal data directory.

Would be actually a nice feature, have a certain configured kind of Nextcloud client on the server machine itself, that enables direct local edit. The client would then not need to sync the changed/added files to the server, but just need to trigger the database to recognize that changes, just if you would changes files on a real external client.

just for the record, problems may not occure by adding files via smb or any other external file transfer protocol, but they may occure when moving files inside your nextcloud which already have share links for example or are shared with other users on the system. I also did upload lots of files via smb, which was fine to initially fill up my installation with files i wanted there, but later i discovered that webdav is the better way to access. Also webdav is not limited to local network like samba.
Btw regarding the not found message, i had similar problems, but found out that you have to change something in registry

Taken from here: https://docs.nextcloud.com/server/13/user_manual/files/access_webdav.html

Ah, and i had to use the path and syntax from there as well, as the path in nextcloud settings seems to not work on windows, but i could use it on android using solid explorer.

net use Z: https://<drive_path>/remote.php/dav/files/USERNAME/ /user:youruser yourpassword
where <drive_path> is the URL to your Nextcloud server.

1 Like

That’s actually exactly what the occ files:scan command does, and it works either on internal or external storage, per that same doc page. I think you’re incorrect on these two points.

1 Like

Wow, indeed, never knew that. Apologies for the wrong info then. With the --path option the cron job can then be even limited to the actual external storage to not rescan all internal files as well every time. Hmm but this should be actually done by Nextcloud itself, at best in a more efficient way, but somehow does not work reliably?

Jep, if a regular occ files:scan is okay for ones needs (although it takes quite much time, thus leads to larger delay), then it is not a huge issue to do local changes, as long as the file IDs are not used by any other Nextcloud app (than simply files app).
But file tags, comments, shares, previews and all of this is always attached to the file ID within the NC database. On files:scan (at least on internal storage), if a file was externally moved, then Nextcloud will create a new file ID, thus all the above mentioned is lost. Also the database fills up with obsolete entries and I am not sure how well Nextcloud cleans this up automatically. There was something done about it but some time ago e.g. obsolete oc_filecache entries just added up, leading to a very large database table…

1 Like

Okay, that’s a lot of discussion to take in so I’m going to try and do my best to parse these things piece by piece.

and from the documentation.

We recommend configuring the background job Webcron or Cron (see Defining background jobs) to enable Nextcloud to automatically detect files added to your external storages.

Okay so a couple of things, I want to make sure I’m understanding, and tell me which of these things I’m getting right or wrong. The context of all these questions is of a vanilla install of NextCloud 13.

A cron job can be set up to enable NextCloud to automatically detect files added, deleted, edited or moved on the local storage of the desktop which is mounted to NextCloud as an external folder.

I can have a cron job run that runs everytime an edit is made to a file or moved, or deleted, etc. ?

In your experience even with a cron job attempting to rescan every time a change is made to the files, NextCloud has never been able to keep up with changes

you are saying this also because the documentation says “Nextcloud may not always be able to find out what has been changed remotely (files changed without going through Nextcloud), especially when it’s very deep in the folder hierarchy of the external storage.” and what that means is that sometimes there are such deeply nested folders they might not get covered by the cron job correctly.

— Why don’t they get covered by the cron job? Do they time out or something?

The solution to this problem is to make sure you run a second cron job which runs every 15 minutes and scans the entirety of NextCloud folders, both the data folders, and the internal folders.

– I am confused when you say this: “The problems with this approach are pointed out by @FadeFx, and I agree with his suggestion to use WebDAV or Nextcloud sync rather than behind-the-scenes stuff you want, but again, I’ve done what you want without losing anything.”. I don’t follow what you’re trying to say. I started this topic asking about SFTP, and since then have decided to use Webdav, and it works totally fine. BUT my new issue that seems to have sprung out of talking about this issue was that I am having trouble getting NextCloud to sync properly. And I was told in this thread it was because I was directly manipulating these files locally on my desktop server in their local folders which were mounted as external folders to NC. If the only thing you’re trying to say is:

Use WebDav to mount instead of sftp

Then I agree and that’s not really the issue I’m having right now.
or perhaps you mean something else. I spoke with some people on the seafile forums and they suggested that in order to access files locally on the desktop which I was running the server that I could mount parts of my seafile database to my local environment in order to access those files directly. It would allow me to directly use the files on the same desktop as the desktop running the server but it would come at a small performance loss.

Is that what you’re suggesting?

or something else?

Okay now I think I’m starting to understand. What ya’ll are saying is that I should keep everything internally, and just use web dav to access those files locally on my desktop, correct?

No my intention was to use SFTP to mount a archive HDD that was too large to fit on my laptop. It’s a 3TB archive, and I only have 500gb of space on the laptop. I wanted to be able to access those files if I needed, or drag and drop files into folders on that drive and get them off my laptop drive, BUT that was only for the Archive drive. I figured that out. That issue is covered. I am using WebDav currently with it and I’m perfectly happy with it’s speed at this time.

The second part of the issue was to use either NextCloud or Syncthing to keep local files on the Windows laptop at /users/david/work and the local files on the Linux desktop at /home/david/work in sync with each other. Whether I used Syncthing or NextCloud was question of which would screw the NextCloud Database up the least. I originally planned to use NextCloud but I seemed to be told that I should not do that because manipulating the local files on the desktop directly would create problems for the sync, so I thought if I used a third party sync function, I could keep them in sync and just allow a little slack for the web interface of NextCloud to catch up as it did it’s cron job every 15 minutes.

One possible solution would be to put everything in the internal data folders, and then mount via web dav the nextcloud drives to the desktop, similar to what the Seafile people suggested I try. This would create a performance hit though and if I’m going to do that I kind of think I might as well just use Seafile.

So I could do this solution, but I might have trouble with the syncing for the reasons that have been listed above. IE:

  1. direct manipulation of local files on the desktop that have been mounted as external drives to NextCloud causes a cron job to run which checks for changes, but in very deep folders this might end up failing and causing errors.

Wait… okay so no cron job runs when changes are made to the local folders?

EDIT: Okay I see, there was some confusion on this point, but I see that actually it does do a cron job for both. gotcha.

Cool, thank you, I believe I’ve already made that change though. Thanks. The issue I’m having with files not found are not to do with the webdav stuff, they have to do with the externally mounted folders which are locally present on both the laptop and the desktop.

Okay so it sounds like this is indeed possible other than the possibility of making a huge database by accident.

One last Question

2 major reasons I wanted to use NextCloud.

  1. It’s fully fleshed out web GUI interface that would make it easy to access files online, and to share those files with clients or friends. and

  2. Because I wanted to have the ability to locally edit files on the desktop that is also running the server.

If doing so possible but will likely cause syncing headaches, then it seems my only solution is to do this:
I use Seafile to sync my laptop and desktop. I use Seafile to mount my Archive drive to my laptop. So that I can have local manipulation of the files on my desktop, I mount the Seafile server files to the desktop local folders using a Seafile virtual drive. Then I have that virtual drive mounted to NextCloud using the external folders app so that I can access and share those files from the web.

Would this be possible? Can NextCloud mount a virtual Seafile drive as an external folder?

Basically I see three solutions here and it depends on what the facts of how NextCloud works etc.

Which of these is most viable?

  1. Mount local /home/david/work folders to NextCloud using the external folders app, and sync this to my laptop using NextCloud. Directly manipulate files locally on the desktop and on the laptop, and allow the cronjob to keep stuff in sync.
  2. Keep everything in the NextCloud datafolder, and use WebDav to mount those folders to the local /home/david folder on the desktop, and keep stuff synced to the laptop using NextCloud client sync.
  3. Keep everything in the Seafile database, and use Seafile virtual drive to access and manipulate those files locally on the desktop and use external folder apps in NextCloud to make those files available online for sharing or whatever using the NextCloud Web GUI.

or perhaps a 4th:
4. None of these are possible solutions. There is no way to do what I’m trying to do.

Ah yeah sorry, missed that. Note that within the desktop client you can choose which folders to sync and which to skip. Thus you can keep the 3 B drive available via Nextcloud without make it filling up your laptop drive.


Cant say something about seafile, but 1. and 2. will work and I think you have to try out which method suits you best. As you nicely summed up both have their limits:

  • Mounting drive locally via webdav should lead to decreased read/write speed, when you locally work on these files. But Nextcloud will be always in perfect sync, tags/comments/previews/shares are not at risk.
  • Mounting local drive via external storage engine does not influence local read/write performance, but your Nextcloud the one or the other way is no perfectly in sync and I am not sure how well tags/comments/previews/shares work with this.

I think I would first go with method 2. and especially try it first without adding another cron job. Try it a while and watch how well (or bad) Nextcloud keeps sync by itself. Not sure how this is technically done, but it seems to be a method with way less overhead than files:scan, but with the downside of not 100% reliable sync state.
If you want to add the additional cron job, assure that you use occ files:scan --path /<username>/files/path/to/external to at least not scan all internal folders again, but only the external one: Using the occ command — Nextcloud 13 Administration Manual 13 documentation

If your priority is more towards Nextcloud sync state and tags/comments/previews/shares features and/or you do no recognize much performance loss when accessing a webdav mount, go with 1..

1 Like

You said try method 2 but from the context of the rest of your statement it seems you mean method 1.

But yes sounds good. Once I finish getting these things backed up on an external drive, I will try method 1 again with a clean slate and see how stuff goes from there.

Ah yes, meant your method 1. of course, too late at night, my brain does not work well any more :laughing:.

1 Like

I’ve edited my own post some, to remove some ambiguity I now realize I left. English is hard.

Yes. This works in my experience, so long as you’re using the simple features. Comments and share links may or may not work after you move or rename things.

Cron jobs run on a set schedule (e.g., every 15 minutes), and not based on things you do. That said, this INotify app, if I understand it correctly, will run the same occ command that your cron job would, and only when it detects changes. Handy if that’s what it does.

This is where I was ambiguous. Without the cron job, Nextcloud has never updated external local storages for me, regardless of whether I set the “scan on direct access” check-box in the web GUI. That check-box does not run the same occ command as the cron job does, and the cron job has worked just fine.

I’m saying the documentation confirms my own experience, which is that external storages often need the cron job. The cron job works correctly, it’s that GUI checkbox that doesn’t work. So the solution is to run the cron job on whatever schedule suites you. I personally have a cron job that runs once a week to scan only my Archive folder. I had jobs that ran more often and/or scanned more locations, but I changed those other folders so they aren’t external storage anymore.

“What you’re suggesting” in that case was directly accessing your folders on your Linux workstation, having it mounted into Nextcloud as external local storage, and then configuring a cron job. I’ve done that without losing data or failing a sync. I was saying that your “solution 1” was viable.

I didn’t mention anything about WebDAV vs SFTP for the remote mount. I realized that’s not what you’re asking about now, and I see you made a good choice when it comes to that. I was talking about what you’ve dubbed solution 2; using WebDAV to work with the files even on your Linux workstation. That’s still what I’d suggest as the most stable option, but the choice is yours to make. As I’ve been saying since my first post, your solution 1 is also viable, just not a setup I chose to keep once I had it.