Can I send files TO my Nextcloud server via (s)ftp?

I’m using the docker version of NC, currently at version 21.

I have a webhost elsewhere, where every night I have a cron set to backup my sql db’s. I want to be able to send the backups over to my NC server when it’s done backing up. Is there a way to send them somehow? FTP perhaps, or something else?

Hello, you could try with a nextcloud sync client on your server doing the backup.

The safest method is to upload via WebDAV. Accessing the data folder directly via SFTP or similar protocol is not recommended.

@Kacey_Armstrong

Why not just use SSH and something like rsync or any kind of backup script or backup software and store them in a seperate folder on you Nextcloud server, independently of the Nextcloud? I don’t see any advantage in using Nextcloud as a storage backend for database backups or any kind of backups at all. It’s not as if you could do anything with these backuped files within Nextloud, except to look at them in the web interface.

@Kacey_Armstrong
You can not really good sent them from the host to your nextcloud.
When you send the data from your server to nextcloud e.g. with sftp you must “re-scan” your nextcloud data dir with “occ”.

But perhaps you can get them on your nextcloud with “External Storage”.
I do not really know if you can use a flow process for automation.

https://docs.nextcloud.com/server/latest/admin_manual/configuration_files/external_storage/sftp.html

Nextcloud is not really a backup host. Nextcloud is a file sharing plattform.
You can use sftp to send it to another part of your linux server.

Well, to better explain my line of thinking; I’m using a self-hosted NC server as a direct replacement for DropBox (for everything, not just for these db backups). For years now, on the cron db backup script on my separate, paid webhost after the db backup itself it would send the .tar.gz file over to my DropBox. I do this to save space on my webhost (as you can imagine, nightly backups of 6 db’s can add up fast), and to have the db’s on an entirely different backup host in case of any issues with my paid webhost.

My DropBox script broke due to some changes with DropBox API itself, and looking at the coding I’d have to do just to get it working again, it feels like a PITA, plus DropBox has a storage limit as well, and since I already have my own NC cloud with terabyes of space, why not just replace DropBox with Nextcloud?

I don’t plan to “view” the backed up files in NC. If I ever need to restore from them, or cherry pick through them I’ll just open them up on my actual computer to do what I need.

So to sum up, it’s an offsite backup storage. A cloud. To store stuff.

I did take a look at the External Storages addon, but from what I can tell, all it does is allow NC to login to a remote sftp server and enumerate the files on it, which is awesome if someone needs to do so, but it’s not able to do any automatic “fetching” of specific files from that server, or anything of that sort.

restic.readthedocs.io + rclone.org

I forgot to add, I suppose as an alternative I could look into adding an entirety different container (like syncthing, or similar) go through the learning curve of how to use it, etc., just to sync these 6 .gz files every night, and add all that extra overhead for a small function, but NC has dozens of addons and capabilities, and seems like it would be a natural function built right into NC, or at least via an addon, to be able to send files to it.

My assumption was NC/OC were designed to be full-on self-hosted replacements for DropBox, and DropBox does have the ability to send files over to it from remote servers.

Hi Kacey,

Your needs sounds very interesting. Try watching this video and maybe it will give you some idea:

What I thought you could do is setup a cron job to send the files to your NC host to the path where the NC data directory is located.

I hope this somehow helps

a “cloud server” aka “vserver”
or
“web hosting space” with preinstalled wordpress/joomla/etc. ?

are you able to install additional software on this server?

I’m not familiar with using webdav at all. Can you point me in a direction that can directly give me an example on how to do what I’m looking for with NC? I’d appreciate it.

Also, I’m not sure I’d even be able to do so in my case. I’ve got these errors in my overview which I’ve not been able to resolve yet, and I think(?) they might be related to webdav?

I’ve tried in the past to resolve them, and come across many others fighting with it as well, and came up short so far.

@Kacey_Armstrong from the rclone docs:

btw: if are “webhost” is a real server you should be able to mount a nextcloud folder via webdav filesystem and write your backups direct into this folder.

If you are using NGINX with your NC then add the lines inside your NGINX to solve caldav and carddav:

location = /.well-known/carddav {
return 301 https://YOUR_NC_DOMAIN/remote.php/dav;
}

location = /.well-known/caldav {
    return 301 https://YOUR_NC_DOMAIN/remote.php/dav;
}

be sure to edit your NC domain

1 Like

You can use Nextcloud as a backup target by using WebDAV. But I do not see an advantage in comparison to backups via rsync or another backup tool directly to your server’s file system. Beside of that: the perfomance via webdav would most likely be worse.

Of course, if you would use a hosted nextcloud, or Dropbox as you did in the past, you would have no choice. But since you run your own server, with full root access, there is no longer any need to handle it like that imho.

It’s nothing you would have to do on NC, but rather on the client. All modern operating systems can mount a drive via WebDAV. Visually it looks similar to mounting a SMB share. You’ll have to look up how to mount it for your OS. I believe the Ubuntu package is called davfs2. Windows can mount it natively.

This is a documented error, and you have to change something in your web server and/or proxy (depending on your specific setup) to fix it.

I know a lot of people have issues resolving this, but this is absolutely the correct resolution. It’s just a matter of implementing it properly in your config.

Your Web server is not set up properly to resolve /.well-known/caldav/ or /.well-known/carddav/

tyvm! That pointed me in the right direction to (mostly) solve it.

I originally had that, like this:

location = /.well-known/carddav   { return 301 /remote.php/dav; }
location = /.well-known/caldav    { return 301 /remote.php/dav; }
location = /.well-known/webfinger { return 301 /public.php?service=webfinger; }
location = /.well-known/nodeinfo  { return 301 /public.php?service=nodeinfo; }

But it wasn’t working. For ishs & grins, I did like you suggested and added the hardcoded full path instead, like:

location = /.well-known/carddav   { return 301 https://my.domain.com:3333/remote.php/dav; }
location = /.well-known/caldav    { return 301 https://my.domain.com:3333/remote.php/dav; }
location = /.well-known/webfinger { return 301 https://my.domain.com:3333/public.php?service=webfinger; }
location = /.well-known/nodeinfo  { return 301 https://my.domain.com:3333/public.php?service=nodeinfo; }

And the top 2 errors vanished! So, preferring to keep variables instead of hardcoding paths, I switched to this, and it still works:

location = /.well-known/carddav   { return 301 $scheme://$http_host/remote.php/dav; }
location = /.well-known/caldav    { return 301 $scheme://$http_host/remote.php/dav; }
location = /.well-known/webfinger { return 301 $scheme://$http_host/public.php?service=webfinger; }
location = /.well-known/nodeinfo  { return 301 $scheme://$http_host/public.php?service=nodeinfo; }

Now I’m only left trying to solve the problem with the last 2 (webfinger & nodeinfo) that give 404. It looks like I’m not the only one, and it might not be resolved at all yet, according to this.

I guess the main reason I want it to go into NC instead of just anywhere on my machine is so that I can easily view the list of files from within the NC file browser, along with all the other stuff I store in my NC cloud, rather than have to “separate” them out, and view the files via Explorer instead.

That and having to setup some other software just to do syncing of 6 small zip files.

It’s a standard shared/managed webhost (A2). It doesn’t come with WordPress pre-installed, but other things are, like cpanel, etc.

I don’t know if/think I can install things that “low level” on it, but I can install higher level things, like CMS’.

you want to upload your files from your web host to your nextcloud.
what changing the direction. writing a small script to download the files?

is your web host accessible via sftp? most hoster offer this.

than you download the backup file “into the docker volume” where your nextcloud data folder lives.