I’m using the docker version of NC, currently at version 21.
I have a webhost elsewhere, where every night I have a cron set to backup my sql db’s. I want to be able to send the backups over to my NC server when it’s done backing up. Is there a way to send them somehow? FTP perhaps, or something else?
Why not just use SSH and something like rsync or any kind of backup script or backup software and store them in a seperate folder on you Nextcloud server, independently of the Nextcloud? I don’t see any advantage in using Nextcloud as a storage backend for database backups or any kind of backups at all. It’s not as if you could do anything with these backuped files within Nextloud, except to look at them in the web interface.
@Kacey_Armstrong
You can not really good sent them from the host to your nextcloud.
When you send the data from your server to nextcloud e.g. with sftp you must “re-scan” your nextcloud data dir with “occ”.
But perhaps you can get them on your nextcloud with “External Storage”.
I do not really know if you can use a flow process for automation.
Well, to better explain my line of thinking; I’m using a self-hosted NC server as a direct replacement for DropBox (for everything, not just for these db backups). For years now, on the cron db backup script on my separate, paid webhost after the db backup itself it would send the .tar.gz file over to my DropBox. I do this to save space on my webhost (as you can imagine, nightly backups of 6 db’s can add up fast), and to have the db’s on an entirely different backup host in case of any issues with my paid webhost.
My DropBox script broke due to some changes with DropBox API itself, and looking at the coding I’d have to do just to get it working again, it feels like a PITA, plus DropBox has a storage limit as well, and since I already have my own NC cloud with terabyes of space, why not just replace DropBox with Nextcloud?
I don’t plan to “view” the backed up files in NC. If I ever need to restore from them, or cherry pick through them I’ll just open them up on my actual computer to do what I need.
So to sum up, it’s an offsite backup storage. A cloud. To store stuff.
I did take a look at the External Storages addon, but from what I can tell, all it does is allow NC to login to a remote sftp server and enumerate the files on it, which is awesome if someone needs to do so, but it’s not able to do any automatic “fetching” of specific files from that server, or anything of that sort.
I forgot to add, I suppose as an alternative I could look into adding an entirety different container (like syncthing, or similar) go through the learning curve of how to use it, etc., just to sync these 6 .gz files every night, and add all that extra overhead for a small function, but NC has dozens of addons and capabilities, and seems like it would be a natural function built right into NC, or at least via an addon, to be able to send files to it.
My assumption was NC/OC were designed to be full-on self-hosted replacements for DropBox, and DropBox does have the ability to send files over to it from remote servers.
I’m not familiar with using webdav at all. Can you point me in a direction that can directly give me an example on how to do what I’m looking for with NC? I’d appreciate it.
Also, I’m not sure I’d even be able to do so in my case. I’ve got these errors in my overview which I’ve not been able to resolve yet, and I think(?) they might be related to webdav?
btw: if are “webhost” is a real server you should be able to mount a nextcloud folder via webdav filesystem and write your backups direct into this folder.
You can use Nextcloud as a backup target by using WebDAV. But I do not see an advantage in comparison to backups via rsync or another backup tool directly to your server’s file system. Beside of that: the perfomance via webdav would most likely be worse.
Of course, if you would use a hosted nextcloud, or Dropbox as you did in the past, you would have no choice. But since you run your own server, with full root access, there is no longer any need to handle it like that imho.
It’s nothing you would have to do on NC, but rather on the client. All modern operating systems can mount a drive via WebDAV. Visually it looks similar to mounting a SMB share. You’ll have to look up how to mount it for your OS. I believe the Ubuntu package is called davfs2. Windows can mount it natively.
This is a documented error, and you have to change something in your web server and/or proxy (depending on your specific setup) to fix it.
I know a lot of people have issues resolving this, but this is absolutely the correct resolution. It’s just a matter of implementing it properly in your config.
Now I’m only left trying to solve the problem with the last 2 (webfinger & nodeinfo) that give 404. It looks like I’m not the only one, and it might not be resolved at all yet, according to this.
I guess the main reason I want it to go into NC instead of just anywhere on my machine is so that I can easily view the list of files from within the NC file browser, along with all the other stuff I store in my NC cloud, rather than have to “separate” them out, and view the files via Explorer instead.
That and having to setup some other software just to do syncing of 6 small zip files.