Copy files from one NC to another one


is there a good way to get some specific files automatically copied from one nextcloud installation to another?
For example from my NC Install from my main server to a nextcloudbox for backup reasons?

I would go for a federated share - it’s basically like sharing with local users, but you just enter the federated cloud id of the remote user (can be found on your Personal page / section Federated Cloud) instead of a local user name.

The remote user has to accept the share, before the data gets “automagically” synced :slight_smile:

Federated Cloud Sharing needs to be enabled on both servers, see for more details.


EDIT: Just read “for backup reasons”. The above mentioned federated share might not fulfill your expectations concerning backup, depending on your understanding of “backup”.

I don’t have issue with accessing it, but if the nc server which hosts the files is gone, the files are gone too, thats why I want to actually copy them.

That means, we are talking about backup / recovery procedures for a stand-alone installation?

I want backup some specific files as f.e. the sql database, so I create a directory backup which i put in several data which I want to have redundant.
So after I placed them there some module app or other should take the files from the nextcloud backup directory and copy them to the bachup folder of my local nextcloudbox, where I can store them via LAN on a USB Hard drive which I only attach from time to time.

Sounds like a job for a federated share firsthand, but since I don’t know how changes are propagated on a federated share (are they pushed or pulled?) it probably might not fit your needs.

I think, the most straight-forward approach would be to mount both folders via WebDAV on a Linux box and rsync both folders. Not very elegant, but easy to setup.

So you mean federated share is actually copying files? I thought the files are only shared so they are gone, if you shut down the first nextcloudbox.

No, I misunderstood your requirements first, hence I recommended using a federated share.

That’s why I proposed to use an rsync job in my previous post, to get the data reliably transferred to your backup box.

ah okay, thanks, I think I will do the rsync with 2 davfs directories thing.

i am just trying that, copying from one davfs to another davfs share is slow as hell. you get about 12-24 kb/s in iftop.

I also thought first about some sharing/external storage solution. But all of them have in common that the data is not actually copied to the destination, just accessible. Using a desktop client, you could add the two instances, main and backup nc, adding the desired folder/files to both of them. At least I guess this should work. Same result using other webdav solutions. But as you said, this will be slow as hell…

For backup I would recommend a solution outside of nextcloud, using some cronjob to mount an external drive, rsync the files/folders from /nextcloud/data//files/ “manually” and unmount the drive again. Depending on your server/backup locations network drives will do the job.

It would be actually nice to be able to store shared folders and content of external storages also locally, or at least to mark single files/folders for this.

the nextcloud client detects if you are using the same folder even if you use different nc-server. I already posted that here in the forums, the devs are strongly disrecommending using the nc client as a backup tool between 2 server.

Ah okay, yeah as stated it would be indeed an ugly solution.

I get the feeling a lot of people don’t like cloud backup services (not to be confused with cloud storage services like Nextcloud). But I do a combination of local backup and cloud backup:

  • My Nextcloud VM is Ubuntu with an NFS share for all data, and Nextcloud’s data directory is within the NFS share
  • A cron job on the nextcloud server runs a nextcloud directory backup to a local disk under the NFS share, outside of the cloud data directory (daily)
  • A cron job on the nextcloud server runs a data directory backup to a local disk under the NFS share, outside of the cloud data directory(daily)
  • A cron job on the nextcloud server runs a nextcloud database backup to a local disk under the NFS share, outside of the cloud data directory(daily)
  • Crashplan runs on a VM which backs up everything on the NFS share (I would substitute this for a local backup server if applicable)

Technically I use Nextcloud as a storage AND backup solution, but only because the data it stores is backed up in real time by Crashplan.

ok so you more or less skip nc and do that only via a file sharing protocol. I could do that do, but if i store the files via nfs or smb etc direct to the nc-box at home I have to to a cronjob which does occ file:scan every now and then.

Right, I don’t see any point in using a second instance of NC if you’re only using it for “backup”. Federation might be a good idea, but I wouldn’t personally use that unless there was already another NC set up and primarily used for cloud storage/sync.

If you’re just setting up a remote server primarily for backup, maybe look into a dedicated backup tool like Duplicati or something else, otherwise you’re limited to a single full backup, making it difficult to do previous versions etc.

As far as having to do occ files:scan, I don’t do this. The NC data directory is only writable by NC, so I do not make changes outside of NC.

Another (very similar) way I could have done this is by having a read only NFS mount which the backup server sees and backs up, then underneath that have a read/write NFS share for local users, and beside that my cloud data directory.

In my case its the other way round. As I got a nextcloud-box (the black thing you can see on the homepage) as a low power nas at home, I thought I can use it as a backup endpoint too.

Of course you could do that, but as linucksrox said, for what you need the nextcloud instance to manage your backup data? As you have the box with drive inside, use it to store the backup, but outside the the nc data folder :stuck_out_tongue_winking_eye:. You will find another task for the nextcloud instance on the box, where all the database/webservice features make sense ;).

We’re interested in creating an NC-level backup/replication solution, which doesn’t look to create an offline copy of an installation, rather 2-way replicates files as wanted, along with relevant states, in as near-real time as is achievable/desirable in a given situation (probably bandwidth dependent.

users get best possible bandwidth access to files (they access the local office instance, or the public cloud hosted instance, whichever would be fastest for them);
BC backup is done (just as versions takes care of user error backup);
ongoing file access if any given instance isn’t available;
managed replication on rules basis (don’t replicate/update large files until after working hours, don’t replicate to public cloud any sensitive files (or only to central copy, not other branches);
controlled state backup (app knows when a file has changed, and when it has been replicated and passes that info to user as needed);

1 Like

Mostly it is convenience, so its easy accessible too, (and i am a bit lazy too :stuck_out_tongue: ) I found a program called goodsync, which copies the stuff around now.
the main backupscript makes a tgz archive and stores that to the first instance via davfs.
As I have only few big files for that, the performance of webdav is sufficient in my case.
I get that, thats not the best solution, but it does the job in an acceptable way.