MASSIVE ammount of data to upload

Hey Guys,

I have a MASSIVE (20+ TB) of data to upload to Nextcloud. Currently the first few TB are on a NAS. I’ve installed the windows client on my laptop. . . Our network is full Gigabit. But man is this slow. Any tips/tricks for getting this data onto nextcloud faster? For the last 30 minutes or so it’s been bouncing around “looking for changes in xyz”.

I know i’m not supposed to just do a bulk upload to the directory directly. . . but is there really not a faster way?

Maybe use davfs to mount the Nextcloud folder you want to copy to? That would enable you to “copy” directly from the command line of the NAS device where the data currently resides to the Nextcloud server (assuming I understood correctly where the data is and where it should go). I did that, although “only” with 100 GB of data.

I find this a lot faster than using the Nextcloud client. If you have lots of small files it will be slow anyway.

Other than that, the only faster way is to upload via ssh/sftp or even Samba. Then you need to do a occ files:scan --all -vvv to add the files to the Nextcloud database. I do this frequently with no adverse effect.

my initial load using the client seems to have stalled at 120gb of 330. any advice to queue it back up? i’m not a linux admin. . .

Rclone with webdav remote.

Bulk upload with happiness.

Well i’ve found myself in a pinch and I, personally, can’t seem to make nextcloud do what makes appear so easy. Anyone available to help?