Tutorial: How to migrate mass data to a new NextCloud server

@Tom_Forge didn’t want to discourage you… At work I shifted through 90TB of data spanning since 1993. I had to face the not so happy user faces too, whilst doing so. You may wish to first create the file structure first than invite to see if they like it and then apply it. I have the full version of scooter and it helps alot in transferring of data, but try to use a high end pc beefed up with RAM, so you can keeping working your way through the data transfer without waiting for the cut and past thing to complete.

I am happy that you solved the ownership and rwe permissions as this is very much dependent on how you installed NextCloud.

Currently I am working on another NextCloud project at home whereby I will be trying to make better use of my router running LEDE (a port of OpenWRT) to act as a NextCloud Server. If I succeed I would have accomplished my goal to exploit and make better use of my embedded system (Linksys WRT1900ACS v2) and remove a power hungry rack server from home and a NAS to run everything I need with just 0.03 kW/h :wink:

1 Like

@lebernd, correction noted and done, thanks for pointing it out :wink:

Hello fab, how are you?

I have some questions:

  • About this: "… data that you wish to migrate to NextCloud. Ideally the data within is already sorted and orginised in a folder structure that you are happy with." The organization of the data have to be like data/[username]/. Because I cant understand how the scan asociates a file to an user. Ergo, I have 4 users with 10 files each one. Which should be the better organization to do the sync and impact in NextCloud?

  • So, as a Prerequisite the users have to exists in the database?

Regards,

Alexis.

@alexis.rosano Hi, the scope of this tutorial is to migrate existing mass data into a new nextcloud install. The data you have to migrate is to be ready available on an external drive, organized in a folder structure friendly to the user who is going to use it. Scan does not associate the data with the user but populates the database in use with nextcloud (Maria DB or mysql) with the file names of the data - a kind of index. You as an admin would associate the user to a folder from the nextcloud web interface.

Thanks for your answer.

Regards!

Great post I have needed to do this with over 16gb of data a few times I never knew bout the console.php script to force a scan and update the database so I always spent a few days using the sync app for each user. In the owncloud days (and i think in the early nextcloud versions) you use to be able to just copy the data to the webroot data folder and it would just work but this option was removed a few years ago for reasons I never understood.

Just an fyi rather than running ls to check the progress of rsync you can use rsync -avP to view rsyncs progress in realtime. You can also speed up the transfer by adding compression during transfer using the -z option.

rsync is a very powerful tool with tons of useful options great for tasks like this.

-a
archive mode
-v
increase verbosity
-P
show progress during transfer (must use upercase P)

more on rsync here
https://linux.die.net/man/1/rsync

thanks again
tim

@fab

Fab, I read through your tutorial and i was relieved to see it was possible to migrate mass data onto my NextClould quickly. Thank You.

I followed your tutorial and i can see that the (test) file i attempted to move was in fact relocated to the destination.
But, when i ran
sudo -u www-data php console.php files:scan --all
command in " /var/www/nextcloud " an error was returned which says:

Your data directory is invalid

Ensure there is a file called ".ocdata" in the root of the data directory.

Your data directory is readable by other users

Please change the permissions to 0770 so that the directory cannot be listed by other users.

An unhandled exception has been thrown:

Exception: Environment not properly prepared. in /var/www/nextcloud/lib/private/Console/Application.php:148

Stack trace:

#0 /var/www/nextcloud/console.php(89): OC\Console\Application->loadCommands(Object(Symfony\Component\Console\Input\ArgvInput), Object(Symfony\Component\Console\Output\ConsoleOutput))

Also, i can’t even access any of my user accounts . i’m getting an error that says: "
" Your data directory is invalid
Ensure there is a file called “.ocdata” in the root of the data directory. "

In my case, I didn’t realize that the files needed to be inside of the users “./files” folder /data/username/files and I was placing them in the /data/username folder. The scan indexed the files, but they didn’t show up in the web interface. Moving them to the correct sub-directory and rescanning fixed it.

2 Likes

Hi. How can I migrate when Nexcloud using S3 has external storage. Because… the directory structura doesent exist.

For example… I have directory structure with files (/tmp/user1 for example. 1GB of data distributed into 100 files) and I need upload the directory structure and its files into an spesific user account. But… I not known the password of the user. Im know the admin user and password.

How can I do?

My regards.

Thanks for sharing this information with us, printer offline fix helped me to get the solution of this. I never knew bout the console.php script to force a scan and update the database so I always spent a few days using the sync app for each user.

Would be nice to do something like this from one NextCloud iocage to another NextCloud iocage. This is cause my instance was done from third-party install and some features don’t work.

My db and files folders are located outside of the iocage/jail. Theoretically, if it’s safe to map the files folder without the data being lost, that should be as simple as editing the files location, but I’ve yet to get any replies on the subject… aside from “don’t knows”. There are no comments or contacts, etc. There are some notes that would be easily copied over.

Thanks in advance!

My db and files folders are located outside of the iocage/jail. Theoretically, if it’s safe to map the files folder without the data being lost, that should be as simple as editing the files location, but I’ve yet to get any replies on the subject get all the help from the Epson Printer Error code 0xf1

Hi,

I am new to the community and a question regarding this post.
I have a one node google’s Ganeti cluster with multiple VM’s, nextcloud and the backup VM’s are the only VM’s with external data storage. The Ganeti master /host) and all VM’s run Ubutuntu 16.04.

  • For nextcloud I have a raid 5 attached over nfs from the host (Ganeti) to the guest (nextcloud)
  • I am using samba 4 AD to manage user’s accounts
  • the user’s folder in the data directory is the UUID of the users

Now I want to move from Ubuntu to debian for the Ganeti master (host) as well as for the VM’s (guests). That means I have to delete every thing, keep the existing raid and reinstall host and guests.
So when setting up a new samba 4 AD and create the users again, they will have a different UUID.
Is there any way to migrate the data of the existing accounts to the new accounts with the new UUID?

Thanks in advanced

After moving to Debian I decided to move from mysql-5.7 to mariadb-server 10.1.37-0+deb9u1, from the default stretch repo. So I did the following:

  1. installed alll needed packages from default debian pero which are needed for nextcloud
  2. created my nextcloud vhost with the new server name
  3. copied the /var/www/nextcloud/directory to the new server
  4. made an sqldump from the old server and copied it to the new server

The first problem ist that in mariadb-server 10.1.37 the innodb_large_prefix is not enabled which caused me a headache with some tables while importing mysqldump. While it is enabled in mysql-server 5.7. So I did the following:

# mysql -u root -p
MariaDB [(none)]> SET GLOBAL innodb_file_format=Barracuda;
MariaDB [(none)]> SET GLOBAL innodb_file_per_table=ON;
MariaDB [(none)]> SET GLOBAL innodb_large_prefix=1;

the log out from mysql. Next I created the nextcloud database and imported the mysqldump and upgraded:

mysql -u root -p -e "CREATE DATABASE nextcloud CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci"
mysql -u root -p nextcloud < /tmp/nextcloud.sql
mysql_upgrade -u root -p --force

then I added the certificate for apache ssl vhost and LDAPS and restarted apache2 and mysql.

The last step was to add the new IP of the server to the hosts /etc/exports and to auto mount the share in /etc/fstab in the new nextcloud server.

And there you go, everything is there and works perfectly.

NOTE:

if the new nextcloud server has a different host name, you have to do the following:

  • replace the server name in the apache vhost

  • replace the server name in /var/www/nextcloud/config/config.php

  • replace all appearances of the old server name with the new one on nextcloud.sql dump before importing it to the database:

    cp nextcloud.slq nextcloud-org.sql
    sed -i 's/old\.cloud\.server/new.\.cloud\.server/g' nextcloud.sql 
    

Done

Note:

If you are using mariadb fro debian buster which is mariadb-server 1:10.3.13-1 , the innodb_large_prefixis reenabled due compatibility. I didn’t tested it yet though.

1 Like

Everything is working well except that the scan doesn’t result in folders or files that are seen by the nextcloud client. The output is the same as above (ie the small table) and the files are in the correct folder (on an external local disk) but just cannot be seen by the nextcloud client.

If I create a file or folder in the folder that they’re in on the nextcloud client then these are seen both by the client and by cd’ng into the folder on the server.

I assume that this is a permission issue? If so what permissions should I assign?

I think it would be a case of reassigning the correct permissions:
sudo chown -R www-data:www-data /path/to/nextcloud/data/
In pure layman’s terms, the process in total seems to be: move the files, change their ownership, run the php console.
At least, that’s what worked for me!

Hey! this is an awesome tutorial shared by you, I am a general user of the NextCloud services. I have no idea about more but I try to use the NextCloud client sync app and or browser upload features.
I have an issue when goes to print from the cloud then appear hp printer offline how to resolve this problem.

Hey, i know this is an old thread however i’m having an issue. I have migrated successfully however forcing Nextcloud to search for the files is having a mental breakdown.

The following i am getting in my terminal and have no idea what to do as the folder its stating missing is actually there.

Command - root@freenas:/mnt/NAS/iocage/jails/Nextcloud/root/usr/local/www/nextcloud # php console.php files:scan data

Error - App directory “/usr/local/www/nextcloud/apps” not found! Please put the Nextcloud apps folder in the Nextcloud folder or the folder above. You can also configure the location in the config.php file.

I’m running on a FreeNAS, the previous parts in the tutorial worked a charm it’s just this part im having issues with.

Hi
Great post helps tons. I am in the middle of switching drives but I have a slightly different setup due to Nextcloudpi, I believe. I will attach a screen shot. not sure what to do with the linked directories.
Thanks in advance.

Appreciate this thread is a little old now, but just chiming into say it was helpful, though my Docker container didn’t have sudo installed. The trick was to exec the container as the www-data user:

docker exec -it nextcloud-app --user www-data /bin/bash

then I could run the php console.php files:scan --all command and it scanned the files - no sudo etc needed.