My Nextcloud repository, hosted in the Internet, is 21 Go big. When installing a new computer, I need do synchronize it from scratch. It takes around 20 hours. Is it normal? How to avoid such a lengthy process? Where am I wrong? Thank you for any hint. Have a nice day!
@lbloch depending on you internet speed it could be a good result. Syncing 21GB in 20 hours results in AVERAGE speed of 2,3MBit/s (0,3MB/s). Depending on your internet connection. number and size of the files this could be expected sync speed (from the internet). Maybe your hosting provider applies some throttling?
Especially small files are less efficient to sync. I had similar issue shortly syncing within local LAN was as fast al 10-15 KBit/s because of configured external storage. “clean” Nextcloud running on 4-core Celeron with 8GB RAM can sync 50GB of data within 1-2 hours in LAN at average speed of 30-50MB/s (again given network is not a limit the speed greatly relates to the kind of files). fully saturated 1GBit/s network link can ship around 70GB/s netto - 50GB/s from Nextcloud sounds pretty perfect for me.
bandwidth calculator: https://calculators.io/bandwidth/
How much computers? Perhaps you better host 21 GB at home.
Nextcloud is for me a platform to get access to my data from anywhere, on any computer, and to have anybody else to worry about power outages and network failures. So remote access is… useful. I’m looking for a way to make a backup of the repository, hopefully in a single archive file, and to transfer it faster than in 20 hours. Does it exist?
Have a nice day!
Somebody, not anybody, sorry. My cloud operator, as it happens.
archive/backup a completely different situation. If you want to perform a backup you don’t just sync the files to another host, Nextcloud consist of additional components, mainly DB to provide functionality on top of file storage e.g. sharing. usually you access the files through the underlying server system and copy then locally or remotely, additionally you backup DB and apps… the whole procedure in detail:
there are more guides and scripts out there… coping the files from the server should run at full speed this hardware/network is capable, you can expect it to be faster then syncing as no additional checks and DB operation are necessary .
What kind of repo is this? Always updated whole 20 GB?
You can use e.g. Restic to backup data and then in pieces, so that probable you do not need to sync whole 20GB all the time.
Thank you for your answer, I will look at the manual (RTFM, of course!).
Well, I understand server operations described by the manual. But, if what I need to do is just to install the data in a new client computer, may I just copy the data files from the server, via rsync or scp for instance, or do I need to execute some other operations?
I think if you want to use the data in your nextcloud desktop client you must also use the functions of the client to sync the data and you can not use the much faster rsync, sftp, …
Normally if someone hosted 20 GB in the internet he/she uses e.g. a managed nextcloud with high availability and the backup (mostly extra costs) direct in the cloud from the hoster. Than you do not need a full local copy because of fast internet. Then you can sync only parts of the nextcloud. If you are self hosting nextcloud in the internet you can backup the data also in the internet.
If you always sync all data between cloud and all clients then perhaps cloud is not your solution.
I get the feeling you are little confused with terms of backup, restore and sync. Self-hosting complex software package requires at least basic understanding of the technology you use from the start (setup) to daily operations like backup, monitoring and security to the restore and disaster recovery. If this is not the case better you choose a hoster who offers you a good package with included backup and monitoring or look for professional support.
As you see there are people in this forum who are happy to help with you issues but you can’t expect that somebody explains you all the basics. So far this thread is completely confusing: we started with sync speed now we are talking about backup/restore and you still didn’t provided exact description of the issue you are facing and what you tried so far.
Thank you for your message. I understand perfectly that i am no more eligible to ask questions on this forum. I apologize for the time you lost with me. So bad. Bye bye.
Your initial post is not very clear. It’s like saying, “Everytime I’m going to New York, it takes me more than 5h”. In such statements, it is more about what you didn’t mention so it’s really hard to help and people don’t want to spend much time to ask all the details (if you open a new topic, it should show you a issue template).
The other problem can be shared hosters, some work fine, other use very “strange” setups that make it difficult to host Nextcloud. On top of that, on shared environments it is much harder to debug things and users on shared environments are less experienced. So you can spend days debugging with help of more experienced users just to find out that something in the shared hosters environment is not compatible with Nextcloud (or some tuning stuff to improve performance is not possible) …
Especially the transfer performance with large number of files depends heavily on the database cache, the redis file-locking, things you mostly can’t change in hosted environments)
If you are interested, you can start at home with a raspberry pi or some virtual Server, and in the end if it’s not enough at home, you can rent virtual servers on the net (few $€/month).
Thank you for your analysis. Maybe I should try to move from Sqlite3 to a more elaborate database system. Fortunately I have a shell access on my hoster’s system…
Many small files are tricky. I’d done some tests a long time ago when it was still ownCloud, back then, the configuration of the database and caching has a huge impact on syncing performance of many small files: https://github.com/owncloud/core/issues/20967
So mysql should be better but I wouldn’t expect too much.
I will try it. Thanks again.