Hello. I’m new to NextCloud and delighted to have found it. That said, I’m having a heck of a time getting it to work.
I’ve successfully synced one computer and now working on the second. On the second I’m trying to sync about 90GB of information. It would be great if this initial sync would just work tirelessly in the background, but:
- it frequently times out (after about an hour), then restarts the whole “checking for changes” process;
- alternatively it says it can’t get a connection to the server; or
- connection timed out; or
- server replied Internal Server Error; or
- some file is blacklisted.
I’m operating from a Windows 10 Sync Client, a shared hosting environment (on InMotion) with a MySQL database and PHP 7 running on an Apache server.
Both computers are syncing to the same NextCloud Instance but each is syncing to its own folder under the main NextCloud directory.
I do have some 15GB files and it would be helpful to know if there’s something I have to change to allow these files to sync. I’m just using the default config files, I’ve never touched a blacklist file, and the hosted server is always up and running when I check (as you might suspect for a reputable hosing service).
Is there an optimal set of settings and configuration options I should be using? After checking support forums, documentation and error logs, I am no smarter about how to tackle this problem of getting my PC synced. Any assistance, links, etc. would be appreciated.
Do you happen to know how much memory and swap your server has?
And during the the sync with errors, how is the memory distributed? (free, in use, cache/buffer)
I’m asking because I have the same kind of symptoms on my system as well.
In my case my server is an odroid HC2 running Armbian with Nextcloud 13.0.1 installed via snap (13/candidate).
I noticed that while syncing a lot of large files (tutorial videos) I get the same kind of errors you mention. I logged in on the server and noticed that my memory was full. (2 Gb of memory, 128Mb of zram swap). The strange thing (for me) was that 1.4Gb of memory was cached/buffer. Usually under linux that is no problem and can be used again if a process needs it (at least, that is what I read on the internet). When clearing the pageCache ( sync; echo 1 > /proc/sys/vm/drop_caches) I see memory being cleared, syncing working better, but in s few seconds, the memory is full again (back to 1.4Gb of cached/buffer).
Syncing 155Gb of smaller files is no problem however.
I agree, I am trying to use this as a backup solution but it seems everything I do causes this to have some sort of syncing or timeout error. The concept is great but the software is a bit flaky and on top of that I don’t see any way of managing the actual storage used / remaining space on the drives/partitions/etc themselves.
In response to pietere, I believe you have a different problem. If im correct your using a SoYouStart arm storage server in which case the default install will make a few different partitions. In this case you need to install nextcloud on the /home directory as that is where the majority of the storage is.