Hi, I have a client with very slow internet, so I uploaded a backup of the hard drive to the Nextcloud server from another location with better internet speed to an external S3 backup location. The only problem I have is that all the “last modified” metadata has been lost in this process. In other words, when I try to start the cli sync, it seems like the cli tries to upload or download every file (not sure about that). But with this poor internet connection, this can take weeks or months.
So my question: Is there a way to tell the server or the cli that the local files are the latest in any case and maybe just upload all meta data and synchronize both again (yes I know that some files might get wrong meta data, but I’ll take that if it doesn’t take over a month to synchronize)?
First, I looked into the documentation here, but I didn’t find anything.
Before brute forcing our way into a hand-crafted solution, you may want to look into the issue of the official Github page if there is a solution easier to apply.
In the last resort I tried to find a way to “manually” generate an acceptable solution.
If the synchronization is only made using the meta data, file size and hash, then maybe you can do the following :
- Act when the user isn’t working on his files.
- Backup the user’s files.
- On the server and the client use a script that would set all meta data on the same date (on the scope you want to operate => you may want to try this on a test environment first).
- Then activate the client
- The server should ask you to solve conflicts, then just push the version you want.
I’m not 100% sure the client will report this as conflict but it may be worth a try on a test account and false data.