Learning from other Open Source projects like 'Syncthing'


Since we now have a chance to learn from past mistakes, I wanted to make the following suggestion:

As we’re looking down the road on making a killer NextCloud App for iOS and Android, perhaps we can learn from the blood, sweat and tears that have already been invested in other Open Source projects?

For example, perhaps we can learn from the following project to help us make an incredible NextCloud App for Windoze, Apple and mobile devices: https://syncthing.net/

Several years ago, I was a hard-core user of a revolutionary Sync client called BitTorrent Sync, which was later renamed to just ‘Sync’.

I actually used it, together with my ownCloud Server on a ‘head-less’ hosted server setup. It was incredible.

Later BitTorrent Inc., made a huge change in their business model, essentially going back on a promise that they had made to the Open Source community.

This change severely handicapped ‘Sync’, forcing anyone who wanted to use some of the more unique and beloved features, to subscribe to a yearly license.

This move was almost the death of ‘Sync’. In about 48 hours they lost almost their entire following.


Anyway, most of us landed at ‘Syncthing’.

Syncthing has a global community united in one common cause: To make the best, fastest and most secure Sync app under the sun.

With such a focus on the ‘Syncing’ aspect alone, there is a lot I’m sure we can learn from them.

Just my thoughts.


Any thoughts on this, or even about Syncthing?

Hmm perhaps we can Fork off that project to create NextCloud Sync?




I’m currently totally new to the ownCloud/Nextcloud part. I was literally listening a podcast about Nextcloud a minute before writing the current text, but, kind of unfortunately, I know for a fact that people have trouble downloading 100GiB files without it interrupting their use of their internet connection for other activities. For example, one of my clients said that he is not able to download a VirtualBox virtual appliance that I prepared for him, because it would stop him from using the internet connection for other, business related, activities. Often times people, who live in private houses or at country side, in stead of living in a multi-story house of flats that resides at some town with at least 100k residents, have difficulties getting a proper, fast, internet connection, because the price of setting up a private fiber-optic cable is prohibitive.

Long story short: BitTorrent or something that mimics it allows people to pause and resume the downloads of those multi-GiB-sized files and that feature is ESSENTIAL for allowing many people to download VirtualBox (VMware, etc.) virtual appliances.

The format for virtual appliances is universal. For example, a virtual appliance can be exported from VirtualBox and imported by VMware. The reason, why virtual appliances are important, is that for any project there are a lot of dependencies. Practically, there’s the dependency hell. To be able to re-compile a project years after its release, may be to add a menu, fix a typo, modify some feature, a whole development environment that has all of the original dependencies, is needed. The only way to make that feasible is to use a virtual appliance, because the Linux/BSD/whatever distributions change that quickly, the old software might not run on the new hardware, the old operating systems can not possibly contain drivers for future hardware that does not exist at the time of the release of the operating system, etc.

Sometimes one also needs to use closed source development tools and the only way to make those “futureproof”, provided that their licensing software allows it, is to install them to a virtual machine. Whenever one wants to test the installation of whatever software, open-source, closed-source, to a freshly installed operating system that is guaranteed to not have the various environment variables, files, configuration modifications that are at a developers’ machine, then also the only way to try that is to clone a virtual machine that is in a state, where the operating system is freshly installed, not customised. Again, that means virtual appliances and those, if exported, are HUGE.

So, it is paramount to be able to download really huge files and the ordinary click-link-and-download-from-the-start-to-the-end-in-a-single-row will not work. That is to say, having a “Dropbox” that is purely server-side software, is USELESS, because the client side needs some BitTorrent client or something that mimics the BitTorrent by allowing downloads to be performed in chunks, paused, resumed.

Does Nextcloud have a client-side component?

Thank You for reading my letter and
thank You for Your answers.

Actually, given the dependency requirements and the complexity of configuring/customizing some of the open source projects, virtual appliances will probably be THE UNIVERSAL SOFTWARE PACKAGING FORMAT for many applications. An example of that is what I have done with the Gigablast open source search engine. (I am not the author of the Gigablast, just a user, a “leech”.)

My current comment is a separate post in stead of an edit, because there’s the stupid and archaic limit that new forum users are allowed to put only 2 links to a post.

Hi, thanks for you interest in Nextcloud. We actually have clients for Linux, Macos and Windows. However you have to download them from the ownCloud website or repos. See: https://owncloud.org/install/#install-clients

I luckily don’t have a slow internet connection, but in my country it is still common for ISP’s running over cable to have small data limits. (Like 100GB, or a FUP which is essentially a limit of 500GB) and slow upload speeds. The Nextcloud/ownCloud client supports as far as I know interrupting of internet connection or pausing. I do this a lot when uploading big files. This is because the client uploads the files in chunks. (5MB or 10MB IIRC)

Also note that the client supports rate limiting: https://doc.owncloud.org/desktop/2.2/navigating.html#using-the-network-window

Do note that when you have a slow internet connection it’s possible that the client “timeouts”, but it starts where it left of (that’s why chunking is important :slight_smile: ) and tries again.