Mass upload when using e2e encryption and an S3 backend

Nextcloud version (eg, 12.0.2): 16.0.5
Operating system and version (eg, Ubuntu 17.04): Ubuntu 17.04
Apache or nginx version (eg, Apache 2.4.25): Apache whatever-version-came-with-the-official-snap
PHP version (eg, 7.1): php whatever-version-came-with-the-official-snap

I have enabled end-to-end encryption in my instance, and am using an S3-compatible storage (I’m on DigitalOcean, so I’m using their Spaces object store) to back the Nextcloud instance.

I’d like to be able to upload around 25GB of data, but I’m not keen on using the Linux client for doing so, as I know it will take many hours, if not days, to accomplish. What would be better is if I could push an archive to the instance, or otherwise push the files up to either the server, or the object storage, and have Nextcloud populate from there.

I’m aware of the files:scan tooling, but my understanding is that only works for files on the filesystem, and I’m unclear if it would then push those into the object store, or if they need to stay on the filesystem itself. I also don’t know if there is any tooling that would work with archive files (e.g., tgz, tar.xz, etc). From what I’ve read, if you are using E2E encryption, files:scan may not work at all.

Any suggestions/guidance would be greatly appreciated!

Update: I’ve removed end-to-end encryption, but am still using S3-compatible storage, and would love to have some tools for bulk-upload.

It would also be interesting to have ways to do bulk-download. I recently used Synchronize Ultimate on an Android tablet to sync a directory of notes to the tablet, and the 270MB directory took 36 hours to sync… (it’s much faster on the desktop with official clients).