FTP download directories without zipping first

Is there a way to download entire directories from Nextcloud without zipping first, i.e. add all files in a directory to a download queue and have them download individually? I’m asking because the transfer of directories with a lot of large files is not only incredibly slow due to the zipping first (like a 10gig directory filled with various video and audio files) but also means you need double the amount of space on the target as you have to have room for both the zipped file AND the extracted files. Many devices have limited storage or come without an external SD slot making this a real issue for some.

The alternative - clicking each file sequentially to download - is not only tedious when there are a large number of files involved but also a start/stop operation, in Chrome at least, due to hardwired limitations on the number of allowed open sockets.

At any rate, is there any way to do this? I know Filezilla does it (adding a directory to the download queue simply adds all the individual files) and I was hoping Nextcloud could do it too. Thanks.

You can use any webdav client and copy files directly (WinSCP, Cyberduck, …).

Right but those are third-party programs that require installation, which isn’t possible on some corporate computers or viable at all if using a Chromebook. The advantage of Nextcloud is file access to my own private cloud from a browser, and it seems there should be a way to download an entire directory of my files without having to zip them first or click each file one by one or hunt down a third-party program to do it.

Windows has a native webdav client but it has a few bugs. Chromebook has perhaps as well?

Without zip you can’t just download several files, or do you know a web application that can do it?

If that all doesn’t work for you, you can set up a ftp server to download the files and use it as external storage in Nextcloud.

Are you saying it’s the nature of a limitation of webdav http download? If so, is there a way for Nextcloud to queue all the links within a directory for download over webdav rather than zipping the directory and queuing that for webdav download?

Perhaps this is a feature request for the git rather than a tech support request for this forum.