Disable Zipping

I don’t think he have a RAM issue, i think of a php issue because the server don’t get the right argument and crash the request.
@voxdemonix do you have time this after noon that we can troubleshoot this together ?

Is there really no way to scale the RAM? I mean RAM is not so expensive nowadays and obviously the need seems to be on the user side.

I thought one was from RAM but it was a mistake on my part.

do you have time this after noon that we can troubleshoot this together ?

Of course :slight_smile:

Well, some attempts to download (on the ARM with remote SSHFS) :

  • more than 100 small files in one zip => work
  • download 2Go in one zip => work
  • download +2Go in one zip => not work

PS : I use a translator, I hope it doesn’t disturb the reading too much ^^

How about zip with compress option in the setting.
That way you can put from -v0 to -v9 qhen -v0 is like tar.
How about that?

After discussion with @Nemskiller, the Zip function requires a PHP function that only works on 64-bit systems.
Downloading multiple files in a zip for more than 2GB is therefore impossible (and not reported) from the WEBUI (no problem with webdav, it’s only with zip on webui). When you have the server error, refresh is not working too (you re-coming on the server error).
Downloading one by one file don’t need this PHP function.

This issue is really problematic. Because it’s a feature refgression regarding to the specs required for nextcloud.

Nextcloud does not require a 64bit system in the specs and lot of people run this on not 64bits Arm systems like older rapsberry pi or custom openhardware nas with low cpu/ram.

I think that there are 2 aspects in this problematic :

  • Create an archive
  • Compress this archive

Nowadays lots of people got good internet access so compression is not interesting
Also, in a cloud you store things that are already compressed like Jpeg images or mp3 or MsOffice files
If you store raw text for your next scifi book or code you have better to put it in a gitlab :slight_smile:

So, imho, our question is how to download multiple files and folder in a single way and not about compressiong the files.

I found this article that seems related to the zip function issue: https://stackoverflow.com/questions/5745255/php-aborting-when-creating-large-zip-file

Can we have any workaround to this ? It’s really problematic for the user experience.

What about theses potential solutions :

  • Use cli or alternate php function to create the zip archive
  • Propose other archive format in the UI (tar, 7zip…)
  • Show a warning in webui that takes care of the size of data to be archived
  • Update the specs to require 64bit system for nextcloud

EDIT : The workflow can be different and may help to solve the problem. Maybe the archive can be prepared somewhere instead of using a stream ? It must not be a problem because it only uses diskspace instead of cpu

Thanks for your help

Normally you can download the biggest file of the world. No problem with limited RAM

There is a new issue for this:

Please give it a :+1: if you care about it.

This is equivalent to being able to download only one file at a time, since from two files or from a folder, a zip file is always used. Maybe it is better to disable the download of more than one file or a whole folder.