It is possible to disable the zipping (compression of files) when an user want to download more than one file via the WEBUI ?
This feature can be used for crash an nextcloud instance easy. (just try to download a folder more bigger than the web server RAM and BOUUUUM)
Hi voxdemonix,
I had a look in the current version (16.0.4) but could not find an option for that. I also looked on Github if someone else mentioned this but it doesn’t seem like it.
I would suggest for you to open a new issue on Github (https://github.com/nextcloud/server/issues) and explain your problem over there and ask about a switch option for the admin, to either switch zipping on or off. I do agree with you on the memory part but this could be prevented by setting a certain memory limit in your PHP config. This would however not fix the actual problem and would probably just cause more errors.
Hello @CFelix
I don’t have github anymore (microsoft…).
You can post the issue and copy-paste the link here if you want
I do agree with you on the memory part but this could be prevented by setting a certain memory limit in your PHP config. This would however not fix the actual problem and would probably just cause more errors.
I think this would cause errors when trying to download files who make a larger zip file than what PHP accepts in the config. Users would not understand the problem and would stick to “it doesn’t work”.
The server may also get lost when several users ask for more than RAM allows. (ex : you have 2Go Ram and 3 users try to zipped file in a 1Go folder each).
Hi voxdemonix,
I opened a new Issue / Feature request in Github (https://github.com/nextcloud/server/issues/16971) about this. Let’s see what they come back with.
Big Thanks
By the way, a little feedback:
- the BTRFS filesystem should be discouraged from using nextcloud at the risk of having repeated “expected filesize xxx got xxx” errors when your users upload file.
You cannot bypass the zip process.
Did you ever downloaded several files from a Web Browser without being zipped ?
Sorry to answer by the negative, but for the moment, it’s impossible for nextcloud to deliver you several files from WebUI without zipping them.
Did you ever downloaded several files from a Web Browser without being zipped ?
Of course, without any problem why ?
You cannot bypass the zip process.
In my situation
= server crash when user want download severals files = users prefere GoogleCloud
Their is not this problem on webdav, in OS filesystem (but users don’t [want] understand what is webdav).
Hi @voxdemonix
Do you have any logs regarding the crash of your server/instance? Are you using a 32bit operating system by any chance?
Hello,
Could you provide me video where you download several files at once from a browser without getting zipped ? I’m very curious.
Do you have the PHP module Zip installed on your system ? php-zip ?
You could talk with me on MP if you want me to do direct diagnostic.
And if it is a large folder, you can probably crash a server as well if you download many files at the same time (going through individual connections, …). I am not even sure if the zip then compresses the data or can just stream everything together as just one large file, so it doesn’t need so much resources. Google drive/Dropbox (https://help.dropbox.com/installs-integrations/sync-uploads/download-entire-folders) also zips you all the stuff.
With a browser plugin, this could be possible. Or some fancy javascript?
Could you provide me video where you download several files at once from a browser without getting zipped ? I’m very curious.
Instant Learn :
Ziping into a single file is a practice that aims to spend server machine power to save network bandwidth. This is not required to download multiple files simultaneously.
And if it is a large folder, you can probably crash a server as well if you download many files at the same time (going through individual connections, …).
That I be corrected if I am wrong, but:
-When transferring several files and the server zips them, it must place ALL the files in one big file in RAM.
-When transferring the same files but one by one (not zipped), the server don’t need cache all the datas of the files in RAM. This is much more acceptable for people who do not have Google servers.
With a browser plugin, this could be possible. Or some fancy javascript?
It’s not just possible to allow the admin to choose ?
Do you have any logs regarding the crash of your server/instance?
I’ll test it as soon as I’ve solved to fix the webui that’s been broken since earlier.
Are you using a 32bit operating system by any chance?
32bit (ARM) and 64bit, it’s a cluster.
Transferring files one by one still uses RAM as the data needs to be processed by the web server / PHP script. Is it using less RAM? I kind of doubt that, but that always depends on the RAM management of the software in use. The main question is, is the used RAM released quick and sufficient enough, for it not to crash the server?
Only work around I see at the moment, is to increase the RAM, advise users to download smaller chunks or single files.
Not if it is a stream and it can add this one-by-one. If this is possible without zip then please share you knowledge with the developers, it seems they weren’t able to figure it out. You can do this here:
He doesn’t have a Github account anymore because it’s owned by Microsoft (mentioned it in his first reply)
Okey i haved test :
- on X64, it’s working with a test of downloading 25Go without any problem.
- on ARM with data on SSHFS mount path, with a test of ±6Go :
{"reqId":"**REPLACED**","level":3,"time":"2019-09-03T17:11:01+00:00","remoteAddr":"**REPLACED**","user":"**REPLACED**","app":"index","method":"GET","url":"\/index.php\/apps\/files\/ajax\/download.php?dir=%20linux&files=%5B%22Domotique_LinuxMCE-8.10-final.iso%22%2C%22FreeBSD-10.0-RELEASE-amd64-dvd1.iso%22%2C%22FreeBSD-8.2-RELEASE-i386-disc1.iso%22%5D&downloadStartSecret=**REPLACED**","message":{"Exception":"TypeError","Message":"Argument 2 passed to OC\\Streamer::__construct() must be of the type integer, float given, called in \/var\/www\/html\/nextcloud\/lib\/private\/legacy\/files.php on line 166","Code":0,"Trace":[{"file":"\/var\/www\/html\/nextcloud\/lib\/private\/legacy\/files.php","line":166,"function":"__construct","class":"OC\\Streamer","type":"->","args":[{"__class__":"OC\\AppFramework\\Http\\Request"},6686298112,3]},{"file":"\/var\/www\/html\/nextcloud\/apps\/files\/ajax\/download.php","line":64,"function":"get","class":"OC_Files","type":"::","args":["\/linux",["Domotique_LinuxMCE-8.10-final.iso","FreeBSD-10.0-RELEASE-amd64-dvd1.iso","FreeBSD-8.2-RELEASE-i386-disc1.iso"],{"head":false}]},{"file":"\/var\/www\/html\/nextcloud\/lib\/private\/Route\/Route.php","line":155,"args":["\/var\/www\/html\/nextcloud\/apps\/files\/ajax\/download.php"],"function":"require_once"},{"function":"OC\\Route\\{closure}","class":"OC\\Route\\Route","type":"->","args":["*** sensitive parameters replaced ***"]},{"file":"\/var\/www\/html\/nextcloud\/lib\/private\/Route\/Router.php","line":297,"function":"call_user_func","args":[{"__class__":"Closure"},{"_route":"files_ajax_download"}]},{"file":"\/var\/www\/html\/nextcloud\/lib\/base.php","line":975,"function":"match","class":"OC\\Route\\Router","type":"->","args":["\/apps\/files\/ajax\/download.php"]},{"file":"\/var\/www\/html\/nextcloud\/index.php","line":42,"function":"handleRequest","class":"OC","type":"::","args":[]}],"File":"\/var\/www\/html\/nextcloud\/lib\/private\/Streamer.php","Line":46,"CustomMessage":"--"},"userAgent":"**REPLACED**"}
I think the interessing line is :
Argument 2 passed to OC\\Streamer::__construct() must be of the type integer, float given, called in \/var\/www\/html\/nextcloud\/lib\/private\/legacy\/files.php on line 166"
If their is any sensitive datas in the log, plz modo edit
If this is possible without zip then please share you knowledge with the developers
A simple list of the files and the server send one-by-one ? With this method, user can use directly the downloaded file without waiting the end of the transfert of all files. And it consumes half as much space for the user (with zip you need the space for the zip and the space for uncompressed files)
Imagine this; the folder a user wants to download has sub folders with files in them, how do you want to download them without zipping the folder?
How about using TAR?
It does not compress and it packs its all in one file. It also packs file and/or folders
We can return the question:
What is most important, that the user can download his files, regardless of the structure, or that the user cannot download his files ?
This could be a problem on windows client systems since there is no default application to extract tar archives.
I understand your point, but if a folder structure exists, in my opinion it should then also be possible to download it.
Is there really no way to scale the RAM? I mean RAM is not so expensive nowadays and obviously the need seems to be on the user side.