Hey all! New users here, wondering what I can do to safely allow htaccess without zipping the folder up… I look forward to any replies! Thanks all!
Nextcloud version : Latest, downloaded today
Operating system and version (eg, Ubuntu 17.04): 18.04 LEMP
Apache or nginx version (eg, Apache 2.4.25): nginx/1.14.0
PHP version (eg, 7.1)_: 7.1
The issue you are facing: I get an error every time I try to upload old websites to my cloud… It will not sync .htaccess files.
I’ve looked online and the only results I can find is from 2017 saying zip it up or deal with it. But its 2019 now, isn’t there a way I can I upload these files like I can in ownCloud? I’m a web developer so I put all my client work on my cloud as a backup.
Note: this is a fresh install today with Ubuntu 18.04 Digital Ocean LEMP stack.
Is this the first time you’ve seen this error? (Y/N): Y
Steps to replicate it:
- Syncing .htaccess files creates an error seen here
The output of your Nextcloud log in Admin > Logging:
The output of your config.php file in
/path/to/nextcloud (make sure you remove any identifiable information!):
Unchanged from install.
The output of your Apache/nginx/system log in
I’m pretty much still a newbie, so I don’t know if this is relevant to your issue, but just yesterday I noticed this line in the Server Admin Manual:
Blacklist a specific file or files and disallow the upload of files with this name. .htaccess is blocked by default.
You can search for that text within the manual and it might help to point you in the right direction (seems that you can delete .htaccess from the array associated with the “blacklisted_files” configuration within config.php).
look. Since Nextcloud is basiclally a Webserver, and the permissions are organised via .htaccess files in the folderstructure, it cannot allow for .htaccess files beeing uploaded into the same Folders.
Do you see the Problem here?
Please just dont do it or try it on behalf of your own serversecurity and stableness of your system.
Wouldn’t it make sense to simply encode the filename?
for filenames that match $filesystem_conflict||potential_security_risk ||other_reason_here, perform the following gsub when uploading and storing…
hell, a general purpose filemangler/transmogrifier pipeline at the client <-> disk layer, as well as the server <-> disk layer might be the better way to go, which I imagine is already in place for E2EE… it would seem to me that abstracting the role of the manipulation in question to be a pipeline which handles E2EE, and could also potentially perform other needful filename mangling / permissions alteration checksumming/ fingerprinting /deduping / content ( transcoding / compressing / obfuscating / nerfing / etc ) without other parts of the system needing to care much shrug then again, I am sorta crazy… this might be a horrible idea.
Having written a web server to two, I do understand the problems that come with allowing random .htaccess files to be placed where they will be interpreted by Apache (or any other web server). However, these .htaccess files should not be visible to Apache. After all, it is NextCloud that is serving them, not Apache. (If NextCloud is interpreting them, that’s another story and questionable design decision.)
I suspect this restriction is an anachronism that goes back to the early days of OwnCloud and is carried forward because ensuring introduction of .htaccess files does not cause a security problem is a significant challenge.
I will have a look at the code base. In the mean time, I guess I’ll stick with Dropbox and Resillio Sync for my Devel folder.
Oh, BTW, there are no .htaccess files under any of the user directories; so, I doubt they are used for access control.
Here’s hoping this limitation is removed soon.
Okay. I have had a chance to do a bit more research on this. It appears that the Sabre DAV code does not allow access to .htaccess files. This restriction is inherited by Nextcloud. Since still I want to use Nextcloud to synchronize web development directories, I wrote a bash script to help workaround this limitation. It is neither a perfect nor automatic solution, but it will help if you want to synchronize .htaccess files.
Using the script requires these steps:
- On the machine with the .htaccess files, change to the directory containing the .htaccess files and run the script. This will create a .htaccess_xxxcloud_workaround file for each .htaccess file in the directory and its child directories.
- Wait for Nextcould to synchronize the .htaccess_xxxcloud_workaround files.
- On each machine without the .htaccess files, run the script. This will create a .htaccess file for each .htaccess_xxxcloud_workaround file.
- The .htaccess_xxxcloud_workaround files are simply hard links to the corresponding .htaccess files. This ensures the content and metadata are the same for both files.
- The hard links also .htaccess file updates to synchronize automatically without rerunning the script.
- However, if .htaccess files are deleted and a recreated instead of updated, the hard links are broken and will need to be recreated. Rerun this script with the -r with the -r option to remove orphaned .htaccess_xxxcloud_workaround files and recreate the links.
Hi, .htaccess files are blacklisted by default (search for ‘blacklisted_files’ parameter in doc).
But this can be overriden in your config/config.php file by adding
'blacklisted_files' => array(),
This empty array will allow .htaccess files to be synced!
As stated in the doc, WARNING: USE THIS ONLY IF YOU KNOW WHAT YOU ARE DOING
I think you can add your own other blacklist rules in that array as it will override the global setting.
Apache by default prevents access to those files from the web (from my debian-buster apache2.conf):
# The following lines prevent .htaccess and .htpasswd files from being
# viewed by Web clients.
Require all denied
This is an important and sensible security setting.
In order to avoid using those files you could (theoretically) put all the settings into apache-config-files, but this requires a lot of work and is imho contradictory to using nextcloud in a productive way.
You could also access those files via the shell, ftp or some other protocol.
Got it. I run several Apache servers and completely understand the situation. Still, from a user perspective the web server and its limitations should be invisible. When using the Nextcloud client, many users probably do not even realize a web server is involved (and, they shouldn’t have to.)
I am currently using many synchronization and sharing tools (ResilioSync, rsync, Dropbox, etc.); and, I was really hoping to get it down to a single solution for synchronization and sharing with Nextcloud. Unfortunately, between this .htaccess limitation and inability to sync UNIX/Linux execution mode, I must continue to use several solutions (Nextcloud and Syncthing). Fortunately, all the users except me are served well by Nextcloud alone.
I’ve got same problem. Trying to sync my folders where i keep some websites i made, and .htaccess files are not synced and there is error in client. Today When i opened client and saw many messages with this error, it even crashed (closed without any reason).
Since Nextcloud is a layer between web server and clients, and got own structure of stored data, it would be simple to just change file name on the fly (if .htaccess then save it as .htaccess-dangerousfilenextclouddetected on the server).
But what i noticed is that Nextcloud developing is very slow. I believe there is no coding power in this project. Android app is weak, and even do not allow You to upload existing files, and crash even on creating new directory. The Nextcloud server part is bloiated like hell, and while optimizations are needed because it is quiet slow, they just adding new fancy things, look etc. forgetting what is the main purpose of the project. The Windows client is hybrid of QT5 (adding library to show 1 GUI screen is like using rocket to travel to the city that is 5 kilometers away) that is not even stable. Good example is Virtual Drive feature that was introduced year ago, and there is still no working official client with that.
So i believe, just simple filename change is too hard for Nextcloud to implement.
If your Coding expertise is only half as good as your Rant capabilities i would suggest, you might look inte the codebase yourself and play the pull request game: https://github.com/nextcloud
I know, everybody has a bad day sometime and needs to put there energy somewhere, i´ve been also guilty of this on the one or the other occasion, but insulting the Awesome people (believe me, they freaking ARE) developing here is a bit much.
Indeed. There could be more contributors. If you have a idea how to improve something: Patches are welcome.
Probably. I was not able to find any enhancement or bug ticket at nextcloud/server (searched for .htaccess blacklisted). Hooks are in place. You can register (even with an app) to write and rename events and manipulate the final path. That needs some testing but should be possible.
Also if you are using Nginx, .htaccess is disabled or the data directory is not within the document root changing
blacklisted_files (as suggested earlier) should work for you.
I guess it’s a good default to disable it. If your setup will work with those files you can enable them. That should explain why no one build a rename blacklisted files feature.
@Ascendancer Thanks I’m on GitHub for a while. Getting used to such ranting people
I have. Sell some good quality client. There are developers that sell some cloud storage clients that works as they should. I’m 100% sure, that if Nextcloud client for Windows/Mac and Android, had some low price that would make money, that would be possible to hire some devs. There are more than 500.000 installations of Android client. If it would cost 1$ i’m sure 90% of people would pay for it then You have 450.000$ in pure cash to spend on devs that will make good quality content.
You may sell it even for 5$ or 10$ to make it even better. Same for Windows and Mac clients.
There is nothing wrong when You want people to pay for good product.
Can anybody point to a source of what’s the real danger of allowing to upload .htaccess-files?
The only lead I’ve found is that it is in the used WebDAV code. Is this a security risk or simply best practice? Might it be only relevant if you access the files via WebDAV?
I’d like to know this too.
Is there any risk at all if the /data dir is outside web root? (as is recommended setup).
I can’t have missing files in projects.
The danger is that you’re adding a file that’s interpreted on the server, and not just stored as is.
If it contains rewriting or redirection rules you might end up messing up the file system, or locking yourself out of your own files, etc.
I suggest this solution to the problem at hand:
Simply change the
AccessFileName setting in Apache’s config file into some long random string of characters, and use that in place of
.htaccess wherever needed within the NextCloud universe.
This allows the user to upload .htaccess files without being treated by the webserver in any special way.