Drupal The Settings file is not writable - drupal

I'm new to Drupal 8, I installed it locally on XAMPP my Mac machine, but in Verify Requirement I got a problem which is
The Settings file is not writable.
The Drupal installer requires write permissions to ./sites/default/settings.php during the installation process. The webhosting issues documentation section offers help on this and other topics.
I tried a lot on Internet but there isn't effective solution, Could anyone has knowledge in this.

In a terminal, run this command, where /path/to/drupal/root/ is just that: The path to the folder where you installed Drupal.
chmod 644 /path/to/drupal/root/sites/default/settings.php

have you tried:
Go to the path - sites/default/ and execute below command.
chmod -R 777 /files.
Check this link also https://www.drupal.org/forum/support/installing-drupal/2018-02-13/solved-the-directory-sitesdefaultfiles-is-not-writable

Related

Permissions and performance issues using Windows 11 + WSL2 (Ubuntu), Docker Desktop & WordPress via docker-compose local environment

I am working on a project handed over by a previous developer which utilises docker desktop and a docker-compose.yml file to bring up a WordPress project on my Windows PC. This worked okay on my last PC (Windows 10), I've now upgraded to a new PC running on Windows 11.
The project worked fine with a non WSL version of docker, just using hyper-v and docker-compose with some mounted volumes to create the containers and code up a bespoke WordPress theme (running on roots/sage 8).
Since moving to Windows 11, the project has been almost unusable due to massively slow pagespeed (30 seconds minimum to load just about anything).
I tried to solve this by switching to WSL2. I followed tutorials from Microsoft & Docker to get WSL2 setup with Ubuntu.
I did a totally clean install of:
WSL2
Ubuntu
Docker Desktop
VS Code
My understanding is that adding the project files directly into Ubuntu is the best way to go as it should be a lot faster than trying to mount the files from Windows, so I setup SSH keys and cloned the repo into my /home/andy/ directory.
Running docker-compose up successfully fires up the website but I then get hit with loads of issues I never encountered while running the project via docker desktop without WSL2.
WordPress or the web user seems to be lacking permissions to write to the wp-content directory
-- Can't install any plugins
-- Can't upload media
VS code (launched via code . in the project directory) doesn't have permission to write to anywhere in wp-content
FileZilla is unable to create any folders within wp-content including in wp-content/uploads
I need to get this project working, I'm obviously not a docker or linux expert, I just need to get it fired up so I can continue coding with WordPress!
I've trawled through most pages of Google's search results for people having similar issues and I've done a fresh install of the WSL2/Docker setup multiple times after trying different things to get this to work but nothing is working.
I've tried:
adding to the docker-entrypoint.sh chown -R www-data:www-data /var/www/html
-- This was to try and give the www-data user ownership of everything mounted into the var/www/html directory, which appeared initially to help but broke other permissions, like Filezilla being able to write to wp-content
Ensured that all the directories/files in my project folder had the correct permissions for WordPress 755 for directories & 644 for files
-- Had no effect on the issue
Attempted to set 777 recursively on the entire project directory in Ubuntu
-- Still had issues with FileZilla & VS Code, and for some reason, when I did docker-compose up the permissions on wp-content reset themselves from 777 to 755
Added user:$USER within the wordpress part of the docker-compose.yml file, and then did USER=root docker-compose up
-- This appeared to give WordPress the permissions/access needed in order to write to wp-content as I could now download and install plugins, and upload media, but the entire backend of WordPress was very slow and I still couldn't save files with VS code, or download with FileZilla.
Attempted to take ownership of the /home/andy directory where my project files live via sudo chown -R andy /home/andy/
-- This allowed VS Code & FileZilla to both work as expected, saving files and downloading files from a remote server into Ubuntu via WSL2
With weird root user hacky solution + chown on the user directory, I appeared to have got everything "working", but wp-admin was so ridiculously slow, it was barely usable.
Another post I came across said that using .local to access the site was a bad idea since that could cause slow performance with docker, so I then changed the website url in my hosts file, and did a find and replace on the DB/Files to reflect the change, and now the backend of WordPress works fast, but appears to have permissions issues as lots of plugins are complaining about not having the correct permissions and I'm unable to download / install or even deactivate existing plugins.
I'm basically at a total loss at this point as for how to get this working. If anyone has any suggestions I'd be very grateful.
Thanks!

Drupal - drush installation on mamp server

I have a drupal project, and I am trying to install drush for mamp server. I have run:
composer global require drush/drush:dev-master
In the directory of my project, but when I am trying to run:
drush status
I get:
command not found: drush
Follow information listed in : https://www.drupal.org/node/1674222
At the very least, you definitely did not carry these two steps out correctly, which symlink the "drush" binary.
cd /usr/bin
ln -s /Users/myusername/drush/drush
Replace /Users/myusername/ with the name of the directory where you unpacked Drush.
You may need to quit and restart Terminal after completing these instructions before running any drush commands.
http://youtu.be/TCg02d4am_Q for more details
If you're still having problems, I suggest following the instructions here: https://www.drupal.org/node/1674222 and reporting back, editting your answer to reflect the new error you get, so we can actually help you figure out what step of the install you're not running successfully.

Access Denied on Install Symfony on Xamp?

I am new in Symfony and dont know how to install Symfony on my Xamp server. In symfony webm, when i want to download it, Through this command c:> php -r "readfile('http://symfony.com/installer');" > symfony`
then Access denied is occur on my cmd.
I am also beginner to symfony and had a same problem.
I think you also need to install composer first to make it working. I download the composer from https://getcomposer.org/download/ and then installed it. After that the symfony installer command worked for me. Try it and Good luck.
I just had a similar problem, it was not allowing me to download the symfony file (and displaying "Access denied") because I had a folder named Symfony. Renaming the folder solved the project problem.
Probably you do not have write access to your c:\ directory - it is default settings on Windows 7, 8, ....
Create some folder (for ex. c:\symfony_project) and try readfile('http://symfony.com/installer');" > symfony in that folder
For those who are still facing the problem, here how I solved mine. First you have to open XAMPP as administrator by clicking right click on XAMPP make sure its closed first after opening it as admin enable Apache and mysql then open shell, then head to htdocs and make sure there is composure installed and then paste the installing of symfony there. I hope it works after that.

How to correctly install dokku - with or without sudo?

I'm learning dokku right now for simple web deployment. Offical install instructions state this command:
wget -qO- https://raw.github.com/progrium/dokku/v0.3.12/bootstrap.sh | sudo DOKKU_TAG=v0.3.12 bash
I'm not a devop or admin, but as far as I understand this line, it performs all bootstrapping and installation under the root account, thanks to sudo. So dokku will be checked out into a directory with root access rights, and all additional directories like /var/lib/dokku/ will also have root access rights.
The problem is - all articles across the internet about dokku instructs to execute dokku command or do dokku-related actions without sudo. For example, instructions about this dokku database plugin, https://github.com/krisrang/dokku-mariadb, instructs to install it via:
cd /var/lib/dokku/plugins
git clone https://github.com/krisrang/dokku-mariadb mariadb
dokku plugins-install
This is not working, since /var/lib/dokku/plugins have root access rights and git clone will fail with acces denied. It's hard to be a non-admin nowadays, but maybe someone will hint what I'm doing wrong? Do I need to install dokku some other way, or all dokku-related tutorials across internet assume that I'm executing them under root (which is, by my limited admin knowledge, highly not recommended for security reasons).
You should run those three commands as sudo:
sudo su -
The dokku binary will run code as the dokku user even if you execute as root. So it should be fine to run that as is. Once you are the sudo user, just run the install instructions listed in your question. Hope my answer helps ! :)
I also contacted them as they mentioned:
In the future, we'll have a method to install plugins directly with a
dokku command
As far as I can tell, you need to run it as root. A traditional way to install a program without root-privileges is to download the source and compile it, which can be done by running:
git clone https://github.com/progrium/dokku.git
make
make install
Dokku's makefile depends on apt-get, which requires root access to run.
I'm not familiar with dokku or dokku-mariadb, but I think the author of dokku-mariadb also assumes root access.
For people running into the question on wether its fine to install through root user (on fresh created VMs as per the guide), try checking this Github issue:
https://github.com/dokku/dokku/issues/961
Since the commands related to dokku are prefixed with # rather than $, it means that its not necessary to run them from non-root user. It also makes writing suddo unnecessary (and form my experience counterproductive).

Error "could not delete" with Composer on Vagrant

I have a Vagrant running Linux and I'm trying to install Symfony.
After the command composer create-project symfony/framework-standard-edition ./ "2.5.*" I have the error :
[RuntimeException]
Could not delete ./.git/objects/pack/tmp_idx_llwUKb:
If I try to composer update another project, I always have this kind of error Could not delete
Any ideas?
Edit: For a simple sudo composer update -vvv on another project:
- Installing sonata-project/admin-bundle (dev-master 8a022aa)
Failed to download sonata-project/admin-bundle from source: Could not delete /vagrant/crm_neo/vendor/sonata-project/admin-bundle/.git/objects/pack/tmp_idx_hchQhc:
Now trying to download from dist
- Installing sonata-project/admin-bundle (dev-master 8a022aa)
Failed: [RuntimeException] Could not delete /vagrant/crm_neo/vendor/sonata-project/admin-bundle/.git/objects/pack/tmp_idx_hchQhc:
[RuntimeException]
Could not delete /vagrant/crm_neo/vendor/sonata-project/admin-bundle/.git/o
bjects/pack/tmp_idx_hchQhc:
Exception trace:
() at phar:///usr/local/bin/composer/src/Composer/Util/Filesystem.php:193
Composer\Util\Filesystem->unlink() at phar:///usr/local/bin/composer/src/Composer/Util/Filesystem.php:151
Composer\Util\Filesystem->removeDirectoryPhp() at phar:///usr/local/bin/composer/src/Composer/Util/Filesystem.php:129
Composer\Util\Filesystem->removeDirectory() at phar:///usr/local/bin/composer/src/Composer/Util/Filesystem.php:35
Composer\Util\Filesystem->remove() at phar:///usr/local/bin/composer/src/Composer/Util/Filesystem.php:80
Composer\Util\Filesystem->emptyDirectory() at phar:///usr/local/bin/composer/src/Composer/Downloader/FileDownloader.php:108
Composer\Downloader\FileDownloader->doDownload() at phar:///usr/local/bin/composer/src/Composer/Downloader/FileDownloader.php:89
Composer\Downloader\FileDownloader->download() at phar:///usr/local/bin/composer/src/Composer/Downloader/ArchiveDownloader.php:35
Composer\Downloader\ArchiveDownloader->download() at phar:///usr/local/bin/composer/src/Composer/Downloader/DownloadManager.php:201
Composer\Downloader\DownloadManager->download() at phar:///usr/local/bin/composer/src/Composer/Installer/LibraryInstaller.php:156
Composer\Installer\LibraryInstaller->installCode() at phar:///usr/local/bin/composer/src/Composer/Installer/LibraryInstaller.php:87
Composer\Installer\LibraryInstaller->install() at phar:///usr/local/bin/composer/src/Composer/Installer/InstallationManager.php:152
Composer\Installer\InstallationManager->install() at phar:///usr/local/bin/composer/src/Composer/Installer/InstallationManager.php:139
Composer\Installer\InstallationManager->execute() at phar:///usr/local/bin/composer/src/Composer/Installer.php:548
Composer\Installer->doInstall() at phar:///usr/local/bin/composer/src/Composer/Installer.php:217
Composer\Installer->run() at phar:///usr/local/bin/composer/src/Composer/Command/UpdateCommand.php:128
Composer\Command\UpdateCommand->execute() at phar:///usr/local/bin/composer/vendor/symfony/console/Symfony/Component/Console/Command/Command.php:252
Symfony\Component\Console\Command\Command->run() at phar:///usr/local/bin/composer/vendor/symfony/console/Symfony/Component/Console/Application.php:889
Symfony\Component\Console\Application->doRunCommand() at phar:///usr/local/bin/composer/vendor/symfony/console/Symfony/Component/Console/Application.php:193
Symfony\Component\Console\Application->doRun() at phar:///usr/local/bin/composer/src/Composer/Console/Application.php:135
Composer\Console\Application->doRun() at phar:///usr/local/bin/composer/vendor/symfony/console/Symfony/Component/Console/Application.php:124
Symfony\Component\Console\Application->run() at phar:///usr/local/bin/composer/src/Composer/Console/Application.php:84
Composer\Console\Application->run() at phar:///usr/local/bin/composer/bin/composer:43
require() at /usr/local/bin/composer:15
It happened once to me and it turns out that I was hitting composer's timeout.
You could take the following measures to gain some speed:
Increase composer process-timeout (default 300) (not really needed if the following settings will help you gain speed, but can't hurt)
Set dist as preferred install type.
Enable https protocol for github, which is faster.
~/.composer/config.json
{
"config": {
"process-timeout": 600,
"preferred-install": "dist",
"github-protocols": ["https"]
}
}
If you still have problems after that, you can also clear composer's cache:
rm -rf ~/.composer/cache
I was trying to update project dependencies (using composer update) during a Laravel Framework upgrade exercise in my local Homestead environment (having run vagrant ssh to login as the default "vagrant" user) and none of the previous answers in this thread made any difference to the...
Could not delete /home/vagrant/projects/projectname/vendor/kylekatarnls/update-helper/src/UpdateHelper
...error message I repeatedly encountered.
The only thing that worked for me was to include a composer option as follows:
composer update --no-plugins
Plugins are used to alter or extend the functionality of Composer. The above command disables all installed plugins. Unfortunately, I'm not clear as to why this command worked for me, as I certainly haven't written any plugins myself. All I can conclude is that there was an erroneous Composer plugin installed that was causing this issue.
TL;DR Switch to Docker. It is the industry standard.
I came across this issue and spent quite some time doing research. I've tried every possible option to fix it but none of them worked for me. For me, the bug occurred on GNU/Linux host with Vagrant and VirtualBox provider.
It turns out it's a VirtualBox bug related to the file system layer and race conditions when creating/deleting files. It occurs only for VirtualBox shared folders, not for regular ones. The sad part is that it seems like it's not going to be fixed any time soon.
Some guys reported that they were able to solve the problem using the following tricks:
Downgrading to VirtualBox version 6.0.4.
Using nfs or rsync instead of shared folders.
Patching composer to add some pauses after certain operations.
Disabling plugin usage with --no-plugins option.
But all of this seemed dirty to me. I personally was able to use a workaround suggested on GitHub which is to configure composer to install packages from sources. That's a simple and kind of clean trick which should not have significant negative side effects on your workflow. Try putting the following config into your ~/.config/composer/config.json. Or instead you can edit your composer.json accordingly depending on your needs. Keep in mind that composer.json will override your global config.
{
"config": {
"preferred-install": "source"
}
}
Just got the same issue.
I see the problem in accessing to some local files. In my case target directory was under "root" and I'm not the root user.
Solution
Change permissions/owner of your files/directory.
Redefine owner:
sudo chown myuser:myuser -R /path/to
Maybe there is some lack of permissions for group which you are in.
So, try to run:
sudo chmod g+rwX -R /path/to
Or maybe you may run your command with "sudo" if it works for you (not recommended). :)
P.S. Never use 777. It's not secure.
UPD1
Another thing, you may found out useful to solve the root of the cause, to wrap up your composer binary to run it always behalf a certain user.
$ cat /usr/local/bin/composer
#!/bin/bash
# run composer behalf www-data user
set -o pipefail
set -o errexit
set -o nounset
#set -o xtrace
[[ "${DEBUG:-}" = "true" ]] && set -o xtrace || true
composer_debug=$([[ 'true' != "${COMPOSER_DEBUG:-}" ]] || echo '-vvv' )
sudo -u www-data -- /usr/bin/composer ${composer_debug:-} $#
I had this problem when provisioning the machine, which was bootstrapped to run composer install. I simply exited the VM and ran composer install on the code on my host machine and it worked.
So, if you're facing this problem while running Composer inside the VM, just try running Composer from outside the VM.
Update: As pointed in the comments below, this can pose some problems with different versions of packages being installed owing to the difference in system configurations between the local and Vagrant environments, so exercise appropriate caution while trying this.
We're running into issues also. There are several people who seem to have this issue, a fix has not been provided. For more information you can look into github issues of vagrant-winnfsd.
for my case, I only used the NFS folders type instead of the shared folders and it works:
folders:
- map: ~/code/cs-cart-trial
to: /home/code/cs-cart-trial
type: "nfs"
Just run
sudo chmod -R 777 /folder/path
This will give you write access to the folder you are running composer in.
I know this is an old post but this works so I have to share it.
In my case I was trying composer update but I got
[RuntimeException]
Could not delete .../vendor/bin/php-parse:
Despite I'm using Laravel framework, this question was the first link in Google, so I decided to post an answer.
My solution was to grant ownership for vendor: sudo chown -R $USER:www-data vendor/ and
sudo chown -R $USER:www-data composer.json
Update: my host OS was Ubuntu 16.04.
Having the same issue for Cakephp 4.2.1
Error:
Could not delete /var/www/vendor/cakephp/plugin-installer/src:
Solution:
Based of https://stackoverflow.com/a/63139337/1110760
After trying out several options mentioned above, for me this was the easiest way to solve it.
composer install --prefer-source
The argument --no-plugins worked as well, sort of. It skipped some packages but my localhost seemed to work just fine. This is faster, but it's missing some.
On AWS I got this error while deploying Yii framework project there was this
/var/app/current/vendor/
folder i deleted everything inside it came back to my document root and ran composer update it fetched all the repos again.
In my case , by removing the plugin and re-create the box solve the issue.
For me it caused by composer's timeout. I checked my internet speed and found it dropped to 0.7M which is nearly unusable. After I reconnected the wifi and have my internet connection speed back to normal, the errors are gone.
This has something do to with the synchronization of the folders between host and guest OSes, the folder might be simply temporarily locked from your host machine.
The solution is simply to delete the offending .git folder from your host OS or reboot the machine and launch composer install again.
Ideally each OS has its own dependencies and different binaries, therefore you should isolate your /vendor folder out from the rsync/vagrant folder share, likewise you would do the same with /node_modules on a Nodejs project.
Another thing to check for, Composer needs to run in the context of a directory it has permissions to.
In my case I was trying to issue a create-project command from /var/www, aimed against /var/www/html. /var/www is owned by root, /var/www/html is owned by the same user I executed Composer as (www-data). I got the following error; Could not delete /var/www/html/:
Issued the same Composer command from within /var/www/html itself and it worked perfectly.
To me it helped to install a (new) version via command line from download homepage https://getcomposer.org/download/. I can exclude some file permissions as I was root with chmod +R 0777, though I had virtualbox mounted drive. Anyway since new version worked, would mean it was version, or running a new version via php phar, and the original bin belonged to root
php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
php -r "if (hash_file('sha384', 'composer-setup.php') === '48e3236262b34d30969dca3c37281b3b4bbe3221bda826ac6a9a62d6444cdb0dcd0615698a5cbe587c3f0fe57a54d8f5') { echo 'Installer verified'; } else { echo 'Installer corrupt'; unlink('composer-setup.php'); } echo PHP_EOL;"
php composer-setup.php
php -r "unlink('composer-setup.php');"
I have solved the problem by creating a mount :
In /home/vagrant create a folder named vendor
then apply command : mount --bind /home/vagrant/vendor /path/to/source/vendor
It's a bit unrelated with the question, but in my case with Docker. It was failing because Webpack was watching and it didn't allow other files to be deleted.
It worked when I turned off Webpack.
I had same problems trying composer install
- Installing aws/aws-sdk-php (3.218.3): Extracting archive
Install of aws/aws-sdk-php failed
In Filesystem.php line 330:
Could not delete /home/vagrant/code/my-project/vendor/composer/cefa44c2/aws-aws-sdk-php-a1bd217/src:
What I did I comment it out
type: "nfs"
from my homestead.yaml
and make a fresh vagrant provision
I'm using Oracle Virtual box 6.1 on Windows 10.
Turn of Dropbox or other file sync
Best hack i found was to replace the unlink commands with the one below. I am running ubuntu.
sudo nano +219 /usr/share/php/Composer/Util/Filesystem.php
exec("sudo rm -rf $path");
return true;
For Windows users
Wow, I can't believe how long it took me to realize this, and sadly it has happened multiple times, and I'm finally writing this note so that I and others can quickly recover next time.
Just use Windows Explorer to go delete the /vendor/whatever_project_name folder instead of trying to delete it from the Vagrant command line.
Then run composer update to reinstall the dependencies.

Resources