I am a newbie in php, mysql. I have written a hello.php script, which I am trying to copy into /var/www directory (and will later want to open it through web browser). The problem with the same is that I am not allowed to save/write any files in /var/www despite me being the root. I tried implementing steps in this question, but I get the following error when I process the third line
find /var/www/ -type f -exec chmod g+w '{}' ';'
chmod: changing permissions of `/var/www/index.html': Operation not permitted
I know symlink is also an option. I would want to be able to write/copy files directly to /var/www/ directory.
Any suggestions on what is going wrong?
it'matter of *unix permissions, gain root acces, for example by typing
sudo su
[then type your password]
and try to do what you have to do
Do you have a file in /var/www called hello.php already that has permissions on it? Maybe the system can't replace the file?
Although, root access should supersede any user on the system.
Have you tried applying permissions to the www folder?
If you can do this, try the following:
sudo chmod -R 777 /var/www
then do:
sudo cp hello.php /var/www
I only recommend doing this if you know 100% that it is ok to set permissions on the whole www folder. By the sounds of it, you are running on your own production server as most live/shared hosting servers are setup so that the www folder is not in the /var folder (instead it is in the home folder of the user).
Be VERY careful when doing anything with the sudo prefix though, you can seriously damage your system if you do it wrong.
Are you in a development environment ? If Yes, You can do
chown -R user:group /var/www
so you will be able to write with your user.
Execute the following command
sudo setfacl -R -m u:<user_name>:rwx /var/www
It will change the permissions of html directory so that you can upload, download and delete the files or directories
Encountered a similar problem today. Did not see my fix listed here, so I thought I'd share.
Root could not erase a file.
I did my research. Turns out there's something called an immutable bit.
# lsattr /path/file
----i-------- /path/file
#
This bit being configured prevents even root from modifying/removing it.
To remove this I did:
# chattr -i /path/file
After that I could rm the file.
In reverse, it's a neat trick to know if you have something you want to keep from being gone.
:)
sudo chown -R $USER:$USER /var/www
First off, this has nothing to do with php. This is a unix permission issue. You need to login as a superuser ( sudo/su ) and type your password, then try that command.
$ su
(type password )
\# your command
$ sudo command
$ (type password)
It might also help if you actually specified the operating system you use.
sudo cp hello.php /var/www/
What output do you get?
If none of the above works, you might be dealing with a vfat filesystem. Use "df" to check.
See http://www.charlesmerriam.com/blog/2009/12/operation-not-permitted-and-the-fat-32-system/ for more details.
First of all, you need to login as root and than go to /etc directory and execute some commands which are given below.
[root#localhost~]# cd /etc
[root#localhost /etc]# vi sudoers
and enter this line at the end
kundan ALL=NOPASSWD: ALL
where kundan is the username and than save it. and then try to transfer the file and add sudo as a prefix to the command you want to execute:
sudo cp hello.txt /home/rahul/program/
where rahul is the second user in the same server.
You just have to write sudo instead of su.
Then just copy the PHP file to the var/www/ directory.
Then go to the browser, and write local host/test.php or whatever the .php filename is.
Enter the following command in the directory you want to modify the right:
for example the directory: /var/www/html
sudo setfacl -m g:username:rwx . #-> for file
sudo setfacl -d -m g:username: rwx . #-> for directory
This will solve the problem.
Replace username with your username.
The problem is a privilege issue navigate to the var/www/
right-click in it and select open as admin
then continue your work
Related
I've just started to learn Linux Command Line. The setup I am on is AWS Lightsail bitnami Wordpress. I work with wordpress primarily.
I'm still confused about file permissions in Linux. Why do I have permissions denied when I sign in as the owner?
Whenever I have to ftp, overwrite, edit files and folders, I have to change the permissions settings for each affected folders and files manually via SSH.
More often than not, at the end of the day, I lost track of which folders and files' permissions I have edited and need to reset to default. I find this a chore and I believe there is a better way.
I wonder if there are lines of command that can
give me full access to all directories, folders, subfolders and files at once?
change the permissions for directories, folders, subfolders and files at once?
reset the permissions of all edited files to default/original all at once?
To check the permission of the file
sudo stat TARGETFOLDER
To change the permission of the file
sudo chmod 777 TARGETFOLDER
Bitnami Engineer here,
We configure the permissions of the WordPress' files by setting bitnami as the user owner and daemon as group owner of the files. This configuration allows you edit the files using the bitnami user and the webserver can use the daemon group to do the same. However, if you make changes to the application using the web interface (install plugins or themes), those new files are owned by daemon:daemon (the Apache and PHP-FPM services use that user and group so they generate the files using those permissions configuration) and you won't be able to edit them unless you use the command line and sudo. In that case, you can run the following commands to be able to edit those files using the bitnami user
sudo chown -R /opt/bitnami/apps/wordpress/htdocs
sudo find /opt/bitnami/apps/wordpress/htdocs -type d -exec chmod 775 {} \;
sudo find /opt/bitnami/apps/wordpress/htdocs -type f -exec chmod 664 {} \;
sudo chmod 640 /opt/bitnami/apps/wordpress/htdocs/wp-config.php
You can learn more about this here
https://www.youtube.com/watch?list=PLGgVZHi3XQNn4x0DU7Qj1r_inej3xEUda&v=nKfle7O0vN8&feature=emb_title
1 and 2, you can try chmod -R option.
3. i think it can not. you should restore it. maybe it help.
First of all I read a lot of topics from stackoverflow and other sites but I can't find the answer.
I am trying to setup my own WordPress hosting solutions for my WordPress sites. So, I have server from digitalocean I install nginx,hhvm and mysql.I can install WordPress without errors. But when i try to install plugin from WordPress dashboard, WordPress asks my ftp credentials. How can I remove that and automatically install plugins without ftp credentials?
I know I can modify wp-config file adding define('FS_METHOD','direct') but at that time i am getting 'Could not create directory.' error and this is not right solution because every time I install new wordpress I should do it again and again.
Another thing writing ftp credentials to wp-config file again not right solution because i have to repeat again and again in every site.
Last thing I found is this code;
sudo chown -R www-data:www-data wordpress-foldername
it works but again i have to do that for every wordpress site. Can i automate this? I don't want to make this every install new wordpress. How can i configure one time and it works all the time?
I also used Digital Ocean for hosting personal and development project,
Whenever I have a new project I need to used command line to install wordpress, and configure the install, so I'm not sure how you did it.
Here's the full code, using command line
browse to the html directory
cd /var/www/sites-folder-directory
download wordpress package and extract
wget http://wordpress.org/latest.tar.gz && tar xfz latest.tar.gz
move wordpress files to root folder & Delete Wordpress Folder and tar file
mv wordpress/* ./
rmdir ./wordpress/ && rm -f latest.tar.gz
Assign these new files and folders to default nginx user and group, you should check your default nginx user and group under /etc/nginx/nginx.conf
chown -R nginx:nginx /var/www/sites-folder-directory
Not really nescessary but if you want to change folder and file permission you can use command below
# Change Folders Permission
sudo find <directory> -type d -exec chmod 755 {} \;
# Change Files Permission
sudo find <directory> -type f -exec chmod 644 {} \;
if you have correct folders and files permission and the current directory is assign to default user and group it wouldn't ask for FTP credentials
if you want to do it for every website that already installed so you have to change the file owner inside /var/www to www-data.
So your command should be like this
$cd /var/www/
$sudo chown -R www-data:www-data *
this will affect all of your websites
I have installed nginx on Ubuntu 12.04. However, nginx does not seem to follow symlinks. I understand that there is a config change required for this but I am not able to find where to make the change. Any help appreciated.
In my case nginx was already configured to follow symbolic links. But the issue was the user nginx could not access my home files and therefore symbolic link to my home directory was not working.
Example
Suppose we have symbolic link /usr/share/nginx/www/mylink -> /home/u/html
cd /usr/share/nginx/www
mkdir -p /home/u/html
sudo ln -sv /home/u/html mylink # creates mylink -> /home/u/html
Give permissions
Give the read and execute permissions using chmod and find:
chmod +rx /home/u
chmod +rw /home/u/html
find /home/u/html/php -type d -exec chmod +rx {} +
find /home/u/html/php -type d -exec chmod +w {} + # optional
Notes:
The permission x is named execute. But when applied to a directory, this permission allows to recurse the directory tree (see Unix modes).
The command find ... -exec chmod ... recursively changes the permissions. We could also use the command chmod -R +rx /home/myuser/html but this last command also gives the execution permission to all regular files, and we do not want that. The option -type d execute chmod to only directories.
The last optional command gives write permission if your PHP scripts require to write data. Try to limit write permission to only required directories for security reasons.
Test
No need to restart ngnix, just press Ctrl+F5 in your browser.
Caution: It is not recommended to create symbolic links pointing to your home directory because a mistake on read/write access or a wrong symbolic link may expose your digital data...
Reference: Arch wiki on nginx
Have a look at the following config option from nginx docs:
Syntax:
disable_symlinks off;
disable_symlinks on |
if_not_owner [from=part];
Default: disable_symlinks off;
Context: http, server, location
This directive appeared in version 1.1.15.
if olibre's answer doesn't help edit the file /etc/nginx/sites-available/default and add this line where you've specified your server root directory.
autoindex on;
save the file and restart server
/etc/init.d/nginx restart
Today, I installed testlink. And after I select 'new Installation' and choose 'I agree' option, it failed at the second step. The failed message are as following:
Read/write permissions
For security reason we suggest that directories tagged with [S] on following messages, will be made UNREACHEABLE from browser
Checking if C:\xampp\htdocs\testlink\gui\templates_c directory exists OK
Checking if C:\xampp\htdocs\testlink\gui\templates_c directory is writable (by user used to run webserver process) OK
Checking if /var/testlink/logs/ directory exists [S] Failed!
Checking if /var/testlink/upload_area/ directory exists [S] Failed!
So, can anyone give me a hand? Many thanks!
In C:\xampp\htdocs\testlink\config.inc.php file, change
$g_repositoryPath = 'C:\xampp\htdocs\testlink\upload_area';
$tlCfg->log_path = 'C:\xampp\htdocs\testlink\logs';
Worked for me , make sure you dont have the slash at the end.
i.e, make sure that it is NOT:
$g_repositoryPath = 'C:\xampp\htdocs\testlink\upload_area\';
$tlCfg->log_path = 'C:\xampp\htdocs\testlink\logs\';
If you installed the XAMPP or testlink in another directories, change the paths above accordingly.
Go to config.inc.php and log directory ($tlCfg->log_path) edit the path to C:\xampp\testlink\logs and upload directory ($g_repositoryPath) to C:\xampp\testlink\upload_area
In some cases, you would do like this:
Go to C:\xampp\htdocs\testlink\config.inc.php1
and log directory ($tlCfg->log_path) edit the path to C:\xampp\htdocs\testlink\logs
and upload directory ($g_repositoryPath) to C:\xampp\htdocs\testlink\upload_area
Then you have:
$g_repositoryPath = 'C:\xampp\htdocs\testlink\upload_area';
$tlCfg->log_path = 'C:\xampp\htdocs\testlink\logs';
I had paths set correct , also user, group and access are set correct and still can not get rid of issue. It took me very long to go to the root cause, the issue lies at- http daemon does not have access to file in concern due to SELinux policies. So simple chown, chmod would not help(group and user access). For testlink 1.16 I resolved it with re-installing with sudo user, but for upgrade, an issue arose again even with sudo user.
And resolved issue by executing following commands, I hope this helps. (Note: You might have to mend attributes to run it successfully)
$chcon -t httpd_sys_content_rw_t "<path_to_testlink_folder>/gui/templates_c/"
$chcon -t httpd_sys_content_rw_t "/<path_to_testlink_folder>/upload_area/"
$chcon -t httpd_sys_content_rw_t "<path_to_testlink_folder>/logs"
$semanage fcontext -a -t httpd_sys_content_rw_t "<path_to_testlink_folder>(/.*)?"
$restorecon -R -v path_to_testlink_folder
Ubuntu 12.04 - All you have to do is chmod 777 these directories, Fails will become Pass.
~$ cd into /var/www/testlink
~$ sudo chmod 777 ./gui/templates_c/
~$ sudo chmod 777 ./upload_area/
~$ sudo chmod 777 ./logs/
Whatever the instructions say is total BS. Making these directories unreachable from browser is optional, and that created a confusion. if you chmod 777 them, your Fails will turn into pass and now you'll be able to proceed to step 3 of your testlink installation. Tested with testlink version 1.9.5.
For Mac OS Users try this in 1.9.19 version:
Make Sure with your folder name.
In config.inc.php file:
$tlCfg->log_path = TL_ABS_PATH . 'logs' . DIRECTORY_SEPARATOR;
$g_repositoryPath= TL_ABS_PATH . 'upload_area' . DIRECTORY_SEPARATOR;
After this if you got read write permission issue failed.
Goto testlink -> logs / upload_area -> press Command + I -> in Permission Enable Read Write to Everyone.
on Linux; ensure the paths are as follows:
$tlCfg->log_path
$g_repositoryPath
are
/var/www/html/testlink/logs/
/var/www/html/testlink/upload_area/
Valid for ubuntu 16.04 LTS add permisions
Change:
$g_repositoryPath = 'var/www/html/testlink/upload_area'; //linux user
$tlCfg->log_path = 'var/www/html/testlink/logs';
~$ cd into /var/www/testlink
~$ sudo chmod 777 ./gui/templates_c/
~$ sudo chmod 777 ./upload_area/
~$ sudo chmod 777 ./logs/
In CentOS go to /var/www/html/testlink-code-1.9.16 and edit the file custom_config.inc.php replace these two lines
// $tlCfg->log_path = '/var/testlink-ga-testlink-code/logs/'; /* unix example */
// $g_repositoryPath = '/var/testlink-ga-testlink-code/upload_area/'; /* unix example */
with
$tlCfg->log_path = '/var/www/html/testlink-code-1.9.16/logs/';
$g_repositoryPath = '/var/www/html/testlink-code-1.9.16/upload_area/';
Make sure you have disabled the selinux. If not to do so edit the file /etc/sysconfig/selinux and change the variable SELINUX to disabled and reboot the machine. Now these errors should have gone.
on ubuntu 18.04, will need to run
apt-get remove apparmor
in order to install it
To solve the problem :
Checking if /var/www/html/testlink-1.9.16/gui/templates_c directory is writable (by user used to run webserver process) on Centos 7.
Disable SELinux, and then restart your system.
You should no longer have this error message.
I'm hosting a wordpress site on ec2 and I'm trying to update my theme through the admin screen. Its asking me for Hostname and ftp username and password. Is ec2-xxx.compute-1.amazonaws.com:22 my hostname? I tried along with ec2user and root for my ftp username but no luck. What am I doing wrong?
Skip the FTP info altogether and just change the permission of the directory structure where Wordpress is installed.
VIA SSH
sudo chown -R apache:apache path/to/wordpress
sudo makes sure you execute as the root user
chown will change the owner of the directory
-R will make it recursive, so it changes all files and directories within
apache:apache is user:group
And then the path to wordpress. Could be /var/www/html/sitename.com or if you navigate to the folder where Wordpress is installed, you can use a period (.) to tell it to change the current directory.
This will make is so that you can't copy files via sftp though, so it is good to change at least the themes directory back to the ec2-user:ec2-user user and group.
So this changes back to your ssh/sftp user:
sudo chown -R ec2-user:ec2-user path/to/wordpress
You can assign the folders to the ftp user and the apache group and then make them group writable as well. This will allow you to ftp into the directory, and allow everything to be auto updated within Wordpress.
// Set the wp-contents into the apache group and then make files group writable
sudo chgrp -R apache wp-content
sudo chmod -R g+w wp-content
// This makes new files created in wp-content and all of its sub-directories group-writable.
sudo chmod g+s wp-content
Then add this to wp-config.php to force Wordpress to update when only applying this wp-content:
define('FS_METHOD', 'direct');
You can also apply to the whole Wordpress install to auto update Wordpress and not just plugins/themes. If you do this, I would recommend putting your wp-config.php file one directory above your Wordpress install though, so you can lock it down separately.
EDIT: Whenever I am having permission troubles on EC2, I go to site root directory, and paste these lines in. I apply it to the whole Wordpress install these days:
sudo find . -type d -exec chmod 0755 {} \;
sudo find . -type f -exec chmod 0644 {} \;
sudo chown -R ec2-user:apache .
sudo chmod -R g+w .
sudo chmod g+s .
I use something similar on my Mac as well.
In your wp-config.php under directives add this line:
define('FS_METHOD', 'direct');
You can simply solve this problem by doing this via ssh:
sudo chown -R apache path/to/wordpress
then
sudo chmod -R 755 path/to/wordpress
Your hostname would be ec2-107-20-192-98.compute-1.amazonaws.com.
Your username will be the username you use to SFTP to the instance normally - ec2user for some instance types, ubuntu for Ubuntu AMIs, etc. EC2 generally doesn't use passwords, preferring SSH keys, so you'll have to set a password for your account by doing passwd on the commandline.
Try adding FTP credentials to wp-config.php: http://codex.wordpress.org/Editing_wp-config.php and http://codex.wordpress.org/Editing_wp-config.php#WordPress_Upgrade_Constants
That should make WP admin stop asking for FTP details. But depending on how you've set up permissions via the command line, may have to go to the command line to edit files like wp-config.php . And you may not have sufficient permissions to upload and for WP to unzip a theme.
As per other answers, I use SFTP with a server of ec2-xx-xxx-xx-xx.compute-1.amazonaws.com username of ec2-user
ec2-107-20-192-98.compute-1.amazonaws.com:22 represents both the hostname and the ssh port. (SSH is normally on port 22, though it can run on any port.)
Try just ec2-107-20-192-98.compute-1.amazonaws.com in the hostname field.
I'm still skeptical of a webpage asking for a username and password. Seems a bit silly to me, since you should just use SFTP to directly upload whatever content you want using your SSH identity key instead of a password.
You could simply use 127.0.0.1 as hostname and check FTP in Wordpress ftp settings.
To resume what has been said:
user is the same you actually use to SSH/SFTP
password needs to be set/updated logging in via SSH and typing
sudo passwd your-user-name