FTP files up to Amazon AWS server - nginx

I have not been able to find an answer to this question, although there have been some similar questions. So I have a Amazon EC2 that I launched using a "Magento quick start" on the Amazon marketplace. Anyway so the site is working I can get to it via my domain and can also ssh in.
However when I login via filezilla I am using my key and also the username ubuntu . Now when I try to ftp something into my var/www/magento folder I get a permission denied. The default owner/group is nginx nginx
If I do a
sudo chown -R ubuntu:nginx var/www/magento
I am then able to ftp files up to the server. However when I go in the browser to the url the site then gives me an "this site can not be reached" error in chrome browser. However if I
sudo chown -R nginx:nginx var/www/magento
I am then able to see the site in the browser but am back to not being able to upload anything.
I also tried doing
sudo adduser ubuntu nginx
I got a success message but still not able to ftp. I get a permission denied error
So what is the ssh command(s) that would enable me to login to sftp with user ubuntu and be able to upload files and even change file permissions without making the site no longer load when you type in its domain name?
I believe I maybe just need to add the "ubuntu" user to the nginx group??
If so what would that command be?
thanks

try sudo usermod -a -G nginx ubuntu this will add your existing ubuntu user to nginx group. Also, log out from any existing session and login back.

Related

Cannot delete files on AWS EC2 via FTP, permission denied

I run a WordPress site on AWS EC2 with Litespeed.
When I log in via FTP I cannot delete anything, neither plugin nor theme files. FileZilla shows a rm /path/to/file permission denied error:
These are due to permissions issue, as you are trying to delete the files while loggged in from user that is not the owner of those files.That www-data is server user, ask your server provider to remove those files or change the owner of files.
For this issue, you need to change the owner of the files.
This command will work for you
sudo chown -R www-data:www-data /var/www/html
Run this command from your ssh and try again.
Regards

How do I create a FTP user in Amazon Lightsail to update Wordpress Plugins

I successfully migrated a Wordpress site from BlueHost to AWS Lightsail. When I go to update the plugins, Wordpress is asking for FTP credentials (see the image).
By default, you can only connect to the Lightsail instance via SSH Certificate, which I have successfully done via Transit.
In your lightsail firewall rules make sure you allow access to TCP ports 21 and 1024-1048 from 127.0.0.1
SSH to your Lightsail instance (use putty for windows unless you know how to edit files with vim)
run the following commands to install vsftpd.
sudo apt install vsftpd
sudo nano /etc/vsftpd.conf
uncomment these lines:
local_enable=YES
write_enable=YES
add these lines:
pasv_enable=YES
pasv_min_port=1024
pasv_max_port=1048
pasv_address=127.0.0.1
Press Ctrl+X , Y , ENTER to save the changes to the file (this is why I said to use putty)
run this command to see what group owns the wp-content directory
ls -l /home/bitnami/apps/wordpress/htdocs/
In my lightsail instance, it was the "daemon" group
Note:other articles suggest adding this user to the bitnami group, but in my experience this resulted in errors during update siting that it was not able to create directories.
Run the following to create a new user and assign it to this group so that it will have access to write to the wp-content directory.
(in the following lines, substitute ftpuser for the new username)
sudo /etc/init.d/vsftpd restart
sudo adduser ftpuser
sudo usermod -d /home/bitnami ftpuser
sudo usermod -a -G daemon ftpuser
sudo /etc/init.d/vsftpd restart
Now you can try your updates again and it should work.
use 127.0.0.1 for the hostname and specify the new FTPuser credentials you just created.

wordpress: plugin updates not updating

To be clear on some things, I have tried:
going into config.php and inputting define FS_Method, FTP_Base, FTP_Content_Dir, FTP_Plugin_Dir, FTP_User, FTP_Pass, FTP_Host, FTP_SSL
setting file permissions to 755 on wp-content, wp-content/uploads, wp-content/plugins
Things I do not have access to: cpanel, file manager, ubuntu, commands, SSH credentials.
I have spoken to my web host (it is a shared host account), and they will not provide me info on SSH. The only backend I have access to is wordpress admin and FTP through FileZilla or WinSCP. The web host has declared this issue to be in my court and refuses to help me out (unless I want to be charged a hefty fee).
Now, the issue is updating plugins. I can activate and deactivate plugins. But I can't install, delete or update plugins. Originally, the issue was "can't create directory" but then I changed define(FS_METHOD) to ftpsockets. Originally it was direct. (ftpext did not work whatsoever).
NOW the issue is "Update Failed: Could not copy file. all-in-one-wp-migration/all-in-one-wp-migration.php" for the plugin All-in-one WP Migration.
Can anyone help me out or point out what I'm doing wrong?
check your disk quota , the space assigned to your account , it looks like you may be overquota and hence the updates are failed.
As other posts indicate, the root cause is a permissions problem in /var/www/html/wordpress. In my case, I used Microsoft document https://learn.microsoft.com/en-us/azure/virtual-machines/linux/tutorial-lamp-stack to install LAMP in Azure on Ubuntu 18.x LTS. I set the app to use the SFTP plugin for updates & uploads per https://wordpress.org/plugins/ssh-sftp-updater-support/. Then I changed permissions in as needed to the directories 'plugins themes upgrade uploads', group www-data (I used top to determine this, but other tools will do the trick), 775 on directories and 664 on files. Of course the user ID used to SFTP files had to be added to the www-data group on the system. I DID NOT set permissions to 777 as some have suggested in other posts and blogs.
Your situation may vary for required group ownership permissions. So analyze accordingly.
And best of luck.
Get your webhost to do this or you can do it yourself if you have SSH access
sudo usermod -aG www-data $USER
sudo chown -R www-data:www-data /var/www
sudo chmod -R 774 /var/www
you may want to revert back to the default: prevent writeable
sudo chmod -R 755 /var/www

WordPress nginx can't create Directories - Permissions correct

I know there are bunch of posts all over the internet about the WordPress permissions, but I am facing an issue I can't explain from the other posts. I am running debops WordPress on Ubuntu 16.04 with nginx.
Basically my updates within WordPress are failing, I am getting the "Could not create directory error". So I checked the permissions, and they are all correct (755 for the directories, 644 for the files).
Furthermore I checked that nginx is actually running as www-data user, which it does:
ps aux|grep nginx|grep -v grep
Shows that nginx is running as www-data.
To verify the permissions, I tried:
sudo -u www-data mkdir test
which worked and created the test directory.
Then some other posts made me think it has to do with a FTP configuration, most of them point to the vsftpd.conf file, but I don't have vsftpd installed (though I am able to connect via sftp to the ubuntu machine).
Question: What other reasons might cause this issue? Technically, WordPress has all the permissions to create it's directories.
Ok I found the problem:
nginx was indeed running as www-data user, but that wasn't the issue. From the debops issues I found that the correct user who should own the WordPress directory is the 'wordpress' user, not www-data.
chown wordpress:wordpress /var/www/ -R
Now everything works well with the updates.

We weren't able to verify your property: www.website.com

While using GoogleWebMaster for verifying a newly uploaded website I get this error when I use a HTML upload verification type.
We weren't able to verify your property: www.website.com
My server is a EC2, ubuntu 64 bit micro instance running apache2.
I moved the file to /var/www/html and yet the verification failed. I then used chown to change the ownership of the file. It worked after that.
sudo chown www-data:www-data google.html

Resources