How to usually .htaccess file when it is hidden in ftp - drupal

I usually cannot see .htaccess file because it is hidden, when I login to remote servers with ftp access.
Since I don't have shell access, I usually perform the following steps to edit the file:
I change the settings on my mac (from terminal) to see invisible files
I open .htaccess file on a standard drupal installation and I edit it
I upload it to the remote server and I overwrite the existent one
I disable hidden files on my mac
I was wondering if there is a faster solution
thanks

I often have a separate file in my Drupal root called production.htaccess or something along those lines. Not only does this expose the file in Finder without revealing every single .DS_Store on my system, it also allows me to set separate .htaccess directives for different environments. Then, I just rename production.htaccess to .htaccess after I upload it to the server.
More often than not, the two .htaccess files are identical, but even in that case, I still use this method for the sake of convenience.

The FTP application should have the option to show you hidden files; normally, that is an option available on FTP client applications for Mac OS X.

Related

Is doing drush archive-dump enough to transfer a drupal site to another server?

I am relatively new to Drupal. I have a drupal site on my staging but I would like to transfer the site to production server. My question is: Is doing drush archive-dump enough to do this? I tried doing this and it seems like the site is not loading the configurations correctly? I already executed the sql commands from the file generated by the dump.
There are three components to moving a Drupal site to another server:
Database
Code
Files (e.g. files uploaded by content creators; usually sites/all/public)
drush archive-dump is specifically there to grab all three and tar them. So yes, that is all the data you need. There can be other issues (e.g. server permissions; software versions; DB credentials; etc...)
To go live you need to:
Test your site in the same environment as production site have.
Move code to production server.
Move database to production server.
That's all.
Please read:
http://www.slideshare.net/erikwebb/the-basics-of-smart-drupal-deployment
https://www.drupal.org/best-practices
Almost. It seems that archive-dump will not include the Private Files Directory if it resides outside the DocumentRoot.
Some administrators will place the Private Files in a directory such as DOCROOT/sites/default/files/private/ and although Apache 2.x should deny access to this directory directly via .htaccess rules, placing this outside the DocumentRoot entirely ensures that protection regardless of HTTPD service...
So no, archive-dump is coming up short if you have Private Files outside your DocumentRoot directory.

open_bir restriction in effect at Wordpress [duplicate]

I'm getting this error on an avatar upload on my site. I've never gotten it before and nothing was changed recently for me to begin getting this error...
Warning: is_writable() [function.is-writable]:
open_basedir restriction in effect.
File(/) is not within the allowed path(s):
Modify the open_basedir settings in your hosting account and set them to none. Find the open_basedir setting given under 'PHP Settings' area of your Plesk/cPanel. Set it to 'none' from the dropdown given there.
I have shown them in the Plesk panel picture.
To resolve this error, you must edit the file httpd.conf.
Way before it can be seen in phpinfo in apache2handler section directive Server Root.
For example, in my case this way - / etc / httpd / httpd.conf.
Open the file httpd.conf, find the mention of the parameter open_basedir. And set it to none. (php_admin_value open_basedir none)
If you're running this with php file.php. You need to edit php.ini
Find this file:
: locate php.ini
/etc/php/php.ini
And append file's path to open_basedir property:
open_basedir = /srv/http/:/home/:/tmp/:/usr/share/pear/:/usr/share/webapps/:/etc/webapps/:/run/media/andrew/ext4/protected
For me the problem was bad/missing config values for the Plesk server running the whole thing.
I just followed the directions here:
http://davidseah.com/blog/2007/04/separate-php-error-logs-for-multiple-domains-with-plesk/
You can configure PHP to have a separate error log file for each VirtualHost definition. The trick is knowing exactly how to set it up, because you can’t touch the configuration directly without breaking Plesk.
Every domain name on your (dv) has its own directory in /var/www/vhosts. A typical directory has the following top level directories:
cgi-bin/
conf/
error_docs/
httpdocs/
httpsdocs/
...and so on
You’ll want to create a vhost.conf file in the domain directory’s conf/ folder with the following lines:
php_value error_log /path/to/error_log
php_flag display_errors off
php_value error_reporting 6143
php_flag log_errors on
Change the first value to match your actual installation (I used /tmp/phperrors.log). After you’re done editing the vhost.conf file, test the configuration from the console with:
apachectl configtest
…or if you don’t have apachectl (as Plesk 8.6 doesn’t seem to)…
/etc/init.d/httpd configtest
And finally tell Plesk that you’ve made this change.
/usr/local/psa/admin/bin/websrvmng -a
Laravel
If you have this problem when using Laravel.
Only go to folder bootstrap/cache and rename config.php to anything you want and reload site.
If used ispconfig3:
Go to Website section -> Options -> PHP open_basedir:
In this field has described allowed paths and each path is separated
with ":"
/var/www/clients/client2/web3/image:/var/www/clients/client2/web3/web:/var/www/...
and so on
So here must put the path that you want to have access, in my case is:
/var/www/clients/client2/web3/image:
The problem appears because:
When a script tries to access the filesystem, for example using include, or fopen(), the location of the file is checked. When the file is outside the specified directory-tree, PHP will refuse to access it.
The path you're refering to is incorect, and not withing the directoryRoot of your workspace. Try building an absolute path the the file you want to access, where you are now probably using a relative path...
if you have this kind of problem with ispconfig3 and got an error like this
open_basedir restriction in effect.
File(/var/www/clients/client7/web15) is not within the allowed
path(s):.........
To solve it (in my case) , just set PHP to SuPHP in the Website's panel of ispconfig3
Hope it helps someone :)
I had this problem # one of my wordpress sites after updating and/or moving :)
Check in database table 'wp_options' the 'upload_path' and edit it properly...
For Plesk, you can change or set the openbase dir settings via the panel
https://support.plesk.com/hc/en-us/articles/360006170513-How-to-add-custom-or-additional-path-to-the-open-basedir-option-for-Plesk-domain-
Edit the php.ini or .user.ini that is located within the main directory
open_basedir = none
If you are running a PHP IIS stack and have this error, it is usually a quick permission fix.
If you administer the windows server yourself and have access, try this FIRST:
Navigate to the folder that is giving you grief on writing to and right click it > open properties > security.
See what users have access to the folder, which ones have read only and which have full. Do you have a group that is blocking write?
The fix will be specific to your IIS setup, are you using Anonymous Authentication with specific user IUSR or with the Application Pool identity?
At any rate, you are going to end up adding a new full write permission for one of IUSR, IIS_IUSRS, or your application pool identity - like I said, this is going to vary depending on your setup and how you want to do it, you can go down the google rabbit hole on this one (one such post - IIS_IUSRS and IUSR permissions in IIS8) For me, i use anon with my app pool identity so i can get away with MACHINE_NAME\IIS_IUSRS with full read/write on any temp or upload folders.
I do not need to add anything extra to my open_basedir = in the php.ini.
In addition to #yogihosting's answer, if you are using DirectAdmin, then follow these steps:
Go to the DirectAdmin's login page. Usually, its port is 2222.
Login as administrator. Its username is admin by default.
From the "Access Level" on the right panel, make sure you are on "Admin Level". If not, change to it.
From the "Extra Features" section, click on "Custom HTTPD Configurations".
Choose the domain you want to change.
Enter the configurations you want to change in the textarea at the top of the page. You should consider the existing configuration file and modify values based on it. For example, if you see that open_basedir is set inside a <Directory>, maybe you should surround your change in the related <Directory> tag:
<Directory "/path/to/directory">
php_admin_value open_basedir none
</Directory>
After making necessary changes, click on the "Save" button.
You should now see your changes saved to the configuration file if they were valid.
There is another way of editing the configuration file, however:
Caution: Be careful, and use the following steps at your own risk, as you may run into errors, or it may lead to downtime. The recommended way is the previous one, as it prevents you from modifying configuration file improperly and show you the error.
Login to your server as root.
Go to /usr/local/directadmin/data/users. From the listed users, go to one related to the domain you want to change.
Here, there is an httpd.conf file. Make a backup from it:
cp httpd.conf httpd.conf.back
Now edit the configuration file with your editor of choice. For example, edit existing open_basedir to none. Do not try to remove things, or you may experience downtime. Save the file after editing.
Restart the Apache web server using one of the following ways (use sudo if needed):
httpd -k graceful
apachectl -k graceful
apache2 -k graceful
If your encounter any errors, then replace the main configuration file with the backed-up file, and restart the web server.
Again, the first solution is the preferred one, and you should not try the second method at the first time. As it is noted in the caution, the advantage of the first way is that it prevents saving your bad-configured stuff.
Hope it helps!
I am using an Apache vhost-File to run PHP with application-specific ini-options on my windows-server. Therefore I use the -d option of the php-command.
I am setting the open_basedir for every application as one of these options.
I needed to set multiple urls as open_basedir, including an UNC-Path, and the syntax for this case was a bit hard to find. You have to seperate the paths with semicolons and if your first path starts with a driveletter you might have to start the list with a semicolon too. At least that's what works for me.
Example:
php.exe -d open_basedir=;d:/www/applicationRoot;//internal.unc.path/ressource/
I uploaded my codeigniter project on Directadmin panel. I was getting same error.
Then I change in php settings.
open_basedir =
session.save_path = ./temp/
Then it worked for me.
As most do not find a solution, the solutions are broad for WordPress most even don't know fully why things are they are.
I've found out you will have to enable IP for your server in especially when using Cerber in some cases it can think you are not uploading .png instead you are uploading .js files.
The server IP needs to be whitelisted. Even the uploaders in some rare cases.
A great to know is to have a tmp folder 755 in your base directory, you actually do not need a folder called tmp.: "Also remember / properly inedited as below:
open_basedir = "/home/user/site.com/:/tmp"
upload_tmp_dir = /home/user/site.com/tmp
The best option for quick setup is in Cpanel where you use the MultiPHP INI Editor you can actually save and both .htaccess and php.ini will be updated as well as settings being initiated at the same time on site.
It's NOT recommended to have basedir as "none" since you are enabling root files that can be edited with just a single file editor in WordPress. If that truly is possible.
Check \httpdocs\bootstrap\cache\config.php file in plesk to see if there are some unwanted paths.
Just search
open_basedir =
in php.ini and disable it. That's the simplest solution to solve this issue.
Before Changes open_basedir =
After Changes ;open_basedir =
P.s - After changes don't forget to restart your server.
Enjoy ;)
Modify the open_basedir settings in your PHP configuration (See Runtime Configuration).
The open_basedir setting is primarily used to prevent PHP scripts for a particular user from accessing files in another user's account. So usually, any files in your own account should be readable by your own scripts.
Example settings via .htaccess if PHP runs as Apache module on a Linux system:
<DirectoryMatch "/home/sites/site81/">
php_admin_value open_basedir "/home/sites/site81/:/tmp/:/"
</DirectoryMatch>

Checking Wordpress core files

Is there a script or something that can check if all core files are installed properly. I am installing a Wordpress site on clients hosting, and for some reason around 100 files were not transferred due to the connection time out. Now I am moving them one by one, but still I would like to check somehow, once I am done, that all files transferred are there and their size is more than 0b.
Thanks.
Since you are using Filezilla, drag and drop all files again into the folder.
Then when the file exists message shows up, pick Overwrite if different size and check apply to current queue only. Then only the ones with different sizes (or the ones that weren't transferred) will be overwritten/updated.
There's an easier way:
If you have access to some kind of control panel like cPanel, you can make a .zip file and upload it only via Filezilla.
Then on cPanel, go to File Explorer and unzip from there. Will be faster and you just have to upload one file (rather than opening tons of connections and giving you timeout).
Or if you have shell access, you can login with your key using Terminal(mac) or Putty(win), browse the folder and run the unzip command.

download directory and sub directories off a virtual repository

Wondering if there is a way to download the root folder plus a bunch of sub folders (and sub folders of those folders) with all the files and keep them in their respective folders.
I've tried some firefox plugins like flashgot and download-them-all but they grab the actual web files in addition to the files in the repository, but only if they are visible. For example, if I don't collapse all the folders and expose the files in the repository, the plugins won't detect them.
I would just collapse all the folders and expose the files but these plugins won't recognize the folders...they just download as "foldername".html .... and all the files are mixed together in one folder.
I've also tried visualWget and allowed recursive downloads but again, this only grabs the actual website files, not the files in the repository.
If anyone could help it'd be greatly appreciated. I've been copying them manually but there are literally thousands of files and folders so I'm looking for a quicker solution.
As a client you can only download what's accessible. You either need to know the list of files or crawl the pages for the links, which is what the Firefox plugins do.
There's no way to get a list of files on the server without access to the server beyond http (unless the server has webdav or exposes some other api).
I ended up getting it to work. I used the following command in Terminal.
scp -r username#hostaddress:/file/path/to/directory /path/to/my/computer/directory
-r is for recursive so it downloads all files and directories and subdirectors
If you try this be sure to run this command from your local terminal. I made the mistake of doing it from the SSH connection to the server (no negative effects just frustrating)

ASP.NET Web.config question

The server is IIS7.
Is there a way to disable web.config files in subfolders?
I am asking because, I have a folder on the web server that is for uploads. When someone uploads files, a new folder is created for the user's session and the files they upload go in the folder.
So the path to uploads would be like this:
~/uploads/3F2504E0-4F89-11D3-9A0C-0305E82C3301/somefile.txt
In the ~/uploads/ directory there is a web.config file that removes all http handlers except the static file handler and adds a wildcard mime type. So every file that a user uploads will only ever be served statically.
If a user uploads a web.config file, I want to disallow any of the settings in that file from being applied.
How can I do this?
EDIT
Could I just make the upload folder an application that is a member of an application pool configured to run in Classic mode instead of Integrated Pipeline mode? That way it wouldn't even care about a web.config file.
EDIT 2
Is there another type of webserver I could install for serving all files statically? I could just access the files through a different port. Is there some software that I can be sure wont run any scripts and is safe.
I simply wouldn't allow them to upload a file with that name. In fact, I normally wouldn't trust any filename that the user gave me... makes a great candidate for an injection-style attack.
Ok I have a different angle on this...
What if your uploads folder was not part of the website and instead part of the file system? This way ASP.NET is not processing requests to the folder and thus web.config wouldn't be loaded by the ASP.NET runtime.
You'd have to give your app pool's account read/write access to the file system where these files are stored, but I think it better fits what you're trying to accomplish.
Obviously it could be done in code.
If the folders always exist, you could pre-populate with a web.config with no (significant) content and an ACL to ensure it cannot be overwritten, but looking at the path it I suspect you create the upload folders dynamically which means this would not work.
I don't believe there is a way to tell IIS not to use a web.config (but I could be wrong). Personally, I would add a check to my save code and rename the file.
Why not just check the filename first to prevent the user from uploading a file named web.config? You're probably going to want to check for other things too before allowing the upload - files that are too big, etc.

Resources