WordPress - Hardened permissions with automatic updates? - wordpress

Is there a way to allow WordPress to automatically update while still using hardened permissions?
It seems the recommended security setup for WordPress is to use hardened permissions, which are mostly achieved using the permissions given in this answer. However, these permissions result in WordPress not being able to automatically update, or use update through the administrator web interface, resulting in an error:
Downloading update from https://downloads.wordpress.org/release/wordpress-x.x.x-partial-x.zip…
Unpacking the update…
The update cannot be installed because we will be unable to copy some files. This is usually due to inconsistent file permissions.: wp-admin/includes/update-core.php
Installation Failed
By allowing the web server to update update-core.php we violate the hardened permissions (as far as I can tell). Unfortunately, without automatic updates, we also have the problem that we don't get automatic security updates, which leads to another security problem. Is there a way to allow automatic updates without the need for weak permissions? What are the strongest permissions that can be used while still allowing automatic updates, and is that strong enough?

The Hardening Wordpress guide describes on what's a secure setup and recommends automatic updates, but conveniently omits that the former doesn't work with them.
To my knowledge, every admin just has a very unpleasant choice to make:
Keep the hardened permissions, requiring keeping on top of every single minor update and changing permissions back and forth to execute it
Loosen permissions in a non-documented way and risk the associated increased insecurities
As somebody who primarily deals with automation, personally I just can't get behind the manual approach. It seems like less of a risk, but that's only if you never happen to let an update go unattended for a week or two. Then arguably the risk is higher due to the unpatched vulnerabilities than it would have been for the looser permissions.
Here's the extract that I use to switch to "insecure" mode for the few seconds it takes to update (and that I'll be using until something better comes along or my patience with this manual approach ends):
sudo chown -R <wordpress_user> <wp_rootdir>; read; sudo chown -R <myuser> <wp_rootdir>
It changes the owner of everything to the process that runs WordPress and uses the "read" command just to hold up until you press any button to then restore back to the original owner.
TL;DL: No, there is only the choice of two extremities.

Related

SVN: excluding a file type from repository but not between 2 machines

I am using less to manage a website and since I am the styling curator, these files remain hidden in the repository from the rest of the PCs since no one else needs to edit or modify them (as a security measure, since some proactive persons have edited the css files which gives me some headaches when I compile less without their changes ), since only I should edit and compile them.
But now I'll be needing to share and sync the less and css files with one other person in the newtork, and either my changes or his changes will have to be compiled and synced up: if I update a file, I need SVN to notify him that his version is old and needs to sync up, and viceversa if he's the one who makes changes.
So, since css files are compiled from less files and some of them have dependencies between them, I only need to share them with one coworkers via SVN, and since he is starting with less variables and such, I want to have the option to keep and review logs and preform reverts to previous versions, which effectively means to maintain less files invisible to everyone else in SVN, but one other PC.
Is it possible? If it is, how would that go?
SVN 1.8
You can't have IP-based ACLs (easy), you can get only username-based ACLs
This ACLs must be implemented on server-side (and path-based access control must be enabled prior to defining rules)
With Path-Based Authorization you can have enable|disable rules for individual files

What should I do to secure, chmodly wise, my server?

I want to let my friend access my server so he can host his website. Let's call him Joris.
# useradd joris
note that I'm Debian. So now a /home/joris has been created. This is cool and all. BUT. He can
cd /
cd /etc/
cd /var/www/
He can cd pratically everywhere, maybe not delete but he can see everything, which I don't want him to. Is that normal?
First, I would suggest you reading the Debian Administrator's Handbook by either using aptitude install debian-handbook or using a search engine to find a copy online. It covers many topics about security that will be of use to you, especially when sharing a server with multiple users.
As far as being able to access various directories, Debian is VERY relaxed for my tastes with it's default permissions setup. Check the default UMASK settings (/etc/login.defs) so that you can have a more secure setup when adding users.
I o-rx from things like /var/www and grant access to those using Access Control Lists (ACLs). If you are unfamiliar with ACLs I highly recommend you familiarize yourself with them as they are much more robust than the default permissions system.
As far as what all you should protect, that will depend on your setup. Most things in /etc will be self explanatory whether or not you can remove read access for users outside of the owner/group (like your web server configuration directory). You can also use permissions to limit access to specific binaries that users should never have access to, like mysql or gcc.
In the long run your setup will be unique to your specific needs. Reading the Debian Handbook will be immensely helpful in helping you secure your box not only from the outside, but from the inside as well.
Hope this helps point you in the right direction.

Updating a Classic ASP website without interrupting service

A couple of questions:
1) How can I update a Classic ASP website/page without interrupting service (users getting an error or service unavailable message) or shutting the website down temporarily?
2) When updating/restoring a MSSQL DB via SQL Server Management Studio, will the website users get an error message?
Thanks in advance.
A smart practice is to use at least one separate development environment with the same setup as your production environment and debug all changes there to ensure that they work. Once your entire site is running and tested on the other, identical environment to your production environment, you should be able to simply move the files and they should work in production. This model being effective is dependent on actually being able to maintain environments as close to identical to each other as possible.
When updating/restoring a MSSQL DB
Be careful with your terminology; UPDATE and RESTORE are two very different commands.
If the database is locked by the changes being made, then it will be inaccessible to users and may cause error messages depending on your IIS and code setup. Scheduling a maintenance period and blocking user access to any pages that access the database is will help avoid messy errors and revealing any information about your infrastructure while the changes are being made.
It seems like you might want to do some basic research on development and databases both in order to make sure you understand what you're doing and can cover all of your bases. Looking up commands like RESTORE and UPDATE and using them correctly is crucial.
For example, when you rewrite one or more of your website files
via FTP, in that very moment when rewriting is taking place,
users will get a 500 Service Unavailable error. How can I avoid this?
This really shouldn't happen, although you could upload the files to a different folder, avoiding any delay there, and sync the files with a diff tool such as Winmerge (also helping you keep track of changes and revert quickly) when done uploading.

Where to put a new ASP.NET website?

Where's the best place for a production asp.net application? I mean a place that we need less permission manipulation on folders and probably the experts choice.
under C:\inetpub\wwwroot or C:\inetpub or elswhere ?
In development/test phases I usually put it under C:\inetpub\wwwroot and create a new web application without setting bindings. But on production version with binding I'm not sure where's the right place.
You can put it anywhere you like, they key thing is to ensure that the app pool it is running under is set to run as a low privileged user (like NT AUTHORITY\NETWORK SERVICE), then ensure that user has Read (and possibly Browse if you want it) permissions on the folder you put your web app in. Very seldom (if ever) will the user need Write or Modify permissions on the folder.
and on a new system I had a lot of problem to modify batch files, setting permissions
Setting permissions should not be a problem, you should set the same basic permissions i mentioned above for the user you want to run the app pool as. You can use PowerShell or WMI for this, and you should use the same permissions no matter what folder you install in to.
You could always wrap all this up into an installer, then it can be as simple as hitting Next.. Next... Finish... in an installer wizard to set up your website on any machine. Doing this in an installer also gives you some certainty that nothing has been missed.
Personally I have a 'Development' folder on my D: drive which is then subdivided into different categories depending on the work. I generally don't use inetpub directory and any permission issues I come across I just set directly onto the relevant folder within my own development structure.
On production environments I've used in the past, we've generally done the same thing. Mainly to help backup scenarios really, but also because there's no strict need to use the default IIS directories - you're free to structure things how you like.
Personally, I always create a new folder (in the root of a drive) called WebSites. I then make sure it has the appropriate permissions for the website process(es) (aka App Pools).
eg.
C:\
|_WebSites
|_www.Foo.com
|_www.Bah.com
It also makes it easier to manage because you don't have to hunt through the folder structures to find any/all websites.
But technically, it can be (more or less) anywhere - just needs to have the correct permissions set.
Bonus Answer
I also remove the Default Website from IIS .. which in effect means I can also delete c:\inetpub\wwwroot.
You can put the website any where on the server hard disk, Just make sure it is a secure folder and also I recommend to don't put it in the same OS drive, in case it failed and you needed to formate it.
C:\inetpub\wwwroot and C:\inetpub are just the default places nothing more.
Really depends on how the production server is configured and how operations likes to operate over there. Typically we setup a second "data" drive on servers for a few reasons:
a) Back in the old days, there were a lot of cannonical attacks where the attacker would try to navigate from c:\inetpub to c:\winnt\cmd.exe. Putting things on a different drive prevented this sort of thing.
b) Recovery -- if the OS gets hosed, you can pretty easily reinstall/reimage or move the data disks to another box and get things stood up fast.
c) Typically is lots easier to do things like swap the non-os disk in case you need more disk space or faster disks or whatever.
Basically, off the OS drive is a good idea. Though virtualization and modern deployment tools make lots of this matter less.

Upgrading my Wordpress installation...click and pray?

I never know what to do when my Wordpress installation tells me there's an update available. I am using version 2.8 so whenever there is an update, all I have to do is click update, some magic happens behind the scenes, and it gets updated. But should I create backup files? And how? I have custom themes and plugins that I don't want to get lost because I don't have backups! Is it safe to assume that nothing bad will happen when you click the upgrade button? What is your process when you decide to upgrade to the newest version?
Backup the database, wp-content directory and configuration files first.
There are plug-ins to make this easier, but since you're asking on StackOverflow, I'll assume you could write a script to do it yourself. While you're at it, add the script as a cron job.
http://codex.wordpress.org/WordPress_Backups#Backup_Resources
Always backup before making a big change like that.
You'll want to copy all your files to a safe place via FTP. Copy 'em, zip 'em up, and keep them somewhere safe where you can remember where they are. You'll also want to backup or "export" the database and keep that also safe. This way if something goes wrong, you can restore it to the way it was.
There's a good backup script here for Wordpress sites:
http://www.guyrutenberg.com/2008/05/07/wordpress-backup-script/
based on Bash and bzip2.
I usually don't update nothing in production without testing it first, unless is a simple modification and is about security (like the 2.8.4 update).
The ideal thing to do is create a separate installation to be a test server: can be in your local machine, or just a whole different installation in your server. Why? Remember you have plugins installed and some may break, updating everything can't be a "blind" decision!
So, before updating in the production installation/server, always test in the "test environment".
Nothing is worse than having your website down because of an update error.

Resources