Storing Plugin Specific Data Securely - wordpress

I want to store an API key for a service that the WordPress plugin I am developing needs to get information from an API. There are two options that I am aware of:
1) WordPress's options mechanism
2) Create a new database table
As far as I can tell, at the end of the day both are the same in that they are storing the information in a MySQL table and that data could potentially be accessed by another plugin.
Is there any way to store data so that it cannot be read by other plugins?
Is this even a concern I should be worried about?

A plugin can potentially dump your entire database and send it to it's authors through email, so one way or other to store it's pretty much useless.
This boils down essentially to 2 things, store it in an external database, where just your plugin have access to that or just do a two-way encode/decode with a salted key so your plugin it's the only thing can decrypt it.
If database access from other plugins is still a concern then store the API key within your PHP file. It won't be replaceable but you can take MySQL off the list.
On a personal opinion unless you are installing the worst and least known plugins on Wordpress you probably should be quite confident about the security of your website. To be fair probably caring about an API key to be stolen is the least concerning thing when you have someone that could access all your user details and passwords and potentially FTP access to your server.

Related

Understanding the Firebase and purpose of google cloud functions

Let's say I'm developing app like Instagram: for iOS, Android and Web. I decided to use Google Firebase as it really seems to simplify the work.
The features user needs in the app are:
Authorization/Registration
Uploading photos
Searching for other people, following them and see their photos
I come from traditional "own-backend" development where I do need to setup a server, create database and finally write the API to let the frontend retrieve the data from the server. That's the reason why it's unclear to me how it all works in Firebase.
So the question is how can I create such app:
Should I create my own API with cloud functions? Or it's ok to work with the database directly from the client-side?
If I work with the database directly why do I need cloud functions? Should I use them?
Sorry for such silly questions, but it is really hard to get from scratch.
The main difference between Firebase and the traditional setup you describe is that with Firebase, as far as the app developer is concerned, the client has direct access to the database, without the need for an intermediate custom API layer. Firebase provides SDKs in various languages that you would typically use to fetch the data you need / commit data updates.
You also have admin SDKs that you can use server-side, but these are meant for you to run some custom business logic - such as analytics, caching in an external service, for exemple - not for you to implement a data fetching API layer.
This has 2 important consequences:
You must define security rules to control who is allowed to read/write at what paths in your database. These security rules are defined at the project level, and rely on the authenticated user (using Firebase Authentication). Typically, if you store the user profile at the path users/$userId, you would define a rule saying that this node can be written to only if the authenticated user has an id of $userId.
You must structure your data in a way that makes it easily readable - without the need for complex database operations such as JOINs that are not supported by Firebase (you do have some limited querying options tough).
These 2 points allow you to skip the 2 main roles of traditional APIs: validating access and fetching/formatting the data.
Cloud functions allow you to react to data changes. Let's say everytime a new user is created, you want to send him a Welcome email: you could define a cloud function sending this email everytime a new node is appended to the users path. They allow you to run the code you would typically run server-side when writes happen, so they can have a very broad range of use-cases: side-effects (such as sending an email), caching data in an external service, caching data within Firebase for easier reads, analytics, etc..
You don't really need a server, you can access the database directly from the client, as long as your users are authenticated and you have defined reasonable security rules on Firebase.
In your use case you could, for example, use cloud functions to create a thumbnail when someone uploads a photo (Firebase Cloud Functions has ImageMagick included for that), or to denormalize your data so your application is faster, or to generate logs. So, basically you can use them whenever you need to do some server side processing when something changes on your database or storage. But I find cloud functions hard to develop and debug, and there are alternatives such as creating a Node application that subscribes to real time changes in your data and processes it. The downside is that you need to host it outside Firebase.
My answer is definitely NOT complete or professional, but here are the reasons why I choose Cloud Functions
Performance
You mentioned that you're writing an instagram-like mobile device app, then I assume that people can comment on others' pictures, as well as view those comments. How would you like to download comments from database and display them on users' devices? I mean, there could be hundreds, maybe thousands of comments on 1 post, you'll need to paginate your results. Why not let the server do all the hard work, free up users' devices and wait for the results? This doesn't seem like a lot better, but let's face it, if your app is incredibly successful, you'll have millions of users, millions of comments that you need to deal with, server will do those hard jobs way better than a mobile phone.
Security
If your project is small, then it's true that you won't worry about performance, but what about security? If you do everything on client side, you're basically allowing every device to connect to your database, meaning that every device can read from/write into your database. Once a malicious user have found out your database url, all he has to do is to
firebase.database().ref(...).remove();
With 1 line of code, you'll lose all your data. Okay, if you say, then I'll just come up with some good security rules like the one below:
This means that for each post, only the owner of that post can make any changes to it or read from it, other people are forbidden to do anything. It's good, but not realistic. People are supposed to be able to comment on the post, that's modifying the post, this rule will not apply to the situation. But again, if you let everybody read/write, it's not safe again. Then, why not just make .read and .write false, like this:
It's 100% safe, because nobody can do anything about anything in your database. Then, you write an API to do all the operations to your database. API limits the operations that can be done to your database. And you have experience in writing APIs, I'm sure you can do something to make your API strong in terms of security, for example, if a user wants to delete a post that he created, in your deletePost API, you're supposed to authenticate the user first. This way, 'nobody' can cause any damage to your database.

Sql vs flatfiles and some secutity related issues

The basic requirement of the site is to provide a plateform for blogging and single interface for managing various social network, emails, blog etc.
For blogging I am using .netblogengine. And facebook and gmail and blogger are currently being managed by signing in via their api.
I am using mysql and blog content is being saved in mysql column.
My first question is.. Suppose instead of saving blog content in mysql table column I just save the content in a txt file and save the name of the file in the column.. Doing so will reduce the size of table but will it also affect the performance?? I know I will not be able to search within the content but mysql size grows up quickly (50 mb with 8 users). The minimum number of users are atleast 30.
My second question.. I am thinking of asking all the acount (facebook, gmail etc) username and password and store them in the db. When even a user sign in all related account will be signed in using the saved data so that separate logging is not required. How ever username/password hacks are common headlines now days. I want to know how secure are the shared server environment in this regard? Will I be required to take some extra effort to secure these data or I can be sure that as long as my mysql logging details are safe all data are safe.
The short answer is saving text content like blog post in the database is going to be the best bet. Saving any attachments if there are any in separate files is a good idea though.
The data size on your disk is going to be similar either way, so there's no savings there. And since you would need to develop a unique naming convention for the text files you'd essentially be programming a partial database yourself; might as well leave it to the experts :)
In general, the table size getting to hundreds of megabytes is not going to hurt performance substantially. Assuming you've set up your indexes appropriately, the database engine will be able to seek directly to the data it needs.
Short answer on the account credentials, definitely don't save either username or especially password in unencrypted form in the database. It would be a question for the security experts on how to properly encrypt these to ensure security. https://security.stackexchange.com/

Storing sensitive data with Drupal

I need to use sensitive data with Drupal for a custom module to use. If I simply set them through the GUI, they will be stored unencrypted in the database. Anyone having access to it will have access to my sensitive data.
I can see two solutions for the moment:
Find a way to securely store those credentials into the database;
Put those sensitive data into a credentials_inc.php file, include it in settings.php to set variables my custom module could use and make sure that nobody else can read the file.
Which solution is best according to you? What do you recommend? Is there any other best option?
Best regards.
I would start off by using SecurePages module, to make sure the data entered somewhere along the way is not snooped.
Then to encrypt the information try using php's mcrypt with a short example of how to encrypt and decrypt.
Once the information is secured, you should have no problem storing the data in drupal's db structure. Also, an important note, you might check out hook_init() instead of trying to append something in settings.php. That is in general a bad practice.
The Encryption module provides an API that supports a few different encryption methods, including mcrypt (if you have it enabled).
The Encryption module is an excellent way to encrypt sensitive data within Drupal. However, this module does not provide adequate key management (it stores the encryption key within the Drupal database - like storing the keys to your house under your Welcome mat).
Along with Encrypt, you will also need an additional module like Townsend Security Key Connection which allows you to manage the encryption keys outside of the Drupal database in an encryption key manager (HSM, Cloud, VMware, etc.). Just remember - if you aren't properly managing your encryption keys, you aren't properly encrypting your data.
Full Disclosure: I work with Townsend Security on the Drupal team.

In SAAS architecture, how do I handle db schema and MVC user logins for multi-tenants

Our requirement is something like this.
We are building a multi-tenant website in ASP.NET MVC, and each customer should be able to create their own users as per predefined user roles.
We are thinking about to create a schema for few tables which would be common for customers. So customer can login to system according to their schema logins and we need not to alter any queries to serve all of them.
We are referring http://msdn.microsoft.com/en-us/library/aa479086.aspx Shared Database, Separate Schemas.
Can someone suggest on following
1. After creating schema how to authorize user against a particular schema
2. Is this possible that without any changes in queries db can serve multi-tenants
Thanks in advance
Anil
After much research, I can say that, although it takes more development up front and more checks along the way, shared database and shared schema is the way to go. It puts a little bit of limits on how easily you can cater to a client's specific needs, but from my point of view SAAS isn't about catering to a single client's weird needs. It's about catering to the majority of clients. Not that it's a SAAS but take iPhone as an example. It was built to cater to the masses. Rather than focusing on doing everything it's built to be one-size fits all just by its simplicity. This doesn't help your case when it comes to authoriztion but it'll save you dev hours in the long run.
If you are asking this in the context of SQL Server authentication/authorization mechanism, i can asnwer this question with saying that every user has a default schema which helps query engine to find out required object in the database.
SQL Query Engine will look at the user's default schema first to find the required object (table). If it founds the object in user's schema then use it, otherwise goes to system default schema (dbo) to find it.
Check this article's How to Refer to Objects section to find out how it works. The article also has some information about security concepts related to schemas.

User authentication when using single database per client?

My company is building an ASP.NET HR application and we have decided to create one database per client. This ensures that clients cannot accidentally view another client's data, while also allowing for easy scalability (among other benefits, already discussed here).
My question is - what is the best way to handle security and data access in such a scenario? My intent is to use a common login/account database that will direct the user to the correct server/database. This common database would also contain the application features that each user/role has access.
I was not planning to put any user information in each individual client database, but others on my team feel that the lack of security on each database is a huge hole (but they cannot articulate how duplicating the common access logic would be useful).
Am I missing something? Should we add an extra layer of security/authentication at the client database level?
Update:
One of the reasons my team felt dual user management was necessary is due to access control. All users have a default role (e.g. Admin, Minimal Access, Power User, etc.), but client admins will be able to refine permissions for users with access to their database. To me it still seems feasible for this to be in a central database, but my team doesn't agree. Thoughts?
We have a SaaS solution that uses the one DB per client model. We have a common "Security" database too. However, we store all user information in the individual client databases.
When the user logs into the system they tell us three pieces of information, username, password and client-id. The client-id is used to lookup their home database in the "security" database, and then the code connects to their home database to check their username/password. This way a client is totally self-contained within their database. Of course you need some piece of information beyond username to determine their home database. Could be our client-id approach, or could be the domain-name requested if you're using the sub-domain per client approach.
The advantage here is that you can move "client" databases around w/out having to keep them synced up with the security database. Plus you don't need to deal w/cross-db joins when you're trying to lookup user information.
Update: In response to your update... One of the advantages to each customer having their own DB is also the ability to restore a customer if they really need it. If you've split the customer's data into two databases how do you restore it? Also, again, you'll need to worry about cross-db data access if the users are defined in a DB other than the home DB.
I've always been of the opinion that security should be enforced at the application level, not the database level. With that said, I see no problem with your intended approach. Managing accounts and roles through a central database makes the application more maintainable in the long run.
You may want to look into using the ASP.NET membership provider for handling the authentication plumbing. That would work with your stated approach and you can still keep all of the authentication data in a separate database. However, I agree with Chris that keeping one DB will utlimately be more maintainable.

Resources