What is common practice for coding web applications where part of the site has to be secured (e.g. checkout section) and part not necessarily, let's say homepage? As far as I know sharing sessions in between HTTP and HTTPS parts of the site is not easily possible (or is it?). What would be common approach if I wanted to display on HTTP page like homepage, shopping cart data (items) that users ordered on HTTPS pages? How those two parts of the site would communicate if necessary? Also isn't it security flaw in popular shopping carts as it seems that many of these have only checkout pages secured (SSL) and the rest not?
I'm using PHP if it makes any difference.
The simplest answer is to have all links to your "secure" pages link to https://. Obviously this can be somewhat of a nightmare depending on the site.
Another alternative is to set up URL rewrite rules to automatically direct secure pages to https:// if trying to access them via http://
Check out mod_rewrite for Apache if you are not familiar with the concept. Depending on what web server you are using there are other options available to achieve the same functionality, but that should give you an idea of what your options are. I assume since you're using PHP that you're using Apache, but could not be the case?
I would say that is probably the most common approach. If all of the secure pages reside in a given directory, that makes it even easier as you can write rules to say that everything in that directory must be requested via https://, otherwise http:// is suitable.
Its pretty common practice to use cookies to store cart data throughout a site. Security isn't an issue because you only care about your credit card data going over the wire. The list of things I want to buy isn't particularly sensitive.
I can tell you what I did for an ecommerce site I created from scratch. His whole site is HTTP, which includes checking out with a check (ie they fill in their info, an invoice is generated and a check is snail mailed to the seller). But, the credit card processing is done on Paypal's side, which is HTTPS. But, in order to get the cart data to Paypal I used hidden post elements, and Paypal did the rest.
Not the greatest system, but it works.
Related
I am trying to learn how to implement a DNS server, and am looking first at Heroku.
Why do they have # point to hidden-sierra-7936.herokudns.com.? Why not just foo.herokudns.com., or better yet, herokudns.com. (no subdomain), or even better, heroku.com. (main website). What are the reasons for this? Is it security, performance, architecture-needs, something else, all of the above? More specifically, what are the details of these reasons, does it depend on the number of requests coming through and that's why the <dynamic-name>.herokudns..., so there are a lot of them? Or perhaps if there is an error in one they can quickly switch it?
Finally, can these reasons be avoided/countered/argued-against so you could make the domain a little nicer and just do heroku.com.? Why can't you just do it on heroku.com.? (If you were building Heroku that is, obviously Heroku doesn't support this).
I am also looking at this. It looks like Heroku used to do it like proxy.heroku.com, but for some reason they switched it. Why?
As far as using subdomains on .heroku.com, this is at least partly a security mitigation.
Consider the arguments made in this blog post from GitHub, published when they moved Pages sites from .github.com to .github.io:
There are two broad categories of potential security vulnerabilities that led to
this change.
Session fixation and CSRF vulnerabilities resulting from a browser security issue sometimes referred to as “Related Domain Cookies”. Because Pages sites may include custom JavaScript and were hosted on github.com subdomains, it was possible to write (but not read) github.com domain cookies in way that could allow an attacker to deny access to github.com and/or fixate a user’s CSRF token.
Phishing attacks relying on the presence of the “github.com” domain to create a false sense of trust in malicious websites. For instance, an attacker could set up a Pages site at “account-security.github.com” and ask that users input password, billing, or other sensitive information.
If I set up a simple web server online (eg nginx), and generate a very large random string (such that it is unguessable), and host that endpoint on my domain, eg
example.com/<very-large-random-string>
would I be safe in say, hosting a webapp at that endpoint with no authentication to store my personal information (like a scratch-pad or notes kind of thing)?
I know google docs does this, is there anything special one has to do (again, eg for nginx) to prevent someone from getting a list of all available pages?
I guess I'm asking is there any way for a malicious actor to find out about the existence of such a page, preferably irrespective of what web-server I used.
I'd be pretty alarmed if my online bank started using this system, but it should give you a basic level of security. Bear in mind that this is security through obscurity, which is rather frowned upon and will immediately turn into no security whatsoever the moment someone discovers the hidden URL.
To prevent this from happening, you will need to take a few precautions:
Install an SSL certificate on your server, and always access the url via https, never via http (otherwise the URL path will be sent in plain view and visible to everyone along the way).
Make sure your secure document contains no outgoing links. This includes not only hyperlinks (<a href="...">) but also embedded images, stylesheets, scripts, media files and so on. Otherwise the URL will be leaked to other domains via the Referer request headers.*1
(A bit of a no-brainer, but) make sure there are also no inbound links to this page. Although they aren't so common now, web hosts used to generate automatic "web stats" pages showing the traffic to each web domain. Some content management systems generate a site map automatically. This would be just as bad.
Disable directory browsing on your server. In other words, make sure that someone who visits the directory level above your hidden directory isn't presented with a list of subdirectories.
Bear in mind that the URL will always be visible in your address bar and browser history, and possibly in other places like your browser's cookie jar. Your browser will probably provide the rest of the URL by auto-complete when someone types the domain into your address bar.
*1: Actually, your browser will only send a Referer header when you access other https pages, but still...
I've just put an ssl on a WP site and was wondering if all pages should be https, or just the key ones (checkout, etc).
It's about 1500 pages and posts. So going through and finding all non secure assets could take a while.
1) Is it worth making the whole site https?
2) Is the speed an issue these days (from the research I've been doing, it appears it's not so much of an issue anymore)
3) If only key pages are https, is it possible to make the links on the page http (ie After ordering on a secure page, the customer is redirected to a secure confirmation page. But let's say they then click through to the blog... the blog shows up as https... but because it has unsecure elements, it shows error messages in the browser. So, is it possible to click from a https page to a non https page.
(I am using the "Wordpress Https Plugin", which has a "Force SSL Exclusively" function, but, this causes problems with the shopping cart on there, so it can't be used.) Thanks
You kow, honestly, at this point if you're making any page secured with https -- which means you somehow deal with the cert issue -- just make them all. The performance hit is less noticeable if the first SSL/TLS handshake happens when first finding the landing page, and there aren't many advantages to sticking with HTTP.
Update
I guess that wasn't clear enough, or I Hm, I think I just got tl;dr on a one paragraph answer.
IF "you're going to use HTTPS at all"
THEN
"You might as well just use it everywhere."
ELSE
"Don't."
FI
Yes, you should definitely make your entire website https is you are able. However, mixing non-https content inside the same page will make most browsers give users warnings, which might confuse them into thinking your site has security problems.
Linking to non-https sites is not a problem, but using assets (javascript, css, images) from non-https sites is.
Unless your site is visited daily by millions of users, you probably shouldn't worry about the performance hit and make the whole thing https. Remember that nowadays Google takes https as a signal for better ranking your site, so it's good for SEO as well.
We have a scenario whereby we are hosting an ASP.NET MVC web site on behalf of someone else.
The customer in this case wants us to restrict access to the web site, to those users who have logged in to their main portal. They should then only be able to get to our web site via a link from that portal.
At this point I'm not yet sure what technology or authentication mechanism the 3rd party are using but just wanted to clarify what the possible options might be.
If we call our hosted site B, and their portal web site A,as I see it we could:
Check the referrer for all requests to B, unless they've come from A they can't get in
Check for a specific cookie (assuming A uses cookies)
I'm sure there are other options, anyone any ideas?
Check the referrer for all requests to B, unless they've come from A they can't get in
Can be faked, but most normal users won't do it.
Check for a specific cookie (assuming A uses cookies)
Ask them to embed in their portal some code portion from your site. This way visiting their portal will resulting in you setting a cookie for your domain. Then you can easily read it later.
One more thing to mention. If you're talking about public sites, then it will suffice for a search engine to somehow discover these hidden urls once, after which the game is over. It will index the pages and keep a cache of it. You may want to consider including some noindex/nocache meta tags in these pages.
But seriously, if you wish to have it done properly and secure, you're going to need some form of shared user authentication that that portal and your site both support.
The solutions you have posted are not secure.
In case this is an enterprise application with real requirements for security, you may want to look at some single sign-on solutions.
List of single sign-on implementations
I'd like to be able to use these "best of breed" opensource solutions, with the only requirement of some sort of single-sign-on between the different sites. I don't want my users having to log-in in 3 different places, so I though it could be possible with OpenId.
Has anyone tried something similar?
OpenID will not avoid the problem of having to sign in 3 separate times. It was allow the user to share the same login credentials between the sites, but they will have to actually log in to each of the three systems. If that is not a problem, go with OpenID. If it is, you have two options:
Use an LDAP server to authenticate on all three sites. I think all three software packages have modules/plugins for LDAP (Drupal, Moodle, MediaWiki). Once you have the LDAP server running, the rest should be easy.
Write custom modules/plugins for each platform that authenticate against a single database. Maybe you could use the Drupal database as the primary one, and have MediaWiki and Moodle authenticate with that. So, effectively, the user will only have an account on the Drupal site, but will get access to all three. This is basically the same idea as an LDAP server, but might save you some overhead and complication.
There is also the Moodle Integration module for Drupal that attempts to accomplish the same thing, only without MediaWiki in the mix. I would check that out.
Good luck!
here are three possible solutions: (1) sigle sign-in site, (2) inject login/register forms into all sites using server site includes - SSI and (3) - ajax.
Single sign-in site.
suppose you have site1.domain.com and site2.domain.com and you want to login/register at both simultaneously. Probably the easiest way to do it will be to create another domain e.g. login.domain.com that will do the job. Your login/register application will need access to databases for site1 and site2 and/or their api's. Since login status usually resides in the cookies, your login application will need to set those login cookies to both sites simultaneously (on successful login/registration) and delete on logout.
To set cookies for all sites from login.domain.com - all of the must sit on .domain.com and cookie domain parameter must be .domain.com
If your solution needs both api access (to the other applications) and access to the same database by several applications - you may need to deal with database transactions. This is because new registrations won't be visible on other sites until transaction is committed - so for example - you can not call api from within login code to retrieve cookies before committing the transaction with the new registration.
One important detail. If you already have users separately registered at site1 and/or site2 but not on both your signon site will either have to handle those cases or you'll need to sync registrations manually yourself upon deployment of your new registration system. Manual fix won't be possible when extra user input is required to complete the cross-site registration. This point also becomes important when you add new sites requiring some new user input for the registration.
Finally, carefully choose domain name handling OpenID. To the best of my knowledge it is impossible to transfer openid endorsements across subdomains without users consent - please correct me if I am wrong. You don't want to ask users to re-register just because you decide to rename the sub-domain.
server side include (ssi) method
Another solution is to inject those forms via sever-side includes into all sites. This may be considerably harder and will depend on the type of webserver in use and will work slower.
A pre-requisite here is that all your applications run on the same subdomain - so that openid works for all of them.
I've once built common user registration for MW (php) and cnprog (python/django).
My solution was to display the same exact registration form on the wiki and the forum site, while generating and processing this form with django. I did it this way because wiki and forum "skins" are so different that I did not want to surprise visitors with the dramatic change of site appearance when they go to the registration page. This is complicated and I will not do it again :) and instead would go with single sign-in method.
in order to display django output through mediawiki I've created a wiki extension printing apache "include virtual" call to glue django-generated content with the wiki output. This comes with problems.
Apache include virtual on my installation cannot POST to subrequests and cannot pass cookies from subrequests and cannot pass redirect responses (all http headers will be thrown out) to the upstream user requests.
So I've added "was_posted=true" to mark the posts for django and a secret code to prevent cross-site forgery. To get the cookies out - had them printed with cookie_morsel.output_js() in python. So javascript must run on the client for this to work. Any redirects will have to be done with javascript too. Extra work will still be needed to upload files (like avatar picture).
So single sign-on may be the best solution.
ajax may be a neat way around - just build forms in all of your sites with javascript and submit them via ajax. Will work fast and will not break appearance of your various sites,
but this won't please the folks allergic to javascript.
Actually, the only method that does not require any javascript is single sign-in site.
Posted this because I've spent enough time building this thing for MW and django - an hour of typing did not make a difference :).