Server-side http referrer blocking - as complete and universal as possible - http

Server-side http referrer blocking may be achieved by including in the webpage header:<meta name="referrer" content="no-referrer" />
Remove http referer
https://scotthelme.co.uk/a-new-security-header-referrer-policy
However referrer policy is something fairly new, so it is not supported universally across all browsers. http://caniuse.com/#feat=referrer-policy
To therefore mask the referrer redirection may be used (php):
https://www.willmaster.com/library/security/hiding-referrer-information.php
https://lincolnloop.com/blog/2012/jun/27/referrer-blocking-hard/
But this only ensures the URL of the referring page is masked by a different URL, not an empty string.
Most browsers don't send a referrer when instructed to redirect using the "Refresh" field.
https://en.wikipedia.org/wiki/HTTP_referer#Referrer_hiding
Is there any better way to hide all referrer information more universally?
What if these two approaches are combined:
<head>
<meta name="referrer" content="no-referrer" />
<meta http-equiv="refresh" content="0; URL=<?php echo($_SERVER['QUERY_STRING']) ?>">
</head>
Would the referrer policy here have an effect i.e. block the referrer in a browser that respects referrer policy but one that also sends a referrer when instructed to redirect using the "Refresh" field?
In other words does referrer policy have an effect even when redirection doesn't happen because of user interaction, but because of any other reason such as in this example.

Related

Why should we include CSP headers in the HTTP response for an API?

OWASP recommends to use Content-Security-Policy: frame-ancestors 'none' in API responses in order to avoid drag-and-drop style clickjacking attacks.
However, the CSP spec seems to indicate that after the HTML page is loaded any other CSP rules in the same context would be discarded without effect. Which makes sense in my mental model of how CSP works but if OWASP recommends it then I'm sure missing something.
Can anyone explain how can a CSP header in a XHR request improve security, after the fact that the HTML page is already loaded and the "main" CSP already evaluated? How that works in the browser?
how can a CSP header in a XHR request improve security, after the fact that the HTML page is already loaded and the "main" CSP already evaluated?
You are right, browsers use CSP from main page and just ignore the CSP header sent along with the XHR requests.
But you haven't considered the second scenario - the API response is open in the browser's address bar or in a frame. In this case, cookies will be available to the response page, and if XSS is detected in the API (as, for example, in the PyPI simple endpoint API), then the user's confidential data may be available to an attacker.
Therefore, it is better to protect API responses with the "default-src `none" policy, as well as 404/403/500, etc pages.
Can anyone explain how can a CSP header in a XHR request improve security, after the fact that the HTML page is already loaded and the "main" CSP already evaluated? How that works in the browser?
Adding to the correct answer by granty above, Frames are commonly used for CSP bypasses.
If a frame was allowed in a page (not blocked by the CSP), the frame has it's own CSP scope. So if you create some API for data - you don't want to allow it to be set as a frame as it could be used for bypassing the original CSP (for data exfiltration as an example).
So you can block this vulnerability by setting Content-Security-Policy: frame-ancestors 'none';, and then your API will refuse to be framed.
See this article on bypassing CSP for more info. The POC uses a creative hack:
frame=document.createElement(“iframe”);
frame.src=”/%2e%2e%2f”;
document.body.appendChild(frame);
which in turn triggers the NGINX error code page that does not have any CSP set. Many production CSPs are vulnerable to this issue.
Since not setting a CSP on a framed page would essentially default to no CSP (everything is open), the article suggests:
CSP headers should be present on all the pages, event on the error pages returned by the web-server
The frame-ancestors 'none' directive will indicate to the browser on page load that it should not be rendered in a frame (including frame, iframe, embed, object, and applet tags). In other words the policy does not allow it to be framed by any other pages.
The CSP header for the API or page is read at load. It is not something that happens after the fact. The "main" CSP isn't pertinent because it's the URI in the frame that's sending the CSP for itself over. The browser simply honors the frame-ancestor 'none' request by that URI
The frame-ancestors directive restricts the URLs which can embed the resource using frame, iframe, object, or embed. Resources can use this directive to avoid many UI Redressing [UISECURITY] attacks, by avoiding the risk of being embedded into potentially hostile contexts.
References
CSP frame-ancestors
Clickjacking Defense Cheat Sheet
Content Security Policy
Web Sec Directive Frame Ancestors

Mixed Content The page at was loaded over HTTPS but requested an insecure resource This request has been blocked the content must be served over HTTPS

Mixed Content: The page at '' was loaded over HTTPS, but requested an insecure resource ''. This request has been blocked; the content must be served over HTTPS.
There's no way to disable mixed content using javascript but you can add this tag
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
to your HTML to allow mixed content
to allow Mixed Content:
1- add this meta tag to the page (HTML File)
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
2- add unsafe_url for referrerPolicy to your fetch requests if you get ERR_CONNECTION_REFUSED
example:
fetch('http://URL', {
// ...
referrerPolicy: "unsafe_url"
});
Warning: This policy will leak potentially-private information from
HTTPS resource URLs to insecure origins. Carefully consider the impact
of this setting.
for more info check these 2 documentations:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Referrer-Policy
https://javascript.info/fetch-api

Is :secure_url necessary in opengraph meta tag if all site resources behind https?

My site and all it resourced serving with https, nginx redirect all http request to https.
When using OpenGraph meta, do I need to duplicate all resources with :secure_url?
For example,
<meta property="og:url" content="http://example.com">
<meta property="og:url:secure_url" content="https://example.com">
Or it's enough to specify only one property (but which one - with secure or not) and https resource?

Why am I suddenly getting a "Blocked loading mixed active content" issue in Firefox?

This morning, upon upgrading my Firefox browser to the latest version (from 22 to 23), some of the key aspects of my back office (website) stopped working.
Looking at the Firebug log, the following errors were being reported:
Blocked loading mixed active content "http://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
Blocked loading mixed active content "http://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"`
among other errors caused by the latter of the two above not being loaded.
What does the above mean and how do I resolve it?
I found this blog post which cleared up a few things. To quote the most relevant bit:
Mixed Active Content is now blocked by default in Firefox 23!
What is Mixed Content?
When a user visits a page served over HTTP, their connection is open for eavesdropping and man-in-the-middle (MITM) attacks. When a user visits a page served over HTTPS, their connection with the web server is authenticated and encrypted with SSL and hence safeguarded from eavesdroppers and MITM attacks.
However, if an HTTPS page includes HTTP content, the HTTP portion can be read or modified by attackers, even though the main page is served over HTTPS. When an HTTPS page has HTTP content, we call that content “mixed”. The webpage that the user is visiting is only partially encrypted, since some of the content is retrieved unencrypted over HTTP. The Mixed Content Blocker blocks certain HTTP requests on HTTPS pages.
The resolution, in my case, was to simply ensure the jquery includes were as follows (note the removal of the protocol):
<link rel="stylesheet" href="//code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css" type="text/css">
<script type="text/javascript" src="//ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"></script>
Note that the temporary 'fix' is to click on the 'shield' icon in the top-left corner of the address bar and select 'Disable Protection on This Page', although this is not recommended for obvious reasons.
UPDATE: This link from the Firefox (Mozilla) support pages is also useful in explaining what constitutes mixed content and, as given in the above paragraph, does actually provide details of how to display the page regardless:
Most websites will continue to work normally without any action on your part.
If you need to allow the mixed content to be displayed, you can do that easily:
Click the shield icon Mixed Content Shield in the address bar and choose Disable Protection on This Page from the dropdown menu.
The icon in the address bar will change to an orange warning triangle Warning Identity Icon to remind you that insecure content is being displayed.
To revert the previous action (re-block mixed content), just reload the page.
It means you're calling http from https. You can use src="//url.to/script.js" in your script tag and it will auto-detect.
Alternately you can use use https in your src even if you will be publishing it to a http page. This will avoid the potential issue mentioned in the comments.
In absence of a white-list feature you have to make the "all" or "nothing" Choice. You can disable mixed content blocking completely.
The Nothing Choice
You will need to permanently disable mixed content blocking for the current active profile.
In the "Awesome Bar," type "about:config". If this is your first time you will get the "This might void your warranty!" message.
Yes you will be careful. Yes you promise!
Find security.mixed_content.block_active_content. Set its value to false.
The All Choice
iDevelApp's answer is awesome.
Put the below <meta> tag into the <head> section of your document to force the browser to replace unsecure connections (http) to secured connections (https). This can solve the mixed content problem if the connection is able to use https.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you want to block then add the below tag into the <head> tag:
<meta http-equiv="Content-Security-Policy" content="block-all-mixed-content">
Its given the error because of security.
for this please use "https" not "http" in the website url.
For example :
"https://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
"https://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"
In the relevant page which makes a mixed content https to http call which is not accessible we can add the following entry in the relevant and get rid of the mixed content error.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you are consuming an internal service via AJAX, make sure the url points to https, this cleared up the error for me.
Initial AJAX URL: "http://XXXXXX.com/Core.svc/" + ApiName
Corrected AJAX URL: "https://XXXXXX.com/Core.svc/" + ApiName,
Simply changing HTTP to HTTPS solved this issue for me.
WRONG :
<script src="http://code.jquery.com/jquery-3.5.1.js"></script>
CORRECT :
<script src="https://code.jquery.com/jquery-3.5.1.js"></script>
I had this same problem because I bought a CSS template and it grabbed a javascript an external javascript file through http://whatever.js.com/javascript.js. I went to that page in my browser and then changed it to https://whatever... using SSL and it worked, so in my HTML javascript tag I just changed the URL to use https instead of http and it worked.
To force redirect on https protocol, you can also add this directive in .htaccess on root folder
RewriteEngine on
RewriteCond %{REQUEST_SCHEME} =http
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
#Blender Comment is the best approach. Never hard code the protocol anywhere in the code as it will be difficult to change if you move from http to https. Since you need to manually edit and update all the files.
This is always better as it automatically detect the protocol.
src="//code.jquery.com
I've managed to fix this using these :
For Firefox user
Open a new TAB enter about:config in the address bar to go to the configuration page.
Search for security.mixed_content.block_active_content
Change TRUE to FALSE.
For Chrome user
Click the Not Secure Warning next to the URL
Click Site Settings on the popup box
Change Insecure Content to Allow
Close and refresh the page
I found if you have issues with including or mixing your page with something like http://www.example.com, you can fix that by putting //www.example.com instead
I have facing same problem when my site goes from http to https. We have added rule for all request to redirect http to https.
You needs to add the redirection rule for inter site request, but you have to remove the redirection rule for external js/css.
I just fixed this problem by adding the following code in header:
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#if (env('APP_DEBUG'))
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#endif
Syntax for Laravel Blade, Remember to use it for debugging only to avoid MITM attacks and eavs-dropping
Also using
http -> https
for Ajax or normal JS Scripts or CSS will also solve the issue.
If your app server is weblogic, then make sure WLProxySSL ON entry exists(and also make sure it should not be commented) in the weblogic.conf file in webserver's conf directory. then restart web server, it will work.

Which browsers' back button does not generate request to the server?

I need to test my web application against a browser for which back button doesn't generate request to server.
Could you give me examples of such browsers?
That doesn't depend on the browser used, but on the HTTP response headers sent to it. If the browser is by the response headers instructed to cache the page, then it will cache the page. But if it is instructed to not cache the page, then it will not cache the page and fire a real request.
You have control over the response headers on the server side.
Internet explorer 6, not sure about 7/8. Make sure you dont have the following meta statements in your header (they will force page reload):
<META HTTP-EQUIV="Pragma" CONTENT="no-cache">
<META HTTP-EQUIV="Expires" CONTENT="-1">
Check this page for more info:
http://support.microsoft.com/kb/234067

Resources