Which browsers' back button does not generate request to the server? - http

I need to test my web application against a browser for which back button doesn't generate request to server.
Could you give me examples of such browsers?

That doesn't depend on the browser used, but on the HTTP response headers sent to it. If the browser is by the response headers instructed to cache the page, then it will cache the page. But if it is instructed to not cache the page, then it will not cache the page and fire a real request.
You have control over the response headers on the server side.

Internet explorer 6, not sure about 7/8. Make sure you dont have the following meta statements in your header (they will force page reload):
<META HTTP-EQUIV="Pragma" CONTENT="no-cache">
<META HTTP-EQUIV="Expires" CONTENT="-1">
Check this page for more info:
http://support.microsoft.com/kb/234067

Related

Force mobile browser to skip cache when browser is reactivated

I've got a single-page app that I'm updating several times a day. In spite of aggressive cache controls on the page, mobile users (especially Safari users) are frequently several days out of date.
I know for a fact that the page is being reloaded, ie, that my startup JS is being executed correctly. So browsers like mobile Safari are simply loading the page from cache despite my headers and meta tags, probably when the browser itself is reactivated after being closed.
How can I prevent this behavior? I'd rather not resort to forcibly calling location.reload(), as that gets complicated (need some timestamp in local storage), could disrupt the user, and could result in slower perceived loading times.
I'm already using aggressive cache prevention measures:
<Files index.html>
Header set Cache-Control "no-cache, max-age=0, must-revalidate"
Header set Pragma "no-cache"
</files>
<!-- srsly plz dont cache this page kthxbye -->
<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate">
<meta http-equiv="Pragma" content="no-cache">
<meta http-equiv="Expires" content="0">
Why Safari isn't reloading over the network I don't know, but you might try adding location.reload() to a listener on the window's focus event. If the cache is still valid, it should reload basically instantly, and if it's not it'll pull the fresh data.
window.addEventListener("focus", () => {
// other logic here, perhaps to restrict how often it reloads
location.reload();
})

How to Remotely force a client to purge a cached website?

We are experiencing an issue where a previous version of our home page is being displayed. Even though there has been changes since then, the web page will always show the old version.
This issue stems from us using a WordPress plugin that added a
Last-Modified: Tue, 19 Apr 2016 15:18:40 GMT
header to the response.
The only way found to fix this issue is by force refresh on the browser. Is there a way to invalidate that cache remotely for all clients ?
The Request-Response header
If you mean the stylesheets or javascript for example you can update the version of the stylesheet see below for an example
<link rel="stylesheet" type="text/css" href="mystyle.css">
You can change to
<link rel="stylesheet" type="text/css" href="mystyle.css?v=1.0">
Notice the ?v=1.0 parameter at the end of the source, this works for Javascript also.
If you need images and things to update you can find lots here about cache busting here
Refresh image with a new one at the same url
you can also try adding
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="EXPIRES" CONTENT="Mon, 20 Feb 2012 00:00:01 GMT">
To the Head of the HTML Page.
Browsers are going to honor the cache settings that were originally provided to it, you should be able to look in the developer tools of the browser to see what the cached headers are.
For example, if the content sent something like:
Cache-Control: public, max-age=86400
Then it will have no reason to request an updated version of the content from your server for a day.
If the server is able to handle the load of receiving requests for the content, you can do ensure that there is an ETag and Last-Modified header, and then use a short expiration time, such as:
Cache-Control: public, max-age=600
ETag: abcdefg
Last-Modified: Tue, 19 Apr 2016 15:18:40 GMT
Then, after 10 minutes the browser will issue a request that asks the server if the content has changed. If not, the server should issue an empty 304 Not Modified response to indicate no difference. So this saves on your bandwidth and the only cost is however "expensive" resource-wise it is to determine the headers to send.
I would absolutely suggest using small cache times for your primary HTML (or any dynamic content) if you know it will change as the entire purpose of those caching headers is to allow browsers to serve as quickly as possible the version they have as well as save you on CPU and bandwidth.
Side note: If you were able to "reach out" to it in that way, it would actually be somewhat terrifying.
Based on all of the information provided, you're missing on Varnish HTTP Purge plugin and/or have not configured the VCL for it.
If you're seeing old cache version for the homepage this means that the page's cache was not purged after updating its contents in Wordpress admin.
In a typical scenario for Wordpress, you will set maximum cache lifetime and use a plugin like the one mentioned to invalidate cache based on relevant Wordpress hooks.

Server-side http referrer blocking - as complete and universal as possible

Server-side http referrer blocking may be achieved by including in the webpage header:<meta name="referrer" content="no-referrer" />
Remove http referer
https://scotthelme.co.uk/a-new-security-header-referrer-policy
However referrer policy is something fairly new, so it is not supported universally across all browsers. http://caniuse.com/#feat=referrer-policy
To therefore mask the referrer redirection may be used (php):
https://www.willmaster.com/library/security/hiding-referrer-information.php
https://lincolnloop.com/blog/2012/jun/27/referrer-blocking-hard/
But this only ensures the URL of the referring page is masked by a different URL, not an empty string.
Most browsers don't send a referrer when instructed to redirect using the "Refresh" field.
https://en.wikipedia.org/wiki/HTTP_referer#Referrer_hiding
Is there any better way to hide all referrer information more universally?
What if these two approaches are combined:
<head>
<meta name="referrer" content="no-referrer" />
<meta http-equiv="refresh" content="0; URL=<?php echo($_SERVER['QUERY_STRING']) ?>">
</head>
Would the referrer policy here have an effect i.e. block the referrer in a browser that respects referrer policy but one that also sends a referrer when instructed to redirect using the "Refresh" field?
In other words does referrer policy have an effect even when redirection doesn't happen because of user interaction, but because of any other reason such as in this example.

Is :secure_url necessary in opengraph meta tag if all site resources behind https?

My site and all it resourced serving with https, nginx redirect all http request to https.
When using OpenGraph meta, do I need to duplicate all resources with :secure_url?
For example,
<meta property="og:url" content="http://example.com">
<meta property="og:url:secure_url" content="https://example.com">
Or it's enough to specify only one property (but which one - with secure or not) and https resource?

Why am I suddenly getting a "Blocked loading mixed active content" issue in Firefox?

This morning, upon upgrading my Firefox browser to the latest version (from 22 to 23), some of the key aspects of my back office (website) stopped working.
Looking at the Firebug log, the following errors were being reported:
Blocked loading mixed active content "http://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
Blocked loading mixed active content "http://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"`
among other errors caused by the latter of the two above not being loaded.
What does the above mean and how do I resolve it?
I found this blog post which cleared up a few things. To quote the most relevant bit:
Mixed Active Content is now blocked by default in Firefox 23!
What is Mixed Content?
When a user visits a page served over HTTP, their connection is open for eavesdropping and man-in-the-middle (MITM) attacks. When a user visits a page served over HTTPS, their connection with the web server is authenticated and encrypted with SSL and hence safeguarded from eavesdroppers and MITM attacks.
However, if an HTTPS page includes HTTP content, the HTTP portion can be read or modified by attackers, even though the main page is served over HTTPS. When an HTTPS page has HTTP content, we call that content “mixed”. The webpage that the user is visiting is only partially encrypted, since some of the content is retrieved unencrypted over HTTP. The Mixed Content Blocker blocks certain HTTP requests on HTTPS pages.
The resolution, in my case, was to simply ensure the jquery includes were as follows (note the removal of the protocol):
<link rel="stylesheet" href="//code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css" type="text/css">
<script type="text/javascript" src="//ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"></script>
Note that the temporary 'fix' is to click on the 'shield' icon in the top-left corner of the address bar and select 'Disable Protection on This Page', although this is not recommended for obvious reasons.
UPDATE: This link from the Firefox (Mozilla) support pages is also useful in explaining what constitutes mixed content and, as given in the above paragraph, does actually provide details of how to display the page regardless:
Most websites will continue to work normally without any action on your part.
If you need to allow the mixed content to be displayed, you can do that easily:
Click the shield icon Mixed Content Shield in the address bar and choose Disable Protection on This Page from the dropdown menu.
The icon in the address bar will change to an orange warning triangle Warning Identity Icon to remind you that insecure content is being displayed.
To revert the previous action (re-block mixed content), just reload the page.
It means you're calling http from https. You can use src="//url.to/script.js" in your script tag and it will auto-detect.
Alternately you can use use https in your src even if you will be publishing it to a http page. This will avoid the potential issue mentioned in the comments.
In absence of a white-list feature you have to make the "all" or "nothing" Choice. You can disable mixed content blocking completely.
The Nothing Choice
You will need to permanently disable mixed content blocking for the current active profile.
In the "Awesome Bar," type "about:config". If this is your first time you will get the "This might void your warranty!" message.
Yes you will be careful. Yes you promise!
Find security.mixed_content.block_active_content. Set its value to false.
The All Choice
iDevelApp's answer is awesome.
Put the below <meta> tag into the <head> section of your document to force the browser to replace unsecure connections (http) to secured connections (https). This can solve the mixed content problem if the connection is able to use https.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you want to block then add the below tag into the <head> tag:
<meta http-equiv="Content-Security-Policy" content="block-all-mixed-content">
Its given the error because of security.
for this please use "https" not "http" in the website url.
For example :
"https://code.jquery.com/ui/1.8.10/themes/smoothness/jquery-ui.css"
"https://ajax.aspnetcdn.com/ajax/jquery.ui/1.8.10/jquery-ui.min.js"
In the relevant page which makes a mixed content https to http call which is not accessible we can add the following entry in the relevant and get rid of the mixed content error.
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
If you are consuming an internal service via AJAX, make sure the url points to https, this cleared up the error for me.
Initial AJAX URL: "http://XXXXXX.com/Core.svc/" + ApiName
Corrected AJAX URL: "https://XXXXXX.com/Core.svc/" + ApiName,
Simply changing HTTP to HTTPS solved this issue for me.
WRONG :
<script src="http://code.jquery.com/jquery-3.5.1.js"></script>
CORRECT :
<script src="https://code.jquery.com/jquery-3.5.1.js"></script>
I had this same problem because I bought a CSS template and it grabbed a javascript an external javascript file through http://whatever.js.com/javascript.js. I went to that page in my browser and then changed it to https://whatever... using SSL and it worked, so in my HTML javascript tag I just changed the URL to use https instead of http and it worked.
To force redirect on https protocol, you can also add this directive in .htaccess on root folder
RewriteEngine on
RewriteCond %{REQUEST_SCHEME} =http
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
#Blender Comment is the best approach. Never hard code the protocol anywhere in the code as it will be difficult to change if you move from http to https. Since you need to manually edit and update all the files.
This is always better as it automatically detect the protocol.
src="//code.jquery.com
I've managed to fix this using these :
For Firefox user
Open a new TAB enter about:config in the address bar to go to the configuration page.
Search for security.mixed_content.block_active_content
Change TRUE to FALSE.
For Chrome user
Click the Not Secure Warning next to the URL
Click Site Settings on the popup box
Change Insecure Content to Allow
Close and refresh the page
I found if you have issues with including or mixing your page with something like http://www.example.com, you can fix that by putting //www.example.com instead
I have facing same problem when my site goes from http to https. We have added rule for all request to redirect http to https.
You needs to add the redirection rule for inter site request, but you have to remove the redirection rule for external js/css.
I just fixed this problem by adding the following code in header:
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#if (env('APP_DEBUG'))
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
#endif
Syntax for Laravel Blade, Remember to use it for debugging only to avoid MITM attacks and eavs-dropping
Also using
http -> https
for Ajax or normal JS Scripts or CSS will also solve the issue.
If your app server is weblogic, then make sure WLProxySSL ON entry exists(and also make sure it should not be commented) in the weblogic.conf file in webserver's conf directory. then restart web server, it will work.

Resources