I can't seem to get a Vaadin(7) BrowserFrame to open https sources, and am struggling to understand why that might be. With http:// sources the web page is opened just fine, but I just get a blank page when using a https://www.google.co.uk; tcpdump shows that the a request was served, but it's not displayed in the browser window.
class BrowserWindow extends Window {
BrowserWindow(URI externalUri) {
center()
setClosable(false)
setDraggable(false)
setResizable(false)
setSizeFull()
setModal(true)
def ex = new ExternalResource(externalUri.toString())
BrowserFrame browser = new BrowserFrame("Browser", ex)
browser.setSizeFull()
content = browser
}
}
It works just fine with
getUI().getCurrent().addWindow(new BrowserWindow("http://www.truespeed.com")
but not with
getUI().getCurrent().addWindow(new BrowserWindow("https://www.google.co.uk")
Does anyone know why that might be?
This is usually a problem caused by mixed (https and http) content. The BrowserWindow might be trying to load an http page / resource which is "Potentially Dangerous". If you open the browser console, you will be able to see the error because of which the content was not loaded.
In short: If a https parent window tried to load a resource over http which compromises the security of the entire page, the browser can/will not load that resource.
More detailed information about mixed content can be found here
Make sure that the resource you are trying to load uses https to get around that problem (if mixed content is the problem).
Related
I have a use case where I need to retrieve the initiator from the Chrome Network tab. This works fine, except for the following case:
The iframe is HTTPS;
The enclosing page is HTTP;
The page was opened by Selenium
In this case, the network tab (and any extension on the debug protocol) show the fetch to the iframe content remains pending forever, and none of the child loads are emitted.
Changing the page URL to HTTPS, then the iframe is loaded and the child loads are displayed.
If I manually control the Selenium-opened browser and open a new tab, then it does not matter if the fetch is over HTTP or HTTPS. It really is only the tab that webdriver creates when it's loaded that seems to suffer this effect.
Is there some security protection at play, or is this just a weird bug?
This appears to be caused by Out-of-Process iFrame Isolation.
By passing --disable-features=IsolateOrigins,site-per-process to the Chrome process, then the iFrame network traces show up.
Let's take the following hypothetical situation:
an HTTP server has a custom error page set up /404.html and does a server-side forward for any URL that gives a 404 response (for example /blabla.html) to the 404.html page
a browser requests an existing page from the server, say /home.html
the page contains <img src="a.jpg" alt="a" />, but that resource does not exist on the server
the browser receives a 404 for the resource, marks it as missing and does not receive any response (tested this in Chrome and FF in the network tab of the dev console - the response bit is empty)
My question is: what happens on the server when the image is requested?
My guess is the browser cuts off the connection when it gets the 404 status in the header so it doesn't wait or download the response. My other guess is that it's implementation specific, but I'm curious if the servers notice that the connection has been cut off.
The browser will get your error page but he can't handle html in an image. (It will throw an error in the console)
If you would do it with a frame it will show your error page.
What does "Pending" mean under the status column in the "Network" tab of Google Chrome Developer window?
This happens when my page script issues a GET request whose response contains content-headers for downloading a CSV file:
Content-type: text/csv;
Content-Disposition: attachment; filename=myfile.csv
This works fine in FF and IE7, downloading a CSV file as expected and opening a file picker to save the file, but Chrome does nothing. I confirmed that the server responds to the request, so it appears that Chrome will not process the response.
Curiously, all works as expected if I type the URL into Chromes address bar and hit <enter>.
FYI: Chrome 10.0.648.204 on Windows XP
In my case, I found that the "pending" status was caused by the AdBlock extension. The image that I couldn't get to load had the word "ad" in the URL, so AdBlock kept it from loading.
Disabling AdBlock fixes this issue.
Renaming the file so that it doesn't contain "ad" in the URL also fixes it, and is obviously a better solution. Unless it's an advertisement, in which case you should leave it like that.
I also get this when using the HTTPS everywhere plugin.
This plugin has a list of sites that also have https instead of http. So I assume before the actual request is made it is already being cancelled somehow.
So for example when I go to http://stackexchange.com, in Developer I first see a request with status (terminated). This request has some headers, but only the GET, User-Agent, and Accept. No response as well.
Then there is request to https://stackexchange.com with full headers etc.
So I assume it is used for requests that aren't sent.
I had some problems with pending request for mp3 files.
I had a list of mp3 files and one player to play them. If I picked a file that had already been downloaded, Chrome would block the request and show "pending request" in the network tab of the developer tools.
All versions of Chrome seem to be affected.
Here is a solution I found:
player[0].setAttribute('src','video.webm?dummy=' + Date.now());
You just add a dummy query string to the end of each url. This forces Chrome to download the file again.
Another example with popcorn player (using jquery) :
url = $(this).find('.url_song').attr('url');
pop = Popcorn.smart( "#player_", url + '?i=' + Date.now());
This works for me. In fact, the resource is not stored in the cache system. This should also work in the same way for .csv files.
I had the same issue on OSX Mavericks, it turned out that Sophos anti-virus was blocking certain requests, once I uninstalled it the issue went away.
If you think that it might be caused by an extension one easy way to try and test this is to open chrome with the '--disable-extensions flag to see if it fixes the problem. If that doesn't fix it consider looking beyond the browser to see if any other application might be causing the problem, specifically security apps which can affect requests.
I had a similar issue with application/json ajax calls. In ff/IE they were fine. In chrome in the Developer Network window Status was always (pending) because a different status code was being returned.
In my case I changed my Json response to send a HttpStatusCode of 200 then Chrome was fine and the Status Text changed to 200 OK.
For example using ASP.NET Web Api
return new HttpResponseMessage(HttpStatusCode.OK ) {
Content = request.Content
};
The Network pending state on time, means your request is in progressing state. As soon as it responds the time will be updated with total elapsed time.
This picture shows the network call is in processing state(Pending)
This picture shows the time taken in processing by network call.
The fix, for me, was to add the following to the top of the php file which was being requested.
header("Cache-Control: no-cache,no-store");
Same problem with Chrome : I had in my html page the following code :
<body>
...
<script src="http://myserver/lib/load.js"></script>
...
</body>
But the load.js was always in status pending when looking in the Network pannel.
I found a workaround using asynchronous load of load.js:
<body>
...
<script>
setTimeout(function(){
var head, script;
head = document.getElementsByTagName("head")[0];
script = document.createElement("script");
script.src = "http://myserver/lib/load.js";
head.appendChild(script);
}, 1);
</script>
...
</body>
Now its working fine.
Encountered a similar issue recently.
My App is in angular 11 and we have a form with some validators which have regex to validate the data. One of data element had a special character which the regex wasn't handling and it made the entire browser hung up. Infact, even though all network calls were successful with 200 Ok, chrome was not showing any response returned by the backend and was also showing the requests in Pending State when infact all network calls are successful, there was no console log errors or anything. Handling the regex fixed the issue.
After i found the issue, i googled more about it. Here is more explanation about it.
https://javascript.info/regexp-catastrophic-backtracking
I came across this issue when I was debugging a local web application. The issue turned out to be AVG Antivirus and Firewall restrictions. I had to allow an exception through the firewall to get rid of the "Pending" status.
In my case, a simple restart to my browser (chrome) and it worked straight away afterwards like magic!
Little bit of context, I happen to refresh my frontend web page and straight away went onto making a changes to my API which led it to restart. During that instance, the frontend was making calls to API which led into "pending" due to that API is reloading. Browser at this point cached that pending state. For me to get out of it is either I set no-cache (which I didn't want to) or simply restart the browser, I chose the restart.
A little background
I encountered such an issue when requesting an url in my Django project. The server is setup using Apache HTTP web server and basic auth for user authentication.
The url I was accessing required no authentication to access i.e. in my Apache config, I had set Require all granted on the url using the LocationMatch directive.
The issue
The url I was trying to access returned 200 status (in the Network tab in Chrome), but the static assets being used for styling of the requested webpage (css, javascript, font files etc.) associated with the request url were not loading and returned pending status.
In the meanwhile, the page loaded partially and still kept on loading. All this was happening in the presence of basic-auth dialog in browser, even though my url was granted all access.
What worked for me
Interestingly, as I entered my credentials and logged in, the requested page loaded all the static assets. This made it very clear to me that the static assets directory might NOT have the necessary access permissions.
Then, I granted the access to the static assets directory by updating my Apache config and then the requested url and the webpage loaded up fine (200 status) without any basic auth dialog OR pending status.
In my case, there's an update for Chrome that makes it won't load before you restart the browser. Cheers
I encountered the same problem when I request certain images from page. I use JavaScript to set the src attribute of an img object and if the network is poor pending will be displayed in the network panel of chrome developer window. I think it's due to the poor network.
Setting Response.StatusCode = 404 doesn't serve content under neither IE8 nor Chrome? It works in Mozilla though I find it strange!
Do the simplest of things - empty asp.net web application project with empty Default.aspx page. In Page_Load event use the following:
protected void Page_Load( object sender, EventArgs e )
{
Response.StatusCode = 404;
}
This effectively sets the status code of the current request to 404, no doubt about that. When rendering under IE8 or Chrome, or may be some other browsers as well - I haven't tested, the actual page doesn't show up at all. These browsers display their default 404 error pages (NOT default IIS custom errors). Example in IE8:
The webpage cannot be found
HTTP 404
Most likely causes:
•There might be a typing error in the address.
•If you clicked on a link, it may be out of date. ... and so on ...
What I really want to do though is to serve 404 error page with 404 error code which will actually tell the browser or the crawler or whoever that this page doesn't exist - not only to show some fancy custom error message with status message 200 OK.
Using fiddler shows that I am actually really serving the request, but the browser is totally ignoring it?!
My question - how can I set 404 status code and still render page content? Example - http://www.intel.com/invalidpage.wow. Using fiddler shows that this page is served with 404 status code.
By default IE will show it's custom error page if the response for the error is less than a configurable amount. I believe the amount is 512 bytes, but I will try to find some confirmation on this. So all you need to do is put more content in your response.
EDIT: This blog post describes the limits. One of the comments shows the registry key settings for changing these values. The key is:
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Internet Explorer\MAIN\ErrorThresholds
You don't serve the content, you set a custom error page at the web server level (IIS) or in the web.config in the case of asp.net
I'm working on a web site which contains sections that need to be secured by SSL.
I have the site configured so that it runs fine when it's always in SSL, I see the SSL padlock in IE7/IE8/FireFox/Safari/Chrome
To implement the SSL switching, I created a class that implemented IHTTPModule and wired up HTTPApplication.PreRequestHandlerExecute.
I go through some custom logic to determine whether or not my request should use SSL, and then I redirect. I have to deal with two scenarios:
Currently in SSL and request doesn't require SSL
Currently not in SSL but request requires SSL
I end up doing the followng (where ctx is HttpContext.Current and pathAndQuery is ctx.Request.Url.PathAndQuery)
// SSL required and current connection is not SSL
if (requestRequiresSSL & !ctx.Request.IsSecureConnection)
ctx.Response.Redirect("https://www.myurl.com" + pathAndQuery);
// SSL not required but current connection is SSL
if (!requestRequiresSSL & ctx.Request.IsSecureConnection)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
The switching back and forth now works fine. However, when I go into SSL mode, FireFox and IE8 warns me that my request isn't entirely encrypted.
It looks like my module is short circuiting my request somehow, would appreciate any thoughts.
I would suspect, that when you determine which resources require encryption, and which not, you do not include the images, or some header and footers as well, or even CSS files, if you use any.
As you always throw away SSL for such a content, it may happen that part of the page (main html) requires SSL, but the consequential request for an image on this page does not.
The browser is warning you, that some parts of the page were not delivered using SSL.
I will check if the request is for HTML, and only then drop the SSL if needed. Otherwise, keep it the way it is (most probably images and such are referenced with relative paths, than a full blown url).
I.e., if you have:
<html>
<body>
Some content...
<img src="images/someimage.jpg">
</body>
</html>
and you request this page using SSL, but your evaluation of requestRequiresSSL does not take into account the images as secured resources, it will form a http, not https request, and you will see the warning.
Make sure when you request a resource and evaluate requestRequiresSSL, to check the referrer and if this is an image:
// SSL not required but current connection is SSL
if (!requestRequiresSSL && ctx.Request.IsSecureConnection && !isHtmlContent)
ctx.Response.Redirect("http://www.myurl.com" + pathAndQuery);
Just figure out how to determine isHtmlContent (if you do not serve images from a database, etc., but from a disk location), just check the the resource filename (.aspx, .asmx, .ashx, .html, etc.).
That way, if the connection is encrypted, but the resource itself is not html, and no set for "encryption", you are not going to drop the encryption.
I highly recommend using this (free / open source) component to do what you're trying:
http://www.codeproject.com/KB/web-security/WebPageSecurity_v2.aspx
Any content that is not normally handled by .Net (such as regular html and most graphic files) will not execute the httpmodule because it doesn't go through .net
Your best bet is to just handle this at the IIS level. See the following for info on how to configure your server.
http://www.jameskovacs.com/blog/HowToAutoRedirectToASSLsecuredSiteInIIS.aspx
I highly recommend you this product:
http://www.e2xpert.com/web/Http-Https-Switch.aspx
It is professional and easy to use. It comes with a powerful configuration tool, by which just one click can finish the entire configuration for you.
Just use SSL throughout your site, for all pages and for all images/scripts/stylesheets. That just makes everything oh-so-simple. IE and Firefox will no longer complain, you will no longer have crazy modules trying to guess whether any given request should be redirected, etc.
For the average user it's nearly impossible for them to make a informed decision when the only thing Firefox vaguely tells them is, "Parts of the page you are viewing were not encrypted before being transmitted over the Internet." This is about as helpful as the "somethings wrong" engine light and in fact is telling them after their information has been transferred.
The least this message should be accompanied with is a list providing the URL, type of content (images, javascript, css) and what it means to the user. BTW I get this message when using GMail.
Until that happens, as others stated your code should work once you determine the unsecured elements. Then you can use Firebug (http://getfirebug.com) to check the content being delivered over the connection.