IE9 redirect caching, fonts, and cross domain resource sharing(CORS) CDN HTTP headers - css

I thought I have somehow found a solution to the very vexing problem with Firefox and CDN-hosted fonts access, but here comes IE9.
I recently found a very frustrating issue with IE9 caching problem, and chanced upon this blog post (IE9 Redirect Caching Nightmare) which enlightened me more about the actual issue.
I have to admit that I'm not sure whether the above mentioned is actually the issue, but it seems close enough.
Problem:
I have a website set up with 2 domains(base domain and subdomain) pointing to the same server, serving the exact same website which is using a same set of resources from a CDN hosted on Amazon S3, served by Cloudfront.
https://example.com
https://www.example.com
I get these kind of error messages in my IE9 developer tools console when loading fonts from my CSS file using #font-face:
CSS3117: #font-face failed cross-origin request. Resource access is restricted.
This happens when I loaded either of the URL first, then visiting the other second. IE9 is not running in Compatibility Mode. It running is in Document Mode: IE9 Standards.
From my limited understanding of the CORS and the need to set Access-Control-Allow-Origin HTTP header, I have dutifully set it up in S3 CORS policy, and it works perfectly fine with Firefox.
Requests from both domains, will get their respective header when requesting the CDN resource.
It seems that IE9 tried to do some optimization with caching, and cached the redirect too.
This causes a problem as the Access-Control-Allow-Origin header is cached as well. Without sending a request to the CDN server, the Access-Control-Allow-Origin header cannot change for different domains.
So I'm left with a situation where the request is from https://www.example.com and yet the Access-Control-Allow-Origin is https://example.com. This leads to the restricted resource problem with the error message above.
Further look: I did a check with Firefox 19, the above situation actually occurs, but it does not encounter the same strict restriction as IE9. Subdomain (https://www.example.com) requesting information will accept the access-control-allow-origin of the main domain(https://example.com). Chrome (Webkit) doesn't seem to care. I'm at a loss about which browser's behaviour implementation is correct.
With my current settings in the CDN, it seems like Chrome and Firefox, automatically reroutes allwww subdomain requests to the main domain. Only upon multiple attempts of inputing the www subdomain in the address bar, then will Chrome and Firefox obey. IE9 on the other hand, just goes to whichever address is typed in the address bar. IE9 seems to be the odd one out here, but I'm not sure which browser's behaviour is actually correct.
From a usability standpoint, Chrome and Firefox seems to be the "correct" behaviour.
Known Possible Solutions:
Set Access-Control-Allow-Origin header to allow all, i.e. *
Turn off caching in the browser
Redirect one domain to the other
Use query string to differentiate different domain calls for resource
Embed the font into the CSS as data-uri
For solution 1, let's just say I'm paranoid that I just want to set specific domains to allow.
For solution 2, is not optimal if I were to set it for all browsers, also my site has to run on mobile devices with usually less-than-desirable download speeds.
For solution 3, possible, but I'm still curious for solution to deal directly with the IE9 caching issue.
For solution 4, it is very hard to implement especially when the resource is called from #font-face. Does it mean that I'll have to dynamically re-generate the CSS for different domain calls for the different line just to load a font to bypass the issue? Seems to defeat the purpose of CSS itself, and caching resources for that matter.
edit: Stylesheet works, font-loading doesn't.
For Solution 5, it is tedious for maintenance and updating, especially when there are changes to the font files periodically.
Question: Are there any known ways to deal specifically with IE9's redirect caching behaviour in this particular case?
Answers and comments are very much appreciated. Thanks in advance!
Edit: More browser test information.

Solution 1:
Check this question.
Solution 4: rename your CSS file to style.php and use whatever code you need to call the appropriate resource.
Set the content type at the top of the page.
<?php
header("Content-type: text/css; charset: UTF-8");
?>
More info about style.php from Chris Coyier.

We discovered the same weird behavior also in IE10 and IE11.
Resetting the browser cache makes the fonts to be loaded without any problem. Also enabling and disabling compatibility mode.
But when switching to another subdomain, IE does not render the font because request header does not match the response header which is still the URL of the last request. And IE always shows the full URL for even if the definition on the bucket is *.ourdomain.com
So the general issue with allowing cross origin requests to assets like webfonts was solved by adding CORS permissions to the S3 Bucket - that made the webfonts work perfectly in Firefox.
But we still have no idea how to avoid * and tell IE not to cache the response headers.

Related

some CSS can not show when swith the website from http to https

I have a website written in Ruby using Ruby on Rail framwork, everything was fine when using HTTP protocol, but when switching to HTTPS protocol.
Some CSS material can not shown, but some of it can.
The font can not be shown, originally the font was designed, but now it is not.
Anyone know what happen?
Without any specific error I assume browser is probably blocking files loading from mixed content, i.e. using both HTTP and HTTPS. Use your browser developer tools network tab to confirm this.
You can use // instead of http:// so that resources load from the relative protocol that the page content is loading from; Can I change all my http:// links to just //?
Also read; How to fix a website with blocked mixed content

Can I change all my http:// links to just //?

Dave Ward says,
It’s not exactly light reading, but section 4.2 of RFC 3986 provides for fully qualified URLs that omit protocol (the HTTP or HTTPS) altogether. When a URL’s protocol is omitted, the browser uses the underlying document’s protocol instead.
Put simply, these “protocol-less” URLs allow a reference like this to work in every browser you’ll try it in:
//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js
It looks strange at first, but this “protocol-less” URL is the best way to reference third party content that’s available via both HTTP and HTTPS.
This would certainly solve a bunch of mixed-content errors we're seeing on HTTP pages -- assuming that our assets are available via both HTTP and HTTPS.
Is this completely cross-browser compatible? Are there any other caveats?
I tested it thoroughly before publishing. Of all the browsers available to test against on Browsershots, I could only find one that did not handle the protocol relative URL correctly: an obscure *nix browser called Dillo.
There are two drawbacks I've received feedback about:
Protocol-less URLs may not work as expected when you "open" a local file in your browser, because the page's base protocol will be file:///. Especially when you're using the protocol-less URL for an external resource like a CDN-hosted asset. Using a local web server like Apache or IIS to test against http://localhost addresses works fine though.
Apparently there's at least one iPhone feed reader app that does not handle the protocol-less URLs correctly. I'm not aware of which one has the problem or how popular it is. For hosting a JavaScript file, that's not a big problem since RSS readers typically ignore JavaScript content anyway. However, it could be an issue if you're using these URLs for media like images inside content that needs to be syndicated via RSS (though, this single reader app on a single platform probably accounts for a very marginal number of readers).
The question of whether one could change all their links to be protocol-relative may be moot, considering the question of whether one should do so. According to Paul Irish:
2014.12.17: Now that SSL is encouraged for everyone and doesn’t have performance concerns, this technique is now an anti-pattern. If the
asset you need is available on SSL, then always use the https://
asset.
If you use protocol-less URLs to load stylesheets, IE 7 & 8 will download them twice:
http://www.stevesouders.com/blog/2010/02/10/5a-missing-schema-double-download/
So, this is to be avoided for CSS if you like good performance.
Yes, network-path references were already specified in RFC 1808 and should work with all browsers.
Is this completely cross-browser compatible? Are there any other caveats?
Just to throw this in the mix, if you are developing on a local server, it might not work. You need to specify a scheme, otherwise the browser may assume that src="//cdn.example.com/js_file.js" is src="file://cdn.example.com/js_file.js", which will break since you're not hosting this resource locally.
Microsoft Internet Explorer seem to be particularly sensitive to this, see this question: Not able to load jQuery in Internet Explorer on localhost (WAMP)
You would probably always try to find a solution that works on all your environments with the least amount of modifications needed.
The solution used by HTML5Boilerplate is to have a fallback when the resource is not loaded correctly, but that only works if you incorporate a check:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<!-- If jQuery is not defined, something went wrong and we'll load the local file -->
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>
I posted this answer here as well.
UPDATE: HTML5Boilerplate now uses <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"> after deciding to deprecate protocol relative URLs, see here.
If you would like to make sure all requests are upgraded to secure protocol then there is simple option to use Content Security Policy header upgrade-insecure-requests
Content-Security-Policy: upgrade-insecure-requests;
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/upgrade-insecure-requests
I have not had these issues when using ://example.com - but you do need to add the colon at the beginning. Yoast had a good write up about this a while back. But it's lost in his pile of blog posts.

HTTPS does not work - Secure and Non secure data on web page?

I have a browser compatibilty problem with https? I have SSL installed and is in usage. Until today morning, my https part is working well. From then, Https is shown as https(with slashed in red color) saying the page has some insecure content.
I have not changed any code and suddenly i see this problem in chrome. In IE 8, i see the same problem but on every page, it shows me a popup if i should allow to opne secure and non secure or just secure. Firefox has no issues . It shows correct https without any problem. I am fed up with it searching all over. Why is this happenening for me in Chrome and IE 8.
Could someone tell me what the problem is and what can be done to solve it!
PS: I have also checked if the page source is any different when IE8 showed with and without secure data. Everything is the same. but viewstateID was different. Is that something that is creating this problem?
Thanks a lot in advance.
This is usually caused by having the absolute path to a resource specified somewhere on the page without having https specified, eg:
<img src="http://someurl.com/image.png">
If it's a link to something on your site, use https: or a relative path.
DO you have any 3:rd party javascript included, like google analytics or other that might have changed.
If you try with Firefox there is firebug you can add as an addon.
In there is a tab for network (net).
It lists everything the page loads.
In that list you should be able to find anything that gets loaded without https.
IE (correctly) complains when there is mixed http/https content as a security warning. Most other browsers do not typically complain when dealing with mixed content so your source is very likely the same in both instances.
I would second David Mårtensson's answer and say the issue is likely a third party library (google or MS hosted JQuery for example) or static asset server.

Caching Typekit CSS

I'm using TypeKit to provide fonts for a site I'm developing. When the page is loading it loads slow (more than a second). Turns out that this is because it's downloading the fonts on every requests. It's beyond me that a service such as this doesn't have etags configured to get clients to cache the fonts...but I digress. Until TypeKit fixes this I host the CSS locally temporarily.
Anyone had this issue with TypeKit? How did you work around it? Perhaps I'm wrong?
According to a posting on their getsatisfaction.com account, they have at least some caching in place:
One thing to note is that although the fonts are served with an Expires header, they're also served with an Etag. The browser is required to make a request after 5 minutes, but will normally use the Etag to generate a 304 (Not Modified) response - meaning, the fonts aren't actually downloaded again.
can you check what happens using Firebug?

IE7 not Caching CSS Image over SSL

I'm using the WebDevHelper toolbar for Internet Explorer to troubleshoot HTTP requests/roundtrips on my SSL site and noticed that IE re-downloads my CSS :hover images every time they are triggered. This causes a huge amount of roundtrips.
How can I prevent this from happening?
Edit: All static content is served with cache-control: public, so images, javascript etc. are cached in Firefox and Chrome. This problem is IE specific.
Serve static content via http, sure, but don't do separate images for :hover states. Proper css image sprites should be used. It's just good practice all around, via https or http. There are tons of resources available for creating sprites. Supposedly SpriteMe, [ http://spriteme.org/ ] is an attempt to automate css image sprite creation.
If the images are being delivered from a different hostname than your main page, then you're hitting the artifact described here:
http://blogs.msdn.com/ieinternals/archive/2010/04/21/Internet-Explorer-May-Bypass-Cache-for-Cross-Domain-HTTPS-Content.aspx
Well there are multiple issues according to other Stackoverflow posts. FireFox 2.x also has this problem. But FireFox 3.x doesn't.
Will web browsers cache content over https
Also in Internet Explorer, you go to Tools > Internet Options > Advanced tab > Security section > Do not save encrypted pages to disk. It appears to be unchecked by default in IE6, 7 and 8.
Content served via SSL will not be cached for security reasons. If you want something to be cached, serve it via HTTP.
Have you tried adding to the header for those type of static files.
P3P: CP="CAO PSA OUR"
I know this works with in IE to allow storage of cookies through framesets and stuff. Not sure if it works with static files under HTTPS.
I know it sounds weird...
try to put a url to something that isn't exists (404 error). after this, all the rest of the images will be cached.

Resources