I'm using TypeKit to provide fonts for a site I'm developing. When the page is loading it loads slow (more than a second). Turns out that this is because it's downloading the fonts on every requests. It's beyond me that a service such as this doesn't have etags configured to get clients to cache the fonts...but I digress. Until TypeKit fixes this I host the CSS locally temporarily.
Anyone had this issue with TypeKit? How did you work around it? Perhaps I'm wrong?
According to a posting on their getsatisfaction.com account, they have at least some caching in place:
One thing to note is that although the fonts are served with an Expires header, they're also served with an Etag. The browser is required to make a request after 5 minutes, but will normally use the Etag to generate a 304 (Not Modified) response - meaning, the fonts aren't actually downloaded again.
can you check what happens using Firebug?
Related
How long will cached CSS file get updated in browser if I don't do anything specifically?
I googled this but haven't found a clear answer. I know I can use file.css?v=1 to force the browser to load the updated version or I can use hard reload feature of the browser. But what if I don't do all of these? So far the browser will always load the cached old version.
Without hard reload and any other setup in server, how long will a local browser update the cached CSS file? Will the cached version stay there forever? (unless the cache space is full to make space).
Browsers generally follow the IETF spec for HTTP caching. This was introduced in the HTTP 1.1 spec. But they do all vary if the content being served doesn't use an HTTP Cache-Control header. Ultimately you can't rely on the hope that your updated file will be loaded by the client unless you either use a URL cache-buster, as you mentioned, or serve your content with proper cache-control headers.
I thought I have somehow found a solution to the very vexing problem with Firefox and CDN-hosted fonts access, but here comes IE9.
I recently found a very frustrating issue with IE9 caching problem, and chanced upon this blog post (IE9 Redirect Caching Nightmare) which enlightened me more about the actual issue.
I have to admit that I'm not sure whether the above mentioned is actually the issue, but it seems close enough.
Problem:
I have a website set up with 2 domains(base domain and subdomain) pointing to the same server, serving the exact same website which is using a same set of resources from a CDN hosted on Amazon S3, served by Cloudfront.
https://example.com
https://www.example.com
I get these kind of error messages in my IE9 developer tools console when loading fonts from my CSS file using #font-face:
CSS3117: #font-face failed cross-origin request. Resource access is restricted.
This happens when I loaded either of the URL first, then visiting the other second. IE9 is not running in Compatibility Mode. It running is in Document Mode: IE9 Standards.
From my limited understanding of the CORS and the need to set Access-Control-Allow-Origin HTTP header, I have dutifully set it up in S3 CORS policy, and it works perfectly fine with Firefox.
Requests from both domains, will get their respective header when requesting the CDN resource.
It seems that IE9 tried to do some optimization with caching, and cached the redirect too.
This causes a problem as the Access-Control-Allow-Origin header is cached as well. Without sending a request to the CDN server, the Access-Control-Allow-Origin header cannot change for different domains.
So I'm left with a situation where the request is from https://www.example.com and yet the Access-Control-Allow-Origin is https://example.com. This leads to the restricted resource problem with the error message above.
Further look: I did a check with Firefox 19, the above situation actually occurs, but it does not encounter the same strict restriction as IE9. Subdomain (https://www.example.com) requesting information will accept the access-control-allow-origin of the main domain(https://example.com). Chrome (Webkit) doesn't seem to care. I'm at a loss about which browser's behaviour implementation is correct.
With my current settings in the CDN, it seems like Chrome and Firefox, automatically reroutes allwww subdomain requests to the main domain. Only upon multiple attempts of inputing the www subdomain in the address bar, then will Chrome and Firefox obey. IE9 on the other hand, just goes to whichever address is typed in the address bar. IE9 seems to be the odd one out here, but I'm not sure which browser's behaviour is actually correct.
From a usability standpoint, Chrome and Firefox seems to be the "correct" behaviour.
Known Possible Solutions:
Set Access-Control-Allow-Origin header to allow all, i.e. *
Turn off caching in the browser
Redirect one domain to the other
Use query string to differentiate different domain calls for resource
Embed the font into the CSS as data-uri
For solution 1, let's just say I'm paranoid that I just want to set specific domains to allow.
For solution 2, is not optimal if I were to set it for all browsers, also my site has to run on mobile devices with usually less-than-desirable download speeds.
For solution 3, possible, but I'm still curious for solution to deal directly with the IE9 caching issue.
For solution 4, it is very hard to implement especially when the resource is called from #font-face. Does it mean that I'll have to dynamically re-generate the CSS for different domain calls for the different line just to load a font to bypass the issue? Seems to defeat the purpose of CSS itself, and caching resources for that matter.
edit: Stylesheet works, font-loading doesn't.
For Solution 5, it is tedious for maintenance and updating, especially when there are changes to the font files periodically.
Question: Are there any known ways to deal specifically with IE9's redirect caching behaviour in this particular case?
Answers and comments are very much appreciated. Thanks in advance!
Edit: More browser test information.
Solution 1:
Check this question.
Solution 4: rename your CSS file to style.php and use whatever code you need to call the appropriate resource.
Set the content type at the top of the page.
<?php
header("Content-type: text/css; charset: UTF-8");
?>
More info about style.php from Chris Coyier.
We discovered the same weird behavior also in IE10 and IE11.
Resetting the browser cache makes the fonts to be loaded without any problem. Also enabling and disabling compatibility mode.
But when switching to another subdomain, IE does not render the font because request header does not match the response header which is still the URL of the last request. And IE always shows the full URL for even if the definition on the bucket is *.ourdomain.com
So the general issue with allowing cross origin requests to assets like webfonts was solved by adding CORS permissions to the S3 Bucket - that made the webfonts work perfectly in Firefox.
But we still have no idea how to avoid * and tell IE not to cache the response headers.
My code to load JavaScript file is:
<script src="/path/to/app.js?1350550684711"></script>
where 1350550684711 is just a server-side generated timestamp. This practice of cache busting is quite popular (link 1, link 2).
In Chrome and Firefox this mechanism works, and in theory it should work for all browsers, since a different HTTP resource is being requested every time.
Still reports are coming in of users using cached versions of the JS file; specifically those on Apple Safari. Any idea?
That could mean that the affected Apple Safaris reuse a cached version of the html page that contains the <script elements, and thus naturally they would not get a new timestamp from the server.
I'm speculating here, but that could be because they interpret the HTTP cache related headers differently, potentially due to differing default settings, offline browsing mode or whatever.
Check which cache settings apply to that html file.
Dave Ward says,
It’s not exactly light reading, but section 4.2 of RFC 3986 provides for fully qualified URLs that omit protocol (the HTTP or HTTPS) altogether. When a URL’s protocol is omitted, the browser uses the underlying document’s protocol instead.
Put simply, these “protocol-less” URLs allow a reference like this to work in every browser you’ll try it in:
//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js
It looks strange at first, but this “protocol-less” URL is the best way to reference third party content that’s available via both HTTP and HTTPS.
This would certainly solve a bunch of mixed-content errors we're seeing on HTTP pages -- assuming that our assets are available via both HTTP and HTTPS.
Is this completely cross-browser compatible? Are there any other caveats?
I tested it thoroughly before publishing. Of all the browsers available to test against on Browsershots, I could only find one that did not handle the protocol relative URL correctly: an obscure *nix browser called Dillo.
There are two drawbacks I've received feedback about:
Protocol-less URLs may not work as expected when you "open" a local file in your browser, because the page's base protocol will be file:///. Especially when you're using the protocol-less URL for an external resource like a CDN-hosted asset. Using a local web server like Apache or IIS to test against http://localhost addresses works fine though.
Apparently there's at least one iPhone feed reader app that does not handle the protocol-less URLs correctly. I'm not aware of which one has the problem or how popular it is. For hosting a JavaScript file, that's not a big problem since RSS readers typically ignore JavaScript content anyway. However, it could be an issue if you're using these URLs for media like images inside content that needs to be syndicated via RSS (though, this single reader app on a single platform probably accounts for a very marginal number of readers).
The question of whether one could change all their links to be protocol-relative may be moot, considering the question of whether one should do so. According to Paul Irish:
2014.12.17: Now that SSL is encouraged for everyone and doesn’t have performance concerns, this technique is now an anti-pattern. If the
asset you need is available on SSL, then always use the https://
asset.
If you use protocol-less URLs to load stylesheets, IE 7 & 8 will download them twice:
http://www.stevesouders.com/blog/2010/02/10/5a-missing-schema-double-download/
So, this is to be avoided for CSS if you like good performance.
Yes, network-path references were already specified in RFC 1808 and should work with all browsers.
Is this completely cross-browser compatible? Are there any other caveats?
Just to throw this in the mix, if you are developing on a local server, it might not work. You need to specify a scheme, otherwise the browser may assume that src="//cdn.example.com/js_file.js" is src="file://cdn.example.com/js_file.js", which will break since you're not hosting this resource locally.
Microsoft Internet Explorer seem to be particularly sensitive to this, see this question: Not able to load jQuery in Internet Explorer on localhost (WAMP)
You would probably always try to find a solution that works on all your environments with the least amount of modifications needed.
The solution used by HTML5Boilerplate is to have a fallback when the resource is not loaded correctly, but that only works if you incorporate a check:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<!-- If jQuery is not defined, something went wrong and we'll load the local file -->
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>
I posted this answer here as well.
UPDATE: HTML5Boilerplate now uses <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"> after deciding to deprecate protocol relative URLs, see here.
If you would like to make sure all requests are upgraded to secure protocol then there is simple option to use Content Security Policy header upgrade-insecure-requests
Content-Security-Policy: upgrade-insecure-requests;
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/upgrade-insecure-requests
I have not had these issues when using ://example.com - but you do need to add the colon at the beginning. Yoast had a good write up about this a while back. But it's lost in his pile of blog posts.
Any suggestions on how to do browser caching within a asp.net application. I've found some different methods online but wasn't sure what would be the best. Specifically, I would like to cache my CSS and JS files. They do change, however, it is usually once a month at the most.
Another technique is to stores you static images, css and js on another server (such as a CDN) which has the Expires header set properly. The advantage of this is two-fold:
The expires header will encourage browsers and proxies to cache these static files
The CDN will offload from your server serving up static files.
By using another domain name for your static content, browsers will download faster. This is because serving resources from four or five different hostnames increases parallelization of downloads.
If the CDN is configured properly and uses cookieless domain then you don't have unnecessary cookies going back and forth.
Its worth bearing in mind that even without Cache-Control or Expires headers most browsers will cache content such as JS and CSS. What should happen though is the browser should request the resource every time its needed but will typically get a "304 Unmodified" response and the browser then uses the cached item. This is can still be quite costly since its a round trip to the server but the resource itself isn't sent so the bytes transfered is limited.
IE left with no specific instructions regarding caching will by default use its own heuristics to determine if it should even bother to re-request an item its cached. This despite not being explicitly told that it can cache a resource. Its hueristics are based on the Last-Modified date of the resource, the older its the less likely it'll have changed now is its typical reasoning. Very wooly.
Frankly if you want to make a site perfomant you need to have control over such cache settings. If you don't have access to these settings then don't wouldn't worry about performance. Just inform the sponsor that it may not perform well because they haven't facilitated a platform that lets you deliver that.
You best bet to do this is to set an Expires header in IIS on the folders you want the content cached. This will tell most modern browsers and proxies to cache this static content. In IIS 6:
Right click on the folder (example CSS or JS) you want to be cached by the browser.
Click properties
Go to the HTTP Headers Tab
Check "Enabled content expiration"
Set some long period for expiration, like "Expires after 90 days"
Yahoo Developer's Blog talks about this technique.
Unless you configure IIS to give asp.net control of js/css/image requests it won't see them by default, hence your best plan (for long term maintainability) is to deliberately tweak the response headers at your firewall/trafficmanager/server or (better and what most of the world does at this point) to version your files in path, i.e:
Instead of writing this in your mark-up:
http://www.foo.com/cachingmakesmesad.css
Use this:
http://www.foo.com/cachingmakesmesad.css?v1
..and change the version number when you need to effectively clear cache. If that's every time then you could even append a GUID or datestamp instead, but I can't think of any occasion where I would want to do that really.
I thought your question was anti-cache but re-reading it I see I wasted a good answer there :P
Long story short, browsers are normally very aggressively pro-caching "simple" resources so you shouldn't have to worry about this, but if you did want to do something about it you would have to have access to a firewall/trafficmanager/IIS for the reasons above (ASP.NET won't be given a chance by default).
However... there is no way you can absolutely force caching, and nor should you. What is and isn't cached is rightfully the decision of the end-user, all you can do is strongly request.
In .net you can set up your JavaScript, CSS and Images as embedded resources.
.Net will then handle the file expiration for you.
The downside to this approach is that you have to do a new build for each set of changes (this might be an upside, depending on your deployment and workflow).
You could also use ETags, but from what I understand in some cases it doesn’t work well if you have mix of IIS and apache Webservers hosting your images, (or if you plan to switch in the future).
You can just make sure the file date is newer, and let the server handle it for you, but you’ve got to make sure the server is configured correctly.
You can cache static content by adding following code in web.config
<system.webServer>
<staticContent>
<clientCache httpExpires="Tue, 12 Apr 2016 00:00:00 GMT" cacheControlMode="UseExpires" />
</staticContent>
</system.webServer>
See the clientCache documentation for more details.