Is there a way to use the new standard link[rel=preload] with yo webapp so grunt serve will load the CSS file ?
Example:
<link rel="preload" href="path" as="style" onload="this.rel='stylesheet'">
I assume that by "so grunt serve will load the CSS file" you mean that the http server started by the grunt task will preemptively serve (push) the css file in addition to serving the http file, therefore saving time?
Unfortunately getting this to work is not currently as trivial as setting up grunt serve in the right way. Push operations is an Http/2 feature, and the server used by grunt serve by default is the stock node one, which is Http/1.1 only.
If you feel like helping yourself, and also contributing back to the community, you could author a project that wraps or forks 'grunt-serve' and replaces require('http') with something like this as the http server, while also having either a server-side parser examining the pages served looking for 'preload' attributes, or some other way of signifying to your plugin what files need to be served alongside each page.
I don't know of any browsers that actually support link[rel=preload] as of now nor do I even see it listed on caniuse.com.
This feature is really a browser implementation feature rather than something controlled by any application or server code so unfortunately, unless you're writing a browser, I think the answer is no, there is no way to use the new standard. We'll just need to wait for it to be adopted and implemented by the browsers.
Related
I'm generating dynamic CSS URLs for cache-busting. I.e. they're in the format styles-thisisthecontenthash123.css.
I also want to use HTTP Link headers to load the files slightly faster. I.e. have the header Link: <styles-thisisthecontenthash123.css>; rel=stylesheet
I'm pretty sure it's possible to do this in Fastly using VCL, but I'm not familiar enough with the ecosystem to figure it out. The CSS URL is in index.html, which is cached. I'm thinking I can open index.html and maybe use regex to parse out the CSS URL. How would I do this?
If I'm understanding your question correctly, you want to include a link header for all requests for index.html. You can do that with Fastly, but if the URL for the CSS file is changing you're not going to be able to pull that info out with VCL (you can't inspect the response body).
You could use edge dictionaries and whenever your CSS filename changes, update the reference via the API.
Thing is, if you're going to make an API call whenever the file changes, might as well just keep the filename consistent (styles.css) and whenever you publish a new version send a cache invalidation (purge). Fastly will clear the cache in ~150ms, so you then all you have to do is add the header which is can be done in the Fastly web portal with a condition.
I'm working on a .Net/ASP project and my responsibility is to work on the design part of the application only (mostly changing css, js, images, and cshtml files)
I'm working directly on the server, so my app is not running in visual studio or locally or any environment where I can rebuild the app.
That being said, any changes I make to the website takes about 45 minutes before it shows up (I do clear browser cache every time as well).
Is there any way I can manually clear the application cache or rebuild it on the server so my changes start showing immediately?
This is something I added to the we.config but still not helping:
<caching>
<outputCacheSettings enableOutputCache="false"/>
</caching>
Well, if I'm not mistaken the cache you're targetting is not about this kind of resources as they're simply not processed by .NET. This outputcache is rather about the final HTML rendered. So I don't think that's where you should be looking. But there may be some kind of proxy somewhere caching those resources aswell.
To avoid client/server caching problems with CSS and javascript, I usually add a time ticker to every request avoiding them to be cached client-side. It should also override any "server caching" of those resources, and I would advise testing it manually before putting an automatic solution in place. so, if you're including js file that way :
<script src="/mypath/myscript.js"></script>
you could just do that :
<script src="/mypath/myscript.js?123></script>
and see if you get the changes in the file immediately now. If yes, just automate the addition of that number (ideally a timestamp so it's always different on each request) to every javascript/css inclusion you make...
Dave Ward says,
It’s not exactly light reading, but section 4.2 of RFC 3986 provides for fully qualified URLs that omit protocol (the HTTP or HTTPS) altogether. When a URL’s protocol is omitted, the browser uses the underlying document’s protocol instead.
Put simply, these “protocol-less” URLs allow a reference like this to work in every browser you’ll try it in:
//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js
It looks strange at first, but this “protocol-less” URL is the best way to reference third party content that’s available via both HTTP and HTTPS.
This would certainly solve a bunch of mixed-content errors we're seeing on HTTP pages -- assuming that our assets are available via both HTTP and HTTPS.
Is this completely cross-browser compatible? Are there any other caveats?
I tested it thoroughly before publishing. Of all the browsers available to test against on Browsershots, I could only find one that did not handle the protocol relative URL correctly: an obscure *nix browser called Dillo.
There are two drawbacks I've received feedback about:
Protocol-less URLs may not work as expected when you "open" a local file in your browser, because the page's base protocol will be file:///. Especially when you're using the protocol-less URL for an external resource like a CDN-hosted asset. Using a local web server like Apache or IIS to test against http://localhost addresses works fine though.
Apparently there's at least one iPhone feed reader app that does not handle the protocol-less URLs correctly. I'm not aware of which one has the problem or how popular it is. For hosting a JavaScript file, that's not a big problem since RSS readers typically ignore JavaScript content anyway. However, it could be an issue if you're using these URLs for media like images inside content that needs to be syndicated via RSS (though, this single reader app on a single platform probably accounts for a very marginal number of readers).
The question of whether one could change all their links to be protocol-relative may be moot, considering the question of whether one should do so. According to Paul Irish:
2014.12.17: Now that SSL is encouraged for everyone and doesn’t have performance concerns, this technique is now an anti-pattern. If the
asset you need is available on SSL, then always use the https://
asset.
If you use protocol-less URLs to load stylesheets, IE 7 & 8 will download them twice:
http://www.stevesouders.com/blog/2010/02/10/5a-missing-schema-double-download/
So, this is to be avoided for CSS if you like good performance.
Yes, network-path references were already specified in RFC 1808 and should work with all browsers.
Is this completely cross-browser compatible? Are there any other caveats?
Just to throw this in the mix, if you are developing on a local server, it might not work. You need to specify a scheme, otherwise the browser may assume that src="//cdn.example.com/js_file.js" is src="file://cdn.example.com/js_file.js", which will break since you're not hosting this resource locally.
Microsoft Internet Explorer seem to be particularly sensitive to this, see this question: Not able to load jQuery in Internet Explorer on localhost (WAMP)
You would probably always try to find a solution that works on all your environments with the least amount of modifications needed.
The solution used by HTML5Boilerplate is to have a fallback when the resource is not loaded correctly, but that only works if you incorporate a check:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"></script>
<!-- If jQuery is not defined, something went wrong and we'll load the local file -->
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>
I posted this answer here as well.
UPDATE: HTML5Boilerplate now uses <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js"> after deciding to deprecate protocol relative URLs, see here.
If you would like to make sure all requests are upgraded to secure protocol then there is simple option to use Content Security Policy header upgrade-insecure-requests
Content-Security-Policy: upgrade-insecure-requests;
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Content-Security-Policy/upgrade-insecure-requests
I have not had these issues when using ://example.com - but you do need to add the colon at the beginning. Yoast had a good write up about this a while back. But it's lost in his pile of blog posts.
I am building a AJAX intensive web application (using ASP.NET, JQuery, and WCF web services) and am looking into building an HTTP Handler that handles script combining and compression for my JavaScript files and my CSS files. I know not combining the scripts is generally a less preferred approach, and I'm sure it's probably the way I will end up going, but my question is this...
Since so many of my JS files are due to the controls I use don't they get cached by the browser after the first load anyway? Since so many of these controls can be found on many of the pages of my web application is it actually faster to combine all of my scripts and serve that one file (which will vary for every page) or to serve the individual files which will get cached? I guess what I'm getting at is, by enabling script combining am I now losing part of the caching ability of the browser? I know I can cache the combined script, but the combined script will be different for every page whereas with the individual control scripts each one will be cached and the number of new scripts will be minimal for each page call.
Does this make any sense? Thoughts?
The fewer number of JS files you serve, the faster your pages will be, due a smaller number of round trips to the server. I would manually put all the common js code into one file (or as few files as possible), all the css code into one file etc., and not worry about using a handler to combine the files. The handler is going to take processing time to combine the files, so you are going to pay that penalty also. You can turn on gzip compression on IIS and have it handle that for you. I would run something like YUI Compressor on the Javascript files used in production.
If the handler changes the file contents from page to page, browsers won't be able to cache it. If you are using SSL this point will be moot though as the browser won't cache the files anyway.
EDIT
I've been corrected some browsers (like FF) can cache SSL content but not all.
As other mentioned: minify, gzip and turn on caching (set expire time and see to that you support etags) on the one static JS file you have and the one static CSS file you have. On top of this it's recommended to load your CSS files as early as possible and your JS files as late as possible (JS file loading is blocking other downloads and it's faster for the browser to render the page if it got the CSS as soon as possible). Sprites also help if you have many small images/icons. Loading static content from sub domains will help the browser to have more simultaneous downloads and you could drop all you cookies for those sub domains to lower the http header size.
You could consult YSlow for performance analysis, it's a great tool!
generally a less preferred
is it for live js? I thinking of JavaScriptMVC which compresses all the code into one file when complied for production, not development... It's a heavy weight I believe.
Usually it's better to combine all scripts. In this case you'll reduce http overhead. Minified controls scripts usually are quite small. In rare case when you are using quite large control you could not combine it to main js.
What are some of the disadvantages of using an external JS file over including the JS as a part of the ASPX page?
I need to make an architectural decision and heard from coworkers that external JS does not play nice sometimes.
The only downside that I am aware of is the extra HTTP request needed. That downside goes away as soon as the Javascript is used by two pages or the page is reloaded by the same user.
One con is that the browser can't cache the JS if it's in the page. If you reference it externally the browser will cache that file and not re-download it every time you hit a page. With it embedded it'll just add to the file-size of every page.
Also maintainability is something to keep in mind. If it's common JS it'll be a bit more of a pain to make a change when you need to update X number of HTML files' script blocks instead of one JS file.
Personally I've never run into an issue with external files vs embedded. The only time I have JS in the HTML itself is when I have something to bind on document load specifically for that page.
Caching is both a pro and potentially a con, if you are not handling it properly.
The pro is obvious, as it will improve page loading on every page load past the first one.
The con is that when you release new code, it may still be cached by the user's browser, so they may not get the update. This can easily be solved by changing the name on your js file. We automatically version our js with the file's timestamp, and then make sure that points to the create file in the web request through configuration on our web server (mod_rewrite, Apache).
Ask them to define "play nice". Aside from better logical organization, external js files don't have to be transmitted when already cached.
We use YUI compressor to automatically minify and combine external scripts into one when doing production/staging builds.
The only disadvantage I know is that another request must be made to the server in order to retrieve the external JS file. As was said before me you can use tools like YUI compressor to minimize the effects of this.
The advantage however would be that you can keep all of your JS code in a separate more maintainable format.
Another huge advantage to external javascript is the ability to check your syntax with Jslint. That, added to the ability to minify, combine and cache external scripts, makes internal javascript seem like a poor choice.