Google Tag Manager (GTM) minifies all tags and snippets and serves them minified. Which is good.
But the javascript that loads the tags itself is not minimized.
For example: https://www.googletagmanager.com/gtm.js?id=GTM-WPGCQNW
// Copyright 2012 Google Inc. All rights reserved.
(function(w,g){w[g]=w[g]||{};w[g].e=function(s){return eval(s);};})(window,'google_tag_manager');(function(){
var data = {
"resource": {
"version":"137",
"macros":[{
"function":"__jsm",
"vtp_javascript":["template","(function(){var a=new Date(document.querySelector('meta[name\\x3d\"article_date_original\"]').content);return a.toISOString()})();"]
...
Here you see that the total javascript is not minified, but the contents of each line is minified.
My question is, might there be an answer to it: how can I link to a minified version? Does Google offer this as well?
Google does not offer this.
Since the GTM file does not need a backchannel (it is pure Javascript) you could download it to your server, minify it there and link the resulting file in your website (by "download" I mean something like a proxy that downloads and minifies on the fly, or in prescribed intervals, in order to alway receive the lastest version of the GTM file).
It is of course possible that further minification will break the file. Also since the file is delivered gzipped to the browser, and a bunch of spaces compress pretty well, it is unlikely that further minification will have a big effect (you would need to see that your own server zips the file again before it is delivered to your site, or you just make things worse).
I do not think this is actually a worthwhile idea, but it is basically the only way to minify the file beyond what Google does for you.
Related
What tool/editor do you recommend to (live) test your local CSS changes against an externally hosted site?
A site lives on a domain.test (no server access), and I need to write extensive CSS overrides to reskin the entire site. All changes will be in a single CSS file with no preprocessor.
The ideal setup is using the comfort of my regular code editor (Visual Studio Code) and having the site open in a browser and have the CSS auto-refresh as I save my CSS changes in a local directory.
It's a big site, so I'm open to setting up a complex setup rather than relying on testing edits using browser inspect mode or mounting CSS files using Firefox.
:: I can inject JS script/library to the site if it helps with my setup.
Bonus: If I can do that for Vanilla JS too.
I would do it as follows with ModHeader and ScriptAutoRunner (chrome extensions).
With ModHeader replace the site's CSS and JS to local files using the "Redirect URL" function. (If necessary use ScriptAutoRunner to inject JS).
In one of these JS files create a rule to download every 2 seconds via ajax (any other way) your local files and check if there is equality with the last request or not. If the file has changed then reload the page (to update).
I know there are some tools that show CSS changes in real time without you having to reload, but I don't use them and the way I mentioned you can also make them run when the JS is updated.
Whenever you reference something in your public or assets directory via link/script tags, are you only downloading what you reference?
For example, say I have an enormous amount of images in my public directory. Only images that are referenced on that particular page are downloaded, right?
Taking this further, say your stylesheet references a lot of images, and it serves every single page on the your website. So are all of those images downloaded?
You see, I'm probably lacking some fundamental background on this as I'm pretty new to web dev, and I don't have much experience with nginx/apache or the like. Please explain!
Yes, the browser has no idea, what files are located on the server. It only knows those paths that are referenced in HTML (via <link/>, <script/>, <img/>, <a/>, etc.).
You can also have a look at the so called access log of your web server or (better) your browser's developer tools (Firebug, Chrome DevTools) to investigate what happens under the hood.
I've recently started using uglify-js to compress my JavaScript. Source maps are a nice feature for debugging but, for us, part of the benefit of compressing our JavaScript is the obfuscation.
Would putting the source map in a password protected directory prevent a passive observer from using it to re-beautify our JavaScript? Would his have any undesirable side-effects?
I'm not familiar with how and when browsers request this file. I don't want it to trigger password prompts and inconvenience users but I also don't want it to be publicly viewable.
Well, using Grunt you can set up different routines for dev to production; i.e. you may find Sass comments useful in dev but when you go to production you want all the comments stripped out. Same with sourcemaps. What I like to do is test with minified scripts to make sure everything works before the site goes live, so that my development environment is as close to the production environment as possible.
Ideally, you should have a local clone of your production site in which you can bugfix, enhance etc, rather than debugging a live site.
Yes it's possible someone can still take your JavaScript and beautify it again, but I think they would need the (uncompressed) source files, which you wouldn't store on your website in the first place; all they'd then be left with is beautified JS that had 1 character variable and function names, practically useless to anyone :-)
I do not really understand how Google Code handles file versioning.
I am building a jQuery plugin that anyone can access. Like so:
<script type="text/javascript" src="http://jquery-old-browser-warning.googlecode.com/files/jquery.browser-warning.js"></script>
This script accesses other files on the same project (via ajax).
The problem is, that when I upload a new file, it just seems like there aren't any changed to it. Google recommends that new files should have new names.
But then I would have to change the filenames that the script loads.
But then I would have to change the script file as well, and that would break everybodys implementation (with the script-tag above)
Is there a way to force a file to change when uploading with the same filename?
PS: If I go directly to the project page's file list. Then I do get the file with the updated content. But as I said, not when getting it through ajax.
The cheapest trick in the book to prevent caching is adding some random content to a GET parameter:
www.example.com/resources/resource.js?random=1234567
You can for example use the current timestamp for this.
This, however, causes any and every access to re-fetch the content, and invalidates any client-side caching mechanism as well. I would use this only as a last resort. If Google are that stringent about caching, I'd rather develop a workflow that allows for easy renaming of files.
I don't know your workflow, but maybe you can work with versioned directories?
Like so:
www.example.com/50/resources/resource.js
www.example.com/51/resources/resource.js
that would keep whatever caching the client employs intact, but whenever there's a change from your end, the browser would reload the content.
I think Its just a cache on the browsers, So when you request file from ajax, just add random parameters or version number.
For example, Stackoverflow add version parameter to static contents like
http://sstatic.net/so/all.css?v=6638
Are you talking about uploading files to the "Downloads" area? Those should have distinct filenames, for example they should be versioned. If you're uploading the script code, that should be submitted by the version control system you're using, and should most definitely keep the same name across revisions.
Edit: your code snippet didn't show up on my page, misunderstood what you're trying. Don't imagine Google would be happy with you referencing the SVN repository every time some client page is loaded :)
What are some of the disadvantages of using an external JS file over including the JS as a part of the ASPX page?
I need to make an architectural decision and heard from coworkers that external JS does not play nice sometimes.
The only downside that I am aware of is the extra HTTP request needed. That downside goes away as soon as the Javascript is used by two pages or the page is reloaded by the same user.
One con is that the browser can't cache the JS if it's in the page. If you reference it externally the browser will cache that file and not re-download it every time you hit a page. With it embedded it'll just add to the file-size of every page.
Also maintainability is something to keep in mind. If it's common JS it'll be a bit more of a pain to make a change when you need to update X number of HTML files' script blocks instead of one JS file.
Personally I've never run into an issue with external files vs embedded. The only time I have JS in the HTML itself is when I have something to bind on document load specifically for that page.
Caching is both a pro and potentially a con, if you are not handling it properly.
The pro is obvious, as it will improve page loading on every page load past the first one.
The con is that when you release new code, it may still be cached by the user's browser, so they may not get the update. This can easily be solved by changing the name on your js file. We automatically version our js with the file's timestamp, and then make sure that points to the create file in the web request through configuration on our web server (mod_rewrite, Apache).
Ask them to define "play nice". Aside from better logical organization, external js files don't have to be transmitted when already cached.
We use YUI compressor to automatically minify and combine external scripts into one when doing production/staging builds.
The only disadvantage I know is that another request must be made to the server in order to retrieve the external JS file. As was said before me you can use tools like YUI compressor to minimize the effects of this.
The advantage however would be that you can keep all of your JS code in a separate more maintainable format.
Another huge advantage to external javascript is the ability to check your syntax with Jslint. That, added to the ability to minify, combine and cache external scripts, makes internal javascript seem like a poor choice.