Putting a JS sourcemap in a password protected directory - build-process

I've recently started using uglify-js to compress my JavaScript. Source maps are a nice feature for debugging but, for us, part of the benefit of compressing our JavaScript is the obfuscation.
Would putting the source map in a password protected directory prevent a passive observer from using it to re-beautify our JavaScript? Would his have any undesirable side-effects?
I'm not familiar with how and when browsers request this file. I don't want it to trigger password prompts and inconvenience users but I also don't want it to be publicly viewable.

Well, using Grunt you can set up different routines for dev to production; i.e. you may find Sass comments useful in dev but when you go to production you want all the comments stripped out. Same with sourcemaps. What I like to do is test with minified scripts to make sure everything works before the site goes live, so that my development environment is as close to the production environment as possible.
Ideally, you should have a local clone of your production site in which you can bugfix, enhance etc, rather than debugging a live site.
Yes it's possible someone can still take your JavaScript and beautify it again, but I think they would need the (uncompressed) source files, which you wouldn't store on your website in the first place; all they'd then be left with is beautified JS that had 1 character variable and function names, practically useless to anyone :-)

Related

Section doesn't change after modifying script

This is a long shot, but I came to a wall and I don't have any idea what to do with it.
There is a site that has section with Google Maps map with custom pins. Location of the pins and configuration of the map are defined in wp-content/themes/mytheme/js/map.js file. I have to add some pins. According to the person who created the site a couple of years ago, new pins may be added by modifying map.js file.
The problem is - nothing changes when I modify this file. Even better - I can remove this file (and all other scripts in "js" folder) and nothing changes. This is the only instance of a script that I found.
There isn't any caching plugin enabled.
There isn't caching on server.
This is not browser cache.
Basically it seems like instead of loading this script, the site is taking it from different location, but I don't have a clue where it could be. Is there anything I can do to find source location of a script?
EDIT: I deleted all css and js folders entirely from FTP and it still shows in browsers. I entered Chrome web tools and used network tab to see initiator for the script and it shows as
xxxxxxxxxxx.xx/wp-content/themes/xxxxxxxxxx/js/map.js?ver=5.2.10
But this file DOES NOT EXIST. I deleted it via FTP. Same thing happens with CSS files. It isn't browser caching because it happens on different browsers, different computers...

Css files not updating like other files

i am currently hosting my site on justhost (just as a test server), when i save my work on my local computer through aptana the files are automatically uploaded to the hosting server, and they appear fine. However this only works for my actual files like .php and .html
They do not work for my .css files, so if i save them and upload them the changes do not take effect, until like the next day, or if i turn my computer on and off and leave it a couple of hours, i am not sure why they are not taking effect immediately like the rest of the fiels.
I have tried deleting my cache and adding ?ver=1.0 to the end of the file name, but still no luck.
Also, i checked the hosting directly and the css file has updated to the correct version, but just does not show in browser.
Any ideas on what could be wrong, it would make life much easier if i could get them updating like the other files.
Thanks
I can't be sure what is causing this, but if I'm correct - the files do upload, its not a case of not uploading. It's one of these things
The Cache is holding it (already cleared it though?)
The file is doing some odd cross server transfer, depends what sort of hosting your on, but it may be the file is getting held up somewhere
Try clearing the DNS Cache
Start > type CMD > in the dialog type:
ipconfig /flushdns
That may force the computer to reload the file.
As for an ongoing solution to prevent it in the future I'm out...
I know it has been a while, but as others may find this question the way I did, the solution for me was to enable Cloudflare Developer Mode. Cloudflare was keeping the css files in cache and it drove me crazy to find the solution in another forum. I hope your case may be the same as mine as thus you can solve it as well.

Project hosting on Google Code. Files are cached?

I do not really understand how Google Code handles file versioning.
I am building a jQuery plugin that anyone can access. Like so:
<script type="text/javascript" src="http://jquery-old-browser-warning.googlecode.com/files/jquery.browser-warning.js"></script>
This script accesses other files on the same project (via ajax).
The problem is, that when I upload a new file, it just seems like there aren't any changed to it. Google recommends that new files should have new names.
But then I would have to change the filenames that the script loads.
But then I would have to change the script file as well, and that would break everybodys implementation (with the script-tag above)
Is there a way to force a file to change when uploading with the same filename?
PS: If I go directly to the project page's file list. Then I do get the file with the updated content. But as I said, not when getting it through ajax.
The cheapest trick in the book to prevent caching is adding some random content to a GET parameter:
www.example.com/resources/resource.js?random=1234567
You can for example use the current timestamp for this.
This, however, causes any and every access to re-fetch the content, and invalidates any client-side caching mechanism as well. I would use this only as a last resort. If Google are that stringent about caching, I'd rather develop a workflow that allows for easy renaming of files.
I don't know your workflow, but maybe you can work with versioned directories?
Like so:
www.example.com/50/resources/resource.js
www.example.com/51/resources/resource.js
that would keep whatever caching the client employs intact, but whenever there's a change from your end, the browser would reload the content.
I think Its just a cache on the browsers, So when you request file from ajax, just add random parameters or version number.
For example, Stackoverflow add version parameter to static contents like
http://sstatic.net/so/all.css?v=6638
Are you talking about uploading files to the "Downloads" area? Those should have distinct filenames, for example they should be versioned. If you're uploading the script code, that should be submitted by the version control system you're using, and should most definitely keep the same name across revisions.
Edit: your code snippet didn't show up on my page, misunderstood what you're trying. Don't imagine Google would be happy with you referencing the SVN repository every time some client page is loaded :)

Should I embed CSS/JavaScript files in a web application?

I've recently started embedding JavaScript and CSS files into our common library DLLs to make deployment and versioning a lot simpler. I was just wondering if there is any reason one might want to do the same thing with a web application, or if it's always best to just leave them as regular files in the web application, and only use embedded resources for shared components?
Would there be any advantage to embedding them?
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
I doubting and questioning the validity of Easement's comment about how browsers download JavaScript files. I'm pretty sure that the embedded JavaScript/CSS files are recreated temporarily by ASP.NET before the page is sent to the browser in order for the browser to be able to download and use them. I'm curious about this and I'm going to run my own tests. I'll let you know how it goes....
-Frinny
Of course if anyone who knew what they were doing could use the assembly Reflector and extract the JS or CSS. But that would be a heck of a lot more work than just using something like FireBug to get at this information. A regular end user is unlikely to have the desire to go to all of this trouble just to mess with the resources. Anyone who's interested in this type of thing is likely to be a malicious user, not the end user. You have probably got a lot of other problems with regards to security if a user is able to use a tool like the assembly reflector on your DLL because by that point your server's already been compromised. Security was not the factor in my decision for embedding the resources.
The point was to keep users from doing something silly with these resources, like delete them thinking they aren't needed or otherwise tamper with them.
It's also a lot easier to package the application for deployment purposes because there are less files involved.
It's true that the DLL (class library) used by the pages is bigger, but this does not make the pages any bigger. ASP.NET generates the content that needs to be sent down to the client (the browser). There is no more content being sent to the client than what is needed for the page to work. I do not see how the class library helping to serve these pages will have any effect on the size of data being sent between the client and server.
However, Rjlopes has a point, it might be true that the browser is not able to cache embedded JavaScript/CSS resources. I'll have to check it out but I suspect that Rjlopes is correct: the JavaScript/CSS files will have to be downloaded each time a full-page postback is made to the server. If this proves to be true, this performance hit should be a factor in your decision.
I still haven't been able to test the performance differences between using embedded resources, resex, and single files because I've been busy with my on endeavors. Hopefully I'll get to it later today because I am very curious about this and the browser caching point Rjlopes has raised.
Reason for embedding: Browsers don't download JavaScript files in parallel. You have a locking condition until the file is downloaded.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
Regarding the browser cache, as far as I've noticed, response on WebRecource.axd says "304 not modified". So, I guess, they've been taken from cache.
I had to make this same decision once. The reason I chose to embed my JavaScript/CSS resources into my DLL was to prevent tampering of these files (by curious end users who've purchased my web application) once the application's deployed.
Reason against embedding: You may not need all of the JavaScript code. So you could be increasing the bandwidth/processing unnecessarily.
You know that if somebody wants to tamper your JS or CSS they just have to open the assembly with Reflector, go to the Resources and edit what they want (probably takes a lot more work if the assemblies are signed).
If you embed the js and css on the page you make the page bigger (more KB to download on each request) and the browser can't cache the JS and CSS for next requests. The good news is that you have fewer requests (at least 2 if you are like me and combine multiple js and css and one), plus javascripts have the problem of beeing downloaded serially.

Cons of external JavaScript file over inline JavaScript

What are some of the disadvantages of using an external JS file over including the JS as a part of the ASPX page?
I need to make an architectural decision and heard from coworkers that external JS does not play nice sometimes.
The only downside that I am aware of is the extra HTTP request needed. That downside goes away as soon as the Javascript is used by two pages or the page is reloaded by the same user.
One con is that the browser can't cache the JS if it's in the page. If you reference it externally the browser will cache that file and not re-download it every time you hit a page. With it embedded it'll just add to the file-size of every page.
Also maintainability is something to keep in mind. If it's common JS it'll be a bit more of a pain to make a change when you need to update X number of HTML files' script blocks instead of one JS file.
Personally I've never run into an issue with external files vs embedded. The only time I have JS in the HTML itself is when I have something to bind on document load specifically for that page.
Caching is both a pro and potentially a con, if you are not handling it properly.
The pro is obvious, as it will improve page loading on every page load past the first one.
The con is that when you release new code, it may still be cached by the user's browser, so they may not get the update. This can easily be solved by changing the name on your js file. We automatically version our js with the file's timestamp, and then make sure that points to the create file in the web request through configuration on our web server (mod_rewrite, Apache).
Ask them to define "play nice". Aside from better logical organization, external js files don't have to be transmitted when already cached.
We use YUI compressor to automatically minify and combine external scripts into one when doing production/staging builds.
The only disadvantage I know is that another request must be made to the server in order to retrieve the external JS file. As was said before me you can use tools like YUI compressor to minimize the effects of this.
The advantage however would be that you can keep all of your JS code in a separate more maintainable format.
Another huge advantage to external javascript is the ability to check your syntax with Jslint. That, added to the ability to minify, combine and cache external scripts, makes internal javascript seem like a poor choice.

Resources