If instead of using mod_deflate or mod_gzip I would manually gzip a css or js file obtaining something like:
base.css.gz
And I rename it as:
base.css
And then load it from an HTML page, would it work?
This could be useful on environments with very limited memory and resources, like for examples wireless access points in mesh networks.
I also wanted to ask if it would have sense to do it on normal websites to save resources of the server's cpu? At the moment I use mod_deflate, I think with this method the contents gets gzipped at every request on the fly, is it so? Isn't that a bit of waste of resources?
I answer to myself because no one addressed my question.
It is possible to serve manually gzipped files, for example css, but they need to be served by the web-server with the correct content-type and compression headers, just loading a gzipped file from a <link> or <javscript> tag won't work.
Here's an example with php:
<?php
header("Content-Encoding: gzip");
header("Content-Type: text/css");
echo file_get_contents("base.min.css.gz");
exit(0);
// no closing tag to prevent occasionally printing any chars
HTTP servers if set to use compression on static files cache the compressed file for you - so don't worry about it.
IIS documentations here
I'm not too up on Apache, but mod_deflate and mod_cache work together https://serverfault.com/a/220418/7869
Related
How can Symfony deliver static files without bootstrapping/executing the framework?
For example: if some requests are failing by the webserver(images, js files are not found or something like this) then the framework tries to solve the route. Of course this does not exists.
Is there a way to avoid this or blacklist these extensions?
It could be a cache problem.
If it is :
If it is a cache problem, you could try to clear the cache on the symfony console with cache:clear. If it doesn't work you could try to remove the ressources in the general folder, leaving the original ones in your bundle, and running assetic:dump and assets:install.
If it isn't
Regarding the "remove-symfony-routing" thing, I don't know if it's possible, but it should not done anyways.
What you're asking is to be able to access, from the client side, any file on the server, which constitutes a major security breach.
This could allow the client to get any file on the server, meaning he could get his hands on your javascript or php files which most of the time contain valuable information (such as how your app works or even deadlier : global passwords and config values)
What you could do to access resources from the client would be a route that points to a controller function that could output to browser the file you're looking for, provided that it has an extension you'd be ok to share. For example you could allow any image file but forbid code files such as php or javascript.
EDIT: Or yeah, configure your webserver correctly. 2 simple answers while I was typing :D
consider my file
Test.mxml
output file
Test.swf
Each time i make some changes in Test.mxml corresoping swf file is generated.
But this is causing some problem in proxy server.
When i change the version of swf file generated its working fine(im able to see new changes as proxy server will load the new renamed file)(i tried versioning)
I cant see my changed swf file, its giving me cached swf file because of which the changes are not reflected.
A few approaches to handle this:
It may be possible to tell your proxy not to cache this file if you have any control over it.
Sometimes people use the "Random number" technique to prevent files from being cached. that is, in your HTML page that wraps your SWF; add a random number to the SWF location. Conceptually like this myswf.swf?someRandomNumber .
Every time you deploy a new build you could change the filename.
You can also try having your browser send the no-cache headers, which causes the (WebSphere Edge) proxy server to dump its cached copy too. In Firefox, at least, Shift-Reload does this. I think that's true in IE and maybe Chrome too.
The 3 entries below are from a gtmetrix.com report. How should I handle compression performance for these files for Amazon S3? I know how to do gzip for S3. But the three files below present a more restrictive situation.
I don't have access to mailchimp's css file. Is there some way to get better compression performance in this case?
I periodically update my Thesis theme, which will change the css.css file shown below. I can't version that file since I need to use the name css.css. Is there some technique to handle this scenario?
Compressing http://www.mysite.com/wp-content/thesis/skins/classic/css.css could save 20.5KiB (79% reduction)
Compressing http://cdn-images.mailchimp.com/embedcode/slim-041711.css could save 1.1KiB (60% reduction)
Compressing http://www.mysite.com/wp-includes/js/comment-reply.min.js?ver=3.5.1 could save 374B (48% reduction
Yeah, this is a pretty common question. If you serve static files from a traditional HTTP daemon like Apache, the content is actually compressed on-the-fly via mod_deflate--it transparently gzip's the file and sets the appropriate Content-Encoding header.
If you want to do this off of S3, you have to manually gzip the files before uploading them (normally named something like cool-stylesheet.gz.css) and then set a custom Content-Encoding property on the S3 object like this:
This can be tedious to do by hand, so we actually do it automatically as part of our continuous integration build process. A post-commit hook in our source control fires, executing several build steps (including this one), and then the resulting files are deployed to the proper environment.
Edit:
It seems that you meant to describe a problem with Cloudfront, not S3. Since Cloudfront is a CDN and it caches files at it's edge locations, you have to force it to refetch the latest version of a file when it changes. There are two ways to do this: invalidate the cache or use filename versioning.
Invalidating the cache is slow and can get really expensive. After the first 1,000 invalidation requests per month, it costs a nickel for every 10 files invalidated thereafter.
A better option is to version the filenames by appending a unique identifier before they are pulled into Cloudfront. We typically use the Unix epoch of when the file was last updated. So cool-stylesheet.gz.css becomes cool-stylesheet_1363872846.gz.css. In the HTML document you then reference it like normal: <link rel="stylesheet" type="text/css" href="cool-stylesheet_1363872846.gz.css"> This will cause Cloudfront to refetch the updated file from your origin when a user opens that updated HTML document.
As I mentioned above regarding S3, this is too is a tedious thing to do manually: You'd have to rename all of your files and search/replace all references to them in the source HTML documents. It makes more sense to make it part of your CI build process. If you're not using a CI server though, you might be able to do this with a commit hook in your source repository.
We use HttpContext.RewritePath() to rewrite the path to static files. We do this to be able to virtualize the location of the static files on the server.
When we do so, IIS seems to behave strangely by sometimes not compressing the content of the file, but by still putting gzip encoding in the http response header. When this happens, browsers fail to properly parse the static file since they try to uncompress cleartext data.
We have been looking around and found a few threads talking about this, but we could not find a satisfying answer as to how to avoid this problem, or why should rewriting paths of static files not be allowed.
Shouldn't it be possible to rewrite the path of a static file?
Here a few pages I found about the issue:
GZip compression in IIS7 not working, but content-encoding header is set to gzip
HttpContext.RewritePath breakes buildin IIS gzip
I want to use GZIP files to reduce the page load time.
I have Converted All my JS files in GZIP files.
Now I want to know How Do I set IIS (I am using IIS 7) that this can be work.
Second thing How do i call GZIP file in My Asp.net Pages as well?
Thanks in Advance
I have Converted All my JS files in GZIP files.
Not necessary. Just enable GZIP compression in IIS settings. This article gives details on how to do so. This SO discussion talks about how to validate that the settings were successfully applied.
Second thing How do i call GZIP file in My Asp.net Pages as well
Compression can be enabled for static and/or dynamic files. ASPX pages are considered a dynamic type; enabling dynamic content compression will cause them to be GZIP'd before they are served to a compatible user agent.
To validate, you can use a tool like http://gtmetrix.com/ which will warn about uncompressed resources.