As part of static file caching in my application, I am using the clientCache> feature supported by IIS7.5. But I would like to invalidate the client side files while performing new deployments, to ensure that the stale files are removed.
cacheControlMaxAge seems to be absolute. I want to invalidate the files one time (during deployment) and then they should be cached. Is there any way recommended?
Depends which files you are trying to invalidate. One solution could be to generate some sort of fingerprint and append it as a querystring to your files. For example:
<link rel="stylesheet" href="mystyle.css?fp=hash_generated_from_file" >
When you change file you should recreate the hash and append it to the file. Mads Kristensen wrote an article Cache busting in ASP.NET which would be the answer to your question. Instead of copying-pasting his article you might want to see how he does this.
Related
How can Symfony deliver static files without bootstrapping/executing the framework?
For example: if some requests are failing by the webserver(images, js files are not found or something like this) then the framework tries to solve the route. Of course this does not exists.
Is there a way to avoid this or blacklist these extensions?
It could be a cache problem.
If it is :
If it is a cache problem, you could try to clear the cache on the symfony console with cache:clear. If it doesn't work you could try to remove the ressources in the general folder, leaving the original ones in your bundle, and running assetic:dump and assets:install.
If it isn't
Regarding the "remove-symfony-routing" thing, I don't know if it's possible, but it should not done anyways.
What you're asking is to be able to access, from the client side, any file on the server, which constitutes a major security breach.
This could allow the client to get any file on the server, meaning he could get his hands on your javascript or php files which most of the time contain valuable information (such as how your app works or even deadlier : global passwords and config values)
What you could do to access resources from the client would be a route that points to a controller function that could output to browser the file you're looking for, provided that it has an extension you'd be ok to share. For example you could allow any image file but forbid code files such as php or javascript.
EDIT: Or yeah, configure your webserver correctly. 2 simple answers while I was typing :D
The 3 entries below are from a gtmetrix.com report. How should I handle compression performance for these files for Amazon S3? I know how to do gzip for S3. But the three files below present a more restrictive situation.
I don't have access to mailchimp's css file. Is there some way to get better compression performance in this case?
I periodically update my Thesis theme, which will change the css.css file shown below. I can't version that file since I need to use the name css.css. Is there some technique to handle this scenario?
Compressing http://www.mysite.com/wp-content/thesis/skins/classic/css.css could save 20.5KiB (79% reduction)
Compressing http://cdn-images.mailchimp.com/embedcode/slim-041711.css could save 1.1KiB (60% reduction)
Compressing http://www.mysite.com/wp-includes/js/comment-reply.min.js?ver=3.5.1 could save 374B (48% reduction
Yeah, this is a pretty common question. If you serve static files from a traditional HTTP daemon like Apache, the content is actually compressed on-the-fly via mod_deflate--it transparently gzip's the file and sets the appropriate Content-Encoding header.
If you want to do this off of S3, you have to manually gzip the files before uploading them (normally named something like cool-stylesheet.gz.css) and then set a custom Content-Encoding property on the S3 object like this:
This can be tedious to do by hand, so we actually do it automatically as part of our continuous integration build process. A post-commit hook in our source control fires, executing several build steps (including this one), and then the resulting files are deployed to the proper environment.
Edit:
It seems that you meant to describe a problem with Cloudfront, not S3. Since Cloudfront is a CDN and it caches files at it's edge locations, you have to force it to refetch the latest version of a file when it changes. There are two ways to do this: invalidate the cache or use filename versioning.
Invalidating the cache is slow and can get really expensive. After the first 1,000 invalidation requests per month, it costs a nickel for every 10 files invalidated thereafter.
A better option is to version the filenames by appending a unique identifier before they are pulled into Cloudfront. We typically use the Unix epoch of when the file was last updated. So cool-stylesheet.gz.css becomes cool-stylesheet_1363872846.gz.css. In the HTML document you then reference it like normal: <link rel="stylesheet" type="text/css" href="cool-stylesheet_1363872846.gz.css"> This will cause Cloudfront to refetch the updated file from your origin when a user opens that updated HTML document.
As I mentioned above regarding S3, this is too is a tedious thing to do manually: You'd have to rename all of your files and search/replace all references to them in the source HTML documents. It makes more sense to make it part of your CI build process. If you're not using a CI server though, you might be able to do this with a commit hook in your source repository.
I have an offline process that needs to do some analysis on User-Agent strings, that are logged from requests on our production machines, the problem is I need to use the .browser file that has the required filters to parse & recognize the browsers from user-agent, and the only way to use the file that I know, is through having an ASP.net website and placing the .browser file under APP_Browsers folder, but given the nature of the offline process I can't have it in a website and I find the need to construct a HttpWebRequest object for each record to read the HttpBrowserCapabilities over killing.
So is there any other way to consume the .browser file and read the HttpWebBrowserCapabilities matching the user-agent.
Note: I found a previous similar question that is about 2 years old, that didn't get enough traction, so I thought may be things changed since then.
We are creating a Tapestry 5 webapp with an external designer creating and maintaining the css-files of the application.
We would like him to be able to make changes to the css-files without the need to change the webapp, prefarably in the configurable path in the filesystem.
So what would the best way to do this with Tapestry 5?
There is a JIRA for the ability to use a filesystem asset. Someone has posted patches that should let you do it but it hasn't made it into a release yet. If you do that, you could use #IncludeStyleSheet(value={"file:path_to_css_file"}) in your layout template.
An alternate way would be to stream it using a method like this one. The last paragraph suggests that you can include a streamed response in your template so in this case you could do <link rel="stylesheet" type="text/css" href="${externalStylesheet}"/>. Then create a streamed response that reads the stylesheet from a known path on the server. Or you could store it in a blob in the database and stream from there - that way you could also create a page to let the designer upload new versions.
I have a web page where i have an ASP.NET file upload control to upload files from client machine to Server.Now i want to do the uploading n number of times.Ex : I want to upload 100 files from my local pc to server.The 100 file names i can read from an excel file in my program.But is there any way to assign this file to the file upload control ?
No, as a security feature, FilUpload controls do not allow you to set what to download (imagine if you sign on to a website, and it is set to upload a passwords file or something).
Now there is probably another control, or a way to code around this, buut the FileUpload control will not allow it.
I would recommend using the jQuery Multifile Uploader which would take care of a UI (if you need one). And the actual uploads with Free ASP Uploads which takes care of the actual file transfer. Though it sounds like you are tkaing care of the programs programatically, so you can skip the multifule and just work with free asp upload.
You'll have to make your own Flash object or something to accomplish this, the basic HTML/ASP.Net controls won't let you do what you're looking for.
This will require creating some kind of an active or installable control. In order to get around the security hole of doing this, you're ultimately going to have to be able to execute code on the machine to select and upload the file.
And at that point, you're platform specific, so...
I would strongly suggest that instead of trying to have a web site automatically upload files for you, that you make a WinForms utility to accomplish this task and upload the files wherever you need, communicate with the web site over web services, etc.
This is a security restriction, you cant script the file selection of an upload box as it would allow hackers to write scripts to steal files off your computer.
You could use this silverlight upload utility which is my list of "things to use when I get the chance".
It has a nice UI and supports uploading many files at once. I originally tracked it down doing some research for a photography website that we were quoting for but that project fell through.
Anyway the project can be found here:
http://www.michielpost.nl/Silverlight/MultiFileUploader/
It also has full source code included so even if the control's developers abandon it you still have the choice to edit it yourself.