I have static content (icons etc...) served via Asp.Net
Every response gets caching added to it, like this:
Response.Cache.SetExpires(Now.AddMinute(30))
Response.Cache.SetValidUntilExpires(True)
When I browse from my office everything is fine
When one of the users browses from home, the icons are not cached. Which makes browsing very slow.
I have a log that shows the incoming requests, and the requests from this user have this header
"Cache-Control":"no-cache, no-store"
I don't know if that's the issue, and if yes, how can I solve it? Or can there be something else wrong?
Also, after setting the cache expiration, it seems that the Response.Headers are not affected. I don't see the caching info in the headers.
This is the header string. Not a word about caching.
{Server=Microsoft-IIS%2f10.0&HitID=9&X-AspNetMvc-Version=5.2}
Why are my Cache settings being ignored?
Please check your IIS cache setting. The static file setting may not related with the Response.Cache.SetExpires method().
And you can also set the Cache-Control in iis. About how to set the Cache-Control you can refer to this link.
Cache-Control
Related
I have a web server which returns 200 OK with a bunch of set-cookies, and an HTML page which loads a bunch of scripts from the same server.
However, the subsequent loads that was spawned from that HTML page submits a different cookie on their HTTP request headers.
What could be causing that? Surely there's some policy I'm missing out on, but I don't see why it works on some pages and not others?
I'm using chrome as the browser, but this behavior also happened from iOS, so I'm guessing it's not browser specific.
So after a lot more reading and troubleshooting, it turns out that when you don't set a cookie path, it'll default to whatever path the original request set-cookie was sent to. And because my resource paths had a different path, the cookie was not sent.
Adding Path=/ fixed it for my issue. Of course, if you don't want your cookie to be accessible to all pages this is bad, but my web-server requires requests to come with cookies because they are sensitive data (for security reasons).
I'm writing some http client code to interact with a website, and I need to set some cookies. Simply visiting the website sets 4 cookies (as seen in Chrome Settings).
However, when I look at the HTTP response headers for when those cookies were set (using Live HTTP Headers extension), there is no Set-Cookie header anywhere. How were those cookies set? Is there another way than through Set-Cookie?
Edit: Some of the cookies are HttpOnly.
If you load a site in your browser, it might also load other assets that can also set cookies (given that they are on the same domain).
But there is a second way to set cookies: with Javascript via document.cookies.
As far as I know, if your javascript or python code sets a cookie for that domain, then the response will include the SET-COOKIE field. You can view that from at least the inspect console.
So I see that you're using HTTP live extension, but it doesn't look like it shows that field in the response.
I tried looking for other extensions that could show it, but I wasn't able to find one as far as I know. I suppose we both can always fall back to the chrome inspect console. If you go to the network tab, you should actually see the req-resp.
We have an established site that is now being effected by CSP rules. I’ve added all the scripts we need to the Content-Security-Policy header.
When visiting the site using private browsing or a device that hasn’t been to the site before, I get the new CSP header and everything works.
However, users that have been to the site before get the old headers, and they get CSP warning.
NB I cannot use expire 0 or similar as the browsers are not looking for the new headers, so never know that the headers have expired.
I’m looking for a way to tell the browser “hey, you should checkout my cool new headers because they’re new”.
Turns out I was being foiled by Local Storage that was overwriting the CSP header. Even clearing the cache doesn’t solve the problem as Local Storage remains.
Hope this helps somebody else!
I just read that files that do have Expires Headers should not be requested again until they expire.
While testing some caching stuff I wonderd why files do have a size and do consume time on "Content Download" on chrome dev tools, even if they got a max-age set and should be loaded from cache without sending any request?
Any explanations?
Quoting from this answer:
Chrome appears to be ignoring your Cache-Control settings if you're reloading in the same tab. If you copy the URL to a new tab and load it there, Chrome will respect the cache control tags and reuse the contents from the cache.
I know this question is asked for several times. But Still I am not clear about the concept. After reading many blogs and answers in SO what I got is,
Expiry headers are used when you don’t even want client (and proxies/caches) to make a request to
the server. In ETAG, the client will check with the server for the update, but in expiry
headers, the client will know, when to expire the file and check for an update, till then it
(browsers and proxies/caches) won’t bother server for checking the update.
So basically it say if we use expires/max-age header , It will not even check for the server for an updated file. So I thought to test it locally.
So I have created on simple html file including 2 js files and 1 image file. In IIS , I have set the Expires header to 2 days for the image folder. So as per my understanding , after getting the image file from the server once, for next request it should not send a request to the server to check the image file is modified or not.
But what I got is each time I refresh the page I see a request sent to the server and the server returns a 304 not modified status. But as per the specs/blogs I read It should not send a request to the server.
Someone please explain.
For what you have described
It is clear that ETag works as it expected to be by responding with 304 not modified for the request with If-None-Match field and ETag value.
so now the browser will load the image from cache instead of getting a new image from server costing bandwidth and time.
It seems that caching is disabled in your browser.That's why a new request has been sent before the cache expiration or else a request wouldn't have been sent in the first place.
Here is a wonderful article that explains how to find caching is disabled in browser programatically
Here is a another wonderful article that explains caching and Etag in depth.
Note:
Generally speaking
If you are using multiple servers with load balancer to host your website
then simple Etag configurations likely going to cost more bandwidth by having Etag in their header and it has no purpose which is checking if browser cache is valid.(Its always going to say invalid)
The important part is what you said: I refresh the page. In this case browser is trying provide you the fresh content so it has no other option than to contact server and check all resources. (There is cache control extension immutable which prevents this behaviour but is not widely used and implemented).
If you want to see behaviour of your browser which respects caches without reload you have to use "standard page entry". Either follow a link to the page or use another tab and write the page url to the url bar and press enter.
Caches respects expiration time so if document is not stale then is returned from the cache. If expired that ETag is used to validation of the resource (and after validation it is possible that resource is still not modified - 304 response)