I was looking for some website optimization tips online and most of them had a common tip of specifying expiration of Cacheable resources. I have not yet specified the expiration. So what would be the default expiration duration the cacheable resources taking? Please help me.
I am sorry but I am not sure how this particular question relates to google bigquery. Maybe there is some information you didn't disclose?
update: after your comment and the change of category to HTTP I think I can already answer.
If you are not setting any expires or cache-control headers, then a well-behaved browser should issue a new GET request every time. There is not such thing as a default expiration.
Depending on your technology stack, it might be possible that your web server or your application server add default cache headers. To verify that, you can just open the page on firefox/chrome with the developer tools and inspect the headers. If you can't find any "Expires" or "Cache-control" headers in the response your page is sending, then you don't have any default expiration and you are not making use of caching.
Related
I am running WordPress 5.3.2 on Apache/2.4.29 (Ubuntu) 18.04 on a Digital Ocean droplet.
My client requested the following:
All cookies transferred over an encrypted session, in particular session cookies, should be marked as 'Secure' and all session information should be transmitted over HTTPS.
The HttpOnly flag should also be set within the cookie
So, I defined the following in the virtual host:
Header edit Set-Cookie ^(.*)$ $1;HttpOnly;Secure;SameSite=Strict
I then tested the header response and could see my Set-Cookie defined.
The problem is, I now can't login to WordPress. WordPress says:
ERROR: cookies are blocked or not supported by your browser. You must
enable cookies to use WordPress.
What am I doing wrong?
Strict is probably more restrictive than you want, as this will prevent cookies from being sent on initial cross-site navigations, e.g. if I emailed you a link to a page on your blog, when you first followed that link, the SameSite=Strict cookies would not be sent and it might appear as if you were not logged in.
SameSite=Lax is a better default here. Then I would explicitly look at setting SameSite=Strict or SameSite=None on individual cookies where you know the level of access required.
The HttpOnly attribute is also blanket preventing all of your server-side set cookies from being read by JavaScript. You may well have functionality on your page that requires this.
Finally, a blanket approach here is probably overkill - as it looks as if you will be appending this snippet to every outgoing cookie header, even the ones that already include those attributes. This is likely to cause some unpredictable behaviour. I would either do this on a specific allow-list basis by checking for explicit cookie names or I would alter the regex to only set this if those attributes are missing.
A late answer. But if it helps someone:
Put these values in php.ini
session.cookie_httponly = 1
session.cookie_secure = 1
Of course you should have a valid https certificate.
I currently use Akamai as a CDN for my app, which is served over multiple subdomains.
I recently realized that Akamai is caching CORS requests the same, regardless of the origin from which they were requested.
This of course causes clients that make requests with a different Origin than the cached response to fail (since they have a different response header for Access-Control-Allow-Origin than they should)
Many suggest supplying the Vary: Origin request header to avoid this issue, but according to Akamai's docs and this Akamai community post, this isn't supported by Akamai.
How can I force Akamai to cache things uniquely by Origin if an Origin header is present in the request?
I did some research, and it appears this can be done by adding a new Rule in your Akamai config, like so:
Note that if you do this - REMEMBER - this changes your cache key at Akamai, so anything that was cached before is essentially NOT CACHED anymore! Also, as noted in the yellow warning labels, this can make it harder to force reset your cache using Akamai's url purging tools. You could remove the If block, and just include Origin header as a similar Cache ID Modification rule too, if you were ok with changing the cache key for all your content that this rule would apply to.
So in short, try this out on a small section of your site first!
More details can be found in this related post on Stack Overflow
We have hosted an API on Akamai. I had similar requirement, but we wanted to use the cached response on Akamai for all the touchpoints. But without CORS settings, it used to cache the response from first origin, and then keep it in cache, and the following requests from other touch points use to fail due to cached origin header.
We solved the problem with using API Gateway feature provided by Akamai. You can find it under API Definition. Custom cache parameters can also be defined here. Please see the screen shot for the CORS settings. Now it cached the response from backend and serve to the requester as per the allowed origin list.
CORS Setting in API Definition
Is there a way to tell the browser not to share a cached resource among websites?
I want to give websites a link to some JavaScript on my server and I want to make the response be different for each domain using the Referer header as check.
The response which will be cached should be available to the domain that requested it and when the end users visit another site that uses the script link, another request should be made.
I don't know whether I understand your question.
Does your scenario like: stackoverflow.com and yourwebsite.com use the same script called "https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js", but you don't want to share the cached script with stackoverflow.com
This is under the control of googleapis.com's web server.
So if the cached resource's origin server(googleapis.com) want to implement the feature as you said, it may use the Vary response header. Vary Header define the secondary key of cache.
Maybe "Vary: Origin" but only work for CORS
Maybe "Vary: referer" but referer contains url path
It still doesn't solve your problem but I hope it helps.
see MDN HTTP Cache Doc and [RFC 7234 Section 4.1]
We have a use case where we are storing our images in a CDN. Let's say we are storing a.jpg in the cache and if the user uploads a newer version of the file, then it will flush the cache and overwrite the a.jpg. Now the challenge is that the browser might have cached the file. Since we cannot flush the cached image in the browser we are thinking of using one of the 2 approaches mentioned below :
Append a version a_v1.jpg, a_v2.jpg (version id is the checksum) this will eliminate the need for flushing the browser and CDN cache. I found a lot of documentation about this on the internet and so many people are using this.
Use the etag of the file to find eliminate the stale cache in the browser. I found that CDN's support etags but I did not find literature that etag is used for images.
Can you please share your thoughts about using etag header for cache busting ? Is this a good practice to use it ?
well i wouldn't suggest etag. This might have its advantage but has its setbacks as well. Say you are running two servers then the etag when content served from each of these servers might change.
Best thing i would suggest is control what the browser is caching and how long.
What i mean is send expiry headers when sending response from cdn to client browser say 5min TTL. This way browser will respect the expiry header. And once expired browser will send a fresh request to cdn when the page is refreshed.
I know this question is asked for several times. But Still I am not clear about the concept. After reading many blogs and answers in SO what I got is,
Expiry headers are used when you don’t even want client (and proxies/caches) to make a request to
the server. In ETAG, the client will check with the server for the update, but in expiry
headers, the client will know, when to expire the file and check for an update, till then it
(browsers and proxies/caches) won’t bother server for checking the update.
So basically it say if we use expires/max-age header , It will not even check for the server for an updated file. So I thought to test it locally.
So I have created on simple html file including 2 js files and 1 image file. In IIS , I have set the Expires header to 2 days for the image folder. So as per my understanding , after getting the image file from the server once, for next request it should not send a request to the server to check the image file is modified or not.
But what I got is each time I refresh the page I see a request sent to the server and the server returns a 304 not modified status. But as per the specs/blogs I read It should not send a request to the server.
Someone please explain.
For what you have described
It is clear that ETag works as it expected to be by responding with 304 not modified for the request with If-None-Match field and ETag value.
so now the browser will load the image from cache instead of getting a new image from server costing bandwidth and time.
It seems that caching is disabled in your browser.That's why a new request has been sent before the cache expiration or else a request wouldn't have been sent in the first place.
Here is a wonderful article that explains how to find caching is disabled in browser programatically
Here is a another wonderful article that explains caching and Etag in depth.
Note:
Generally speaking
If you are using multiple servers with load balancer to host your website
then simple Etag configurations likely going to cost more bandwidth by having Etag in their header and it has no purpose which is checking if browser cache is valid.(Its always going to say invalid)
The important part is what you said: I refresh the page. In this case browser is trying provide you the fresh content so it has no other option than to contact server and check all resources. (There is cache control extension immutable which prevents this behaviour but is not widely used and implemented).
If you want to see behaviour of your browser which respects caches without reload you have to use "standard page entry". Either follow a link to the page or use another tab and write the page url to the url bar and press enter.
Caches respects expiration time so if document is not stale then is returned from the cache. If expired that ETag is used to validation of the resource (and after validation it is possible that resource is still not modified - 304 response)