How do I implement client side http caching like a browser? - http

I use a RESTFul service as a backend to my frontend. The service sets expires/etag/lastmodified headers on it's responses.
What I'm looking for is a client-side(favorably java) library which can fetch data from the service and cache it in a pluggable caching backend like ehcache.
What I also want to be able to do is automatically prime the cache using background worker threads as soon as an entry is invalidated. Also, it should be smart to do conditional GETs.
I've come across
http://hc.apache.org/httpcomponents-client-ga/tutorial/html/caching.html
Is there any other library anyone knows about? Isn't this a fairly common problem?

The 4.0+ version of the Apache HttpComponents library comes with HTTP 1.1 cache support. You can use this with the Spring RestTemplate restful client as follows:
CacheConfig cacheConfig = new CacheConfig();
cacheConfig.setMaxCacheEntries(1000);
cacheConfig.setMaxObjectSize(8192);
HttpClient cachingClient = new CachingHttpClient(new DefaultHttpClient(), cacheConfig);
ClientHttpRequestFactory requestFactory = new HttpComponentsClientHttpRequestFactory(cachingClient);
RestTemplate rest = new RestTemplate(requestFactory);

The situation with client side HTTP caches in Java is not particularly good. It is a non-trivial problem that has not been attacked by most of the HTTP client library developers.
I think that is changing slowly, but I cannot provide a definite pointer. A good way to start is to look at the various implementations of JAX-RS that come with a client side API such as Jersey (this has no client side cache). It might be that Restlet has one or Restfulie, please check.
Here is something I found via Google:
http://xircles.codehaus.org/projects/httpcache4j
You can also try to roll your own but you have to be careful to understand the caching headers (including Vary:) to get it right.

RestEasy features a client side caching mechanism which is trivial to get up and running if you are using such client.
RegisterBuiltin.register(ResteasyProviderFactory.getInstance());
YourService proxy = ProxyFactory.create(YourService.class, url);
LightweightBrowserCache cache = CacheFactory.makeCacheable(proxy);
You first create a client proxy instance, then wrap it around the cache. That's it.

Related

How does the browser cache the API response?

I'd like to achieve a deep knowledge about caching mechanism for website assets and even API requests. I read some articles and also searched about it in StackOverflow. There are some examples which show that, if you set for example: max-age: 20; browser will cache the RESTful API response for 20 seconds...
But the important question is that... If browser is able to cache the API response so, why we do have to use some libraries like react-query or using PWA to implement caching scenarios for web applications?
As far as I understand, if we use the browser cache and add max-age browser send request to server but server returns an empty response and makes browser to load from cache... but there is still an issue it would be that request to server to get the empty response.
But if we use something like react-query it even doesn't send that additional request to check cache with server and we can handle caching with zero request and it would a great trick to decrease server requests?
So, I am right? or this scenario is wrong and I couldn't learn it correct?
Thank you

Using Flurl in web service to handle calls to an outside service

I wanted to put a situation to you to get your thoughts on the use of Flurl; if I have developed a restful api which is designed to support authentication and sessions from multiple users and you part of the operation of the restful service is that it needs to make authenticated calls to another outside service. If I used the standard flurl implementation of call asyn from a string URL and if I need to set different headers depending on the user that authenticated to my service, would this cause unpredictable behaviour due to it using a single httpclient (as they are all calling the same host).
Doing it the way you describe is completely safe. Setting headers fluently off a URL string or Url object will apply them to the request, not the client. Example:
await url.WithHeader(name, value).PostAsync(body);
This call can be made a zillion times from different threads with different header values and a single shared HttpClient instance with no conflicts. This works because under the hood it sets the header on the HttpRequestMessage, not the default headers on the HttpClient.

Intercepting the SOAP envelop in an HttpRequest

I want to look at the XML created in my HttpRequest but can't see how. I've tried looking at the request during runtime but no luck.
I'm working in a .NET 4.0 project (just for context here, not that it matters much starting with 2.0)
I'm making a call to a third party API via my project's service reference:
SomeResponseType response = _apiClient.AddUser(userToAdd);
So how do I capture what AddUser is creating in terms of the raw XML being sent to the host without having to go through the pain of creating an Intercept filter which is not the easiest thing to put together?
You should be able to use Fiddler on your machine to capture the underlying HTTP request.
Alternatively, if you're using WCF, you can enable tracing via your config file. To go this route, see Configuring Message Logging. Then you can use the Service Trace Viewer Tool (SvcTraceViewer.exe) to pretty print your logs.
You can use a network sniffing tool such as Fiddler (www.fiddler2.com). Simply fire up Fiddler and then run your app. Fiddler will capture all of the traffic that is going across the wire, and you can look at the XML that is being sent and received from the SOAP service.

allow cross-domain requests to ASP.NET ScriptService

I've got a ASP.NET Webservice up and running using the [ScriptService] Attribute. From what I've read from this article:
http://weblogs.asp.net/scottgu/archive/2007/04/04/json-hijacking-and-how-asp-net-ajax-1-0-mitigates-these-attacks.aspx
ASP.NET by defaults does not allow JSONP requests (injected into the DOM via to deny cross-domain-requests. Its does so by taking 2 measures:
1) only accept POST requests (script injection via always does GET)
2) deny connections sending a HTTP header Content-type other than "Content-type: application/json" (which browsers will not send).
I am familiar with the cross-domain issues and I know what JSONP is and I fully understand, why ASP.NET is by default restricted in that way.
But now, I have my webservice which is a public one, and should be open to everybody. So I explicitly need to enable cross-domain requests via Javascript to my Webservice, so that external websites can retrieve data via my webservice from jquery and alike.
I've already covered step (1) to allow requests via GET by modifiying the ScriptMethod Attribute this way: [ScriptMethod(UseHttpGet=true)]. I've checked with jQuery, GET requests now work (on same-domain). But how to get to fix point (2)?
I know about the Allow-Origin-* headers some browsers support, but afaik its not standard yet, and I don't want to force my users / customers to modify their HTTP headers for using my webservice.
To sum it up: I need the good practice to enable Cross-domain requests for ScriptingService for public Webservices via JSON. I mean there MUST be a way to have a Webservice public, that is what most webservices are about?
Using legacy ASMX services for something like this seems like a lost cause. Try WCF which due to its extensible nature could very easily be JSONP enabled. So if you are asking for best practices, WCF is the technology that you should be building web services on the .NET platform.
Or if you really can't afford migrating to .NET 3.5 at the moment you could also write a custom http handler (.ashx) to do the job.
The jQuery ajax() function does have a 'crossDomain' property.
Pasted from jQuery.ajax()
crossDomain(added 1.5)
Default: false for same-domain requests, true for cross-domain requests
If you wish to force a crossDomain request (such as JSONP) on the same domain, set the value of crossDomain to true. This allows, for example, server-side redirection to another domain

How to show soap request xml for a web service call?

I am using a web method of a company's web service.
This web method requires one parameter when calling it:
CompanyOpereations srv = new CompanyOperations();
srv.getCustomerInfo(input);
How can I see my soap request xml when calling this method?
How do you want to see it? If it's inside the code; I don't know. (Un)Fortunately .NET does a pretty good job of hiding it for the developer.
However, if you just want to debug the calls and nothing else: try Fiddler. It will show you the Request/Response (including headers and everything else) for the webservice calls. This is what we use for debugging webservices. But you can use it for everything that uses the HTTP protocol for communication.

Resources