Can I cache api call which is handled by XMLHttpRequest not fetch? - fetch

I'm trying to cache API call which is handled by XMLHttpRequest with using Workbox. I successfully cached API calls in Cache Storage. However, the request to cached data fails when I switched to offline although the data is in Cache Storage. Then I replaced the method using XMLHttpRequest with fetch event and it pulled the cached data and renders the element in offline.
So, I would like to know whether there is a workaround to use XMLHttpRequest not fetch.
The reason I'm asking is that I am planning to cache API call for multiple many pages but most of the widgets in my workplace is using jQuery Ajax. I'm afraid that we need to replace the method Ajax with fetch one-by-one.

Related

Handle external API calls inside an api

I have a simple HTTP server where you can create and manage todos. You can also add plugins in order to, for example, send an email to the people who starred a todo when that todo has been completed. I currently check for all enabled plugins through an query to the database, and then query each API endpoint for the different plugins (Gmail, Notion, Trello, etc). After this is finished, I send a response back to the user. This is a problem, because it means I rely on the speed of the external API's I am requesting for my response. If the Notion api is slow, then my endpoint is also slow.
Is there a way to first send a response after, for example, the server marks the todo as completed, but then send a different response after all the plugins have been queried (Gmail, Notion, Trello, etc)? Would I have to use web sockets? Or is the way I currently handle external API queries the only way to do it?
You are right thinking that you want to decouple requests from customers with backend processing (reaching out other providers); and web sockets is one of options to do that. HTTP2 streams is another options. And, of course, pulling is also a way (simple, but not too efficient).

How can i fetch only updated events from google calendar php API?

I need to fetch only updated events from google calendar.Is it possible?
Use the "incremental synchronization" functionality in the Calendar API, see: https://developers.google.com/google-apps/calendar/v3/sync
After the initial request:
Incremental sync is performed repeatedly and updates the client with
all the changes that happened ever since the previous sync. Each time,
the client provides the previous sync token it obtained from the
server and stores the new sync token from the response.
See the syncToken argument for the events.list() method: https://developers.google.com/google-apps/calendar/v3/reference/events/list
The PHP documentation for the relevant methods (to get and set the sync token) is here: https://developers.google.com/resources/api-libraries/documentation/calendar/v3/php/latest/class-Google_Service_Calendar_Events.html

Sync external planning with Google agenda private API

We're developing an agenda on our platform. We implemented a feature to sync with Google Agenda which works correctly except that it only works with public calendar and not when it's private.
We implement everything as Google provides and use AuthO2 protocol.
We are migrating to https and we hope that it will solve our issue.
Do you have any idea on the reason it's blocked when agenda is private?
You can implement synchronization by sending HTTP request:
GET https://www.googleapis.com/calendar/v3/calendars/calendarId/events
and adding path parameters and optional query parameters as shown in Events: list.
In addition to that, referring to Synchronize Resources Efficiently, you can keep data for all calendar collections in sync while saving bandwidth by using the "incremental synchronization".
As highlighted in the documentation:
A sync token is a piece of data exchanged between the server and the client, and has a critical role in the synchronization process.
As you may have noticed, sync token takes a major part in both stages in incremental synchronization. Make sure to store this syncToken for the next sync request. As discussed:
Initial full sync is performed once at the very beginning in order to fully synchronize the client’s state with the server’s state. The client will obtain a sync token that it needs to persist.
Incremental sync is performed repeatedly and updates the client with all the changes that happened ever since the previous sync. Each time, the client provides the previous sync token it obtained from the server and stores the new sync token from the response.
More information and examples on how to synchronize efficiently can be found in the given documentations.

Polymer sending file and other data at the same time with core ajax

I'm trying to send an image and some more information to the server form the font-end using core-ajax.
And I was wondering about how it cloud be done with one core-ajax element,and how to receive the data in the server (Asp.net) api
Use FormData object. Mozilla Developer network has a very nice tutorial on FormData.
You can handle this data at server, just as you handle any other POST request.

ASP.net/WCF Service Image handling advice

I would like some advice on how to approach a programming task.
I am creating a website that collects data over time (users enter data) and then it displays spatial maps (upon request) created from that data.
Each time a user enters data I am using a webservice to store data into a SQL database (using ef4).
The design decision I have now is when a client requests a map/image, how should I return the image to them.
Should I:
Upon calling a WCF service construct
the image and return that image via
the service
Upon calling a WCF service construct
the image, store
it server side somewhere, return a path
or URI/URL to the image
I was hoping to do (2), but I am unsure of where to save the image and what implications that has when trying to expose it.
The WebService and Asp.Net application will most likely run from the same server if that helps at all.
Any advice on how to proceed would be most appreciated, particularly details like where it is safe to save from a WCF service etc
Thanks.
(I am using a service because there will be other clients other than the Asp.Net app that request the data).
It would largely depend on what kind of API your clients can consume? If they need SOAP API then you have to use either WCF services or asmx to provide the images. However, the better alternative is clearly a HTTP API i.e a unique URL is provided for accessing the particular image - you may use REST style URL or may choose to accept parameters via query string. Essentially, URL should have all information needed to retrieve image from the database.
Now, you may directly choose REST based WCF services to serve these images or may put ASP.NET based HTTP Handler (as facade) over WCF services to serve them. If you are going to use WCF and image sizes are large then do consider streaming. I will prefer a putting a facade as it helps you to move some important concerns such as caching and streaming outside WCF world. So in such scenario, WCF services will use the best end-point possible to deliver image data from data store. Your facade HTTP handler would translate the url in necessary WCF service call, get the image data, cache into the file system and serve to client. The subsequent requests would be served via this cached image. Because you are using URLs and get requests, images would also cache on client side (you can control this by emitting explicit headers in your facade handler). Typically, requests to fetch the image would be substantially high than requests to update data - by serving get requests via different handler, you may scale it independently of WCF services.
I would use an HTTP handler to consume your WCF service based on some parameters. I use something similar to retrieve an image stored as a BLOB in a database and set some default sizes on the image.
The HTTP handler would be called from an image tag on the asp.net page.
This article has more info on http handlers

Resources