event.passThroughOnException sends requests to origin, but without POST data - cloudflare-workers

I thought that event.passThroughOnException(); should set the fail open strategy for my worker, so that if an exception is raised from my code, original requests are sent to my origin server, but it seems that it’s missing post data. I think that’s because the request body is a readable stream and once read it cannot be read again, but how to manage this scenario?
addEventListener('fetch', (event) => {
event.passThroughOnException();
event.respondWith(handleRequest(event));
});
async function handleRequest(event: FetchEvent): Promise<Response> {
const response = await fetch(event.request);
// do something here that potentially raises an Exception
// #ts-ignore
ohnoez(); // deliberate failure
return response;
}
As you can see in the below image, the origin server did not receive any body (foobar):

Unfortunately, this is a known limitation of passThroughOnException(). The Workers Runtime uses streaming for request and response bodies; it does not buffer the body. As a result, once the body is consumed, it is gone. So if you forward the request, and then throw an exception afterwards, the request body is not available to send again.

Did a workaround by cloning event.request, then add a try/catch in handleRequest. On catch(err), send the request to origin using fetch while passing the cloned request.
// Pass request to whatever it requested
async function passThrough(request: Request): Promise<Response> {
try {
let response = await fetch(request)
// Make the headers mutable by re-constructing the Response.
response = new Response(response.body, response)
return response
} catch (err) {
return ErrorResponse.NewError(err).respond()
}
}
// request handler
async function handleRequest(event: FetchEvent): Promise<Response> {
const request = event.request
const requestClone = event.request.clone()
let resp
try {
// handle request
resp = await handler.api(request)
} catch (err) {
// Pass through manually on exception (because event.passThroughOnException
// does not pass request body, so use that as a last resort)
resp = await passThrough(requestClone)
}
return resp
}
addEventListener('fetch', (event) => {
// Still added passThroughOnException here
// in case the `passThrough` function throws exception
event.passThroughOnException()
event.respondWith(handleRequest(event))
})
Seems to work OK so far. Would love to know if there are other solutions as well.

Related

ClientException, and i can't print the returned value (the request body)

Alright i'm losing my mind here,
in my flutter app, i'm using this function to perform post requests :
Future<Map> postRequest(String serviceName, Map<String, dynamic> data) async {
var responseBody = json.decode('{"data": "", "status": "NOK"}');
try {
http.Response response = await http.post(
_urlBase + '$_serverApi$serviceName',
body: jsonEncode(data),
);
if (response.statusCode == 200) {
responseBody = jsonDecode(response.body);
//
// If we receive a new token, let's save it
//
if (responseBody["status"] == "TOKEN") {
await _setMobileToken(responseBody["data"]);
// TODO: rerun the Post request
}
}
} catch (e) {
// An error was received
throw new Exception("POST ERROR");
}
return responseBody;
}
The problems are :
I get a ClientException (Not every time)
In another class, I stored the result of this function in a variable, it's supposed to return a Future<Map<dynamic, dynamic>>, when i printed it it shows :
I/flutter ( 9001): Instance of 'Future<Map<dynamic, dynamic>>'
But when i run the same post request directly (without using a function) it worked, and it shows the message that i was waiting for.
note: in both cases (function or not), in the server side it was the same thing.
this is the function where i used the post request:
void _confirm() {
if (_formKey.currentState.saveAndValidate()) {
print(_formKey.currentState.value);
var v = auth.postRequest("se_connecter", _formKey.currentState.value);
print(v);
} else {
print(_formKey.currentState.value);
print("validation failed");
}
}
Well for the second problem, i just did these changes:
void _confirm() async {
and
var v = await auth.postRequest('se_connecter', _formKey.currentState.value);
and yes it is stupid.
For the exception, it was the ssl encryption that caused it, so i removed it from my backend.

why can service worker respondWith() return a fetch object instead of a real response?

I was reading the MDN docs, and got confused why event.respondWith can have a fetch object returned? Isn't the actual request initiator expecting a response instead of a fetch?
addEventListener('fetch', event => {
// Prevent the default, and handle the request ourselves.
event.respondWith(async function() {
// Try to get the response from a cache.
const cachedResponse = await caches.match(event.request);
// Return it if we found one.
if (cachedResponse) return cachedResponse;
// If we didn't find a match in the cache, use the network.
return fetch(event.request);
}());
});
The actual request initiator is not expecting a response. It is expecting a promise that resolves into a response. The MDN docs say exactly that:
The respondWith() method of FetchEvent prevents the browser's default
fetch handling, and allows you to provide a promise for a Response
yourself.
You are not returning a fetch object here when you call fetch(event.request). You are calling the fetch method which returns a promise that resolves into a response.
You can return any promise that resolves to a response here, like so:
return new Promise(function(resolve, reject) {
setTimeout(function() {
resolve(/* { Fake Response Object } */);
}, 1500);
});

Service Worker - TypeError: Request failed

I used service worker to cache the resource from the other domain. I get this error "TypeError: Request failed serivce-worker.js:12" I don't know why this error is occurring.
service-worker.js
var cacheNames=['v1'];
var urlsToPrefetch=['file from other domain'];
self.addEventListener('install', function (event) {
event.waitUntil(
caches.open(cacheNames).then(function(cache) {
console.log('Service Worker: Caching Files');
cache.addAll(urlsToPrefetch.map(function (urlToPrefetch) {
console.log(urlToPrefetch);
return new Request(urlToPrefetch, {mode: 'no-cors'});
})).catch(function(error){
console.log(error);
});
})
);
});
self.addEventListener('fetch', function(event) {
console.log('Service Worker: Fetching');
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
return response;
}
return fetch(event.request);
}
)
);
});
This is a side-effect of dealing with opaque responses (those fetched with mode: 'no-cors'). Here's an excerpt from this longer answer:
One "gotcha" that developer might run into with opaque responses involves using them with the Cache Storage API. Two pieces of background information are relevant:
The status property of an opaque response is always set to 0, regardless of whether the original request succeeded or failed.
The Cache Storage API's add()/addAll() methods will both reject if the responses resulting from any of the requests have a status code that isn't in the 2XX range.
From those two points, it follows that if the request performed as part of the add()/addAll() call results in an opaque response, it will fail to be added to the cache.
You can work around this by explicitly performing a fetch() and then calling the put() method with the opaque response. By doing so, you're effectively opting-in to the risk that the response you're caching might have been an error returned by your server.
const request = new Request('https://third-party-no-cors.com/', {mode: 'no-cors'});
// Assume `cache` is an open instance of the Cache class.
fetch(request).then(response => cache.put(request, response));

How to handle loss of connection in Angular2 with RXJS HTTP when polling

I have the following code (simplified for this post) - assume an initial call to onStart().
Running this works fine. If I lose the internet connection I get the net::ERR_INTERNET_DISCONNECTED error (as expected) but the polling stops.
Clearly I am not handling any errors here as that is where I'm getting stuck. I'm not clear where I handle those errors and how? Do I need to call startPolling() again?
I need the polling to continue even if there is no internet connection, so that on re-connection data is updated. Any advice please?
onStart() {
this.startPolling().subscribe(data => {
// do something with the data
});
}
startPolling(): Observable<any> {
return Observable
.interval(10000)
.flatMap(() => this.getData());
}
getData() {
var url = `http://someurl.com/api`;
return this.http.get(url)
.map(response => {
return response.json();
});
}
Thanks in advance.
If you know the error happens because of this.http.get(url) then you can add catch() operator that lets you subscribe to another Observable instead of the source Observable that sent an error notification.
getData() {
var url = `http://someurl.com/api`;
return this.http.get(url)
.catch(err => Observable.empty())
.map(response => {
return response.json();
});
}
This will simply ignore the error and won't emit anything.

Chaining RxJS Observables with interval

my first question to the community out here!
i'm working on an app which does communicates to the API in the following way
step1: create request options, add request payload --> Post request to API
API responds with a request ID
Step2: update request options, send request ID as payload --> post request to API
final response: response.json
Now the final response can take a bit of time, depending on the data requested.
this can take from anywhere between 4 to 20 seconds on an average.
How do i chain these requests using observables, i've tried using switchmap and failed (as below) but not sure how do i add a interval?
Is polling every 4 second and unsubscribing on response a viable solution? how's this done in the above context?
Edit1:
End goal: i'm new to angular and learning observables, and i'm looking to understand what is the best way forward.. does chaining observable help in this context ? i.e after the initial response have some sort of interval and use flatMap
OR use polling with interval to check if report is ready.
Here's what i have so far
export class reportDataService {
constructor(private _http: Http) { }
headers: Headers;
requestoptions: RequestOptions;
payload: any;
currentMethod: string;
theCommonBits() {
//create the post request options
// headers, username, endpoint
this.requestoptions = new RequestOptions({
method: RequestMethod.Post,
url: url,
headers: newheaders,
body: JSON.stringify(this.payload)
})
return this.requestoptions;
}
// report data service
reportService(payload: any, method: string): Observable<any> {
this.payload = payload;
this.currentMethod = method;
this.theCommonBits();
// fetch data
return this._http.request(new Request(this.requestoptions))
.map(this.extractData)
.catch(this.handleError);
}
private extractData(res: Response) {
let body = res.json();
return body || {};
}
private handleError(error: any) {
let errMsg = (error.message) ? error.message :
error.status ? `${error.status} - ${error.statusText}` : 'Server error';
console.error(errMsg); // log to console instead
return Observable.throw(errMsg);
}
in my component
fetchData() {
this._reportService.reportService(this.payload, this.Method)
.switchMap(reportid => {
return this._reportService.reportService(reportid, this.newMethod)
}).subscribe(
data => {
this.finalData = data;
console.info('observable', this.finalData)
},
error => {
//console.error("Error fetcing data!");
return Observable.throw(error);
}
);
}
What about using Promise in your service instead of Observable, and the .then() method in the component. You can link as much .then() as you want to link actions between them.

Resources