I'd like to handle ajax timeouts using redux-observable so that if a timeout occurs (after say 10 seconds) it will retry the request another two times (firing a SAVE_RETRYING action every time so the UI can notify the user that it's retrying).
For any other type of error or if we've already retried twice it should just fail and fire a SAVE_FAILURE action.
I can make it work if I trigger the SAVE_RETRYING action using store.dispatch but getting deprecation warnings about this and I'm a bit stuck figuring out how to do it the proper way (adding SAVE_RETRYING to the stream that is returned by the epic).
Here's what I have (simplified):
function saveEpic(action$, store) {
return action$.ofType('SAVE_CLICKED')
.mergeMap(action => (
ajax({
url: '/a-long-request',
})
.timeout(10000)
.map(() => ({ type: 'SAVE_SUCCESS' }))
.retryWhen(errors => (
errors.scan((count, e) => {
if (count >= 2 || e.name !== 'TimeoutError') {
throw e;
} else {
store.dispatch({ type: 'SAVE_RETRYING', count });
return count + 1;
}
}, 0)))
.startWith({ type: 'SAVE_STARTED' })
.catch(() =>
Observable.of({ type: 'SAVE_FAILURE' }))
));
}
How can I get that SAVE_RETRYING action up to the main stream? Thx.
This is not ideal, but you could use catch and undocumented second argument (which is the source observable) to resubscribe. The downside I don't like is you have to count retries in the mergeMap callback closure.
function saveEpic(action$, store) {
return action$.ofType('SAVE_CLICKED')
.mergeMap(action => {
let retries = 0;
return ajax({
url: '/a-long-request',
})
.timeout(10000)
.map(() => ({ type: 'SAVE_SUCCESS' }))
.catch((error, source) => {
retries += 1;
if (retries >= 2 || error.name !== 'TimeoutError') {
return Observable.of({ type: 'SAVE_FAILURE' });
}
return source.startWith({ type: 'SAVE_RETRYING', count: retries });
})
.startWith({ type: 'SAVE_STARTED' });
});
}
Related
Trying to make a custom async schema validator in mongoose, to check that "tags" for a course being created contains at least one item. (Using SetTimeout() to simulate async). The part of the Schema for tags is :
tags: {
type: Array,
validate: {
isAsync: true,
validator: function (v, cb) {
setTimeout(() => {
//do some async work
const result = v && v.length > 0;
cb(result);
}, 3000);
},
message: "A course should have at least one tag!",
},
},
The code for creating a course is:
async function createCourse() {
const course = new Course({
name: "Node.js Course",
author: "Anon",
category: "web",
tags: [],
isPublished: true,
price: 13,
});
try {
const result = await course.save();
cl("createCourse result", result);
} catch (ex) {
cl("createCourse validate error", ex.message);
}
}
createCourse();
I have an empty array for tags and expected the caught error "A course should have at least one tag". Instead I am getting TypeError: cb is not a function for cb(result), the callback result? Even if I have an item in the tags array it still gives the callback error and in fact it displays the createCourse result BEFORE the schema async completes and then throws the error when it does complete! (If I dont use the async validator but just a plain validator then it works fine).
tags: {
type: Array,
validate: {
validator: function(v) {
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve(v && v.length > 0);
}, 3000);
})
},
message: 'A course should have at least one tag.'
}
},
After trial and error, I came up with the solution below. No changes needed to createCourse(), just to the Schema tags section and added a delay function.
tags: {
type: Array,
validate: {
//isAsync: true,
validator: async function (v) {
await delay(3);
const result = v && v.length > 0;
return result;
},
message: "A Document should have at least one tag!",
},
},
And this calls a delay function, used to simulate a "real" async situation where this data may be being saved to a remote server.
const delay = (n) => {
return new Promise(function (resolve) {
setTimeout(resolve, n * 1000);
});
};
I need to handle a situation where I have 3 endpoints to call and would like to get the data in the most convenient/efficient way. The first call can be handled independently and returns a single result. The second endpoint returns a collection but will need to initiate 0-* subsequent calls, where a given key is present.
Ideally would like to receive the collection (from the 2nd endpoint call) as a mutated/new collection that includes the result from the 3rd endpoint call.
I am currently using forkJoin(observableA$, observableB$) to handle the first 2 calls in parallel but I cannot work out how to include the sequential calls and have the data included in observableB$
//Customer observable
const customer$ = this._customerManagementService.getCustomer(
accountNumber
);
return forkJoin({
customer: customer$,
saleCycles: saleCyclesWithVehicle$
}).pipe(finalize(() => this._loaderFactoryService.hide()));
getSalesWithVehicle(accountNumber: string, dealerKey: string) {
return this._salesCycleService
.getCyclesForCustomer({
customerNumber: accountNumber,
dealerKey: dealerKey
})
.pipe(
concatMap((results: ISaleCycle[]) => {
return results.map(cycle => {
return this._purchaseVehicleService.getPurchaseVehicle(
cycle.vehicleKey
);
});
})
);
}
I expect the collection to include further data as a new property on the original collection
UPDATE
After a bit more thought maybe I should be using reduce somewhere in the solution. This way I can be in control of what's getting push into the array and it could be dynamic?
getSalesWithVehicle(accountNumber: string, dealerKey: string) {
return this._salesCycleService
.getCyclesForCustomer({
customerNumber: accountNumber,
dealerKey: dealerKey
})
.pipe(
switchMap((results: ISaleCycle[]) => {
return results.map(cycle => {
if (cycle.vehicleKey) {
return this._purchaseVehicleService
.getPurchaseVehicle(cycle.vehicleKey)
.pipe(
reduce((acc, vehicle) => {
return { cycle: cycle, vehicle: vehicle };
}, []),
toArray()
);
}
else {
///No extra data to be had
}
});
}),
concatAll()
);
}
I would use concatMap() to merge the responses of HTTP requests 2 and 3.
import { of } from 'rxjs';
import { map, concatMap } from 'rxjs/operators';
const pretendGetCustomer = of({accountNumber: 123, name:"John Doe"});
const pretendGetVehiculeHttpRequest = (customerNumber) => {
return of([{custNum: 123, vehicleId:"2"}, {custNum: 123, vehicleId:"1"}]);
}
const pretendGetCyclesHttpRequest = (cycleIds) => {
return of([{id:"1", name:"yellow bike", retailPrice:"$10"}, {id:"2", name:"red bike", retailPrice:"$20"}]);
}
const yourFunction = () => {
pretendGetCustomer.subscribe(customer => {
// Assuming you do other things here with cust, reason why we are subscribing to this separately
// isHappy(customer)
// Your second & third calls
pretendGetVehiculeHttpRequest(customer.accountNumber).pipe(
// Need to use concatMap() to subscribe to new stream
// Note: use mergeMap() if you don't need the 1st stream to be completed
// before calling the rest
concatMap(purchases => {
const cyclesIds = purchases.map(p => p.vehicleId);
// concatMap() requires an Observable in return
return pretendGetCyclesHttpRequest(cyclesIds).pipe(
// Use map() here because we just need to use the data,
// don't need to subscribe to another stream
map(cycles=>{
// Retrun whatever object you need in your subscription
return {
customerNumber: customer.accountNumber,
customerName: customer.name,
purchases: purchases.map(p => cycles.find(c => p.vehicleId === c.id))
}
})
);
})
).subscribe(resultof2and3 => {
// Do something with the new/mutated Object which is a result of
// your HTTP calls #2 and #3
console.log(resultof2and3);
});
});
}
yourFunction();
I made a stackblitz if you want to see the above run (see console): https://stackblitz.com/edit/rxjs-nqi7f1
This is the solution I eventually came up with. I've taken the advice from BoDeX and used concatMap(). In my mind it was clear that I wanted to use forkJoin and be able to reference the results by object key, I.e customer or saleCycles.
In the scenario where a vehicleKey was present I needed to return the results in a defined data structure, using map(). Likewise, if no vehicle was found then I just needed the outer observable.
const customer$ = this._customerManagementService.getCustomer(accountNumber);
const saleCyclesWithVehicle$ = this.getSalesWithVehicle(accountNumber,dealerKey);
getSalesWithVehicle(accountNumber: string, dealerKey: string) {
return this._salesCycleService
.getCyclesForCustomer({
customerNumber: accountNumber,
dealerKey: dealerKey
})
.pipe(
concatMap(cycles => {
return from(cycles).pipe(
concatMap((cycle: ISaleCycle) => {
if (cycle.vehicleKey) {
return this._purchaseVehicleService
.getPurchaseVehicle(cycle.vehicleKey)
.pipe(
map(vehicle => {
return { cycle: cycle, vehicle: vehicle };
})
);
} else {
return of({ cycle: cycle });
}
}),
toArray()
);
})
);
}
return forkJoin({
customer: customer$,
saleCycles: saleCyclesWithVehicle$
}).pipe(finalize(() => this._loaderFactoryService.hide()));
I have the first asynchronous function
fetch("https://api.priceapi.com/v2/jobs", {
body: body,
headers: {
"Content-Type": "application/x-www-form-urlencoded"
},
method: "POST"
}).then((response) => {
return response.json();
}).then((data) => {
return fetchRepeat(data.job_id)
})
And the second recursive asynchronous function.
function fetchRepeat(id){
fetch("https://api.priceapi.com/v2/jobs/"+ id +"/download.json?token=" + priceapisecret.secret)
.then((response) => {
return response.json()
}).then((data) =>{
if(data.status == "finished"){
var bookdata = {
title: data.results[0].content.name,
price: data.results[0].content.price
}
return bookdata;
}
else{
fetchRepeat(id)
}
})
}
I want to be able to access bookdata in the first async function. How do I do that?
In order to talk about a return your fetchRepeat needs to return the promise. It did not so returning undefined was the result. The last then also didn't return the value of the recursion and thus also resolved to undefined.
Here is a working version:
function fetchRepeat(id) {
// return the promise
return fetch(`https://api.priceapi.com/v2/jobs/${id}/download.json?token=${priceapisecret.secret}`)
.then(response => response.json())
.then(({ status, results: [{ content: { name: title, price } }] = [{ content: {} }] }) =>
(status === 'finished' ? { title, price } : fetchRepeat(id))); // return result of recursion
}
Now I let ESLint handle the formatting and since I use airbnb it prefers destructuring. The error in the last then was obvious since ELSint complained about consistent return. I urge you to use a linter and an IDE which enforces a coding style to reduce bugs in your code and make it easier for others to read.
I'm trying to subscribe to a subject. This is working as expected the first time but throwing the above error the second time and I can't see where to fix it.
export function uploadSceneFile(action$, store) {
return action$.ofType(CREATE_SCENE_SUCCESS)
.mergeMap(({payload}) =>
UploadSceneWithFile(payload)
.map(res => {
if (res.progress > 0){
return { type: UPLOAD_SCENE_PROGRESS, scene: res }
}
else if(res.progress === -1){
return { type: UPLOAD_SCENE_SUCCESS, scene: res }
}
})
)
}
It's designed to listen for the scen being created, dispatch upload progress notifications and then dispatch the success message.
The error gets thrown straight away from this line the second time it runs
onProgress: (val)=> subject$.next({...scene,progress:val}),
export function UploadSceneWithFile(scene){
const subject$ = new Subject()
scene.filename = scene.file.name
scene.type = scene.file.type.match('image') ? 0 : 1
FileToScenePreview(scene).then(res => {
scene.thumbName = res.thumbName
})
const uploader = new S3Upload({
getSignedUrl: getSignedUrl,
uploadRequestHeaders: {'x-amz-acl': 'public-read'},
contentType: scene.file.type,
contentDisposition: 'auto',
s3path: 'assets/',
onError:()=>subject$.next('error'),
onProgress: (val)=> subject$.next({...scene,progress:val}),
onFinishS3Put: ()=> {
subject$.next({...scene,progress:-1})
subject$.complete()
},
})
uploader.uploadFile(scene.file)
return subject$.asObservable()
}
ERROR MESSAGE
Subscriber.js:242 Uncaught Error: Actions must be plain objects. Use custom middleware for async actions.
at Object.performAction (<anonymous>:1:40841)
at liftAction (<anonymous>:1:34377)
at dispatch (<anonymous>:1:38408)
at createEpicMiddleware.js:59
at createEpicMiddleware.js:59
at SafeSubscriber.dispatch [as _next] (applyMiddleware.js:35)
at SafeSubscriber../node_modules/rxjs/Subscriber.js.SafeSubscriber.__tryOrUnsub (Subscriber.js:238)
at SafeSubscriber../node_modules/rxjs/Subscriber.js.SafeSubscriber.next (Subscriber.js:185)
at Subscriber../node_modules/rxjs/Subscriber.js.Subscriber._next (Subscriber.js:125)
at Subscriber../node_modules/rxjs/Subscriber.js.Subscriber.next (Subscriber.js:89)
at SwitchMapSubscriber../node_modules/rxjs/operators/switchMap.js.SwitchMapSubscriber.notifyNext (switchMap.js:126)
at InnerSubscriber../node_modules/rxjs/InnerSubscriber.js.InnerSubscriber._next (InnerSubscriber.js:23)
at InnerSubscriber../node_modules/rxjs/Subscriber.js.Subscriber.next (Subscriber.js:89)
at MergeMapSubscriber../node_modules/rxjs/operators/mergeMap.js.MergeMapSubscriber.notifyNext (mergeMap.js:145)
at InnerSubscriber../node_modules/rxjs/InnerSubscriber.js.InnerSubscriber._next (InnerSubscriber.js:23)
at InnerSubscriber../node_modules/rxjs/Subscriber.js.Subscriber.next (Subscriber.js:89)
at MergeMapSubscriber../node_modules/rxjs/operators/mergeMap.js.MergeMapSubscriber.notifyNext (mergeMap.js:145)
at InnerSubscriber../node_modules/rxjs/InnerSubscriber.js.InnerSubscriber._next (InnerSubscriber.js:23)
at InnerSubscriber../node_modules/rxjs/Subscriber.js.Subscriber.next (Subscriber.js:89)
at MapSubscriber../node_modules/rxjs/operators/map.js.MapSubscriber._next (map.js:85)
at MapSubscriber../node_modules/rxjs/Subscriber.js.Subscriber.next (Subscriber.js:89)
at Subject../node_modules/rxjs/Subject.js.Subject.next (Subject.js:55)
at S3Upload.onProgress (uploadSceneFile.js:27)
at S3Upload.<anonymous> (s3upload.js:139)
In the inner map within your uploadSceneFile, you have an if statement followed by an else if statement, of if neither is true, the map will return undefined instead of an action.
.map(res => {
if (res.progress > 0){
return { type: UPLOAD_SCENE_PROGRESS, scene: res }
}
else if(res.progress === -1){
return { type: UPLOAD_SCENE_SUCCESS, scene: res }
}
// An action should be returned here!
})
Note that, when passed an undefined action, the check that Redux performs to determine whether or not an action is a plain object will effect the error you are seeing.
How can I correctly search for a row in the database and INSERT/UPDATE accordingly to the search result (INSERT if not found, UPDATE if found)?
I'm currently doing this:
bookshelf.transaction(async function (t) {
for (var x = 0; x < 10; x++) {
let row = pmsParser.getRow(x);
if (_.isEmpty(row)) {
break;
}
let data = {
lastUpdate: moment(row.lastUpdate, 'DD/MM/YYYY - HH:mm').toDate(),
mvs: row.version,
color: row.color,
location: row.location,
status: row.status
};
new Vehicle({ chassi: row.chassi })
.fetch({ require: true })
.then(model => {
return new Vehicle(model)
.save(data, { transacting: t, patch: true });
})
.catch(Vehicle.NotFoundError, err => {
new Vehicle(data)
.save('chassi', row.chassi, { transacting: t })
.then(() => {
console.log(`Inserted... ${row.chassi}`);
});
})
.catch(err => {
console.log(err.message);
});
}
})
.catch(function (err) {
console.error(err);
return res.json({ status: false, count: 0, error: err.message });
});
And I receive this error:
Transaction query already complete, run with DEBUG=knex:tx for more info
Unhandled rejection Error: Transaction query already complete, run with DEBUG=knex:tx for more info
at completedError (/home/node/app/node_modules/knex/lib/transaction.js:297:9)
at /home/node/app/node_modules/knex/lib/transaction.js:266:22
at tryCatcher (/home/node/app/node_modules/bluebird/js/release/util.js:16:23)
at Function.Promise.attempt.Promise.try (/home/node/app/node_modules/bluebird/js/release/method.js:39:29)
at Client_SQLite3.trxClient.query (/home/node/app/node_modules/knex/lib/transaction.js:264:34)
at Runner.<anonymous> (/home/node/app/node_modules/knex/lib/runner.js:138:36)
at Runner.tryCatcher (/home/node/app/node_modules/bluebird/js/release/util.js:16:23)
at Runner.query (/home/node/app/node_modules/bluebird/js/release/method.js:15:34)
at /home/node/app/node_modules/knex/lib/runner.js:61:21
at tryCatcher (/home/node/app/node_modules/bluebird/js/release/util.js:16:23)
at /home/node/app/node_modules/bluebird/js/release/using.js:185:26
at tryCatcher (/home/node/app/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (/home/node/app/node_modules/bluebird/js/release/promise.js:512:31)
at Promise._settlePromise (/home/node/app/node_modules/bluebird/js/release/promise.js:569:18)
at Promise._settlePromise0 (/home/node/app/node_modules/bluebird/js/release/promise.js:614:10)
at Promise._settlePromises (/home/node/app/node_modules/bluebird/js/release/promise.js:693:18)
Knex debug output
knex:tx trx1: Starting top level transaction +0ms
knex:tx trx1: releasing connection +28ms
knex:tx undefined: Transaction completed: update "vehicles" set "color" = ?, "lastUpdate" = ?, "location" = ?, "mvs" = ?, "status" = ? where "id" = ? +15ms
Transaction query already complete, run with DEBUG=knex:tx for more info
knex:tx undefined: Transaction completed: update "vehicles" set "color" = ?, "lastUpdate" = ?, "location" = ?, "mvs" = ?, "status" = ? where "id" = ? +8ms
Transaction query already complete, run with DEBUG=knex:tx for more info
When under a transaction ALL related database accesses must be within the context of the transaction.
//...
new Vehicle({ chassi: row.chassi })
.fetch({ require: true, transacting: t })
.then(model => {
//...
Your iterations are not being correctly promisified. That makes your changes to escape the transaction context, causing the 'Transaction query already complete' error. When creating promises within a loop it is always advisable to collect them and submit to a promise collection handling, such as Promise.all(). This will avoid escaping the transaction context before all promises are resolved.
Those changes may lead to a code as below (untested):
bookshelf.transaction(async function (t) {
let promises = [];
for (var x = 0; x < 10; x++) {
let row = pmsParser.getRow(x);
if (_.isEmpty(row)) {
break;
}
let data = {
lastUpdate: moment(row.lastUpdate, 'DD/MM/YYYY - HH:mm').toDate(),
mvs: row.version,
color: row.color,
location: row.location,
status: row.status
};
promises.push(
new Vehicle({ chassi: row.chassi })
.fetch({ require: true, transacting: t })
.then(model => {
return model // no need to use 'new Vehicle()' here
.save(data, { transacting: t, patch: true });
})
.catch(Vehicle.NotFoundError, err => {
return new Vehicle(data) // missing 'return'
.save('chassi', row.chassi, { transacting: t })
.then(() => {
console.log(`Inserted... ${row.chassi}`);
});
})
.catch(err => {
console.log(err.message);
// throw err; // should rethrow it!
})
);
}
return Promise.all(promises)
.catch(function (err) {
console.error(err);
return res.json({ status: false, count: 0, error: err.message });
});
};