I have written a handler function inside my nextjs page/api folder;
handler(req, res) {}
Am using #influxdata/influxDb-client as mentioned in the documentation. Am using
from(queryAPI.rows(query).pipe(....).subscribe(next(value)=> {results.push(value}, complete(console.log(results); res.status(200).json(results)}
Am getting all the query value, once the observable is completed. it works most of the time.
Am pushing the intermediate results in the next part of the subscriber and trying to send the results back to client in the complete part of the subscriber. I want the request handler to wait till i get all my values from influx DB query in the complete part of the subscriber and can send the value back to client..
But the issue "Handler function will not Wait till the observable is completed". Handler function returns, before the observer gets completed. Am getting error: API resolved without sending a response...
I get all the values only when the observer is completed.
I don't know how to handle the scenario.
How can I make the handler function wait until the observable is completed?
I found the solution for the same
I used new Promise() with await, added my observable inside this promise and resolved the promise on Complete of the subscribe.
Code will look like the following :
export async function handler (req, res) {
const results=[];
await new Promise((resolve, reject) => {
from((queryAPIs.rows(query))
.pipe(map(({values, tableMeta}) => tableMeta.toObject(values)))
.subscribe(
{
next(object) => {results.push(object)}
complete() => { resolve (results) }
error(err) => { reject (err) }
});
res.status(200).send(results);
}
}
Related
I've made firebase cloud function which adds the claim to a user that he or she has paid (set paid to true for user):
const admin = require("firebase-admin");
exports.addPaidClaim = functions.https.onCall(async (data, context) => {
// add custom claim (paid)
return admin.auth().setCustomUserClaims(data.uid, {
paid: true,
}).then(() => {
return {
message: `Succes! ${data.email} has paid for the course`,
};
}).catch((err) => {
return err;
});
});
However, when I'm running this function: I'm receiving the following error: "Unhandled Rejection (RangeError): Maximum call stack size exceeded". I really don't understand why this is happening. Does somebody see what could cause what's getting recalled which in turn causes the function to never end?
Asynchronous operations need to return a promise as stated in the documentation. Therefore, Cloud Functions is trying to serialize the data contained by promise returned by transaction, then send it in JSON format to the client. I believe your setCustomClaims does not send any object to consider it as an answer to the promise to finish the process so it keeps in a waiting loop that throws the Range Error.
To avoid this error I can think of two different options:
Add a paid parameter to be able to send a JSON response (and remove the setCustomUserClaim if it there isn’t any need to change the user access control because they are not designed to store additional data) .
Insert a promise that resolves and sends any needed information to the client. Something like:
return new Promise(function(resolve, reject) {
request({
url: URL,
method: "POST",
json: true,
body: queryJSON //A json variable I've built previously
}, function (error, response, body) {
if (error) {
reject(error);
}
else {
resolve(body)
}
});
});
I'm having a hard time wrapping my brain around this pattern I am trying to implement so I'm hoping the stack overflow community might be able to help me work through a solution to this.
Currently I use redux-thunk along with superagent to handle calls to me API and syncing it all up with redux
An example of this might look like
export const getUser = (id) => {
return (dispatch) => {
const deferred = new Promise((resolve, reject) => {
const call = () => {
API.get(`/users/${id}`)
.then((response) => response.body)
.then((response) => {
if (response.message === 'User found') {
serializeUser(response.data).then((response) => {
resolve(response);
});
} else {
reject('not found');
}
}).catch((err) => {
handleCatch(err, dispatch).then(call).catch(reject)
});
}
call()
});
return deferred;
};
};
In the case where the server comes back with a 200 and some data I continue on with putting the data into the store and rendering to the page or whatever my application does.
In the case I receive an error I have attempted to write a function that will intercept those and determine if it should show an error on page or in the case of a 401 from our API, attempt a token refresh and then try to recall the method...
import { refreshToken } from '../actions/authentication';
export default (err, dispatch) => {
const deferred = new Promise((resolve, reject) => {
if (err.status === 401) {
dispatch(refreshToken()).then(resolve).catch(reject)
} else {
reject(err);
}
})
return deferred;
};
This works, however, I have to add this to each call, and it doesn't account for concurrent calls that should not attempt to call if there is a refresh in progress.
I've seen some things in my research on this topic that maybe redux-saga could work but I haven't been able to wrap my brain around how I might make this work
Basically, I need something like a queue that all my API requests will go into that is maybe debounced so any concurrent requests will just be pushed to the end and once a timeout ends the calls get stacked up, when the first call gets a 401 it pauses the queue until the token refresh either comes back successful, in which case it continues the queue, or with a failure, in which case it cancels all future requests from the queue and sends the user back to a login page
The thing I would be worried about here is if the first call in the stack takes a long time, I don't want the other calls to then have to wait a long time because it will increase the perceived loading time to the user
Is there a better way to handle keeping tokens refreshed?
I am following what I've read around being the standard way to use Meteor.call but it's behaving strangely in this scenario:
Client:
Template.sometemplate.events({
'submit .somebutton'(event){
...
Meteor.call('stuff.someMethod', param1, function (err, res){
console.log(err);
console.log(res);
};
}
})
Server /api/stuff.js:
Meteor.methods({
'stuff.someMethod'(param1){
...
Meteor.call('otherstuff.someOtherMethod', param1, function(err, res){
if(err){ throw new Meteor.Error(400,'wrong things');}
if(res) { return 'ok';}
}
);
}
})
Server /api/otherstuff.js:
Meteor.methods({
'otherstuff.someOtherMethod'(param1){
...
return OtherStuff.findOne(query);
}
})
On the client side I click and immediately see the console.log for both err and res as undefined. Whereas in other parts of the application when the client calls a server method, which is not calling another method, the client waits for the answer before executing the asynch callback.
There must be something wrong in how I use the Meteor.call inside a server method calling another server method. The scenario is that for instance I want to insert a document and while doing so I want to check some values in order to link it to other documents from other collections.
Thank you very much,
T.
Sync call on the server
Using Meteor.call on the server does not require a callback, unless you really want to work async on the server side.
If you do not pass a callback on the server, the method invocation
will block until the method is complete. It will eventually return the
return value of the method, or it will throw an exception if the
method threw an exception. (Possibly mapped to 500 Server Error if the
exception happened remotely and it was not a Meteor.Error exception.)
Instead of passing a callback you would either return the result
return Meteor.call(...)
or assign it to a variable that is used for further processing.
const retVal = Meteor.call(...)
Better way: Externalize shared code
If two meteor methods rely on the same code (e.g. one is calling the other) you should extract this code into a shared function. This makes testing and tracing errors also easier.
server/api/common.js
export const sharedFunction = function(param1) {
// ... do somethin
return OtherStuff.findOne(query);
}
server/api/stuff.js:
import { sharedFunction } from './common.js';
Meteor.methods({
'stuff.someMethod'(param1){
// ...
const temp = sharedFunction(param1);
// ...
return result; // or temp if this should be returned to client
}
})
server/api/otherstuff.js
import { sharedFunction } from './common.js';
Meteor.methods({
'otherstuff.someOtherMethod'(param1){
return sharedFunction(param1);
}
});
Using the sharedFunction follows the concepts of DRY and Single Point of Failure.
In a React Native project, I wrote this function using Promise to do a job asynchronously;
function doEncryptionAsync(params) {
return new Promise(
function (resolve, reject) {
// Async code started
console.log('Promise started (Async code started)');
// The job that takes some times to process
var encrypted_value = new EncryptedValue(params);
if (true) {
resolveencrypted_value
}
else {
reject("Error while encrypting!");
}
}
)
}
And I call that in my Redux action;
export const encrypt = ( params ) => {
return (dispatch) => {
dispatch({
type: type.ENCRYPT
});
// Sync code started
console.log('Started (Sync code started)');
doEncryptionAsync(params)
.then((response) => {
// Async code terminated
console.log('Promise fulfilled (Async code terminated)');
encryptSuccess(dispatch, response);
})
.catch((error) => {
console.log(error);
encryptFail(dispatch);
});
// Sync code terminated
console.log('Promise made (Sync code terminated)');
}
}
It works, but not asynchronously! My main thread seems to be blocked until doEncryptionAsync() returns. The line console.log('Promise made (Sync code terminated)') runs, but not immediately!
My output for logs is like this;
// OUTPUT Simulation
Started (Sync code started) at time x
Promise started (Async code started) at time x
Promise made (Sync code terminated) at time (x + 2sec)
Promise fulfilled (Async code terminated) at time (x + 2sec)
My question is what's wrong with my approach to implement a AsyncTask?!
JavaScript's asynchronous behavior is only relevant for IO blocking functions. Meaning, that instead of waiting for an IO function, the event loop keeps running.
Seeing as JS is single threaded, CPU bounded computations take up the thread, and cannot be done asynchronously.
Your only recourse, then, is to create a native module that will do the calculation in a different thread for you, and then call a JS callback when it's done.
I'm creating a Redux middleware that listens for a specific action. If that action type matches what I'm looking for, I want to dispatch another action. The reason for this is because I have different components with some shared functionality, so I want to update actions to have similar types, but different payloads (term), like so:
const updateActions = store => next => action => {
console.log(action);
if (action.type === 'UNIQUE_ACTION_A') {
return next({ type: 'NEW_ACTION', term: 'new_action_a_test' });
} else if (action.type === 'UNIQUE_ACTION_B') {
return next({
type: 'NEW_ACTION',
term: 'new_action_b_test'
});
}
return next(action);
};
const store = createStore(
rootReducer,
composeWithDevTools(applyMiddleware(thunk, updateActions))
);
The problem I'm having is that these actions are not dispatching. When I console.log all actions, they continue to run as normal, rather than dispatching the new actions. It's as if the call to dispatch these actions are just being ignored. What am I doing incorrectly here? Thanks!
There's a difference between next and dispatch inside middleware. dispatch sends an action to the very start of the dispatch chain, which will cause it to run through all the middleware in the pipeline. next sends it to the next middleware after this one, and eventually to the reducers. By calling next({type : "NEW_ACTION"}), you're sending it to the next middleware, and this middleware will never see "NEW_ACTION".
Also see the Redux FAQ entry on next vs dispatch in middleware.