So I have this call to an API in my actions at the Vuex Store, however I want to refactor it with a setTimeout, just to give my initial state time to commit(about a second).
here's the function
Authenticate: ({commit})=>{
return new Promise((resolve) =>{
commit('auth_request')
return Axios.get(util.functions.pathFromScript('authenticate.php'),{params:params() }).then(resp =>{
state.logged_in = true
state.restricted =false
const data = resp.data
const session_id = data.session_id
console.log(session_id)
localStorage.setItem('data',data)
localStorage.setItem('session_id',session_id)
commit('auth_success', params, data,session_id)
resolve(resp)
})
})
}
so basically I can't figure it out where to add the async,and the timeout here. Any help would be appreciated A LOT! Thank you in advance
So theres already two answers to this question.
This one:
.then(resp => setTimeout(resolve, 1000){...
from here
and:
.then(promiseTimeout(2000)).then
from here
Related
This is a topic that's been discussed a lot through github issues and by now I've noticed two main opinions: It's not possible or it should not be done at all.
The argument for both sides is that redux is not meant for it, that the .replaceReducer function is only meant for the purposes of hot-reloading (even though redux itself mentions it as a possibility for code-splitting).
The goal
Anyway, what I would like to achieve (ideally) is a system that only sends the relevant slices and relevant redux code for a specific route in NextJs. And (even more ideally) when navigating between pages the store should just get extended and not re-created.
My initial approach
My first idea was to implement a recipe from the link above, attaching and exposing the injectReducer function onto my store during the store setup:
const store = configureStore({
reducer: {
globals,
[rtkqApi.reducerPath]: rtkqApi.reducer
},
middleware: (getDefaultMiddleware) => getDefaultMiddleware().concat(rtkqApi.middleware)
});
store.dynamicReducers = {};
store.injectDynamicReducer = (name, reducer) => {
if (Object.keys(store.dynamicReducers).includes(name)) {
return;
}
store.dynamicReducers[name] = reducer;
store.replaceReducer(
combineReducers({
globals,
[rtkqApi.reducerPath]: rtkqApi.reducer,
...store.dynamicReducers
})
);
};
const makeStore = () => store;
export const wrapper = createWrapper(makeStore);
export const injectReducer = (sliceName, reducer) => store.injectDynamicReducer(sliceName, reducer);
So basically every page would have a globalsSlice, containing the user info and some other global data, and Redux Toolkit Query API slice (which would then be code-split using RTKQ injectEndpoints functionality).
With this setup, each page that wants to inject its own custom slice (reducer) would do something like this:
const SomePage = () => {
const someData = useSelector(somePageSliceSelectors.selectSomeData);
return (
<Fragment>
<Head>
<title>Some Page</title>
</Head>
</Fragment>
)
};
export default SomeRoute;
injectReducer('somePageSlice', somePageReducer);
export const getServerSideProps = wrapper.getServerSideProps((store) => async (context) => {
// Whatever necessary logic we need
});
Initially this seemed to have worked fine, but then when I realized that next-redux-wrapper works by calling the makeStore factory on every request, and I'm manipulating and mutating a global store object, there has to be something wrong with this, ie a race condition that I haven't been able to cause by testing. Also another problem occurres when using Redux Toolkit Query. For example, if I need to get a cookie from the original request (the one that nextjs receives) and then re-send it to another API endpoint that is handled by redux toolkit query, I would need to extract the cookie from the request context, to which I don't have access unless I do something like this:
export const makeStore = (ctx) => {
return configureStore({
reducer: ...,
middleware: (getDefaultMiddleware) =>
getDefaultMiddleware({
thunk: {
extraArgument: ctx,
},
}).concat(...),
});
};
which further implies that I should definitely not be mutating the global store object.
So then I thought alright, instead of manipulating the global store I could try doing it in GSSP:
export const getServerSideProps = wrapper.getServerSideProps((store) => async (context) => {
store.injectDynamicReducer('somePageSlice', somePageReducer);
});
But no luck here, the slice does not get loaded and the state does not get constructed. It is my guess that the Provider in the _app gets rendered before this, but I'm not sure.
In conclusion, I'd like to know whether anyone has tried and succeeded in implementing redux code splitting using RTK, RTKQ and NextJs. Also, I would like to ask an addition question: Is it necessary? What I mean by this is, if I were to not code-split at all, and send all slices on every request, how performance impactful would this be? Also, since I'm not sure exactly how the NextJs bundler works and how code chunking is done: If a certain page receives a slice it doesn't use at all, will it only receive its initial state or all of its logic (all the selectors, reducers and actions)? If not then maybe this isn't so bad, since initial states are just empty objects.
I hope I've presented the problem clearly enough, as it is a very complex problem, but feel free to ask follow up questions if something doesn't make sense.
Thanks in advance.
I've always struggled to get my head around Redux-thunk, as it really don't understand what great purpose it serves. For example, here's a random Redux-Thunk example I found from a website:
export const addTodo = ({ title, userId }) => {
return dispatch => {
dispatch(addTodoStarted());
axios
.post(ENDPOINT, {
title,
userId,
completed: false
})
.then(res => {
setTimeout(() => {
dispatch(addTodoSuccess(res.data));
}, 2500);
})
.catch(err => {
dispatch(addTodoFailure(err.message));
});
};
};
It's seemingly simple, addTodo is a function that takes in the title and userId and returns a function with dispatch as a parameter, which then uses dispatch once and then again for the response of the HTTP request. Because in this case Redux-Thunk is being used, you would just do dispatch(addTodo(x,x));
Why would I not just do something like this though?
function addTodo(dispatch, title,userId){
dispatch(addTodoStarted());
axios
.post(ENDPOINT, {
title,
userId,
completed: false
})
.then(res => {
setTimeout(() => {
dispatch(addTodoSuccess(res.data));
}, 2500);
})
.catch(err => {
dispatch(addTodoFailure(err.message));
});
}
Then from anywhere, I can just call addTodo(dispatch, x, x);
Why would I use the Redux-Thunk example over my own?
Here are few points through which i will try to explain why should go with redux-thunk.
Its middle ware so it will make dispatch and state object available in every action you define without touching you component code.
When you pass dispatch function which is either from props or from mapDispatchToProps(react-redux) which creates closure. This closure keeps memory consumed till asyc operation finished.
When ever you want to dispatch any action, after completing or in async operation you need to pass dispatch function and in this case you need to modify two files like your component and actions.
If something is already available and tested with lot effort and community support why not use it.
your code will be more readable and modular.
Worst case for both approach, say after completing project, need to change thunk approach, you can easily mock thunk middle ware with your custom middle ware code and resolve it but in case of passing dispatch function it will refactoring all code and search and replace and find way to manage it.
I have these two nodes that I need to get on a single http call. I am trying to achieve this by using async/await to get the two nodes and then combine them using concat or forEach. But it seems that even though I am awaiting responses, inside the function they are still promises and not the data itself. This is my basic example:
exports.searchVisit = functions.https.onRequest(async (req, res) => {
const today = new Date(Date.now());
let todayVisits = await admin.database().ref('/visits/').once('value');
let frequentVisits = await admin.database().ref('/frequent_visits/').once('value');
console.log(todayVisits); // Prints an object (I guess it is a promise)
res.status(200).send(todayVisits); // Returns correctly the data on visits collection
});
How could I achieve to return todayVisits and frequentVisits combined? Thanks in advance.
In your code, todayVisits is a DataSnapshot type object. It is not a promise. Logging that DataSnapshot object is not likely to be useful. If you want to see the raw data inside that snapshot, call val() on it to get a JavaScript object with the entire set of data in that snapshot. This is also what you probably want to send to the client (not the entire contents of the DataSnapshot).
The following code, merging the two JavaScript objects obtained with val(), as explained by Doug, should do the trick:
exports.searchVisit = functions.https.onRequest(async (req, res) => {
const today = new Date(Date.now());
let todayVisits = admin.database().ref('/visits/').once('value');
let frequentVisits = admin.database().ref('/frequent_visits/').once('value');
const [todayVisitsSnap, frequentVisitsSnap] = await Promise.all([
todayVisits,
frequentVisits
]);
res.status(200).send({ ...todayVisitsSnap.val(), ...frequentVisitsSnap.val() });
});
After inserting 29447 entities of a single kind in Google Cloud DataStore I wait about 30 seconds and go and check how many entities are there for that particular kind. The surprising thing is that I notice some of them missing (getCurrentKeys returns a bit less than 29447 entities). When I check after a longer period of time (~1 hour), I can then see that all of the entities are there (getCurrentKeys returns the expected 29447 entities).
The code used to read the number of entities is the following:
const runQuery = (query) => {
return new Promise((resolve, reject) => {
datastore.runQuery(query)
.then(results => {
const entities = results[0];
resolve(entities);
})
.catch(e => reject(e));
});
};
const getCurrentKeys = () => {
const query = datastore.createQuery(KIND)
.select('__key__');
return runQuery(query);
};
async function main() {
const currentKeys = await getCurrentKeys();
console.log(`currentKeys: ${currentKeys.length}`);
}
main();
Any ideas of what could be happening?
Thanks in advance
Non ancestor queries are eventually consistent. It will take a while before all the rows will show up.
This article should explain more:
https://cloud.google.com/datastore/docs/articles/balancing-strong-and-eventual-consistency-with-google-cloud-datastore/
After a bit more research I think it might be related to the indexes. I believe the indexes aren't getting updated fast enough by the time I run the query. The entities have many properties, so there are many indexes involved.
Hi there I have a onChange callback in one of the React components that dispatches an action several times through a map call like this:
onChange: (items, newRatio) => {
items.map( item => {
dispatch(itemActions.updateStart({
...item,
adjusted_ratio: _.round(item.adjusted_ratio + newRatio, 1),
}))
})
}
and I have a Saga for the "items" like so:
// Updating an Item
function* watchUpdate() {
while(true) {
const { record: unsavedItem, } = yield take(itemTypes.ITEMS_UPDATE_START);
const task = yield fork( updateItemDbCrud, unsavedItem )
}
}
function* updateItemDbCrud(unsavedItem) {
const savedItem = yield call( api.update, unsavedItem );
const result = yield put ( itemActions.updateSuccess(savedItem) )
}
export default [watchUpdate]
In other words, I expected that whenever the ITEMS_UPDATE_START action gets dispatched, it forks a new updateItemDbCrud and proceeds to do some API work, but I notice that only the first of the sequence of dispatches goes through. Am I using the fork wrong?
Thank you!
It's a known issue (https://github.com/yelouafi/redux-saga/issues/50) and it has to do with the nature of promises and its use in Redux Sagas core.
It's been resolved in version 0.6.
if you want to learn more about what caused the issue I recommend reading the above github issue and also Jake Archibalds article on tasks, microtasks, queues and schedules.
https://jakearchibald.com/2015/tasks-microtasks-queues-and-schedules/