RTK Query and Mutation sequentially - redux

I have a situation using rtk about how to use the mutations and queries sequentially.
In my use case, in a route like /status/:id/:version, I need to create a job based on Id and version and then monitor the progress of creation (takes around 30 seconds). I have a query and a mutation in this route
const {id, version} = useParams()
const pollRef = useRef(1000)
const [createJob, {data: postData, error, isSuccess}] = useCreateJobMutation()
const {data, ...} = useGetJobIdQuery(postData[0].id, { pollingInterval: pollRef.current })
if (data.progress === 100) {
pollRef.current = 0 // stop polling GET route
use data ...
}
useEffect(()=> {
createJob(newJob) // newJob is created based on id, version
}, [])
I need to wait for the postData to be valid (not undefined), the issue is how to send the result of the mutation to the query without violating the hook rules. (I get ERROR Rendered more hooks during the previous render.)
if (isSuccess) {
useGetJobIdQuery(...) // violate hook rules
}
useCreateJobMutation() and useGetJobIdQuery() work fine standanlone but not together

You can skip queries:
import { skipToken } from '#reduxjs/toolkit/query/react'
useGetJobIdQuery(isSuccess ? jobId : skipToken)

Related

Next.js 13 - Idiomatic access of route params on edge API functions

I'm trying to understand more about dynamic APIs with Next.
Specifically, I'm working on an Edge functions w/ native web APIs. ^1
The documentation for next handlers suggests the idiomatic way to get the dynamic component from a route is to use req.query. ^2
query does't exist on a Node request, however.
// pages/api/user/[id].ts
import { jsonResponse } from "src/utils/jsonResponse";
export const config = {
runtime: 'edge',
}
export default async function handler(
req: Request,
) {
const {id} = req.query // doesn't work because query isn't on the request
}
This makes sense to me since the id isn't actually a query parameter, but part of the route.
So, now I'm on a standard web question I guess, but in the context of Next and I'm curious if there's an idiomatic way to do this.
Right now, my solution is:
// pages/api/user/[id].ts
import { jsonResponse } from "src/utils/jsonResponse";
export const config = {
runtime: 'edge',
}
export default async function handler(
req: Request,
) {
const { searchParams, pathname, } = new URL(req.url)
const parts = pathname.split('/')
const id = parts.pop();
}

How to correctly return array in redux state, if the array did not have to be updated in the reducer?

I am using the aurelia-store state management library for managing state. This question is not specific to Aurelia store, but actually to redux best practices in general since Aurelia store is very much the same thing.
I have an action that fetches unit updates from an API like so:
export const fetchNewUnits = async (state: State): Promise<State> => {
const fetchedUnits = await apiClient.getUnitsMarkers();
// no new updates so don't trigger change in units
// IS THIS ACCEPTABLE?
if (fetchedUnits.length === 0) {
return {
...state,
highwaterMark: new Date()
};
}
const units: UnitMarker[] = state.units.slice();
_.forEach(fetchedUnits, (newUnit) => {
// look for matching unit in store
const idx = _.findIndex(units, {
imei: newUnit.imei
});
// unit was found in store, do update
if (idx !== -1) {
// replace the unit in the store
const replacement = new UnitMarker({...newUnit});
units.splice(idx, 1, replacement);
}
});
// OR SHOULD I ALWAYS DEEP COPY THE ARRAY REFERENCE AND IT'S OBJECTS
return {
...state,
highwaterMark: new Date(),
units: [...units]
};
};
If I do not have any unit changes (i.e. my store is up to date) can I simply return the state with the spread operator as shown in the first return statement? Is this fine since I did not modify the objects?
Or do I always have to do deep replacements such as:
return {
...state,
highwaterMark: new Date(),
units: [...state.units]
};
even if the objects in the array did not change?
The reason why you’re supposed to create a new object is because React components check for prop changes in order to know when to re-render.
If you simply modify an object and pass it in as a prop again, React won’t know that something changed and will fail to rerender.
So in your case, the question is: do you want to rerender, or not? If you don’t, returning the same object is fine and a simple ‘return state’ will let React know that no rerenders are necessary.
See: Why is the requirement to always return new object with new internal references

Firebase Functions onUpdate circular problem

I've this situation with a circular function, having trouble finding a solution.
Have a collection where I have a flag that tells if the data has changed. Also want to log the changes.
export async function landWrite(change, context) {
const newDocument = change.after.exists ? change.after.data() : null
const oldDocument = change.before.data()
const log = {
time: FieldValue.serverTimestamp(),
oldDocument: oldDocument,
newDocument: newDocument
}
const landid = change.after.id
const batch = db.batch()
const updated = newDocument && newDocument.updated === oldDocument.updated
if (!updated) {
const landRef = db.collection('land').doc(landid)
batch.update(landRef, {'updated': true })
}
const logRef = db.collection('land').doc(landid).collection('logs').doc()
batch.set(logRef, log)
return batch.commit()
.then(success => {
return true
})
.catch(error => {
return error
})
}
The problem is that this writes the log twice when the UPDATED flag is false.
But also cannot put the log write in the ELSE statement because the flag can already be UPDATED and a new document update be made so a new log has to be written.
Trigger:
import * as landFunctions from './lands/index'
export const landWrite = functions.firestore
.document('land/{land}')
.onWrite((change, context) => {
return landFunctions.landWrite(change, context)
})
If I understand correctly, the problem here is that the updated flag does not specify which event the update is in response to (as you can't really do this with a boolean). In other words - you may have multiple simultaneous "first-stage" writes to lands, and need a way to disambiguate them.
Here are a few possible options that I would try - from (IMHO) worst to best:
The first option is not very elegant to implement
The first and second options both result in your function being
called twice.
The third option means that your function is only
called once, however you must maintain a separate parallel
document/collection alongside lands.
Option 1
Save some sort of unique identifier in the updated field (e.g. a hash of the stringified JSON event - e.g. hash(JSON.stringify(oldDocument)), or a custom event ID [if you have one]).
Option 2
Try checking the updateMask property of the incoming event, and discard any write events that only affect that property.
Option 3
Store your update status in a different document path/collection (e.g. a landUpdates collection at the same level as your lands collection), and configure your Cloud Function to not trigger on that path. (If you need to, you can always create a second Cloud Function that does trigger on the landUpdates path and add either the same logic or different logic to it.)
Hope this helps!
The main problem here is the inability of differentiating changes that are made by this server function or by a client. Whenever you are in this situation, you should try to explicitly differentiate between them. You can even consider having an extra field like fromServer: true that goes with server's updates and helps the server ignore the related trigger. Having said that, I think I have identified the issue and provided a clear solution below.
This line is misleading:
const updated = newDocument && newDocument.updated === oldDocument.updated
It should be named:
const updateStatusDidNotChange = newDocument && newDocument.updated === oldDocument.updated
I understand that you want the updated flag to be managed by this function, not the client. Let me know if this is not the case.
Therefore, the update field is only changed in this function. Since you want to log only changes made outside of this function, you want to log only when updated did not change.
Here's my attempt at fixing your code in this light:
export async function landWrite(change, context) {
const newDocument = change.after.exists ? change.after.data() : null
const oldDocument = change.before.data()
const updateStatusDidNotChange = newDocument && newDocument.updated === oldDocument.updated
if (!updateStatusDidNotChange) return true; //this was a change made by me, ignore
const batch = db.batch()
if (!oldDocument.updated) {
const landid = change.after.id
const landRef = db.collection('land').doc(landid)
batch.update(landRef, {'updated': true })
}
const log = {
time: FieldValue.serverTimestamp(),
oldDocument: oldDocument,
newDocument: newDocument
}
const logRef = db.collection('land').doc(landid).collection('logs').doc()
batch.set(logRef, log)
return batch.commit()
.then(success => {
return true
})
.catch(error => {
return error
})
}
Edit
I had the exact problem and I had to differentiate changes by the server and the client, and ignore the ones that were from the server. I hope you give my suggestion a try.

firebase firestore adding new document inside a transaction - transaction.add is not a function

I was assuming that it was possible to do something like:
transaction.add(collectionRef,{
uid: userId,
name: name,
fsTimestamp: firebase.firestore.Timestamp.now(),
});
But apparently it is not:
transaction.add is not a function
The above message is displayed inside the chrome console.
I see that we can use the set method of the transaction to add a new document transactionally. see: https://firebase.google.com/docs/firestore/manage-data/transactions
The thing is if I use set instead of add(which is not supported anyways), the id of the document should be created by me manually, firestore won't create it.
see: https://firebase.google.com/docs/firestore/manage-data/add-data
Do you see any downside of this not having an add method that generates the id for you automatically?
For example, is it possible that the id generated by the firestore itself is somehow optimized considering various concerns including performance?
Which library/method do you use to create your document IDs in react-native while using transaction.set?
Thanks
If you want to generate a unique ID for later use in creating a document in a transaction, all you have to do is use CollectionReference.doc() with no parameters to generate a DocumentReference which you can set() later in a transaction.
(What you're proposing in your answer is way more work for the same effect.)
// Create a reference to a document that doesn't exist yet, it has a random id
const newDocRef = db.collection('coll').doc();
// Then, later in a transaction:
transaction.set(newDocRef, { ... });
after some more digging I found in the source code of the firestore itself the below class/method for id generation:
export class AutoId {
static newId(): string {
// Alphanumeric characters
const chars =
'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
let autoId = '';
for (let i = 0; i < 20; i++) {
autoId += chars.charAt(Math.floor(Math.random() * chars.length));
}
assert(autoId.length === 20, 'Invalid auto ID: ' + autoId);
return autoId;
}
}
see: https://github.com/firebase/firebase-js-sdk/blob/73a586c92afe3f39a844b2be86086fddb6877bb7/packages/firestore/src/util/misc.ts#L36
I extracted the method (except the assert statement) and put it inside a method in my code. Then I used the set method of the transaction as below:
generateFirestoreId(){
const chars = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789';
let autoId = '';
for (let i = 0; i < 20; i++) {
autoId += chars.charAt(Math.floor(Math.random() * chars.length));
}
//assert(autoId.length === 20, 'Invalid auto ID: ' + autoId);
return autoId;
}
then,
newDocRef = db.collection("PARENTCOLL").doc(PARENTDOCID).collection('SUBCOLL').doc(this.generateFirestoreId());
transaction.set(newDocRef,{
uid: userId,
name: name,
fsTimestamp: firebase.firestore.Timestamp.now(),
});
Since I am using the same algo for the id generation as the firestore itself I feel better.
Hope this helps/guides someone.
Cheers.
Based on the answer from Doug Stevenson, this is how I got it worked with #angular/fire:
// Create a reference to a document and provide it a random id, e.g. by using uuidv4
const newDocRef = this.db.collection('coll').doc(uuidv4()).ref;
// In the transaction:
transaction.set(newDocRef, { ... });
To complete Stefan's answer. For those using Angularfire, earlier to version 5.2 using CollectionReference.doc() results in an error "CollectionReference.doc() requires its first argument to be of type non-empty string".
This workaround worked for me:
const id = this.afs.createId();
const ref = this.afs.collection(this.collectionRef).doc(id);
transaction.set(ref, { ... });
Credit: https://github.com/angular/angularfire/issues/1974#issuecomment-448449448
I'd like to add an answer solving the id problem. There's no need to generate your own ids. The documentReference is updated after the transaction.set() is called, so in order to access the Firestore's id you need to just do the following:
const docRef = collectionRef.doc();
const result = await transaction.set(docRef, input);
const id = docRef.id;
First of all, firestore transaction object has 4 (get,set,update,delete) methods and doesnt has "add" method. However, the "set" method can be used instead.
import { collection,doc,runTransaction } from "firebase/firestore";
On the other hand documentReference must be created for "set" method.
Steps :
1-) collection method create a collectionReference object.
const collectionRef = collection(FirebaseDb,"[colpath]");
2-) doc method create a documentReference object with unique random id for specified collectionReference.
const documentRef = doc(collectionRef);
3-) add operation can be performed with the transaction set method
try {
await runTransaction(FirebaseDb,async (transaction) => {
await transaction.set(documentRef, {
uid: userId,
name: name,
fsTimestamp: firebase.firestore.Timestamp.now(),
});
})
} catch (e) {
console.error("Error : ", e);
}

React-redux project - chained dependent async calls not working with redux-promise middleware?

I'm new to using redux, and I'm trying to set up redux-promise as middleware. I have this case I can't seem to get to work (things work for me when I'm just trying to do one async call without chaining)
Say I have two API calls:
1) getItem(someId) -> {attr1: something, attr2: something, tagIds: [...]}
2) getTags() -> [{someTagObject1}, {someTagObject2}]
I need to call the first one, and get an item, then get all the tags, and then return an object that contains both the item and the tags relating to that item.
Right now, my action creator is like this:
export function fetchTagsForItem(id = null, params = new Map()) {
return {
type: FETCH_ITEM_INFO,
payload: getItem(...) // some axios call
.then(item => getTags() // gets all tags
.then(tags => toItemDetails(tags.data, item.data)))
}
}
I have a console.log in toItemDetails, and I can see that when the calls are completed, we eventually get into toItemDetails and result in the right information. However, it looks like we're getting to the reducer before the calls are completed, and I'm just getting an undefined payload from the reducer (and it doesn't try again). The reducer is just trying to return action.payload for this case.
I know the chained calls aren't great, but I'd at least like to see it working. Is this something that can be done with just redux-promise? If not, any examples of how to get this functioning would be greatly appreciated!
I filled in your missing code with placeholder functions and it worked for me - my payload ended up containing a promise which resolved to the return value of toItemDetails. So maybe it's something in the code you haven't included here.
function getItem(id) {
return Promise.resolve({
attr1: 'hello',
data: 'data inside item',
tagIds: [1, 3, 5]
});
}
function getTags(tagIds) {
return Promise.resolve({ data: 'abc' });
}
function toItemDetails(tagData, itemData) {
return { itemDetails: { tagData, itemData } };
}
function fetchTagsForItem(id = null) {
let itemFromAxios;
return {
type: 'FETCH_ITEM_INFO',
payload: getItem(id)
.then(item => {
itemFromAxios = item;
return getTags(item.tagIds);
})
.then(tags => toItemDetails(tags.data, itemFromAxios.data))
};
}
const action = fetchTagsForItem(1);
action.payload.then(result => {
console.log(`result: ${JSON.stringify(result)}`);
});
Output:
result: {"itemDetails":{"tagData":"abc","itemData":"data inside item"}}
In order to access item in the second step, you'll need to store it in a variable that is declared in the function scope of fetchTagsForItem, because the two .thens are essentially siblings: both can access the enclosing scope, but the second call to .then won't have access to vars declared in the first one.
Separation of concerns
The code that creates the action you send to Redux is also making multiple Axios calls and massaging the returned data. This makes it more complicated to read and understand, and will make it harder to do things like handle errors in your Axios calls. I suggest splitting things up. One option:
Put any code that calls Axios in its own function
Set payload to the return value of that function.
Move that function, and all other funcs that call Axios, into a separate file (or set of files). That file becomes your API client.
This would look something like:
// apiclient.js
const BASE_URL = 'https://yourapiserver.com/';
const makeUrl = (relativeUrl) => BASE_URL + relativeUrl;
function getItemById(id) {
return axios.get(makeUrl(GET_ITEM_URL) + id);
}
function fetchTagsForItemWithId(id) {
...
}
// Other client calls and helper funcs here
export default {
fetchTagsForItemWithId
};
Your actions file:
// items-actions.js
import ApiClient from './api-client';
function fetchItemTags(id) {
const itemInfoPromise = ApiClient.fetchTagsForItemWithId(id);
return {
type: 'FETCH_ITEM_INFO',
payload: itemInfoPromise
};
}

Resources