I guess my question could also summed up as something like
Is there an idiomatic ES6 way to have:
array.map(identity) === array ?
array.filter(i => true) === array ?
{obj..., attr: obj.attr} === obj ?
I know, it has not been implemented like that in ES6, but is there some possible syntax I'm missing or simple helper functions to have these properties true without resorting to an immutable lib?
I use Babel and new JS features, with immutable js objects.
I would like to know how to make my reducers more efficient and generate less unnecessary object copies
I'm not interested in a lib (Mori/ImmutableJS) solution.
I have a reducer that manages a paginated list.
The pages attribute is actually an Array[Array[item]]
Here is my reducer:
const initialState = {
isLoading: false,
pages: [],
allStamplesLoaded: false
};
function reducer(state = initialState, event) {
switch (event.name) {
case Names.STAMPLE_DELETED:
return {
...state,
pages: removeStampleFromPages(state.pages,event.data.stampleId)
};
case Names.STAMPLE_UPDATED:
return {
...state,
pages: updateStampleInPages(state.pages,event.data.apiStample)
};
case Names.STAMPLE_PAGES_CLEANED:
return {
...initialState,
};
case Names.STAMPLE_PAGE_REQUESTED:
return {
...state,
isLoading: true
};
case Names.STAMPLE_PAGE_LOADED:
const {stamplePage,isLastPage} = event.data;
return {
...state,
isLoading: false,
pages: [...state.pages, stamplePage],
isLastPage: isLastPage
};
case Names.STAMPLE_PAGE_ERROR:
return {
...state,
isLoading: false
};
default:
return state;
}
}
I also have these helper functions:
function removeStampleFromPages(pages,deletedStampleId) {
return pages.map(page => {
return page.filter(apiStample => apiStample != deletedStampleId)
})
}
function updateStampleInPages(pages,newApiStample) {
return pages.map(page => {
return updateStampleInPage(page,newApiStample);
})
}
function updateStampleInPage(page,newApiStample) {
return page.map(apiStample => {
if (apiStample.id === newApiStample.id) {
return newApiStample;
}
else {
return apiStample;
}
})
}
As you can notice, everytime an event such as STAMPLE_UPDATED is fired, then my reducer always return a new state, with a new array of array of pages, even if none of the items of the array were actually updated. This creates unnecessary object copying and GC.
I don't wan to optimize this prematurely nor introduce an immutable library in my app, but I'd like to know if there are any idiomatic ES6 ways to solve this problem?
Immutable data structures such as Immutable.js and Mori use a clever trick to avoid recreating the whole structure all the time.
The strategy is fairly simple: when you update a property drill down to the property, change it and rewrap all the property from this node till the root.
Let's assume you want to change the property c to 4 in the following state:
const state1 = {
a: {
b: {
c: 1
},
d: [2, 3, 4],
e: 'Hello'
}
}
The first step is to update c to 4. After that you need to create
a new object for b (because c changed)
a new object for a (because b changed)
and new object for the state (because a changed).
Your new state will look like this (a * next to an object means the object has been recreated)
const state2 = *{
a: *{
b: *{
c: 4
},
d: [2, 3, 4],
e: 'Hello'
}
}
Notice how d and e have not been touched.
You can now verify things are properly working:
state1 === state2 // false
state1.a === state2.a // false
state1.a.b === state2.a.b //false
state1.d === state2.d // true
state1.e === state2.e // true
You may notice that d and e are shared between state1 and state2.
You could use a similar strategy to share information in your state without recreating a whole new state all the time.
As for your initial question:
array.map(identity) !== array
array.filter(i => true) !== array
{obj..., attr: obj.attr} !== obj
the answer is very simple.
When an array or an object is created, the Javascript VM assigns internally an identifier to that object. The identifier is incremental, so no two arrays/objects are alike.
When you perform an identity check on arrays or objects, only the internal identifier is checked for a match.
a = [] // internal identifier 1
[] // internal identifier to 2
b = [] // internal identifier 3
a === b // id 1 === id 3 is FALSE!
a === a // id 1 === id 1 is TRUE!
Related
I have an object in my pinia store like
import { defineStore } from "pinia";
export const useSearchStore = defineStore("store", {
state: () => {
return {
myobj: {
foo: 0,
bar: 2000,
too: 1000,
},
};
},
getters: {
changed() {
// doesn't work
return Object.entries(this.myobj).filter(([key, value]) => value != initialvalue
);
},
},
});
How do I get the initial value to test if the object changed. Or how can I return a filtered object with only those entries different from initial state?
My current workaround:
in a created hook I make a hard copy of the store object I then can compare to. I guess there is a more elegant way...
I had done this (although I do not know if there a better way to avoid cloning without duplicating your initial state).
Define your initial state outside and assign it to a variable as follows;
const initialState = {
foo: 0,
bar: 2000,
too: 1000
}
Then you can use cloning to retain the original state;
export const useSearchStore = defineStore("store", {
state: () => {
return {
myobj: structuredClone(initialState),
};
},
getters: {
changed: (state) => deepEquals(initialState, state.myobj);
},
});
where deepEquals is a method which deep compares the two objects (which you would have to implement). I would use lodash (npm i lodash and npm i #types/lodash --save-dev if you're using TypeScript) for this.
Full code (with lodash);
import { defineStore } from "pinia";
import { cloneDeep, isEqual } from "lodash";
const initialState = {
foo: 0,
bar: 2000,
too: 1000
}
export const useSearchStore = defineStore("store", {
state: () => ({
myobj: cloneDeep(initialState)
}),
getters: {
changed(state) {
return isEqual(initialState, state.myobj);
},
},
});
If you also want the differences between the two you can use the following function (the _ is lodash - import _ from "lodash");
function difference(object, base) {
function changes(object, base) {
return _.transform(object, function (result: object, value, key) {
if (!_.isEqual(value, base[key])) {
result[key] =
_.isObject(value) && _.isObject(base[key])
? changes(value, base[key])
: value;
}
});
}
return changes(object, base);
}
courtesy of https://gist.github.com/Yimiprod/7ee176597fef230d1451
EDIT:
The other way you would do this is to use a watcher to subscribe to changes. The disadvantage to this is that you either have to be OK with your state marked as "changed" if you change back the data to the initial state. Otherwise, you would have to implement a system (perhaps using a stack data structure) to maintain a list of changes so that if two changes which cancel each other out occur then you would remark the state as "unchanged". You would have to keep another variable (boolean) in the state which holds whether the state has been changed/unchanged - but this would be more complicated to implement and (depending on your use case) not worth it.
For some reason, when watching for Pinia state changes, the newVal and oldVal in the watch function are always identical (except for the very first time).
This is how my code looks like:
const { searchFilters } = storeToRefs(searchFiltersStore)
watch(searchFilters.value.categories, (newVal, oldVal) => {
if (
newVal.length !== oldVal.length ||
newVal.every((value, index) => value !== oldVal[index])
) {
reset(undefined, true)
}
}, { deep: true })
searchFilters.value.categories are string[] (array made of strings) type and initially is an empty array.
When the first change is triggered, the oldVal is an empty array and the newVal is whatever it supposed to be.
But after that, every new trigger has oldVal the same as newVal therefore, my if statement is never valid.
What am I missing?
I found that in order to watch an array, you have to pass a copy of the array and not the existing array like so:
watch(() => [...searchFilters.value.categories], (newVal, oldVal) => {
...
});
Also, the deep parameter in my case was not necessary.
If I am interacting with an API that returns null for some objects that may or may not have value, how can I reconcile that with the reducers?
example: app state
{
foo: {
foo1: null,
foo2: {
bar: null,
bar2: null
}
}
}
but the server, when things are null, returns this:
{
foo: null
}
but it can return the full state when things have value:
{
foo: {
foo1: "somefoo,
foo2: {
bar: "barvalue,
bar2: 27
}
}
}
the problem I ham having is that my reducers are trying to load the state from the return value from the server, and then my components are trying to read from a null object and it is failing.
EDIT: the reducer and the component would look like this... so the component is trying to read some nested json, which may come back as unnreadable because the parent object is null. In this case I know I could hack up a solution that checks if the object is null and inserts my predefined initial state...
BUT...my actual example is a bigger json object than this and I know it will change in the future, so I need a solution that is not so fragile and cumbersome as adding a ton of logic here to check to make sure that every object down the nested like is not null.
var updateSettings = (settings = jsonShape, action) => {
swtich (action.type) {
case UPDATE_SETTINGS:
return Object.assign({}), settings, {
foo2: {
...settings.foo2,
bar: action.newBar
}
}
}
}
const Component = ({ settings }) => {
return (
<div>{ settings.foo2.bar }</div>
)
}
I'm new to flow, any trying to cover some of my functions, however often I have these snippets where I extract fields form an object based on some condition. But I'm struggling to cover them with flow.
const _join = function ( that: Array<Object>, by: string, index: number) {
that.forEach((thatOBJ: {[string]: any}, i: number)=>{
let obj: {[string]: any} = {};
for (let field: string in thatOBJ) {
if (field !== by) {
obj[`${index.toString()}_${field}`] = thatOBJ[field]; // NOT COVERED
} else {
obj[field] = thatOBJ[field]; // NOT COVERED
}
that[i] = obj;
}
});
}
The array that in this code is a data array so can really be in any format of mongodb data.
Any ideas on what to add to make the two lines which are not covered by flow covered?
Thanks.
A few notes...
This function has a "side effect" since you're mutating that rather than using a transformation and returning a new object.
Array<Object> is an Array of any, bounded by {}. There are no other guarantees.
If you care about modeling this functionality and statically typing them, you need to use unions (or |) to enumerate all the value possibilities.
It's not currently possible to model computed map keys in flow.
This is how I'd re-write your join function:
// #flow
function createIndexObject<T>(obj: { [string]: T }, by: string, index: number): { [string]: T } {
return Object.keys(obj).reduce((newObj, key) => {
if (key !== by) {
newObj[`${index}_${key}`] = newObj[key]
} else {
newObj[key] = obj[key]
}
return newObj
}, {})
}
// NO ERROR
const test1: { [string]: string | number } = createIndexObject({ foo: '', bar: 3 }, 'foo', 1)
// ERROR
const test2: { [string]: string | boolean } = createIndexObject({ foo: '', bar: 3 }, 'foo', 1)
Let's say I have this state:
state: {
field1: value1,
field2: {a: 5, b: 7}
}
If a reducer wants to update only field1, can the reducer return a new object containing a new field1 and the existing object state.field2 as field2 property of the new returned state? Or does the reducer have to clone field2?
Use spread operator
return {
...state,
field1: newVal
}
Here is the link detailed immutable update patterns
http://redux.js.org/docs/recipes/reducers/ImmutableUpdatePatterns.html
Yes. Not only is recycling state permissible, it's recommended.
Say your initial state is:
{
object1: { /* key-pair values */ },
object2: { /* key-pair values */ },
}
If you update your state like this:
// bad
return {
object1: action.object1,
object2: Object.assign({}, state.object2),
}
Then your app thinks object2 has changed even when it hasn't. This may cause unnecessary calculations and re-renders in your React components.
It's much better to only update the parts of your state that have actually changed.
// good
return Object.assign({}, state, {
object1: action.object1,
});
Object.assign() is what you are looking for, now you can also use the spread operator (...) , but be aware that those are ES6 features, so something like internet explorer (even 11) will crash at both , so you will need a polyfill, check this link it will give you more infos about it and also a polyfill for older browsers
https://developer.mozilla.org/fr/docs/Web/JavaScript/Reference/Objets_globaux/Object/assign
var o1 = { a: 1 };
var o2 = { b: 2 };
var o3 = { b: 3 , c:2 };
var obj = Object.assign(o1, o2, o3);
console.log(obj); // { a: 1, b: 3 , c: 2 }
hope it helps