I have this helper function in my reducer, which has the given state:
type CustomerCollection = { [number]: Customer }
type CustomerState = {
+customers: ?CustomerCollection,
+newItem: ?(Customer | Review),
+searchResults: ?(Customer[]),
+error: ?string,
+isLoading: boolean
};
function customerWithReview(review: Review): Customer {
const id: number = review.customerId;
const oldCustomer: Customer = state.customers[id];
const newReviews: Review[] = [review, ...oldCustomer.reviews];
return Object.assign(oldCustomer, { reviews: newReviews });
}
I get a Flow error on the id of const oldCustomer: Customer = state.customers[id]; saying Cannot get state.customers[id] because an index signature declaring the expected key/value type is missing in null or undefined.
This is happening because of the nullable/optional ?CustomerCollection type of state.customers.
I can silence the error by making sure customers isn't null:
if (state.customers) {
const oldCustomer: Customer = state.customers[id];
const newReviews: Review[] = [review, ...oldCustomer.reviews];
return Object.assign(oldCustomer, { reviews: newReviews });
}
But then the problem just goes up the chain because I don't have anything to return from the function.
I can certainly expand it to:
function customerWithReview(review: Review): Customer {
if (!state.customers) {
return new Customer();
} else {
const id: number = review.customerId;
const oldCustomer: Customer = state.customers[id];
const newReviews: Review[] = [review, ...oldCustomer.reviews];
return Object.assign(oldCustomer, { reviews: newReviews });
}
}
But in actual practice, the action that gets us to this branch of the reducer will never be called if state.customers is null, and we'd never return new Customer() and would have no use for it if we did. state.customers is nullable in order to tell the difference between "we haven't fetched the customers yet (state.customers == null)" and "we've fetched the customers but there are none (state.customers == {}).
It would be a lot easier if I could just assert that state.customers would always exist in these cases, which in Swift I would do with force-unwrapping:
const oldCustomer: Customer = state.customers![id];
Can I do anything like this with Flow?
Or, given that only my GET_CUSTOMERS_FAILURE action would ever deal with state.customers == null, is there some other way to restructure my reducer so that this is a little easier? An entirely separate fetchReducer that is has a nullable customer collection while the rest of the actions fall under a different reducer?
You can use invariant function (Check that it works here):
type Customer = { id: number, reviews: Array<Review> };
type Review = { customerId: number };
type CustomerCollection = { [number]: Customer }
type CustomerState = {
+customers: ?CustomerCollection,
+newItem: ?(Customer | Review),
+searchResults: ?(Customer[]),
+error: ?string,
+isLoading: boolean
};
declare var state: CustomerState;
declare function invariant(): void;
function customerWithReview(review: Review): Customer {
const id: number = review.customerId;
invariant(state.customers, 'No customers and I don\'t know why');
const oldCustomer: Customer = state.customers[id];
const newReviews: Review[] = [review, ...oldCustomer.reviews];
return Object.assign(oldCustomer, { reviews: newReviews });
}
You can implement it somewhere in your project and import when necessary.
You can implement it like this:
export function invariant<T>(value: ?T, falsyErrorMessage: string, errorParams?: Object): void {
if (!value) {
log.error(falsyErrorMessage, errorParams || {});
throw new Error(INVARIANT_ERROR_MESSAGE);
}
}
Unfortunately, the name of the function is hard-coded in flow.
Alternative variant is just to add an if and to throw an error in your customerWithReview function directly.
I created an item in dynamodb using Node js, the item has multiple attributes such as brand, category, discount, validity, etc. I am using uuid to generate ids for each item. Now let's say I want to update the validity attribute of the item, in which case I am currently sending the entire json object with the value of validity modified to the new value.
This is definitely not optimal, please help me find an optimal solution.
const params = {
TableName: process.env.PRODUCT_TABLE,
Key: {
id: event.pathParameters.id,
},
ExpressionAttributeNames: {
'#discount': 'discount',
},
ExpressionAttributeValues: {
':brand': data.brand,
':category': data.category,
':discount': data.discount,
':denominations': data.denominations,
":validity": data.validity,
":redemption": data.redemption
},
UpdateExpression: 'SET #discount = :discount, denominations = :denominations, brand = :brand, category = :category, validity = :validity, redemption = :redemption',
ReturnValues: 'ALL_NEW',
};
I want to send just the attribute I want to update with the new value, if I want to change the validity from 6 months to 8 months, I should just send something like:
{
"validity": "8 months"
}
And it should update the validity attribute of the item.
Same should apply to any other attribute of the item.
'use strict';
const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.update = (event, context, callback) => {
const data = JSON.parse(event.body);
let attr = {};
let nameobj = {};
let exp = 'SET #';
let arr = Object.keys(data);
let attrname = {};
arr.map((key) => {attr[`:${key}`]=data[key]});
arr.map((key) => {
exp += `${key} = :${key}, `
});
arr.map((key) => {nameobj[`#${key}`]=data[key]});
attrname = {
[Object.keys(nameobj)[0]] : nameobj[Object.keys(nameobj)[0]]
}
const params = {
TableName: process.env.PRODUCT_TABLE,
Key: {
id: event.pathParameters.id,
},
ExpressionAttributeNames: attrname,
ExpressionAttributeValues: attr,
UpdateExpression: exp,
ReturnValues: 'ALL_NEW',
};
// update the todo in the database
dynamoDb.update(params, (error, result) => {
// handle potential errors
if (error) {
console.error(error);
callback(null, {
statusCode: error.statusCode || 501,
headers: { 'Content-Type': 'text/plain' },
body: 'Couldn\'t update the card',
});
return;
}
// create a response
const response = {
statusCode: 200,
body: JSON.stringify(result.Attributes),
};
callback(null, response);
});
};
Contrary to others comments, this is very possible, use the UpdateItem action.
Language agnostic API docs
JavaScript specific API docs
If you want to dynamically create the query, try something like this:
const generateUpdateQuery = (fields) => {
let exp = {
UpdateExpression: 'set',
ExpressionAttributeNames: {},
ExpressionAttributeValues: {}
}
Object.entries(fields).forEach(([key, item]) => {
exp.UpdateExpression += ` #${key} = :${key},`;
exp.ExpressionAttributeNames[`#${key}`] = key;
exp.ExpressionAttributeValues[`:${key}`] = item
})
exp.UpdateExpression = exp.UpdateExpression.slice(0, -1);
return exp
}
let data = {
'field' : { 'subfield': 123 },
'other': '456'
}
let expression = generateUpdateQuery(data)
let params = {
// Key, Table, etc..
...expression
}
console.log(params)
Output:
{
UpdateExpression: 'set #field = :field, #other = :other',
ExpressionAttributeNames: {
'#field': 'field',
'#other': 'other'
},
ExpressionAttributeValues: {
':field': {
'subfield': 123
},
':other': '456'
}
}
Using Javascript SDK V3:
Import from the right package:
import { DynamoDBClient PutItemCommandInput, UpdateItemCommandInput, UpdateItemCommand } from '#aws-sdk/client-dynamodb';
Function to dynamically do partial updates to the item:
(the code below is typescript can be easily converted to Javascript, just remove the types!)
function updateItem(id: string, item: any) {
const dbClient = new DynamoDBClient({region: 'your-region-here });
let exp = 'set ';
let attNames: any = { };
let attVal: any = { };
for(const attribute in item) {
const valKey = `:${attribute}`;
attNames[`#${attribute}`] = attribute;
exp += `#${attribute} = ${valKey}, `;
const val = item[attribute];
attVal[valKey] = { [getDynamoType(val)]: val };
}
exp = exp.substring(0, exp.length - 2);
const params: UpdateItemCommandInput = {
TableName: 'your-table-name-here',
Key: { id: { S: id } },
UpdateExpression: exp,
ExpressionAttributeValues: attVal,
ExpressionAttributeNames: attNames,
ReturnValues: 'ALL_NEW',
};
try {
console.debug('writing to db: ', params);
const command = new UpdateItemCommand(params);
const res = await dbClient.send(command);
console.debug('db res: ', res);
return true;
} catch (err) {
console.error('error writing to dynamoDB: ', err);
return false;
}
}
And to use it (we can do partial updates as well):
updateItem('some-unique-id', { name: 'some-attributes' });
What i did is create a helper class.
Here is a simple function : Add all the attribute and values that goes into, if the value is null or undefined it won't be in the expression.
I recommande to create a helper class with typescript and add more functions and other stuff like generator of expressionAttributeValues , expressionAttributeNames ... , Hope this help.
function updateExpression(attributes, values) {
const expression = attributes.reduce((res, attribute, index) => {
if (values[index]) {
res += ` #${attribute}=:${attribute},`;
}
return res;
}, "SET ");
return expression.slice(0, expression.length - 1)
}
console.log(
updateExpression(["id", "age", "power"], ["e8a8da9a-fab0-55ba-bae3-6392e1ebf624", 28, undefined])
);
You can use code and generate the params object based on the object you provide. It's just a JavaScript object, you walk through the items so that the update expression only contains the fields you have provided.
This is not really a DynamoDB question in that this is more a general JS coding question.
You can use UpdateItem; to familiarize yourself with DynamoDb queries I would suggest you DynamoDb NoSQL workbench:
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/workbench.settingup.html
It can generate snippets for you based on your queries.
DynamoDb NoSQL workbench screenshot query
How i can write generic function, which take Array of Objects (any type of Object, possible even null and undefined), and filter it to return just valid items of array? If i write it lite this, i will lose genericity :/
// #flow
// Types
type Person = {
id: string,
name: string,
};
type Car = {
id: string,
color: string,
};
// Function definition
const isNotUndefinedOrNull = item => !(item === null || item === undefined);
export const trimList = (list: Array<any> | $ReadOnlyArray<any>): Array<any> => {
return list.filter(isNotUndefinedOrNull);
};
// Constants
const persons = [{ id: 'p1', name: 'Johny' }, null, undefined];
const cars = [{ id: 'c1', color: 'red' }, null, undefined];
// Calls
const trimmedPersons = trimList(persons);
const trimmedCars = trimList(cars);
PROBLEM is, there i have trimmed cars and persons, but flow doesnt know, there is Cars in the trimmedCars list and neither know there is Persons in trimmedPersons list. Flow see just Array and i dont know, how to write is right, to not lose this info.
Flow try
As flow has a bug with Refine array types using filter we use explicit type casting ((res): any): T[]).
function filterNullable<T>(items: (?T)[]): T[] {
const res = items.filter(item => !(item === null || item === undefined);
return ((res): any): T[]);
}
// Example
const a: number[] = filterNullable([1, 2, null, undefined]);
i found it :)
export function trimList<V>(list: Array<?V> | $ReadOnlyArray<?V>): Array<V> {
return R.filter(isNotUndefinedOrNull, list);
}
Can someone explain why I have the flow error
object type (This type is incompatible with object type Indexable signature is incompatible:)
for the assignment in the last line of
const plain: { [key: string]: string } = { prop: '' };
type TestType = { [key: string]: string | number };
const testVar: TestType = plain;
I have no error if I remove the type specification for plain...
Many thanks !
Vladimir Kurchatkin answered to this question in the issue I opened on github (https://github.com/facebook/flow/issues/5458) :
Objects are invariant by default. You can make it covariant like this:
const plain: { [key: string]: string } = { prop: '' };
type TestType = { +[key: string]: string | number };
const testVar: TestType = plain;
In the flow documentation, it states about typeof "This type test is particularly useful in conjunction with union types." The following, however does not pass flow's scythe:
var EventEmitter = require('events').EventEmitter;
var fnify = function(key: string | (x: number, y: any) => string) {
var fnkey = typeof(key) === 'function' ? key : (t) => key;
new EventEmitter().emit(fnkey(0), 0);
}
Flow complains that it does not know the return value of fnkey, which is odd, as it is guaranteed to be a string from the signature of the function. What does go through is:
var EventEmitter = require('events').EventEmitter;
var fnify = function(key: string | (x: number, y: any) => string) {
var fnkey = typeof(key) === 'function'
? key
: (t) => typeof(key) === 'string' ? key : null;
var kludge = fnkey(0);
if (kludge) {
new EventEmitter().emit(kludge, 0);
}
}
But the latter seems unnecessarily verbose. Is this a feature? Bug? Is there something wrong in the first snippet that makes flow irate?
The problem is that key can change in the function body, either use a const binding
var EventEmitter = require('events').EventEmitter;
var fnify = function(key: string | (x: number, y: any) => string) {
const k = key;
var fnkey = typeof(k) === 'function' ? k : (t) => k;
new EventEmitter().emit(fnkey(0), 0);
}
or set experimental.const_params=true in the [option] section of your .flowconfig