DynamoDb documentClient.update or delete StringSet throws ValidationException - amazon-dynamodb

I successfully update and delete an item from a StringSet in a dynamoDb table when called from my test app running on localhost.
I then upload the app to LightSail but now when I call the same function to update or delete an item it throws a ValidationException!:
{
"message": "Invalid UpdateExpression: Incorrect operand type for operator or
function; operator: DELETE, operand type: MAP",
"code": "ValidationException",
"time": "2018-01-03T13:20:14.919Z",
"requestId": "9HCQMH5RAUBRK1K7BNESNBUD5BVV4KQNSO5AEMVJF66Q9ASUAAJG",
"statusCode": 400,
"retryable": false,
"retryDelay": 10.381373865940402
}
Why? I have not made any changes to my code so why does this happen and how to solve it?
Here's the relevant code:
var documentClient = getDocumentClient();
var paramsSET = {
ExpressionAttributeNames:
{
"#StringSet": "Packages"
},
ExpressionAttributeValues:
{
":value": documentClient.createSet(['filler as SET cannot be empty',
app.packageName
])
},
Key:
{
"EmailAddress": app.emailAddress
},
ReturnValues: "ALL_NEW",
TableName: "Developers",
UpdateExpression: "ADD #StringSet :value"
// UpdateExpression: "DELETE #StringSet :value" ------ to delete value
};
// adds packagename to Packages SET in developers table - creates set if not exist
documentClient.update(paramsSET, function (err, data){}

I could not get it to work using the documentclient api.
finally used the old api and got it to work using dynamodb.updateItem see docs here
still have no idea why it works on localhost (accessing the same dynamodb tables) and not when live on LightSail!

Related

How to delete a scheduled event using API?

Disclaimer: I am new to Hasura. I think I am missing some key understanding of how Hasura works.
Here is the list of steps I did so far:
Initiazed a new Hasura project using Heroku Postgresql database
using /v1/query and the following post body, I managed to create a scheduled event (I see it in the Hasura Web Console):
{
type: "create_scheduled_event",
args: {
webhook: "some API endpoint",
schedule_at: "somedate",
headers: [
{ name: "method", value: "POST" },
{ name: "Content-Type", value: "application/json" },
],
payload: "somepayload",
comment: "I SUPPLY A UNIQUI ID TO USE IN THE FOLLOWING DELETE QUERY",
retry_conf: {
num_retries: 3,
timeout_seconds: 120,
tolerance_seconds: 21675,
retry_interval_seconds: 12,
}
}
}
Now, I am trying to delete this event using this post body:
{
type: "delete",
args: {
table: {
schema: "hdb_catalog",
name: "hdb_scheduled_events",
},
where: {
comment: {
$eq: `HERE I PROVIDE THE UNIQUE ID I SET ON THE EVENT CREATION ABOVE`,
}
}
}
}
and getting back this response:
data: {
path: '$.args',
error: 'table "hdb_catalog.hdb_scheduled_events" does not exist',
code: 'not-exists'
}
as I understand hdb_catalog is the schema that I should work against but it does not appear anywhere in my Heroku database. I actually managed to create a scheduled event even without any database connected to the project. So, it seems that Hasura uses something else to store my scheduled events, but what??? How to access that database/table? Would you please help me?
You should use the delete_scheduled_event API instead of trying to delete the row itself from the hdb_catalog

Is there a script to show the added fields in response in Postman when using "additionalProperties": false?

i have added "additionalProperties": false to schema scripts in postman. The "additionalProperties": false, worked fine and im getting error message when new fields are newly added to the json response and not covered by schema. But is there a script to specify what fields are added in the json?
No, I think you cannot.
I have tried using itv4 directly instead of using it through pm...
like below... The best I have obtained is the number of fields that are exceeding... but not the name.
const schema = {
"type": "object",
"properties": {
"code": { "type": "string" }
},
"additionalProperties": false
};
console.log(tv4.validate(body, schema))
console.log(tv4.validateResult(body, schema))
console.log(tv4.validateMultiple(body, schema));
pm.test("Explicitly ", function () {
let Ajv = require('ajv');
let ajv = Ajv()
let result = ajv.validate(schema, jsonData);
pm.expect(result, JSON.stringify(ajv.errors)).to.be.true;
});
this will show the additional property there is a bug for this already :
https://github.com/postmanlabs/postman-app-support/issues/9276

Facing issue while trying to run the updateitem in dynamo db

I am able to fetch the record from dynamo db and view the response successfully. I need to modify the fetched 'ACCOUNTNAME' attribute in the 'items' array and update the json and also update in dynamo db. Now when I try to update the fetched records I end up with the Invalid attribute value type exception.
I was trying to update it using the key with Array of Strings which is provided with code snippet also tried to update inside for loop using the individual string but both failed with same exception as
"statusCode": 400,
"body": {
"message": "Invalid attribute value type",
"error": {
"errorMessage": "ValidationException"
}
}
I tried to create params and update the call inside the for loop by setting the key as below,
Key: {
"UUID": {
"S": usersOfAccountFromDB.body.Items[key].UUID
}
,
"TYPE": {
"S": user
}
}
but also failed with the same exception.
Fetched Json from dynamo db
[
{
"DEFINITION": "914ba44a-8c26-4b60-af0f-96b6aa37efe6",
"UUID": "830a49cb-4ed3-41ae-b111-56714a71ab98",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
{
"DEFINITION": "1f60fded-323d-40e1-a7f8-e2d053b0bed0",
"UUID": "47db3bbe-53ac-4e58-a378-f42331141997",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
{
"DEFINITION": "05ddccba-2b6d-46bd-9db4-7b897ebe16ca",
"UUID": "e7290457-db77-48fc-bd1a-7056bfce8fab",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
.
.
.
.]
Now I tried to iterate the Json and setup UUID which is the key as the String array as below,
var userUUIDArray : string[] = [];
for (let key in usersOfAccountFromDB.body.Items) {
userUUIDArray.push(usersOfAccountFromDB.body.Items[key].UUID);
}
for (var uuid of userUUIDArray) {
console.log("UUID : " +uuid); // prints all the uuid
}
// Creating a parameter for the update dynamo db
var params = {
TableName: <tableName>,
Key: {
"UUID": {
"SS": userUUIDArray
}
,
"TYPE": {
"S": user
}
},
UpdateExpression: 'SET #ACCOUNTNAME = :val1',
ExpressionAttributeNames: {
'#ACCOUNTNAME': 'ACCOUNTNAME' //COLUMN NAME
},
ExpressionAttributeValues: {
':val1': newAccountName
},
ReturnValues: 'UPDATED_NEW',
};
//call the update of dynamodb
const result = await this.getDocClient().update(param).promise();
I get the error as below,
"body": {
"message": "Invalid attribute value type",
"error": {
"errorMessage": "ValidationException"
}
}
All the approaches failed with same above exception
The update operation which your code currently uses only allow a single item to be updated.
IIUC, you want to update multiple items with one API call. For this you need to use batchWrite operation. Keep in mind that you cannot update more than 25 items per invocation.
The origin of the error you are getting
Your code fails due to the use of "SS" in the UUID field. This field is of type string so you must use "S". Note however that since you're using the document client API you do not need to pass values using this notation. See this answer for further details.
I have resolved the issue now by running the update statement one by one using loop
for (let key in usersOfAccountFromDB.body.Items) {
var updateParam = {
TableName: process.env.AWS_DYNAMO_TABLE,
Key: {
UUID: usersOfAccountFromDB.body.Items[key].UUID,
TYPE: user
},
UpdateExpression: "SET #ACCOUNTNAME = :val1",
ExpressionAttributeNames: {
'#ACCOUNTNAME': 'ACCOUNTNAME'
},
ExpressionAttributeValues: {
":val1": newAccountName
},
ReturnValues: "UPDATED_NEW",
};
const result = await this.getDocClient().update(updateParam).promise();
}

AWS AppSync GraphQL query a record by a field value

I have an user table, which consists of email, phone etc., and I would like to query a record based on its email or phone value (instead of #Id). Having not-adequate knowledge to do this - I wrote a schema like this:
type Query {
...
getUser(id: ID!): User
getUserByEmail(input: GetUserByEmailInput!): User
...
}
input GetUserByEmailInput {
email: String!
}
In resolver against getUserByEmail(..), I tried to experiment but nothing worked so far, so its remain to default state:
So when I ran a query like this to the Queries console:
query GetUserByEmail {
getUserByEmail(input: {email: "email#email.com"}) {
id
name
email
image
}
}
this returns an error like this:
{
"data": {
"getUserByEmail": null
},
"errors": [
{
"path": [
"getUserByEmail"
],
"data": null,
"errorType": "DynamoDB:AmazonDynamoDBException",
"errorInfo": null,
"locations": [
{
"line": 41,
"column": 5,
"sourceName": null
}
],
"message": "The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: xxx)"
}
]
}
How can I query a record by non-Id field value?
If you use the Create Resources flow in the console, it will create a listUsers query that looks like the following for example. Note that the DynamoDb operation will be a Scan that has a DynamoDb filter expression where you can use any field to query DynamoDb. See below for the mapping template.
{
"version": "2017-02-28",
"operation": "Scan",
"filter": #if($context.args.filter) $util.transform.toDynamoDBFilterExpression($ctx.args.filter) #else null #end,
"limit": $util.defaultIfNull($ctx.args.limit, 20),
"nextToken": $util.toJson($util.defaultIfNullOrEmpty($ctx.args.nextToken, null)),
}
You can find more details about Scans and filter expressions in the AWS AppSync documentation:
https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-resolvers.html

SimpleSchema unable to set specific error message for given validation

I have this very basic model with just one field name that I want to validtate against a regex:
const Projects = new ProjectsCollection('projects');
Projects.schema = new SimpleSchema({
_id : {type: String, regEx: SimpleSchema.RegEx.Id},
name : {
type : String,
regEx: /^[a-zA-Z0-9]+((\s[a-zA-Z0-9]+)|(_[a-zA-Z0-9]+)|(-[a-zA-Z0-9]+)|(\.[a-zA-Z0-9]+))?$/
}
});
Projects.attachSchema(Projects.schema);
When the validation fails I get back a validation error saying that regex failed for Name which is undesirable for me because it's ambiguous and the user has no idea what exactly I need him to enter.
I tried adding the following with no success:
Projects.schema.messages({
"regEx name": [{
msg: "test error message"
}]
});
This one however..works but the problem is that I could have any other model with a name field and it will spit out the same error message for all of them (and I plan on having another model with a name field):
SimpleSchema.messages({
"regEx name": [{
msg: "test error message"
}]
});
I tried also with (no success):
SimpleSchema.messages({
"regEx projects.name": [{
msg: "test error message"
}]
});
I insert via methods and here's my insert code:
export const insert = new ValidatedMethod({
name : 'projects.insert',
mixins : [simpleSchemaMixin],
schema : Projects.simpleSchema().pick([
'name'
]),
schemaValidatorOptions: {
clean : true,
filter: false
},
run({name}) {
return Projects.insert({
name
}, null);
},
});
Any ideas on how am I supposed to configure my validation messages so that I can target them for specific fields?
See the documentation for the special case of regex messages:
In your case you should try:
Projects.schema.messages({
"regEx name": [
{
exp: /^[a-zA-Z0-9]+((\s[a-zA-Z0-9]+)|(_[a-zA-Z0-9]+)|(-[a-zA-Z0-9]+)|(\.[a-zA-Z0-9]+))?$/ ,
msg: "test error message"
}
]
});
This answer and question applies to v1 of meteor-simpl-schema
I had the same problem and this worked for me. I think that custom messages are instance based - are bound only to one simplSchema instance. Then in your validated-method, pick() will create a new SimplSchema instance, without your custom messages.
I needed to manually add a custom messages from the "parent" schema.
In your case like this:
const insertProjectSchema = Projects.schema.pick('name');
insertProjectSchema.messages(Projects.schema._messages);
export const insert = new ValidatedMethod({
name : 'projects.insert',
mixins : [simpleSchemaMixin],
schema : insertProjectSchema,
schemaValidatorOptions: {
clean : true,
filter: false
},
run({name}) {
return Projects.insert({
name
}, null);
},
});

Resources