I'm trying to read an item with ID of X from DynamoDB (Using Appsync graphql) and I want it to create a default item if there is none.
This seems like it should be a normal use case. But the solutions I've tried have all been pretty bad:
I tried to create a Pipeline resolver that would first get the item, then in a second function create an item if there was no item in the result from the previous function. This had with returning the read item.
I tried making a PutAction with the condition that an item with this ID doesn't work. This does what I need it to, but I can't change the response from an error warning, no matter what I do to the response mapping template.
So how does one efficiently create a "read - or create if it does not exist" resolver for DynamoDb?
It turns out that I was close to the solution.
According to this documentation: https://docs.aws.amazon.com/appsync/latest/devguide/resolver-mapping-template-reference-dynamodb.html#aws-appsync-resolver-mapping-template-reference-dynamodb-condition-handling
Create a putItem resolver that conditionally checks if there is an item with the same unique identifier (in DynamoDB that's usually a primary key and a sort key combination)
If the resolver determines the read object to not be different from the intended new object a warning will not be sent. So we can simply remove ALL fields from the comparison.
Example:
{
"version" : "2017-02-28",
"operation" : "PutItem",
"key" : {
"id" : { "S" : "${ctx.args.id}" }
},
"condition" : {
"expression" : "attribute_not_exists(id)",
"equalsIgnore": [ "__typename", "_version", "_lastChangedAt", "_createdAt", "name", "owner"]
},
"attributeValues": {
"name": { "S" : "User Username" }
}
}
Related
I'm using the Dynamodb resource in Retool, which is successful for GETs/Scans/Puts/Queries, but I can't seem to get an UpdateItem statement to work.
I'm trying to update an item to add a key for a list of maps if it doesn't exist and append an item if the key already does exist.
Configuration
Update Expression
SET images = list_append(:val, if_not_exists(images, :emptylist))
ExpressionAttributeValues
In Retool, my ExpressionAttributeValues are
":val": [{"location": "{{s3Uploader1.s3FolderName}}/{{s3Uploader1.s3FileName}}"}], ":emptylist":[], which pulls the s3 folder and file names from an s3Uploader and renders to ":val": [{"location": "redactedpath/redacted/redactedfilename"}], ":emptylist":[]
I originally tried the format of calling out the data types, e.g. "M", "L", etc, but I got exactly the same error.
":val":
{
"L":
[
{
"M":
{
"location":
{
"S": "{{s3Uploader1.s3FolderName}}/{{s3Uploader1.s3FileName}}"
}
}
}
]
},
":emptylist":
{
"L":[]
}
Result/Error
When I run the query, I get the following error:
statusCode:422
error:"Unprocessable Entity"
message:"ExpressionAttributeValues contains invalid key: Syntax error; key: "44""
data:null
estimatedResponseSizeBytes:147
resourceTimeTakenMs:363
isPreview:false
resourceType:"dynamodb"
lastReceivedFromResourceAt:1644774304601
source:"resource"
From my understanding, that error message usually specifies the actual key that caused the problem, but from what I can tell, my ExpressionAttributeValues does not contain the string 44. I'm wondering if this is something coming from Retool or if it's perhaps a location instead of the actual key.
I've dug through what feels like the depths of StackOverflow to try different things, but now I feel like I'm stuck.
Additional Information
My original ExpressionAttributeValues was based on Is it possible to combine if_not_exists and list_append in update_item
Similar question, but no answer and different key: ValidationException: ExpressionAttributeValues contains invalid key
Is there anything in the ExpressionAttributeValues that looks like it could cause that error?
I have a document with the following structure:
{
"email" : "a#gmail.com",
"value" : 100,
"children" : [
{
"email" : "b#gmail.com",
"value" : 100
},
{
"email" : "b#gmail.com",
"value" : 200
}
]
}
I want to remove all elements with the email b#gmail.com from the children array. I am able to remove one item if I pass the whole object to be removed like this:
FieldValue.arrayRemove(childObject)
But I want both the objects with the email b#gmail.com to be removed. Is there anyway to achieve this using FieldValue.arrayRemove()?
The arrayRemove operation removes the exact item that you specify from the array. There is no way to pass a partial object and remove all array items that match the partial information. You will have to pass in each complete item that you want to remove.
If you don't know what those items are yet, you will typically have to first read the document, loop over the items in the array to remove them, and write the modified array back to the document.
As an update, it is still the case that you must match the object exactly to remove it from an array. Additionally, of course, in the example above, he is querying for a value, which requires a query to see what matches.
However, depending on the logic: if you use a Map instead...for instance in the case above, adjusted:
"children" :
"b#gmail.com_100":
{
"email" : "b#gmail.com",
"value" : 100
},
"b#gmail.com_200":
{
"email" : "b#gmail.com",
"value" : 200
}
You can simply use:
'children.b#gmail.com_200': FieldValue.delete(),
As of late, I've gravitated away from Lists to Maps for this reason.
Edit for clarity: There are no error messages, it simply returns an empty list if the input string is from the context.arguments, suggesting that it simply isn't getting the input variable out on the query tester (setting it up incorrectly brings up that famous typing error of course). I've also made this into a pipeline with the exact same result. Looking around, people suggest making an intermediate object, but surely I'm just getting my input variables out wrong somehow.
I'm working on a project in AWS Appsync using DynamoDB and I've run into a problem with the context.arguments input.
Basically the code all works if I hardcode the string for the book id into the query (full context to follow), but if I use the context.arguments, it simply refuses to work properly, returning an empty array for the "spines".
I have the following types in my schema:
type Book {
id: ID!
title: String
spines: [Spine]
}
type Spine {
id: ID!
name: String
bookId: ID!
}
I use the following query:
type Query {
getBook(id: ID!): Book
query getBook($bookId: ID!){
getBook(id: $bookId){
title
id
spines {
name
bookId
}
}
}
With the following input (assume this is a relevant guid):
{
"bookId": "aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa"
}
And this resolver for the spines object:
{
"version" : "2017-02-28",
"operation" : "Query",
"index" : "bookId-index",
"query" : {
"expression": "#bookId = :bookId",
"expressionNames" : {
"#bookId" : "bookId"
},
"expressionValues" : {
":bookId" : { "S" : "${context.arguments.id}" }
}
}
}
}
I made sure my data set contained false positives too (spines for other books) so that I know when my query brings back the correct data.
This works if I hardcode a guid as string instead of using context.arguments, and gets exactly what I'm looking for for each book guid.
For example, replacing the expression values with this works perfectly:
"expressionValues" : {
":bookId" : { "S" : "aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa" }
}
Why does "${context.arguments.id}" not get the input variable here the same way as it seems to in other queries?
Thanks to #IonutTrestian for pointing me in the right direction.
$ctx.args was empty, but I decided to go up the chain to see what was in the entire context, so $util.error($util.toJson($ctx)).
The json object I found included a little object called "Source", which contained the query return for the Book object.
Long story short, $ctx.source.id when applied to my query worked a charm.
I also know a bit more about debugging DynamoDB resolvers in case I encounter problems like this in future. Thank you so much!
I am trying to write a pact consumer test to match the following response.
[
{
"accountId" : 1,
"permissions" : [
{
"schedule" : {
"01/01/2018" : false,
"01/01/1900" : true
},
"permissionId" : 3
}
]
}
]
Each schedule object is composed of an unknown number of keys which match a simple regular expression. But I don't see a way to match a key using a regular expression while having the value map to a simple boolean.
For instance, I see the following method in the API.
public LambdaDslObject eachKeyLike(
String exampleKey,
Consumer<LambdaDslObject> nestedObject)
But that is going to expect a new object as the value, instead of a primitive type.
"schedule" : {
"01/01/2018" : { ... }, // not what I want to match
"01/01/1900" : false // what I want to match
}
Is there a way to specify an imprecise key mapped to a primitive value in pact-jvm?
Sorry, this feature doesn't exist yet, but it's been discussed for the next version of the pact specification. You can add your thoughts on this issue: https://github.com/pact-foundation/pact-specification/issues/47
My question is : how to append a value given by a user to an entity. The user provided value is dynamic.
The Watson response overwrites the toppings variable with the value given by the user, as you can see in the attached image.
{
"output": {
"text": "I got an order to add one or more toppings.
Adding <?context.toppings.append('toppings')?>.
Toppings to provide: <?entities['toppings']?.toString()?>"
},
"context": {
"toppings": "<? entities['toppings']?.toString()?>"
}
}
You can append to an array with the .append() function.
In your example, the expression "toppings": "<? entities['toppings']?.toString()?>" will overwrite the toppings variable every time this node is processed with the actual recognized entities #toppings. First the the $toppings variable needs to be defined as an array, e.g.:
"context" : {
"toppings" : []
}
Then in context part of a dialog node you can write:
"context" : {
"toppings" : "<?$toppings.append(entities['toppings'].toJsonArray())?>"
}
More info in our doc: Watson Conversation Doc
EDIT: Thinking about this, it is probably not a good idea to have the same name for the entity and for the variable you store it in. :-)