How to convert string to list in JSONPath notation? - jsonpath

The JSON in line 5 in the Parameters section of the screenshot uses JSONPath notation in AWS Step Functions. The key is "Values.$" and the value is a JSONPath that selects a string "$". However, I need to pass in a LIST not a STRING. The value that $ selects is a string.
If I put brackets around, it no longer recognizes that I'm using JSONPath notation and simply passes in the dollar sign character instead of gets the value from inputs.
How can I use JSONPath notation and pass in a string as a list?

Should be able to use the States.Array Intrinsic Function since Parameters is a Payload Template
{
"Filters": [{
"Name": "replication-task-arn",
"Values.$": "States.Array($)"
}],
"MaxRecords": 20,
"WithoutSettings": true
}

Related

How to extract value using a JSON Path Expression from key that contains a quote?

I have the following JSON:
{ "\"a\"" : 456 }
I want to use jsonpath npm with method value to extract the value 456.
What is the jsonpath expression to extract this value?
I tried jsonpath.value({ "\"a\"" : 456 }, '$["\\"a\\""]') But it returns undefined.
In general, my question is how to write a jsonpath expression to extract an object whose key contains a quote.

DynamoDb check that a SS attribute in contained in a given SS

Lets say I have this schema:
source_id -> String, HashKey
created_at -> String, RangeKey
required_capabilities -> StringSet
required_capabilities is a Set of Strings that we need to provide in the query in order to be able to retrieve a particular element.
For example:
If I have this three elements:
{
"source_id": "1",
"created_at": "2021-01-18T10:53:25Z",
"required_capabilities": ["Cap1", "Cap2", "Cap3"]
},
{
"source_id": "1",
"created_at": "2021-01-18T10:59:31Z",
"required_capabilities": ["Cap1", "Cap3"]
},
{
"source_id": "1",
"created_at": "2021-01-18T11:05:15Z"
}
I want to create a query, filtering for example source_id = "1" and providing a FilterExpression with the required_capabilities = ["Cap1", "Cap3", "Cap4"].
And I would expect as a result:
{
"source_id": "1",
"created_at": "2021-01-18T10:59:31Z",
"required_capabilities": ["Cap1", "Cap3"] // Since I've provided "Cap1", "Cap3" and "Cap4"
},
{
"source_id": "1",
"created_at": "2021-01-18T11:05:15Z" // Since it doesn't require any capability.
}
I've tried the IN operator as follows, since the stored StringSet should be IN (or Contained by) the given SS, but it didn't work.
aws dynamodb query --table-name TableName --key-condition-expression "source_id = :id" --filter-expression "required_capabilities IN (:rq)" --expression-attribute-values '{":id": {"S": "1"}, ":rq": { "SS": ["Cap1", "Cap3", "Cap4"] }}'
It works only when I provide the exact same StringSet, but If I provide a set that contains the saved one and also have more values, it doesn't return anything.
it seems your issue is around the use of the IN keyword, which does not work with sets. From the docs on conditionals
IN : Checks for matching elements in a list.
AttributeValueList can contain one or more AttributeValue elements of type String, Number, or Binary. These attributes are compared against an existing attribute of an item. If any elements of the input are equal to the item attribute, the expression evaluates to true.
I believe you want the CONTAINS keyword:
CONTAINS : Checks for a subsequence, or value in a set.
AttributeValueList can contain only one AttributeValue element of type String, Number, or Binary (not a set type). If the target attribute of the comparison is of type String, then the operator checks for a substring match. If the target attribute of the comparison is of type Binary, then the operator looks for a subsequence of the target that matches the input. If the target attribute of the comparison is a set ("SS", "NS", or "BS"), then the operator evaluates to true if it finds an exact match with any member of the set. CONTAINS is supported for lists: When evaluating "a CONTAINS b", "a" can be a list; however, "b" cannot be a set, a map, or a list.
Actually, I found out that dynamodb doesn't support the use case I needed, so I found a workaround.
Basically instead of modelling the required_capabilities as a StringSet, I've created a field called required_capability, containing a single required capability (which is ok so far for me) and using the IN operator to check.
If in the future I need to check for more than one capability, I just need to add new fields required_capability_2 and required_capability_3.
It's clearly not ideal, but I guess it's good enough, considering I won't have a lot of required capabilities in a single record, it's usually one, maybe two.

JSONPath - Filter expression to print a field if an array contains a string

I have the following JSON and am trying to write a JSON Path expression which will return me the isbn number when I have a id of either '123456789' or '987654321'. I tried the following but this did not work. Can anybody tell me what I am doing wrong please. Thanks in advance
JSON Path Expression
$.books[?(#.ids== '123456789' )].isbnNumber
JSON
{
"books": [{
"title": "10",
"isbnNumber": "621197725636",
"ids": [
"123456789",
"987654321"
]
}]
}
The (more traditional) JSONPath implementations that stick close(r) to Goessner's reference specification do not offer handy functions like in which are available in extended implementations like JayWay's JSONPath.
Using Gatling's JSONPath, one thing we could do if the positions of the Ids in question are fixed is accessing their respective indices directly to make the comparison:
$.books[?(#.ids[0] == "123456789" || #.ids[1] == "987654321")].isbnNumber
This will give you the desired result of your example; however, some books only have one of the two indices, or they Id to compare to shows up on a different position it won't work.

Look for Value in Multiple Keys with JSONPath

With JSONPath, how can you extract a single value from a list of known keys?
For example, I want to write one JSON path expression that can extract Sean from all three of these JSON documents:
{ "firstName": "Sean" }
{ "first_name": "Sean" }
{ "first_name": "Sean", "firstName": "Sean" }
This example is a little contrived, but I have an actual use case that requires this behavior.
The best I can come up with is the expression $.firstName,first_name which will work for #1 and #2 but returns an array for #3 — and I just want the first value that matches.
Basically, I’m looking for a JSONPath extract expression that simulates this JavaScript code:
json.firstName || json.first_name
I believe you want something like below :)
You can get json path using the index .Whn I'm using rest-assured I always use something similar to below code to extract values from my json response .
Response response=given().contentType(ContentType.JSON).get("http://localhost:3000/posts");
JsonPath jsonPathEvaluator = response.jsonPath();
String fn1 = jsonPathEvaluator.get("firstName[0]");
String fn_1=jsonPathEvaluator.get("first_name[0]");
String fn2=jsonPathEvaluator.get("firstName[1]");
You can pass all pair to dict and then extract your values or if you need only values you can use set structure to store keys and separate list to values.

Convert large xml values into double type json?

I'm forming an xml whose snippet is -
<cache-properties>
<list-cache-hit-rate>
<units>hits/sec</units>
<value>1.5308452E6</value>
</list-cache-hit-rate>
<list-cache-miss-rate>
<units>misses/sec</units>
<value>25422.167</value>
</list-cache-miss-rate>
<compressed-tree-cache-hit-rate>
<units>hits/sec</units>
<value>970.2339</value>
Notice the value 1.5308452E6 is big enough that the values are stored as exponent while performing fn:sum() behind the scene.
Later, I'm converting the xml to json by the following function -
let $arr := json:to-array(local:tojson($data))
return (($data))
and value converted looks like this -
cache-properties": {
"list-cache-hit-rate": {
"units": "hits/sec",
"value": 1.5308452E6
},
"list-cache-miss-rate": {
"units": "misses/sec",
"value": "25422.167"
},
"compressed-tree-cache-hit-rate": {
"units": "hits/sec",
"value": "970.2339"
},
Notice the values are enclosed in quotes except 1.5308452E6 this value. This is not in quotes. What correction is needed here ? Or is this correct? I'd rather have all values in quotes. This is my custom transform function-
declare function local:tojson($func){
let $custom := let $config := json:config("custom")
let $_ := map:put( $config, "whitespace", "ignore" )
let $_ := map:put( $config, "array-element-names", "Video" )
return $config
return json:transform-to-json($func,$custom)
};
Take a look at the xml schema. Your snippets appear to be similar or identical to marklogic system status xml schema however you mention 'fn:sum in the background' so Im guessing you have applied a transformation which has changed the xsd type.
The json transformation code uses the XSD type if in scope to determine the typed output in JSON (for XML numeric types). Also if the number is 'too large' it can convert to string to avoid JavaScript issue.
( it basically uses fn:data(value) to convert )
If needed you can either force a string type onto your xml, or you can specialize the transformation by overriding one of the json-custom: primitives in json/custom.xqy by supplying the appropriate mapping in the config. Look into the source for the full list of overridable functions. They are not fully documented as they are not with full generality in mind and may not be obvious, easy or possibly to change behaviour in every conceivable way.
The strategies are to either
Use an XML with schema in scope that types atomic values explicitly (in your case as xs:string),
Override one of the low level functions in custom.xqy
Convert the JSON by post-processing and 'stringify' the desired elements
Roll your own (not too difficult with the samples show)
All of the above

Resources