Could somebody please tell me what a valid key condition expression would be. I am trying to run a query on a simple table called MyKeyTable. It has two "columns," namely Id and AnotherNumberThatICareAbout which is of type Long.
I would like to see all the values I put in. So I tried:
aws dynamodb query --select ALL_ATTRIBUTES --table-name MyKeyTable
--endpoint http://localhost:8000
--key-condition-expression "WHAT DO I PUT IN HERE?"
What hash do I need to put in? The docs are a bit lame on this imho. Any help appreciated, even if it's just a link to a good doc.
Here's a command-line-only approach you can use with no intermediate files.
First, use value placeholders to construct your key condition expression, e.g.,
--key-condition-expression "Id = :idValue"
(Don't forget the colon prefix for placeholders!)
Next, construct an expression-attribute-values argument. Note that it expects a JSON format. The tricky bit I always try to forget with this is that you can't just plug in 42 for a number or "foo" for a string. You have to tell DynamoDb the type and value. Ref AWS docs for the complete breakdown of how you can format the value specification, which can be quite complex if you need it to be.
For Windows you can escape quotation marks in it by doubling them, e.g.,
--expression-attribute-values "{"":idValue"":{""N"":""42""}}"
For MacOS/Linux, single quote is required around the JSON:
--expression-attribute-values '{":idValue":{"N":"42"}}'
create a file containing your keys: test.json
{
"yourHashKeyName": {"S": "abc"},
"YourRangeKey": {"S": "xyz"} //optional
}
Run
aws dynamodb query --table-name "your table name" --key-conditions file://test.json
refer: http://docs.aws.amazon.com/cli/latest/reference/dynamodb/query.html
For scanning the table
aws dynamodb scan --table-name "you table name"
No need to pass any keys as we scan the whole table (Note: It will get max 1MB of data)
refer:http://docs.aws.amazon.com/cli/latest/reference/dynamodb/scan.html
Related
I am looking for way to improve the following query scans. I need to query based on 3 keys
Primary Partition Key
GSI Partition Key
GSI Sort Key
DynamoDB only allows 2 conditions in key-condition-expression. I have to use filter-expression which is scanning too many records.
I am also considering combining 2 keys as sort key in GSI as an alternative. Is this the right way to do this?
aws dynamodb query \
--table-name bundles \
--index-name GSI-RegulationSidBundleStatus \
--key-condition-expression "RegulationSid = :regulationSid AND BundleStatus = :bundleStatus" \
--filter-expression "AccountSid = :accountSid" \
--expression-attribute-values '{
":accountSid": {
"S": "XXXXXXXXXXXXXXXXXXXXXXXXXXX"
},
":regulationSid": {
"S": "YYYYYYYYYYYYYYYYYYYYYYYYYYY"
},
":bundleStatus": {
"S": "APPROVED"
}
}'
Yes, adding multiple keys into the partition key or sort key is a common pattern. To help with identifying keys, it is common to prefix each key with the key type, or an abbreviation, followed by a hash, and a hash between each key.
For your case, a sort key would look similar to:
r#${RegulationSid}#b${BundleStatus}.
This is also common for partition keys, even when you have just one key. In your case: a#${AccountSid}.
When deciding on whether to overload the partition key or sort key, and the order of the keys, look at your access patterns. If you know you have a pattern to get all regulations, you want to put this key first. Or, if you know you need to get all regulations for a given bundle status, put the bundle status first. Then you can use a begins_with query to get these lists of items, reusing the same GSI.
In your code documentation, you can list the used abbreviations to make sure you don't use the same abbreviation for multiple key types.
I suggest always keeping the customer account Id as the first key in the partition key. If you later decide to use Leading Key row-level authorisation this will come in handy.
I go through some posts and came to know that in dynamodb case insensitive search is not possible, hence trying to update existing dynamodb table's column values to lowercase.
I searched for syntax but havent get any satisfactory result. In mysql we achieve same thing by "
set name = LOWERCASE(name)
Please help me to write same thing in dynamodb.
I wrote this query
aws dynamodb update-item --profile test --table-name test-event-tickets --key '{"university_id": {"S": "112"}}' --update-expression 'SET #nameAttribute = :inputScope' --expression-attribute-names '{"#scopeAttribute":"name"}' --expression-attribute-values '{":inputname":{"S":"george philips"}}'
but here i have hardcoded inputname to "george philips". instead of this I want to read column value and convert it to lowercase
Unforetunately, there is no such syntax in DynamoDB. Although DynamoDB is capable of doing some transformations to data in-place, such as incrementing a counter, the syntax to do this is very limited, and lowercasing a value is NOT one of the things you can do.
So you'll have to scan the entire table, reading the old value of the attribute, calculating the lowercase version in your application, and writing the value back. If your application is doing regular writes in parallel to this transformation, you'll need to be very careful not to overwrite data that is being overwritten in parallel. You can do this with a conditional expression, but I think it will be easier if the new lowercase attribute will have a different name from the old not-always-lowercase attribute, so your transformation process will be able to write to the new attribute only (using ConditionalExpression) if the new attribute is not yet set.
I am writing get-item command for dynamodb of the aws.
Here is my command:
aws dynamodb get-item --table-name foo --key '{\" bar \":{\"S\":\"aaaa\"},\"timestamp\":{\"N\":\"1603610188890\"}}'
As you can see, the table foo has composite primary key:
partition key "bar" and sort key "timestamp".
What is the proper syntax to use comparison for the sort key "timestamp"?
How I can to change my command to get the items whose timestamp is between 1603010188890 and 1603610188890?
Thanks.
The get-item operation can only retrieve a single item, with a specific key. To retrieve items - possibly more than one - in a certain sort-key range you need to use a different request - query.
The "query" request has a key-conditions or key-condition-expression (these are the older and newer, respectively, syntax, for the same thing). With that parameter you can say that you want items where the partition key is equal something, and the sort key is between two values.
I am looking to add some items into DynamoDB via console. (Please see screenshot below). When I click the "Save" button, nothing is happening. No new items are getting created. I also checked the DynamoDB JSON checkbox to convert the JSON into DynamoDB compatible JSON and clicked save button again, but nothing is happening. Can someone please advise, what am I doing wrong ? There are no error messages either.
You haven't provided your table definition so it's difficult to say exactly what a valid item for your table would look like but I can tell you for sure that:
1) You shouldn't be creating an array of JSON objects: each item you create must be an individual valid JSON object. Like so:
{
"sub": 1234,
"EventID": ["B213", "B314"]
}
2) Each item you create must include attributes matching the item schema for your table. This means that if your table has just a partition key defined then each item must include one attribute whose name matches the name of the partition key. If the table has both partition and sort key then each item you create must include at least two attributes, one matching the partition key, the other matching the sort key. And finally, the partition and sort keys must be string or numeric.
Assuming your table has a partition key called sub and no sort key, then the item example above would work.
update
Based on the comment it sounds like the OP was looking for a way to insert multiple items in a single operation. This not possible with the console, and actually it goes deeper than that: Dynamo fundamentally operates one a single item at a time for write operations. It is of course possible to batch up to 25 item writes using the API but that is just a convenience.
If you need to add multiple items to your table, consider writing a small script using the AWS CLI or the API. It’s is relatively easy to do!
The scripting solution looks something like this:
aws-dynamodb-upload-json.sh
#!/bin/sh
set -e
# parse
if [[ $# -eq 0 ]]; then set -- "--help"; fi
if [[ "$1" = "--help" ]]; then
echo "Usage:"
echo " aws-dynamodb-upload-json {table-name} {file.json}"
exit 1
fi
# config
AWS_DYNAMODB_UPLOAD_TABLE_NAME=$1
AWS_DYNAMODB_UPLOAD_FILE_INPUT=$2
echo "Configuration"
echo "AWS_DYNAMODB_UPLOAD_TABLE_NAME=$AWS_DYNAMODB_UPLOAD_TABLE_NAME"
echo "AWS_DYNAMODB_UPLOAD_FILE_INPUT=$AWS_DYNAMODB_UPLOAD_FILE_INPUT"
# main
jq -c '.[]' < $AWS_DYNAMODB_UPLOAD_FILE_INPUT |
while read -r row
do
echo "Entry: $row"
echo ""
aws dynamodb put-item \
--region us-east-1 \
--table-name $AWS_DYNAMODB_UPLOAD_TABLE_NAME \
--item \
"$row"
done
This does rely on the aws CLI and the jq CLI to be installed and on your $PATH.
Hopefully AWS adds an easier way to do this VIA the web interface someday.
iam trying to query a dataset in dynamodb where the primary key is a timestamp.
First i wanted to get all data for a specific sensorId.
I tried with a scan (scan.json):
{
"sensorId": {
"AttributeValueList": [{
"S": "1234"
}],
"ComparisonOperator": "EQ"
}
}
This Json was used via CLI command:
aws dynamodb scan --table-name sensorData --scan-filter file://scan.json
That was successful and gave me all data for the specified sensorid.
Now if i want to have only the timestamp and sensorId as result, i read about projection-expression and tried to do a query (query.json):
{
":Id":{"S":"1234"}
}
aws cli command
aws dynamodb query --table-name sensorData --key-condition-expression "sensorId= :Id" --expression-attribute-values file://query2.json --projection-expression "timestamp"
Gave me :
An error occurred (ValidationException) when calling the Query
operation: Invalid ProjectionExpression: Attribute name is a reserved
keyword; reserved keyword: timestamp
But replacing "timestamp" with "sensorId for testing purpose gave me:
An error occurred (ValidationException) when calling the Query
operation: Query condition missed key schema element: timestamp
And i understand that that sensorId is not valid for key-expression..
KeyConditionExpression accepts only key attributes, hash key and range key.
But how to get the result?
I want only the timestamps for a sensorId.
Is my Primarykey wrong? should be better to use sensorId as primary key together with timestamp as range key?
Based on your scenario, its better to change your keys. Use SensorID as the Partition Key and Timestamp as the Sort Key.
This way you can query (Without scanning all the items) the items for a given SensorID. Also its possible to sort them in order of the Timestamp.
If you have a larger dataset (More than several Killobytes) it would be efficient to create a LSI or GSI to project the required attributes for a given query.
Note: TimeStamp is a reserved word in DynamoDB. You can use Expression Attribute Names to avoid the errors when using reserved attributes in query expressions.