Using Cloudformation to Create DynamoDB with composite primary key - amazon-dynamodb

From the command line or the online API, I have no trouble creating a "composite primary key" but when I try to use CloudFormation to do the job for me, I don't see any JSON/YAML that will let me set something called a "composite primary key". The language is completely different so I was hoping someone could guide me as to how I create such a key using Cloudformation.
My best guess is something like the following where I want the composite key to consist of both userId and noteId:
Resources:
usersTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: notes_serverless
AttributeDefinitions:
- AttributeName: userId
AttributeType: S
- AttributeName: noteId
AttributeType: S
KeySchema:
- AttributeName: userId
KeyType: HASH
- AttributeName: noteId
KeyType: RANGE
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1

Here is the YAML syntax for DynamoDB table creation with partition and sort keys.
The syntax on OP is almost correct. I have just formatted with proper quotes and rearranged the order of the properties.
AWSTemplateFormatVersion: "2010-09-09"
Resources:
usersTable:
Type: "AWS::DynamoDB::Table"
Properties:
AttributeDefinitions:
-
AttributeName: "userId"
AttributeType: "S"
-
AttributeName: "noteId"
AttributeType: "S"
KeySchema:
-
AttributeName: "userId"
KeyType: "HASH"
-
AttributeName: "noteId"
KeyType: "RANGE"
ProvisionedThroughput:
ReadCapacityUnits: "5"
WriteCapacityUnits: "5"
TableName: "notes_serverless"

Related

How to create/add an encryption/key to a dynamo table via cloudformation?

see sample dynamodb table , cloudformation template below. when i create the table below, what encrpytion aws puts in place to protect my data, if it does it all? if not, how can i specify in the template below that i want to encrypt my data with a key provided by aws itself, if possible. if not i assume, i will need to add a key resource to this as well.
AWSTemplateFormatVersion: "2010-09-09"
Resources:
myDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
-
AttributeName: "product"
AttributeType: "S"
-
AttributeName: "model"
AttributeType: "S"
KeySchema:
-
AttributeName: "product"
KeyType: "HASH"
-
AttributeName: "Model"
KeyType: "RANGE"
ProvisionedThroughput:
ReadCapacityUnits: "5"
WriteCapacityUnits: "5"
TableName: "InfoTable"
As mentioned here, add an SSESpecification to your table. So:
AWSTemplateFormatVersion: "2010-09-09"
Resources:
myDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
-
AttributeName: "product"
AttributeType: "S"
-
AttributeName: "model"
AttributeType: "S"
KeySchema:
-
AttributeName: "product"
KeyType: "HASH"
-
AttributeName: "Model"
KeyType: "RANGE"
ProvisionedThroughput:
ReadCapacityUnits: "5"
WriteCapacityUnits: "5"
TableName: "InfoTable"
SSESpecification:
SSEEnabled: 'true'
This encrypts the table using the AWS managed encryption key.

How to create a DynamoDB Global Secondary Index with a hash of multiple fields?

To me, the word "hash" conveys that it IS possible to a hash consisting of multiple fields within DynamoDB. However, every article I find shows the "hash" consisting of only a single value... which doesn't make any sense to me.
My table consists of the following fields:
uid (PK)
provider
identifier
from
to
date_received
date_processed
The goal is to have multiple indexes based on how my app will retrieve data (other than by the PK, of course). The combinations are:
By the providers's message identifier:
Desired hash: provider + identifier
By the conversation message identifier:
Desired hash: from + to
By the date received and if is is processed
Desired hash: _ac
By the date received and if is is processed
Desired hash: account
Here's an one of the examples of what I've tried and were not successful ...
MessagesTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: messages
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: uid
AttributeType: S
- AttributeName: account
AttributeType: S
- AttributeName: provider
AttributeType: S
- AttributeName: identifier
AttributeType: S
- AttributeName: from
AttributeType: N
- AttributeName: to
AttributeType: N
- AttributeName: _ac
AttributeType: N
- AttributeName: _ap
AttributeType: N
KeySchema:
- AttributeName: uid
KeyType: HASH
GlobalSecondaryIndexes:
- IndexName: idxConversation
KeySchema:
- AttributeName: from:to
KeyType: HASH
- AttributeName: _ac
KeyType: RANGE
Projection:
ProjectionType: KEYS_ONLY
- IndexName: idxProviderMessage
KeySchema:
- AttributeName: provider:identifier
KeyType: HASH
- AttributeName: _ac
KeyType: RANGE
Projection:
ProjectionType: KEYS_ONLY
That's not the way DDB works...
with
from: "sender#myco.com"
to: "recevier#otherco.com"
You'd want to have another attribute in the record
gsiHash: "sender#my.com#recevier#otherco.com"
That's the attribute that you'd specify as the GSI hash key.
Note that in order to access the data via this GSI, you'd need to know both from and to.
In your case, you may want to take a cue from the Overloading Global Secondary Indexes page of the DDB docs
Instead of writing a single record, you'd write multiple records to the table
s: id, keytype: hash
s: data, keytype: sort
s: gsi-sk
records would look like
id:"<uid>",data:"PRIMARY", gsi-sk:"<?>" //"primary" record
id:"<uid>",data:"FROM", gsi-sk:"sender#myco.com"
id:"<uid>",data:"TO", gsi-sk:"receiever#otherco.com"
id:"<uid>",data:"FROMTO", gsi-sk:"sender#myco.com#receiever#otherco.com"
id:"<uid>",data:"PROVIDER", gsi-sk:"whateverid"
<ect>
Now you create a GSI with data as the hash key, and gsi-sk as the sort key.
Expanding on my comment
Alternatively, you might expand what you put into "data"
id:"<uid>",data:"PRIMARY", gsi-sk:"<?>" //"primary" record
id:"<uid>",data:"FROM#sender#myco.com", gsi-sk:"TO#receiever#otherco.com"
id:"<uid>",data:"TO#receiever#otherco.com", gsi-sk:"FROM#sender#myco.com"
id:"<uid>",data:"PROVIDER#<whateverid>", gsi-sk:"IDENTIFIER#<someid>"
<ect>
How much of the data you leave in primary record depends on your access requirements. Do you want to be able to get everything with a GetItem(hk=<uid>, sk="PRIMIARY") or is a Query(hk=<uid>) acceptable

DeletionPolicy retain causing problems

i am trying to import an existing ddb table into my cf yaml with the following code:
UserPreferencesDDBTable:
Type: AWS::DynamoDB::Table
DeletionPolicy: Retain
Properties:
AttributeDefinitions:
- AttributeName: id
AttributeType: S
- AttributeName: subid
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
- AttributeName: subid
KeyType: RANGE
SSESpecification:
SSEEnabled: true
TableName: 'test-user-preferences'
in the cloud formation stack events i see that it's trying to create it and fails because resource with this name exists.
what i am doing wrong here?

Unsupported property 'AttributeType' in CloudFormation dynamo db deployment

I've been trying to deploy a Dynamo db using a cloud formation template and I keep getting the following error that the property AttributeType does not exist.
the yaml definition looks like:
MyDynoDB:
Type: AWS::DynamoDB::Table
Properties:
TableName: 'MyDynamoDb'
AttributeDefinitions:
- AttributeName: 'Id'
AttributeType: 'S'
- AttributeName: 'Name'
AttributeType: 'S'
KeySchema:
- AttributeName: 'Id'
KeyType: HASH
- AttributeName: 'Name'
KeyType: 'S'
ProvisionedThroughput:
ReadCapacityUnits: 5
WriteCapacityUnits: 5
StreamSpecification:
StreamViewType: NEW_AND_OLD_IMAGES
The stack will deploy but goes into rollback mode withe the following error.
CREATE_FAILED AWS::DynamoDB::Table MyDynoDB Encountered unsupported property AttributeType
Why am I seeing this error being generated?
UPDATE
based on the comments I've updated the Attribute and Key Schema definitions to now show:
AttributeDefinitions:
-
AttributeName: Id
AttributeType: S
-
AttributeName: Name
AttributeType: S
KeySchema:
-
AttributeName: Id
KeyType: HASH
-
AttributeName: Name
KeyType: RANGE
Unfortunately still seeing the same error
Found the issue. When making changes to the template such as changing the AttributeType from S to HASH or other property edits the compiled yaml file was not being updated. It was as if the CLI was not detecting the change and therefore would not update/overwrite the file with the new changes when packaging.
To correct this I deleted the build folder where the packages were being saved to and re-deployed, deployment was successful.

Creating a DynamoDB table in Cloudformation fails

I'm getting the following error:
Property AttributeDefinitions is inconsistent with the KeySchema of the table and the secondary indexes
But I'm not sure whats wrong here.
FeedbackTable:
Type: "AWS::DynamoDB::Table"
Properties:
AttributeDefinitions:
-
AttributeName: "uuid"
AttributeType: "S"
-
AttributeName: "timestamp"
AttributeType: "N"
-
AttributeName: "pros"
AttributeType: "S"
-
AttributeName: "cons"
AttributeType: "S"
-
AttributeName: "comments"
AttributeType: "S"
-
AttributeName: "options"
AttributeType: "S"
-
AttributeName: "luaA"
AttributeType: "S"
-
AttributeName: "luaB"
AttributeType: "S"
-
AttributeName: "luaC"
AttributeType: "S"
KeySchema:
-
AttributeName: "uuid"
KeyType: "HASH"
-
AttributeName: "timestamp"
KeyType: "RANGE"
ProvisionedThroughput:
ReadCapacityUnits: "1"
WriteCapacityUnits: "1"
TableName: "BD_Feedback"
You do not need to specify all attributes for the DynamoDB table here. What cloudformation requires are definitions for key and index attributes only.
So if you reduce you AttributeDefinition to uuid and timestamp, it should be fine (as long as you have no secondary indices).
Here the section from the CloudFormation docs on this topic:
A list of attributes that describe the key schema for the table and
indexes. Duplicates are allowed.
CloudFormation docs

Resources