i'm trying to search a table based on 2 attributes. However i'm getting the following error:
ValidationException: Query condition missed key schema element
What i'm doing:
let value = 'dsadsada';
Cache.query({'registration': { 'eq': value}, 'derivativeId': { 'eq': null}}).exec();
Expected result:
Return a list of documents where the registration matches the value but the derivativeId is null.
I'm using dynamoose, so first of all this is the model:
import {model} from "dynamoose";
import {Document} from 'dynamoose/dist/Document';
import {Schema} from 'dynamoose/dist/Schema';
class Cache extends Document {
id: string;
createdAt: Date;
updatedAt: Date;
}
const schema = {
types: {
id: {
type: String,
required: true
},
registration: {
type: String,
index: {
name: 'VehicleLookupIndex',
global: true
}
},
derivativeId: {
type: String,
index: {
name: 'VehicleLookupIndex'
}
}
},
options: {
timestamps: true,
saveUnknown: ['**'],
expires: 10
},
};
model.defaults.set({
create: false,
prefix: process.env.TABLE_PREFIX
});
export default model<Cache>('cache', new Schema(schema.types, schema.options));
This is my serverless file where I define the table:
cache:
Type: AWS::DynamoDB::Table
Properties:
TableName: ${self:provider.stage}-${self:service}-cache
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
BillingMode: PAY_PER_REQUEST
GlobalSecondaryIndexes:
- IndexName: 'VehicleLookupIndex'
KeySchema:
- AttributeName: 'id'
KeyType: 'HASH'
Projection:
NonKeyAttributes:
- 'derivativeId'
- 'registration'
ProjectionType: 'INCLUDE'
Any ideas what i'm doing wrong?
Related
First time to dynamodb and serverless framework. I am trying to create a simple todo app. With todoId as primary key and userId as a secondary index. This is my definition of the table in serverless.yaml but when i try to get todo list of the user, i get the above error.
resources:
Resources:
GroupsDynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
AttributeDefinitions:
- AttributeName: todoId
AttributeType: S
- AttributeName: userId
AttributeType: S
KeySchema:
- AttributeName: todoId
KeyType: HASH
BillingMode: PAY_PER_REQUEST
TableName: ${self:provider.environment.TODOLIST_TABLE}
GlobalSecondaryIndexes:
- IndexName: ${self:provider.environment.USER_ID_INDEX}
KeySchema:
- AttributeName: userId
KeyType: HASH
Projection:
ProjectionType: ALL
query:
const result = await docClient
.query({
TableName: toDoListTable,
KeyConditionExpression: 'userId = :userId',
ExpressionAttributeValues: {
':userId': 5
},
ScanIndexForward: false
})
.promise()
Since you are making a query with a global secondary index you must specify the name of the index that you want to use and the attributes to be returned in the query results.
The result should be this:
const result = await docClient
.query({
TableName: toDoListTable,
IndexName: "UserIdIndex", // the name specified here: self:provider.environment.USER_ID_INDEX
KeyConditionExpression: "userId = :userId",
ExpressionAttributeValues: {
":userId": 5
},
ProjectionExpression: "todoId, userId",
ScanIndexForward: false
})
.promise()
I am using microservice with folder structure.
Microservice =>
resolvers ->
Media/Image/get-images/request.vtl
Media/Image/get-images/response.vtl
templates -> services.yaml
Request mapping:
#set($imageIds=$ctx.source.imageIds)
#set($keys=[])
#foreach($imageId in $imageIds)
#set($key={})
$util.qr($key.put("id", $util.dynamodb.toString($imageId)))
$util.qr($keys.add($key))
#end
{
"version": "2018-05-29",
"operation": "BatchGetItem",
"tables" : {
"MediaImages": {
"keys": $util.toJson($keys)
}
}
}
Response mapping:
#set($result=$ctx.result.data.MediaImages)
$util.toJson($result)
Service.yaml
Resources:
MediaImagesTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: MediaImages
AttributeDefinitions:
- AttributeName: id
AttributeType: S
- AttributeName: userId
KeySchema:
- AttributeName: id
KeyType: HASH
ProvisionedThroughput:
ReadCapacityUnits: 5
WriteCapacityUnits: 5
GlobalSecondaryIndexes:
- IndexName: UserImages
KeySchema:
- AttributeName: userId
KeyType: HASH
- AttributeName: id
KeyType: RANGE
Projection:
ProjectionType: ALL
ProvisionedThroughput:
ReadCapacityUnits: 5
WriteCapacityUnits: 2
ImageDetailsDataSource:
Type: AWS::AppSync::DataSource
Properties:
Name: ImageDetailsDataSource
Type: AMAZON_DYNAMODB
ServiceRoleArn:
Fn::ImportValue: !Sub "DynamoDB-Role"
ApiId:
Fn::ImportValue: !Sub "API-Id"
DynamoDBConfig:
TableName: !Ref MediaImagesTable
AwsRegion: !Ref AWS::Region
UseCallerCredentials: false
GetImagesPipelineFunction:
Type: AWS::AppSync::FunctionConfiguration
Properties:
ApiId:
Fn::ImportValue: !Sub "API-Id"
Name: GetImagesPipelineFunction
FunctionVersion: "2018-05-29"
Description: Function to get the images from dynamo db
DataSourceName: !GetAtt ImageDetailsDataSource.Name
RequestMappingTemplateS3Location: ../resolvers/get-images/request.vtl
ResponseMappingTemplateS3Location: ../resolvers/get-images/response.vtl
I have tried
#set($tableName=$util.dynamodb.getDataSourceTableName())
#set($imageIds=$ctx.source.imageIds)
#set($keys=[])
#foreach($imageId in $imageIds)
#set($key={})
$util.qr($key.put("id", $util.dynamodb.toString($imageId)))
$util.qr($keys.add($key))
#end
{
"version": "2018-05-29",
"operation": "BatchGetItem",
"tables" : {
"$tableName": {
"keys": $util.toJson($keys)
}
}
}
"error": {
"message": "1 validation error detected: Value '{$tableName= .
[com.amazonaws.dynamodb.v20120810.WriteRequest#1528275d]}' at
'requestItems' failed to satisfy constraint: Map keys must satisfy
constraint: [Member must have length less than or equal to 255, Member
must have length greater than or equal to 3, Member must satisfy regular
expression pattern: [a-zA-Z0-9_.-]+] (Service: AmazonDynamoDBv2; Status
Code: 400; Error Code: ValidationException; Request ID:
464H3LIEPOSA2S8OI34RJ31QLNVV4KQNSO5AEMVJF66Q9ASUAAJG)",
"type": "DynamoDB:AmazonDynamoDBException"
},
In my AppSync request mapping template. I am doing BatchGetItem and hardcoding the table name. I want to get the table name dynamically into my request and response mapping template. I tried Mapping Template Utility Reference $util.dynamodb.getDataSourceTableName($dataSourceName) but didn't work.
I do this way:
RequestMappingTemplate: !Sub
- |
#set($keys=[])
#foreach($imageId in $imageIds)
#set($key={})
$util.qr($key.put("id", $util.dynamodb.toString($imageId)))
$util.qr($keys.add($key))
#end
{
"version": "2018-05-29",
"operation": "BatchGetItem",
"tables" : {
"${TableName}": {
"keys": $util.toJson($keys)
}
}
}
- { TableName: INSERT YOUR TABLE NAME OR SOME REF HERE }
ResponseMappingTemplate: !Sub
- |
#set($result=$ctx.result.data["${TableName}"])
$util.toJson($result)
- { TableName: INSERT YOUR TABLE NAME OR SOME REF HERE }
I use !FindInMap [Environments, !Ref Environment, ProjectsTableName] but you could also use !Ref YourDynamoDBTable to replace INSERT YOUR TABLE NAME OR SOME REF HERE
i'm using DocumentClient for query.
and using serverless framework with DynamoDb.
i'm trying to query with BEGINS_WITH without providing any primary key.
here is how my data looks like:
[
{
id: 1,
some_string: "77281829121"
},
{
id: 2,
some_string: "7712162hgvh"
},
{
id: 3,
some_string: "7212121"
}
]
here is my serverless.yml [i.e Table config i guess]:
Resources:
IPRecord:
Type: 'AWS::DynamoDB::Table'
Properties:
TableName: ${file(./serverless.js):Tables.IPRecord.name}
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: 'id'
AttributeType: 'S'
- AttributeName: 'some_string'
AttributeType: 'S'
KeySchema:
- AttributeName: 'id'
KeyType: 'HASH'
GlobalSecondaryIndexes:
- IndexName: ${file(./serverless.js):Tables.IPRecord.index.ID}
KeySchema:
# ...some more index goes here
- AttributeName: 'some_string'
KeyType: 'RANGE'
Projection:
ProjectionType: 'ALL'
Q:
Using DocumentClinet i want to query with the first few elements of some_string.
which will return all the docs, that is matching.
like in this case i want to query {some_string:"77"} and it will return
[{
id: 1,
some_string: "77281829121"
},
{
id: 2,
some_string: "7712162hgvh"
}]
currently my query looks like this [this gives error ][Running in Local DynamoDB JS shell]:
var params = {
TableName: '<TABLE_NAME>',
IndexName: '<INDEX_NAME>',
KeyConditionExpression: 'begins_with(some_string,:value)',
ExpressionAttributeValues: {
':value': '77'
}
};
docClient.query(params, function(err, data) {
if (err) ppJson(err);
else ppJson(data);
});
seems like this above query needs a primary key, and in my case that is id. if i pass that, then it will point to a single doc.
Here is what i have achived so far:
var params = {
TableName: '<TABLE_NAME>',
FilterExpression: 'begins_with(some_string,:value)',
ExpressionAttributeValues: {
':value': '77'
},
Select:'COUNT' //as i only required COUNT
};
docClient.scan(params, function(err, data) {
if (err) ppJson(err);
else ppJson(data);
});
this above query does what i want.but
any better approach or solution always welcome.
if number of characters in your beginswith query is always going to be random, i don't see an option solving it with dynamodb.
but let's say there are going to be at least 3 characters. then you can do the following.
Update your dynamodb schema to
IPRecord:
Type: 'AWS::DynamoDB::Table'
Properties:
TableName: ${file(./serverless.js):Tables.IPRecord.name}
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: 'id'
AttributeType: 'S'
- AttributeName: 'some_string'
AttributeType: 'S'
KeySchema:
- AttributeName: 'id'
KeyType: 'HASH'
- AttributeName: 'some_string'
KeyType: 'RANGE'
And instead of storing
[
{
id: 1,
some_string: "77281829121"
},
{
id: 2,
some_string: "7712162hgvh"
},
{
id: 3,
some_string: "7212121"
}
]
store as
[
{
id: 772,
uniqueid:1,
some_string: "77281829121"
},
{
id: 771,
uniqueid:2,
some_string: "7712162hgvh"
},
{
id: 721,
uniqueid:3,
some_string: "7212121"
}
]
Where id is always the first 3 character of original some_string.
Now let's say you have to query all items that start with abcx you can do
select * where id=abc and some_string startswith abcx
but you should always try to have more number of characters in id so that load is randomly distributed. for example if there are only 2 character only 36*36 ids are possible if there are 3 character 36*36*36 ids are possible.
I have created a table in dynamoDB JS shell with stu_id as the hash_key_attribute.
var params = {
TableName: 'student_test',
KeySchema: [
{ // Required HASH type attribute
AttributeName: 'stu_id',
KeyType: 'HASH',
},
],
AttributeDefinitions: [
{
AttributeName: 'stu_id',
AttributeType: 'N', },
],
ProvisionedThroughput: { // required provisioned throughput for the table
ReadCapacityUnits: 1,
WriteCapacityUnits: 1,
},
};
dynamodb.createTable(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});
Now I want to add stu_name and school columns and insert the values. Tried defining the column names in GlobalSecondaryIndexes but it didn't work.
I am using DynamoDB local.
Here is the code to update the table with new GSI.
In the below code, I have defined the GSI with name student_test_name_gsi and attributes stu_name as Hash key and stu_id as Range key
var params = {
TableName: 'student_test',
AttributeDefinitions: [
{
AttributeName: 'stu_id',
AttributeType: 'N', },
{
AttributeName: 'stu_name',
AttributeType: 'S', },
],
GlobalSecondaryIndexUpdates: [
{
Create: {
IndexName: 'student_test_name_gsi', /* required */
KeySchema: [ /* required */
{
AttributeName: 'stu_name', /* required */
KeyType: 'HASH' /* required */
},
{ // Optional RANGE key type for HASH + RANGE secondary indexes
AttributeName: 'stu_id',
KeyType: 'RANGE',
}
/* more items */
],
Projection: { /* required */
ProjectionType: 'ALL'
},
ProvisionedThroughput: { /* required */
ReadCapacityUnits: 20, /* required */
WriteCapacityUnits: 20 /* required */
}
},
},
/* more items */
],
};
dynamodb.updateTable(params, function(err, data) {
if (err) {
console.error("Unable to update table. Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("Updated table. Table description JSON:", JSON.stringify(data, null, 2));
}
});
I’m trying to create a simple table using DynamoDB JavaScript shell and I’m getting this exception:
{
"message": "The number of attributes in key schema must match the number of attributes defined in attribute definitions.",
"code": "ValidationException",
"time": "2015-06-16T10:24:23.319Z",
"statusCode": 400,
"retryable": false
}
Below is the table I’m trying to create:
var params = {
TableName: 'table_name',
KeySchema: [
{
AttributeName: 'hash_key_attribute_name',
KeyType: 'HASH'
}
],
AttributeDefinitions: [
{
AttributeName: 'hash_key_attribute_name',
AttributeType: 'S'
},
{
AttributeName: 'attribute_name_1',
AttributeType: 'S'
}
],
ProvisionedThroughput: {
ReadCapacityUnits: 1,
WriteCapacityUnits: 1
}
};
dynamodb.createTable(params, function(err, data) {
if (err) print(err);
else print(data);
});
However if I add the second attribute to the KeySchema, it works fine. Below a the working table:
var params = {
TableName: 'table_name',
KeySchema: [
{
AttributeName: 'hash_key_attribute_name',
KeyType: 'HASH'
},
{
AttributeName: 'attribute_name_1',
KeyType: 'RANGE'
}
],
AttributeDefinitions: [
{
AttributeName: 'hash_key_attribute_name',
AttributeType: 'S'
},
{
AttributeName: 'attribute_name_1',
AttributeType: 'S'
}
],
ProvisionedThroughput: {
ReadCapacityUnits: 1,
WriteCapacityUnits: 1
}
};
dynamodb.createTable(params, function(err, data) {
if (err) print(err);
else print(data);
});
I don’t want to add the range to key schema. Any idea how to fix it?
TL;DR Don't include any non-key attribute definitions in AttributeDefinitions.
DynamoDB is schemaless (except the key schema)
That is to say, you do need to specify the key schema (attribute name and type) when you create the table. Well, you don't need to specify any non-key attributes. You can put an item with any attribute later (must include the keys of course).
From the documentation page, the AttributeDefinitions is defined as:
An array of attributes that describe the key schema for the table and indexes.
When you create table, the AttributeDefinitions field is used for the hash and/or range keys only. In your first case, there is hash key only (number 1) while you provide 2 AttributeDefinitions. This is the root cause of the exception.
When you use non-key attribute in at "AttributeDefinitions", you must use it as index, otherwise it's against the way of DynamoDB to work. See the link.
So no need to put a non-key attribute in "AttributeDefinitions" if you're not gonna use it as index or primary key.
var params = {
TableName: 'table_name',
KeySchema: [ // The type of of schema. Must start with a HASH type, with an optional second RANGE.
{ // Required HASH type attribute
AttributeName: 'UserId',
KeyType: 'HASH',
},
{ // Optional RANGE key type for HASH + RANGE tables
AttributeName: 'RemindTime',
KeyType: 'RANGE',
}
],
AttributeDefinitions: [ // The names and types of all primary and index key attributes only
{
AttributeName: 'UserId',
AttributeType: 'S', // (S | N | B) for string, number, binary
},
{
AttributeName: 'RemindTime',
AttributeType: 'S', // (S | N | B) for string, number, binary
},
{
AttributeName: 'AlarmId',
AttributeType: 'S', // (S | N | B) for string, number, binary
},
// ... more attributes ...
],
ProvisionedThroughput: { // required provisioned throughput for the table
ReadCapacityUnits: 1,
WriteCapacityUnits: 1,
},
LocalSecondaryIndexes: [ // optional (list of LocalSecondaryIndex)
{
IndexName: 'index_UserId_AlarmId',
KeySchema: [
{ // Required HASH type attribute - must match the table's HASH key attribute name
AttributeName: 'UserId',
KeyType: 'HASH',
},
{ // alternate RANGE key attribute for the secondary index
AttributeName: 'AlarmId',
KeyType: 'RANGE',
}
],
Projection: { // required
ProjectionType: 'ALL', // (ALL | KEYS_ONLY | INCLUDE)
},
},
// ... more local secondary indexes ...
],
};
dynamodb.createTable(params, function(err, data) {
if (err) ppJson(err); // an error occurred
else ppJson(data); // successful response
});```
Declare attributes in AttrubuteDefinitions only if you are going to use the attribute in KeySchema
OR
when those attributes are going to be used in GlobalSecondaryIndexes or LocalSecondaryIndexes
For anybody using yaml files:
Example 1:
Lets say you have 3 attributes -> id, status, createdAt.
Here id is the KeySchema
AuctionsTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: AuctionsTable
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: id
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
Example2:
For the same attributes(ie. id, status and createdAt) if you have GlobalSecondaryIndexes or LocalSecondaryIndexes as well, then your yaml file looks like:
AuctionsTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: AuctionsTable-${self:provider.stage}
BillingMode: PAY_PER_REQUEST
AttributeDefinitions:
- AttributeName: id
AttributeType: S
- AttributeName: status
AttributeType: S
- AttributeName: endingAt
AttributeType: S
KeySchema:
- AttributeName: id
KeyType: HASH
GlobalSecondaryIndexes:
- IndexName: statusAndEndDate
KeySchema:
- AttributeName: status
KeyType: HASH
- AttributeName: endingAt
KeyType: RANGE
Projection:
ProjectionType: ALL
We have included status and createdId in AttributeDefinitions only because we have a GlobalSecondaryIndex which uses the aforementioned attributes.
Reason: DynamoDB only cares about the Primary Key, GlobalSecondaryIndex and LocalSecondaryIndex. You don't need to specify any other types of attributes which are not part of the above mentioned trio.
DynamoDB is only concerned with Primary Key, GlobalSecondaryIndex and LocalSecondaryIndex for partitioning. It doesn't care what other attributes you have for an item.
I also had this problem and I'll post here what went wrong for me in case it helps someone else.
In my CreateTableRequest, I had an empty array for the GlobalSecondaryIndexes.
CreateTableRequest createTableRequest = new CreateTableRequest
{
TableName = TableName,
ProvisionedThroughput = new ProvisionedThroughput { ReadCapacityUnits = 2, WriteCapacityUnits = 2 },
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement
{
AttributeName = "Field1",
KeyType = KeyType.HASH
},
new KeySchemaElement
{
AttributeName = "Field2",
KeyType = KeyType.RANGE
}
},
AttributeDefinitions = new List<AttributeDefinition>()
{
new AttributeDefinition
{
AttributeName = "Field1",
AttributeType = ScalarAttributeType.S
},
new AttributeDefinition
{
AttributeName = "Field2",
AttributeType = ScalarAttributeType.S
}
},
//GlobalSecondaryIndexes = new List<GlobalSecondaryIndex>
//{
//}
};
Commenting out these lines in the table creation solved my problem. So I guess the list has to be null, not empty.
Do not include all the Key values in the --attribute-definitions and --key-schema. Only include the HASH and RANGE keys in these while creating table.
When you are inserting an item into dynamo, it will accept other keys too that were no defined in the above attributes/schema.
for example:
Creating table:
aws dynamodb create-table \
--table-name Orders \
--attribute-definitions \
AttributeName=id,AttributeType=S \
AttributeName=sid,AttributeType=S \
--key-schema \
AttributeName=id,KeyType=HASH \
AttributeName=sid,KeyType=RANGE \
--provisioned-throughput \
ReadCapacityUnits=5,WriteCapacityUnits=5 \
--endpoint-url=http://localhost:4566
and now you can insert an item containing other keys too, just id and sid have to be present in the item