Get State By Linearid corda version 3.1 - corda

I am trying to get a state by lineaId by getting a empty list:
my code:
val queryCriteria = QueryCriteria.LinearStateQueryCriteria(uuid = listOf(linearId.id))
val customerAuditor = serviceHub.vaultService.queryBy<CustomerAuditor>(queryCriteria).states.single()
Error:
java.util.NoSuchElementException: List is empty.
at kotlin.collections.CollectionsKt___CollectionsKt.single(_Collections.kt:472) ~[kotlin-stdlib-1.1.60.jar:1.1.60-release-55 (1.1.60)]
at com.hsbc.auditorFlow.updateCustomerInit$Initiator.call(UpdateCustomerAuditorInit.kt:59) ~[cordapp-customer-0.1.jar:?]
at com.hsbc.auditorFlow.updateCustomerInit$Initiator.call(UpdateCustomerAuditorInit.kt:31) ~[cordapp-customer-0.1.jar:?]
at net.corda.node.services.statemachine.FlowStateMachineImpl.run(FlowStateMachineImpl.kt:96) [corda-node-3.1-corda.jar:?]
at net.corda.node.services.statemachine.FlowStateMachineImpl.run(FlowStateMachineImpl.kt:44) [corda-node-3.1-corda.jar:?]
at co.paralleluniverse.fibers.Fiber.run1(Fiber.java:1092) [quasar-core-0.7.9-jdk8.jar:0.7.9]
at co.paralleluniverse.fibers.Fiber.exec(Fiber.java:788) [quasar-core-0.7.9-jdk8.jar:0.7.9]
at co.paralleluniverse.fibers.RunnableFiberTask.doExec(RunnableFiberTask.java:100) [quasar-core-0.7.9-jdk8.jar:0.7.9]
at co.paralleluniverse.fibers.RunnableFiberTask.run(RunnableFiberTask.java:91) [quasar-core-0.7.9-jdk8.jar:0.7.9]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_144]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_144]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_144]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_144]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_144]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_144]
at net.corda.node.utilities.AffinityExecutor$ServiceAffinityExecutor$1$thread$1.run(AffinityExecutor.kt:62) [corda-node-3.1-corda.jar:?]
When i do get all states the state is present:
val generalCriteria = QueryCriteria.VaultQueryCriteria(Vault.StateStatus.UNCONSUMED)
var pageNumber = DEFAULT_PAGE_NUM
val states = mutableListOf<StateAndRef<ContractState>>()
do {
val pageSpec = PageSpecification(pageSize = 200, pageNumber = pageNumber)
val results = services.vaultQueryByWithPagingSpec(CustomerAuditor::class.java,generalCriteria, pageSpec)
states.addAll(results.states);
pageNumber++
} while ((pageSpec.pageSize * pageSpec.pageNumber) <= results.totalStatesAvailable)
return states;
Output of this:
[
{
"state": {
"data": {
"linearId": {
"externalId": null,
"id": "d7b8331d-6a1a-408a-aff7-5c823e91c6e3"
}
},
"contract": "com.hsbc.contract.PSContract",
"encumbrance": null,
"constraint": {}
},
"ref": {
"txhash": {
"bytes": "FbMJYu2K1lrqHLK4rkhOogn5r/u7iAa26AobmrtWDRY=",
"size": 32,
"offset": 0
},
"index": 0
}
}
]
(i have removed the participants and notary tag as they were very big , i had 3 participants)

I always use this:
inline fun <reified T : LinearState> getLinearStateById(
linearId: UniqueIdentifier,
services: ServiceHub
): StateAndRef<T>? {
val query = QueryCriteria.LinearStateQueryCriteria(linearId = listOf(linearId))
return services.vaultService.queryBy<T>(query).states.singleOrNull()
}

I faced this. Just check if the Unconsumed State vault which you are trying to access is present Participants list. If it is not present than it will not be queried and thus list will always be empty. Hope this helps :)

In Java I did this way...!
#nithesh / #jayant let me know if it works..
UniqueIdentifier linearId = UniqueIdentifier.Companion.fromString(linearIdAsString);
QueryCriteria queryCriteria = new QueryCriteria.LinearStateQueryCriteria(
null, // List of Parties for the state
ImmutableList.of(linearId),
Vault.StateStatus.UNCONSUMED,
null // Set of State types
);
List<StateAndRef<YourState>> stateAndRefs = getServiceHub().getVaultService().queryBy(YourState.class, queryCriteria).getStates();
YourState yourState = stateAndRefs.get(0).getState().getData();

Related

How to serialize json in newsofton on recurring datatable value in ASP.NET Core

So I've been trying to serialize a json that I have so that the defining value I get from my dataset is a "key" in my Json array and that all the values are within that key if that makes sense.
I'm new to this and I haven't found any other post that would help me with this so hopefully one of you knows the answer.
So the code that I'm using to try and achieve this is as following:
string query = #"
select Country.Name, City.Name from world.country
inner join world.city on world.country.Code = world.city.CountryCode
where CountryCode = #CountryCode";
DataSet dataSet = new DataSet("dataSet");
dataSet.Namespace = "NetFrameWork";
DataTable table = new DataTable();
dataSet.Tables.Add(table);
string sqlDataSource = _configuration.GetConnectionString("WorldAppConnection");
MySqlDataReader myReader;
using (MySqlConnection mySqlConnection = new MySqlConnection(sqlDataSource))
{
mySqlConnection.Open();
using (MySqlCommand mySqlCommand = new MySqlCommand(query, mySqlConnection))
{
mySqlCommand.Parameters.AddWithValue("#CountryCode", CountryCode);
myReader = mySqlCommand.ExecuteReader();
table.Load(myReader);
myReader.Close();
mySqlConnection.Close();
}
}
dataSet.AcceptChanges();
string json = JsonConvert.SerializeObject(dataSet, Formatting.Indented);
return json;
}
Now if I perform this query in SQL a small snippet of what the result is:
France | Paris
France | Marseille
France | Lyon
France | Toulouse
...
Now what this code results in as a Json is:
{
"Table1": [
{
"Name": "France",
"Name1": "Paris"
},
{
"Name": "France",
"Name1": "Marseille"
},
{
"Name": "France",
"Name1": "Lyon"
},
{
"Name": "France",
"Name1": "Toulouse"
},
{
"Name": "France",
"Name1": "Nice"
},
{
"Name": "France",
"Name1": "Nantes"
},
{
"Name": "France",
"Name1": "Strasbourg"
}
]
}
As you can see it keeps on repeating the France field. However the result I would want to have is something like:
{
"Table1": [
{
"France":[
{
"Name1": "Paris"
},
{
"Name1": "Marseille"
},
{
"Name1": "Lyon"
},
{
"Name1": "Toulouse"
},
{
"Name1": "Nice"
},
{
"Name1": "Nantes"
},
{
"Name1": "Strasbourg"
}
]
}
]
}
Is there any way to achieve this in any way?
Thanks in advance.
Before serialize the object, you can use LINQ to DataSet to group the result by the Name property, refer to the following sample code:
Create CountryViewModel class:
public class CountryViewModel
{
public string Name1 { get; set; }
}
Then use the following code (My application name is "Application1", you can change it to yours):
//group the country by the name,
var linqresult = dataSet.Tables[0].AsEnumerable()
.GroupBy(c => c.Field<string>("Name"))
.Select(c => new Dictionary<string, List<WebApplication1.Models.CountryViewModel>> {
{
(string)c.Key,
c.Select(i => new WebApplication1.Models.CountryViewModel() { Name1 = (string)i["Name1"] }).ToList()
}}).ToList();
var newdic = new Dictionary<string, List<Dictionary<string, List<WebApplication1.Models.CountryViewModel>>>>();
newdic.Add("Table1", linqresult);
var linqjson = JsonConvert.SerializeObject(newdic, Formatting.Indented);
The result is like this:
try this
dataSet.AcceptChanges();
//or this
string json = JsonConvert.SerializeObject(dataSet, Formatting.Indented);
var jsonParsed = JObject.Parse(json);
//or better
var jsonParsed = JObject.FromObject(dataSet);
var result = new
{
Table1 = jsonParsed["Table1"].GroupBy(x => x["Name"])
.Select(x => new Dictionary<string, string[]> { { (string)x.Key,
x.Select(i => (string)i["Name1"]).ToArray() }})
};
result
{
"Table1": [
{
"France": [
"Paris",
"Marseille",
"Lyon",
"Toulouse",
"Nice",
"Nantes",
"Strasbourg"
]
}
]
}

InvalidParameterType: Expected params.ExpressionAttributeValues[':et1'].N to be a string

Here is my code that I'm using for making queries:
var scanParams = {
TableName : 'xxxx',
FilterExpression : '( (event = :e0) AND (event = :e1 AND eventTime > :et1 AND eventTime < :et2) )',
ExpressionAttributeValues: {
':e0': { S: 'ME 21' },
':e1': { S: 'ME 21' },
':et1': { N: 1509267218 },
':et2': { N: 1509353618 }
},
ProjectionExpression: "event, customer_id, visitor",
};
In configuration of the respective dynamodb table it seems like I've added Nummber for eventTime column.
Here is the error:
error happened { MultipleValidationErrors: There were 2 validation errors:
* InvalidParameterType: Expected params.ExpressionAttributeValues[':et1'].N to be a string
* InvalidParameterType: Expected params.ExpressionAttributeValues[':et2'].N to be a string
at ParamValidator.validate (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/param_validator.js:40:28)
at Request.VALIDATE_PARAMETERS (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/event_listeners.js:125:42)
at Request.callListeners (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/sequential_executor.js:105:20)
at callNextListener (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/sequential_executor.js:95:12)
at /home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/event_listeners.js:85:9
at finish (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/config.js:315:7)
at /home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/config.js:333:9
at SharedIniFileCredentials.get (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/credentials.js:126:7)
at getAsyncCredentials (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/config.js:327:24)
at Config.getCredentials (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/config.js:347:9)
at Request.VALIDATE_CREDENTIALS (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/event_listeners.js:80:26)
at Request.callListeners (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/sequential_executor.js:101:18)
at Request.emit (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/sequential_executor.js:77:10)
at Request.emit (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/request.js:683:14)
at Request.transition (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/request.js:22:10)
at AcceptorStateMachine.runTo (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/state_machine.js:14:12)
at Request.runTo (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/request.js:403:15)
at /home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/request.js:791:12
at Request.promise (/home/jahidul/workspace/backstage-opticon/node_modules/aws-sdk/lib/request.js:777:12)
at DynamoDBService.scanItem (/home/jahidul/workspace/backstage-opticon/shared/services/dynamodb/dynamodb.service.ts:52:39)
at /home/jahidul/workspace/backstage-opticon/job-scripts/dyno-test.js:57:12
at sailsReady (/home/jahidul/workspace/backstage-opticon/node_modules/sails/lib/app/lift.js:49:12)
at /home/jahidul/workspace/backstage-opticon/node_modules/async/lib/async.js:251:17
at /home/jahidul/workspace/backstage-opticon/node_modules/async/lib/async.js:154:25
at /home/jahidul/workspace/backstage-opticon/node_modules/async/lib/async.js:248:21
at /home/jahidul/workspace/backstage-opticon/node_modules/async/lib/async.js:612:34
Any idea? Thanks in advance.
Use the parameter below:
var scanParams = {
TableName : 'xxxx',
FilterExpression : '( (event = :e0) AND (event = :e1 AND eventTime > :et1 AND eventTime < :et2) )',
ExpressionAttributeValues: {
':e0': { "S": "ME 21" },
':e1': { "S": "ME 21" },
':et1': { "N": "1509267218" },
':et2': { "N": "1509353618" }
},
ProjectionExpression: "event, customer_id, visitor",
};
In Dynamo DB, the type (Number) is represented by "N" and the value must be in string format "1509353618". Hope this will resolve your problem.
Actually, you don't need to include the data type for et1 and et2 variables. The DynamoDB API should be able to automatically interpret it as NUMBER. Similarly, for variables e0 and e1.
Please try the below code.
var scanParams = {
TableName : 'xxxx',
FilterExpression : '( (event = :e0) AND (event = :e1 AND eventTime > :et1 AND eventTime < :et2) )',
ExpressionAttributeValues: {
':e0': 'ME 21',
':e1': 'ME 21',
':et1': 1509267218,
':et2': 1509353618
},
ProjectionExpression: "event, customer_id, visitor",
};
Full Tested Code:-
You may need to change the table name and key attribute names in the below code.
var AWS = require("aws-sdk");
var creds = new AWS.Credentials('akid', 'secret', 'session');
AWS.config.update({
region: "us-west-2",
endpoint: "http://localhost:8000",
credentials: creds
});
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: "table4",
FilterExpression: "userid = :user_id1 AND (userid = :user_id2 AND ts > :et1 AND ts < :et2)",
ExpressionAttributeValues: { ":user_id1": 'ME 21',
":user_id2": 'ME 21',
":et1" : '1509267216',
":et2" : 1509353618,
}
};
docClient.scan(params, onScan);
var count = 0;
function onScan(err, data) {
if (err) {
console.error("Unable to scan the table. Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("Scan succeeded.");
data.Items.forEach(function (itemdata) {
console.log("Item :", ++count, JSON.stringify(itemdata));
});
// continue scanning if we have more items
if (typeof data.LastEvaluatedKey != "undefined") {
console.log("Scanning for more...");
params.ExclusiveStartKey = data.LastEvaluatedKey;
docClient.scan(params, onScan);
}
}
}

How to exceed the limit of scan data for 1mb in dynamodb

I am using dynamodb with nodejs, I am having 3000 records, and I am writing 60+ segments in the code, each segment scanning the data of 1mb and displaying the results to 60+ segments of 1 mb limit. So please provide the solution how to get the 3000 records scan in single step that means in one segment. Please provide the solution quickly because i am strucked in the middle of my project. Please help me. Below is my code.
var AWS = require("aws-sdk");
var async = require("async");
AWS.config.update({
region: "---",
endpoint: "-----------",
accessKeyId: "-----------------",
secretAccessKey:"----------"
});
var db = new AWS.DynamoDB.DocumentClient()
var table = "rets_property_all";
var pstart =new Date () .getTime ();
async.parallel({
0 : function(callback){
db.scan ({TableName: table,
ProjectionExpression: "#cityname,ListingKey ",
FilterExpression: "#cityname = :v_id",
ExpressionAttributeNames: {
"#cityname": "CityName",
},
ExpressionAttributeValues: {":v_id" : 'BALTIMORE'},
TotalSegments: 63,
Segment: 0//by the worker who has been called
},function (err , res) {
callback (null , res.Items);
});
},
1 : function(callback){
db.scan ({TableName: table,
ProjectionExpression: "#cityname,ListingKey ",
FilterExpression: "#cityname = :v_id",
ExpressionAttributeNames: {
"#cityname": "CityName",
},
ExpressionAttributeValues: {":v_id" : 'BALTIMORE'},
TotalSegments: 63,
Segment: 1//by the worker who has been called
}, function (err , res) {
callback (null , res.Items);
});
},
2 : function(callback){
db.scan ({TableName: table,
ProjectionExpression: "#cityname,ListingKey ",
FilterExpression: "#cityname = :v_id",
ExpressionAttributeNames: {
"#cityname": "CityName",
},
ExpressionAttributeValues: {":v_id" : 'BALTIMORE'},
TotalSegments: 63,
Segment: 2//by the worker who has been called
}, function (err , res) {
callback (null , res.Items);
});
},
--------
---------
------
62 : function(callback){
db.scan ({TableName: table,
ProjectionExpression: "#cityname,ListingKey ",
FilterExpression: "#cityname = :v_id",
ExpressionAttributeNames: {
"#cityname": "CityName",
},
ExpressionAttributeValues: {":v_id" : 'BALTIMORE'},
TotalSegments: 63,
Segment: 62//by the worker who has been called
}, function (err , res) {
callback (null , res.Items);
});
},
},function(err,results){
if (err) {throw err; }
var pend = new Date () .getTime ();
console.log (results);
})
Actually, there is no way to override the 1 MB limit of scan. This is the DynamoDB design restriction and can't be overridden by any API. You may need to understand the limitation of the architecture or AWS service design.
You can use LastEvaluatedKey on the subsequent scans to start from where the previous scan ended.
The result set from a Scan is limited to 1 MB per call. You can use
the LastEvaluatedKey from the scan response to retrieve more results.
The use case is unclear why you wanted to get all 3000 records in one scan. Even, if you have a specific use case, simply it can't be achieved on DynamoDB scan.
Even, in relation database, you get the cursor and go through the iterations to get all the records sequentially. Similarly, in DynamoDB you have to use Scan recursively until LastEvaluatedKey is null.
Sample code for single scan recursively until LastEvaluatedKey is null:-
var docClient = new AWS.DynamoDB.DocumentClient();
var params = {
TableName: table,
ProjectionExpression: "#cityname,ListingKey ",
FilterExpression: "#cityname = :v_id",
ExpressionAttributeNames: {
"#cityname": "CityName",
},
ExpressionAttributeValues: { ":v_id": 'BALTIMORE' }
};
docClient.scan(params, onScan);
function onScan(err, data) {
if (err) {
console.error("Unable to scan the table. Error JSON:", JSON.stringify(err, null, 2));
} else {
// print all the movies
console.log("Scan succeeded.");
data.Items.forEach(function (movie) {
console.log("Item :", JSON.stringify(movie));
});
// continue scanning if we have more movies
if (typeof data.LastEvaluatedKey != "undefined") {
console.log("Scanning for more...");
params.ExclusiveStartKey = data.LastEvaluatedKey;
docClient.scan(params, onScan);
}
}
}

DynamoDB Query confusion

I have the following table creation code for DynamoDB (C#):
client.CreateTable(new CreateTableRequest
{
TableName = tableName,
ProvisionedThroughput = new ProvisionedThroughput { ReadCapacityUnits = 20, WriteCapacityUnits = 10 },
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement
{
AttributeName = "RID",
KeyType = KeyType.HASH
}
}
,
AttributeDefinitions = new List<AttributeDefinition>
{
new AttributeDefinition {
AttributeName = "RID",
AttributeType = ScalarAttributeType.N
}
}
});
the data that gets populated into this table comes from this JSON:
[
{"RID": 208649, "CLI_RID": 935476, "PRT_DT": "VAL_AA", "DISTR": "INTERNAL"},
{"RID": 217427, "CLI_RID": 1009561, "PRT_DT": "VAL_BB", "DISTR": "INTERNAL", "STATE": "VAL_BD"},
{"RID": 223331, "CLI_RID": 1325477, "PRT_DT": "VAL_CB", "DISTR": "", "STATE": "VAL_CD", "FNAME": "VAL_CE", "START": "VAL_CF"},
{"RID": 227717, "CLI_RID": 1023478, "PRT_DT": "VAL_DB", "DISTR": "", "STATE": "VAL_DD"}
{"RID": 217462, "CLI_RID": 1009561, "PRT_DT": "VAL_BB", "DISTR": "", "STATE": "VAL_BD"},
{"RID": 218679, "CLI_RID": 1009561, "PRT_DT": "VAL_AA", "DISTR": "INTERNAL"},
{"RID": 222376, "CLI_RID": 1263978, "PRT_DT": "VAL_DB", "DISTR": "", "STATE": "VAL_DD"}
]
How would I Query or Filter for all records containing 1009561 in column "CLI_RID" and column "DISTR" <> "INTERNAL"?
There will be about 15 mil records in this DynamoDB table.
Is my table defined correctly for this query/filter?
Updated table creation:
// CLI_RIDIndex
var cli_ridIndex = new GlobalSecondaryIndex
{
IndexName = "cli_ridIndex",
ProvisionedThroughput = new ProvisionedThroughput
{
ReadCapacityUnits = 20,
WriteCapacityUnits = 10
},
KeySchema = {
new KeySchemaElement
{
AttributeName = "CLI_RID", KeyType = "HASH"
}
},
Projection = new Projection { ProjectionType = "ALL" }
};
client.CreateTable(new CreateTableRequest
{
TableName = tableName,
ProvisionedThroughput = new ProvisionedThroughput { ReadCapacityUnits = 20, WriteCapacityUnits = 10 },
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement
{
AttributeName = "RID",
KeyType = KeyType.HASH // Partiton Key (Unique)
},
new KeySchemaElement
{
AttributeName = "CLI_RID",
KeyType = KeyType.RANGE // Sort Key
}
}
,
AttributeDefinitions = new List<AttributeDefinition>
{
new AttributeDefinition {
AttributeName = "RID",
AttributeType = ScalarAttributeType.N
},
new AttributeDefinition {
AttributeName = "CLI_RID",
AttributeType = ScalarAttributeType.N
}
},
GlobalSecondaryIndexes = { cli_ridIndex }
});
But when attempting to query it,
var request = new QueryRequest
{
TableName = "TNAArchive",
KeyConditionExpression = "CLI_RID = :v_cli_rid",
ExpressionAttributeValues = new Dictionary<string, AttributeValue> {
{":v_cli_rid", new AttributeValue { S = "905466" }}}
};
var response = client.Query(request);
I get this error:
Query condition missed key schema element: RID
I guess I'm not really understanding how to do this.
According to your table structure, you won't be able to perform Query on the table but you have to Scan it which we need to avoid.
To perform Query you need to modify certain things
1) Add a Global Secondary Index(GSI) with the field CLI_RID as Hash
2) Now You Query GSI by passing CLI_RID and add query filter with condition <> your value
Here is the reference link.
Edit: Your Main table structure will be same no need to change, but you need to add one more GSI with Hash key as CLI_RID and project required table attribute.
Now you need to query your GSI instead of the table with a hash key as CLI_RID, you don't need to pass RID here.
here is the link on how to add GSI on table.
If CLI_RID is not present in a master table then that record will not be reflected in the GSI so no need to worry.
Edit 2: Just add (IndexName = NameOFYourIndex) attribute while querying and everything should work.
var request = new QueryRequest
{
TableName = "TNAArchive",
IndexName = "NameOfYourIndex",
KeyConditionExpression = "CLI_RID = :v_cli_rid",
ExpressionAttributeValues = new Dictionary<string, AttributeValue> {
{":v_cli_rid", new AttributeValue { S = "905466" }}}
};
Hope that helps

Error InvalidParameterType: Expected params.Item['pid'] to be a structure in DynamoDB

Note: all these are happening on the local instance of DynamoDB.
This is the code that I've used to create a table from the DynamoDB Shell:
var params = {
TableName: "TABLE-NAME",
KeySchema: [
{ AttributeName: "pid",
KeyType: "HASH"
}
],
AttributeDefinitions: [
{ AttributeName: "pid",
AttributeType: "S"
}
],
ProvisionedThroughput: {
ReadCapacityUnits: 1,
WriteCapacityUnits: 1
}
};
dynamodb.createTable(params, function(err, data) {
if (err)
console.log(JSON.stringify(err, null, 2));
else
console.log(JSON.stringify(data, null, 2));
});
This is the function that is being called to add elements into the DB (in node.js):
function(request, response) {
params = {
TableName: 'TABLE-NAME',
Item: {
pid: 'abc123'
}
};
console.log(params);
dynamodb.putItem(params, function(err, data) {
if (err)
console.log(JSON.stringify(err, null, 2));
else
console.log(JSON.stringify(data, null, 2));
});
}
The output that I get is:
{ TableName: 'TABLE-NAME',
Item: { pid: 'abc123' } } // THIS IS PARAMS
{
"message": "There were 7 validation errors:\n* InvalidParameterType: Expected params.Item['pid'] to be a structure\n* UnexpectedParameter: Unexpected key '0' found in params.Item['pid']\n* UnexpectedParameter: Unexpected key '1' found in params.Item['pid']\n* UnexpectedParameter: Unexpected key '2' found in params.Item['pid']\n* UnexpectedParameter: Unexpected key '3' found in params.Item['pid']\n* UnexpectedParameter: Unexpected key '4' found in params.Item['pid']\n* UnexpectedParameter: Unexpected key '5' found in params.Item['pid']",
"code": "MultipleValidationErrors",
"errors": [
{
"message": "Expected params.Item['pid'] to be a structure",
"code": "InvalidParameterType",
"time": "2015-11-26T15:51:33.932Z"
},
{
"message": "Unexpected key '0' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.933Z"
},
{
"message": "Unexpected key '1' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.933Z"
},
{
"message": "Unexpected key '2' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.933Z"
},
{
"message": "Unexpected key '3' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.933Z"
},
{
"message": "Unexpected key '4' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.934Z"
},
{
"message": "Unexpected key '5' found in params.Item['pid']",
"code": "UnexpectedParameter",
"time": "2015-11-26T15:51:33.934Z"
}
],
"time": "2015-11-26T15:51:33.944Z"
}
I don't understand why or how it is getting keys 0, 1, 2, 3, 4 and 5 when they aren't present on being printed in the previous line.
Also, how do I fix the error Expected params.Item['pid'] to be a structure? I have declared it as a string and am trying to store a string!
Other notes:
The same code that I've used in the function works just fine when I run it on the shell. I have also included the aws-sdk and have configured it as required:
var AWS = require('aws-sdk');
AWS.config.region = 'us-east-1';
AWS.config.endpoint = 'http://localhost:8000/'
var dynamodb = new AWS.DynamoDB();
The putItem() method on the AWS.DynamoDB class is expecting the params.Item object to be formatted as a AttributeValue representation. That means you would have to change this:
params = {
TableName: 'TABLE-NAME',
Item: {
pid: 'abc123'
}
};
Into this:
params = {
TableName: 'TABLE-NAME',
Item: {
pid: {
S: 'abc123'
}
}
};
If you want to use native javascript objects you should use the AWS.DynamoDB.DocumentClient class, that automatically marshals Javascript types onto DynamoDB AttributeValues like this:
String -> S
Number -> N
Boolean -> BOOL
null -> NULL
Array -> L
Object -> M
Buffer, File, Blob, ArrayBuffer, DataView, and JavaScript typed arrays -> B
It provides a put() method, that delegates to AWS.DynamoDB.putItem().
Note: This answer may no longer be valid as mentioned in multiple comments below.
The function that must be used to add records into the database from nodejs is put and not putItem which is used in the DynamoDB shell. Changing the above function to the following fixed it.
function(request, response) {
params = {
TableName: 'TABLE-NAME',
Item: {
pid: 'abc123'
}
};
console.log(params);
dynamodb.put(params, function(err, data) {
if (err)
console.log(JSON.stringify(err, null, 2));
else
console.log(JSON.stringify(data, null, 2));
});
}

Resources