DIfficulty adding a Global Secondary Index to an existing table in DynamoDB - amazon-dynamodb

Table: EmailMessages
EntityId - HashKey : String
Id - RangeKey : String
Name - String
Status - String
I would like to make a GlobalSecondaryIndex for Status
var request = new UpdateTableRequest
{
TableName = "EmailMessages",
GlobalSecondaryIndexUpdates = new List<GlobalSecondaryIndexUpdate>
{
new GlobalSecondaryIndexUpdate
{
Create = new CreateGlobalSecondaryIndexAction
{
IndexName = "GSI_EmailMessages_Status",
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement("Status", KeyType.HASH)
}
}
}
}
};
var client = DynamoDBManager.DBFactory.GetClient();
client.UpdateTable(request);
However the return error I get is a 500 error with no return text so I'm unsure on what I need to correct to make this work. I've dug through the documentation and I can't seem to find much help on creating a GSI for an existing table. Any help would be much appreciated

I managed to solve my issue. Turns out I needed a lot more code. I'm going to post my solution in case someone else runs into this similar issue since there doesn't seem to be many resources out there for Dynamo. Please note my read and write capacities are very low because this is for a dev environment. You might want to consider upping yours depending on your needs
var request = new UpdateTableRequest
{
TableName = "EmailMessage",
AttributeDefinitions = new List<AttributeDefinition>
{
new AttributeDefinition
{
AttributeName = "Status",
AttributeType = ScalarAttributeType.S
}
},
GlobalSecondaryIndexUpdates = new List<GlobalSecondaryIndexUpdate>
{
new GlobalSecondaryIndexUpdate
{
Create = new CreateGlobalSecondaryIndexAction
{
IndexName = "GSI_EmailMessage_Status",
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement("Status", KeyType.HASH)
},
Projection = new Projection
{
ProjectionType = ProjectionType.ALL
},
ProvisionedThroughput = new ProvisionedThroughput
{
ReadCapacityUnits = 4,
WriteCapacityUnits = 1,
}
}
}
}
};
var client = DynamoDBManager.DBFactory.GetClient();
client.UpdateTable(request);

Related

Get multiple records from a dynamo db table using Global Secondary Index

I have a dynamo db table CustomerOrders with following fields
Primary partition key: CustomerId (Number)
Primary sort key: DepartmentId (Number)
Order (Serialized Json String)
I would like to do a query on multiple customers in one request without using Sort Key (DepartmentId). So I created a Global Secondary Index on CustomerId and would like to use that to query just using the CustomerId. I see documentation only related to BatchGetItemAsync for running batch queries. I don't see a way to set the IndexName on a BatchGetItemRequest. How can that be done?
Below is my code segment so far:
public async Task<List<CustomerOrder>> GetOrdersAsync(List<int> customerIds)
{
var orders = new List<CustomerOrder>();
var tableKeys = new List<Dictionary<string, AttributeValue>>();
foreach (var x in customerIds)
{
tableKeys.Add(new Dictionary<string, AttributeValue> { { "CustomerId", new AttributeValue { N = x.ToString() } } });
}
var dynamoTable = $"CustomerOrders";
var keysAndAttributes = new KeysAndAttributes
{
AttributesToGet = new List<string> { "CustomerId", "DepartmentId", "Order" },
Keys = tableKeys
};
var request = new BatchGetItemRequest
{
ReturnConsumedCapacity = ReturnConsumedCapacity.INDEXES, // Not sure what this does
RequestItems = new Dictionary<string, KeysAndAttributes> { { dynamoTable, keysAndAttributes } }
};
BatchGetItemResponse result;
do
{
result = await dynamoDbClient.BatchGetItemAsync(request); // Exception gets thrown from here
var responses = result.Responses;
foreach (var tableName in responses.Keys)
{
var tableItems = responses[tableName];
foreach (var item in tableItems)
{
orders.Add(new CustomerOrder
{
CustomerId = int.Parse(item["CustomerId"].N),
DepartmentId = int.Parse(item["DepartmentId"].N),
Order = JsonConvert.DeserializeObject<Order>(item["Order"].S)
});
}
}
//  Set RequestItems to the result's UnprocessedKeys and reissue request
request.RequestItems = result.UnprocessedKeys;
} while (result.UnprocessedKeys.Count > 0);
return orders;
}
I am getting The provided key element does not match the schema error with the above code. Please help!
You can't "set the IndexName on a BatchGetItemRequest"
In fact, you can't GetItem() on a GSI/LSI either. GetItem() only works on the table.
And GetItem() always requires the full primary key.
With a partial key, you'd need to perform multiple Query(), one for each hash key.
The GSI isn't doing anything for you. Department as a sort key really isn't doing anything for you either since I assume customerId is unique.
A better structure might have been to have the table defined with only hash key for the primary key;

Getting error when a method is made for post request

When made I post request is made its giving internal server. Is the implementation of Flurl is fine or I am doing something wrong.
try
{
Models.PaymentPost paymentPost = new Models.PaymentPost();
paymentPost.Parts = new Models.Parts();
paymentPost.Parts.Specification = new Models.Specification();
paymentPost.Parts.Specification.CharacteristicsValue = new List<Models.CharacteristicsValue>();
paymentPost.Parts.Specification.CharacteristicsValue.Add(new Models.CharacteristicsValue { CharacteristicName = "Amount", Value = amount });
paymentPost.Parts.Specification.CharacteristicsValue.Add(new Models.CharacteristicsValue { CharacteristicName = "AccountReference", Value = accountId });
foreach (var item in extraParameters)
{
paymentPost.Parts.Specification.CharacteristicsValue.Add(new Models.CharacteristicsValue {
CharacteristicName = item.Key, Value = item.Value });
}
var paymentInJson = JsonConvert.SerializeObject(paymentPost);
var selfCareUrl = "http://svdt5kubmas01.safari/auth/processPaymentAPI/v1/processPayment";
var fUrl = new Flurl.Url(selfCareUrl);
fUrl.WithBasicAuth("***", "********");
fUrl.WithHeader("X-Source-System", "POS");
fUrl.WithHeader("X-Route-ID", "STKPush");
fUrl.WithHeader("Content-Type", "application/json");
fUrl.WithHeader("X-Correlation-ConversationID", "87646eaa-2605-405e-967c-56e8002b5");
fUrl.WithHeader("X-Route-Timestamp", "150935");
fUrl.WithHeader("X-Source-Operator", " ");
var response = await clientFactory.Get(fUrl).Request().PostJsonAsync(paymentInJson).ReceiveJson<IEnumerable<IF.Models.PaymentPost>>();
return response;
}
catch (FlurlHttpException ex)
{
dynamic d = ex.GetResponseJsonAsync();
//string s = ex.GetResponseStringAsync();
return d;
}
You don't need to do this:
var paymentInJson = JsonConvert.SerializeObject(paymentPost);
PostJsonAsync just takes a regular object and serializes it to JSON for you. Here you're effectively double-serializing it and the server is probably confused by that format.
You're also doing a lot of other things that Flurl can do for you, such as creating those Url and client objects explicitly. Although that's not causing errors, this is how Flurl is typically used:
var response = await selfCareUrl
.WithBasicAuth(...)
.WithHeader(...)
...
.PostJsonAsync(paymentPost)
.ReceiveJson<List<IF.Models.PaymentPost>>();

Cosmosdb store procedure get less documents than real

I am preparing store procedure on cosmosdb by Javascript, however, it gets less documents than the real number of documents in collection.
The sproc is called by C#, C# pass a parameter "transmitterMMSI" which is also the partition key of this collection.
First, the following query is executed in sproc:
var query = 'SELECT COUNT(1) AS Num FROM AISData a WHERE a.TransmitterMMSI="' + transmitterMMSI + '"';
The result is output in response, and the value is 5761, which is the same as the real number of documents in collection.
However, when I change the query to the following:
var query = 'SELECT * FROM AISData a WHERE a.TransmitterMMSI="' + transmitterMMSI + '"';
The documents.length is output as 5574, which is smaller than the real number.
I have already changed the pageSize: -1, which should mean unlimited.
I did some search with google and stack overflow, it seems that continuation can be help. However, I tried some examples, and they don't work.
Anyone familiar with this can help?
The following list the scripts.
The sproc js script is here, which is also the file "DownSampling.js" used in the C# code:
function DownSampling(transmitterMMSI, interval) {
var context = getContext();
var collection = context.getCollection();
var response = context.getResponse();
var receiverTime;
var tempTime;
var groupKey;
var aggGroup = new Object();
var query = 'SELECT * FROM AISData a WHERE a.TransmitterMMSI="' + transmitterMMSI + '"';
var accept = collection.queryDocuments(collection.getSelfLink(), query, { pageSize: -1},
function (err, documents, responseOptions) {
if (err) throw new Error("Error" + err.message);
// Find the smallest deviation comparting to IntervalTime in each group
for (i = 0; i < documents.length; i++) {
receiverTime = Date.parse(documents[i].ReceiverTime);
tempTime = receiverTime / 1000 + interval / 2;
documents[i].IntervalTime = (tempTime - tempTime % interval) * 1000;
documents[i].Deviation = Math.abs(receiverTime - documents[i].IntervalTime);
// Generate a group key for each group, combinated of TransmitterMMSI and IntervalTime
groupKey = documents[i].IntervalTime.toString();
if (typeof aggGroup[groupKey] === 'undefined' || aggGroup[groupKey] > documents[i].Deviation) {
aggGroup[groupKey] = documents[i].Deviation;
}
}
// Tag the downsampling
for (i = 0; i < documents.length; i++) {
groupKey = documents[i].IntervalTime;
if (aggGroup[groupKey] == documents[i].Deviation) {
documents[i].DownSamplingTag = 1;
} else {
documents[i].DownSamplingTag = 0;
}
// Remove the items that are not used
delete documents[i].IntervalTime;
delete documents[i].Deviation;
// Replace the document
var acceptDoc = collection.replaceDocument(documents[i]._self, documents[i], {},
function (errDoc, docReplaced) {
if (errDoc) {
throw new Error("Update documents error:" + errDoc.message);
}
});
if (!acceptDoc) {
throw "Update documents not accepted, abort ";
}
}
response.setBody(documents.length);
});
if (!accept) {
throw new Error("The stored procedure timed out.");
}
}
And the C# code is here:
private async Task DownSampling()
{
Database database = this.client.CreateDatabaseQuery().Where(db => db.Id == DatabaseId).ToArray().FirstOrDefault();
DocumentCollection collection = this.client.CreateDocumentCollectionQuery(database.SelfLink).Where(c => c.Id == AISTestCollectionId).ToArray().FirstOrDefault();
string scriptFileName = #"..\..\StoredProcedures\DownSampling.js";
string scriptId = Path.GetFileNameWithoutExtension(scriptFileName);
var sproc = new StoredProcedure
{
Id = scriptId,
Body = File.ReadAllText(scriptFileName)
};
await TryDeleteStoredProcedure(collection.SelfLink, sproc.Id);
sproc = await this.client.CreateStoredProcedureAsync(collection.SelfLink, sproc);
IQueryable<dynamic> query = this.client.CreateDocumentQuery(
UriFactory.CreateDocumentCollectionUri(DatabaseId, AISTestCollectionId),
new SqlQuerySpec()
{
//QueryText = "SELECT a.TransmitterMMSI FROM " + AISTestCollectionId + " a",
QueryText = "SELECT a.TransmitterMMSI FROM " + AISTestCollectionId + " a WHERE a.TransmitterMMSI=\"219633000\"",
}, new FeedOptions { MaxItemCount = -1, EnableCrossPartitionQuery = true, MaxDegreeOfParallelism = -1, MaxBufferedItemCount = -1 });
List<dynamic> transmitterMMSIList = query.ToList(); //TODO: Remove duplicates
Console.WriteLine("TransmitterMMSI count: {0}", transmitterMMSIList.Count());
HashSet<string> exist = new HashSet<string>();
foreach (var item in transmitterMMSIList)
{
//int transmitterMMSI = Int32.Parse(item.TransmitterMMSI.ToString());
string transmitterMMSI = item.TransmitterMMSI.ToString();
if (exist.Contains(transmitterMMSI))
{
continue;
}
exist.Add(transmitterMMSI);
Console.WriteLine("TransmitterMMSI: {0} is being processed.", transmitterMMSI);
var response = await this.client.ExecuteStoredProcedureAsync<string>(sproc.SelfLink,
new RequestOptions { PartitionKey = new PartitionKey(transmitterMMSI) }, transmitterMMSI, 30);
string s = response.Response;
Console.WriteLine("TransmitterMMSI: {0} is processed completely.", transmitterMMSI);
}
}
private async Task TryDeleteStoredProcedure(string collectionSelfLink, string sprocId)
{
StoredProcedure sproc = this.client.CreateStoredProcedureQuery(collectionSelfLink).Where(s => s.Id == sprocId).AsEnumerable().FirstOrDefault();
if (sproc != null)
{
await client.DeleteStoredProcedureAsync(sproc.SelfLink);
}
}
I tried to comment the 2 loops in the JS codes, only the documents.length output, while the response number is still less. However, I changed the query to SELECT a.id, the documents.length is correct. Looks like it is the continuation issue.
The sproc is probably timing out. To use a continuation token in these circumstances, you will need to return it to your C# calling code then make another call to the sproc passing in your token. If you show us your sproc code we can help more.
You can use a continuation token to make repeated calls to queryDocuments() from within the sproc without additional roundtrips to the client. Keep in mind that if you do this too many times your sproc will eventually timeout, though. In your case, it sounds like you're already very close to getting all of the documents you're seeking so maybe you will be OK.
Here is an example of using a continuation token within a sproc to query multiple pages of data:
function getManyThings() {
var collection = getContext().getCollection();
var query = {
query: 'SELECT r.id, r.FieldOne, r.FieldTwo FROM root r WHERE r.FieldThree="sought"'
};
var continuationToken;
getThings(continuationToken);
function getThings(continuationToken) {
var requestOptions = {
continuation: continuationToken,
pageSize: 1000 // Adjust this to suit your needs
};
var isAccepted = collection.queryDocuments(collection.getSelfLink(), query, requestOptions, function (err, feed, responseOptions) {
if (err) {
throw err;
}
for (var i = 0, len = feed.length; i < len; i++) {
var thing = feed[i];
// Do your logic on thing...
}
if (responseOptions.continuation) {
getThings(responseOptions.continuation);
}
else {
var response = getContext().getResponse();
response.setBody("RESULTS OF YOUR LOGIC");
}
});
if (!isAccepted) {
var response = getContext().getResponse();
response.setBody("Server rejected query - please narrow search criteria");
}
}
}

.net Querying a Global Secondary Index in DynamoDB via DynamoDBContext

I have a dynamoDB table with a schema as follows:
var request = new CreateTableRequest
{
TableName = tableName,
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement("CompanyId", KeyType.HASH),
new KeySchemaElement("Timestamp", KeyType.RANGE)
},
AttributeDefinitions = new List<AttributeDefinition>
{
new AttributeDefinition("CompanyId", ScalarAttributeType.S),
new AttributeDefinition("Timestamp", ScalarAttributeType.N),
new AttributeDefinition("UserId", ScalarAttributeType.S)
},
GlobalSecondaryIndexes = new List<GlobalSecondaryIndex>
{
new GlobalSecondaryIndex
{
IndexName = "UserIndex",
KeySchema = new List<KeySchemaElement>
{
new KeySchemaElement("UserId", KeyType.HASH),
new KeySchemaElement("Timestamp", KeyType.RANGE)
},
Projection = new Projection {ProjectionType = "ALL"},
ProvisionedThroughput = new ProvisionedThroughput(5, 6)
}
},
ProvisionedThroughput = new ProvisionedThroughput(5, 6)
};
I can query the primary key successfully as follows:
var client = new AmazonDynamoDBClient();
using (var context = new DynamoDBContext(client))
{
var sortKeyValues = new List<object>{minTimestamp};
result = await context.QueryAsync<AuditLogEntry>(companyId, QueryOperator.GreaterThanOrEqual, sortKeyValues,
new DynamoDBOperationConfig {OverrideTableName = TableName}).GetRemainingAsync();
}
And I can query the global secondary index without any constraint on the range key as follows:
var client = new AmazonDynamoDBClient();
using (var context = new DynamoDBContext(client))
{
result = await context.QueryAsync<AuditLogEntry>(userId, new DynamoDBOperationConfig {OverrideTableName = TableName, IndexName = indexName})
.GetRemainingAsync();
}
But when I try to query the index with a range key constraint:
var client = new AmazonDynamoDBClient();
using (var context = new DynamoDBContext(client))
{
var sortKeyValues = new List<object> {minTimestamp};
result = await context.QueryAsync<AuditLogEntry>(userId, QueryOperator.GreaterThan, sortKeyValues, new DynamoDBOperationConfig {OverrideTableName = TableName, IndexName = indexName}).GetRemainingAsync();
}
I get the following error:
Exception thrown: 'System.InvalidOperationException' in AWSSDK.DynamoDBv2.dll
Additional information: Local Secondary Index range key conditions are used but no index could be inferred from model. Specified index name = UserIndex
Googling this error hasn't thrown any light on the issue. The reference to Local Secondary Index has me confused because I'm using a Global index, but I just can't see what's wrong with my code.
I've been able to get the query working by querying directly on the AmazonDynamoDBClient rather than using DynamoDBContext, but I'd really like to understand what I'm doing wrong and be able to use DynamoDBContext.
Any ideas would be appreciated.
In your model definition for AuditLogEntry you need to decorate properties that are part of the global secondary index with attributes - [DynamoDBGlobalSecondaryIndexRangeKey] and or [DynamoDBGlobalSecondaryIndexHashKey]. Example below.
public class AuditLogEntry {
// other properties ...
[DynamoDBProperty("UserId")]
[DynamoDBGlobalSecondaryIndexHashKey("UserIndex")]
public string UserId { get; set; }
}

Retrieving CRM 4 entities with custom fields in custom workflow activity C#

I'm trying to retrieve all phone calls related to opportunity, which statecode isn't equal 1. Tried QueryByAttribute, QueryExpression and RetrieveMultipleRequest, but still has no solution.
Here some code i wrote.
IContextService contextService = (IContextService)executionContext.GetService(typeof(IContextService));
IWorkflowContext context = contextService.Context;
ICrmService crmService = context.CreateCrmService(true);
if (crmService != null)
{
QueryByAttribute query = new Microsoft.Crm.Sdk.Query.QueryByAttribute();
query.ColumnSet = new Microsoft.Crm.Sdk.Query.AllColumns();
query.EntityName = EntityName.phonecall.ToString();
query.Attributes = new string[] { "regardingobjectid" };
query.Values = new string[] { context.PrimaryEntityId.ToString() };
RetrieveMultipleRequest retrieve = new RetrieveMultipleRequest();
retrieve.Query = query;
retrieve.ReturnDynamicEntities = true;
RetrieveMultipleResponse retrieved = (RetrieveMultipleResponse)crmService.Execute(retrieve);
}
return ActivityExecutionStatus.Closed;
}
And almost same for QueryExpression
QueryExpression phCallsQuery = new QueryExpression();
ColumnSet cols = new ColumnSet(new string[] { "activityid", "regardingobjectid" });
phCallsQuery.EntityName = EntityName.phonecall.ToString();
phCallsQuery.ColumnSet = cols;
phCallsQuery.Criteria = new FilterExpression();
phCallsQuery.Criteria.FilterOperator = LogicalOperator.And;
phCallsQuery.Criteria.AddCondition("statuscode", ConditionOperator.NotEqual, "1");
phCallsQuery.Criteria.AddCondition("regardingobjectid", ConditionOperator.Equal, context.PrimaryEntityId.ToString();
I always get something like Soap exception or "Server was unable to proceed the request" when debugging.
To get exception details try to use following code:
RetrieveMultipleResponse retrieved = null;
try
{
retrieved = (RetrieveMultipleResponse)crmService.Execute(retrieve);
}
catch(SoapException se)
{
throw new Exception(se.Detail.InnerXml);
}

Resources