I have an error when I'm trying to use BulkExecutor to update one of the properties in CosmosDb. The error message is "Index was out of range. Must be non-negative and less than the size of the collection.
Parameter name: index"
Important point- I don't have partition key defined on my collection.
Here is my code:
SetUpdateOperation<string> player1NameUpdateOperation = new SetUpdateOperation<string>("Player1Name", name);
var updateOperations = new List<UpdateOperation>();
updateOperations.Add(player1NameUpdateOperation);
var updateItems = new List<UpdateItem>();
foreach (var match in list)
{
string id = match.id;
updateItems.Add(new UpdateItem(id, null, updateOperations));
}
var executor = new Microsoft.Azure.CosmosDB.BulkExecutor.BulkExecutor(_client, _collection);
await executor.InitializeAsync();
var executeResult = await executor.BulkUpdateAsync(updateItems);
var count = executeResult.NumberOfDocumentsUpdated;
What am I missing?
If I run the bulk executor on a collection without a partition key, I get the same error. If I run it with a collection that does have it and i specify it, the bulk executor works fine.
Pretty sure they just don't support it right now through the bulk executor api, just use the normal cosmos api for updating the doc as a workaround for now.
Related
I am running below dynamo DB scan query in AWS
const dynamoDb = require('aws-sdk/clients/dynamodb.js');
const dynmoDBClient = new dynamoDb.DocumentClient({ region: "REGION"});
let params = {
"TableName":"Users",
"ScanFilter":{
"name":{
"AttributeValueList":[
{
"S":""
}
],
"ComparisonOperator":"GT"
}
},
"Select":"ALL_ATTRIBUTES"
}
let result = null;
result = await dynmoDBClient.scan(params).promise();
When I run the query , I get below error -
ERROR occurred while querying data from DB :
{"message":"One or more parameter values were invalid:
ComparisonOperator GT is not valid for M AttributeValue type"}
As per table definition , name attribute is of type S (string) and not M (map)
But I am still getting this error.
Can anybody please help here as I am not getting what is the issue here ?
AWS maintains sample code of core DynamoDB operations in various languages. Here's their sample for Scan in Node.js:
https://github.com/aws-samples/aws-dynamodb-examples/blob/master/DynamoDB-SDK-Examples/node.js/WorkingWithScans/scan-parallel-segments.js
ScanFilter is legacy so don't go there.
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/LegacyConditionalParameters.ScanFilter.html
I have a project that uses CloudTableClient and query the cosmos db in this way:
var table = cloudTableClient.GetTableReference(tableName);
var cosmosResult = await table.ExecuteQuerySegmentedAsync(GetTableQuery<DynamicTableEntity>(queryOptions), tableContinuationToken, GetTableRequestOptions(requestOptions), operationContext);
If I use CosmosClient, I can set the ConnetionMode
CosmosClient client = new CosmosClient(connectionString,
new CosmosClientOptions
{
ConnectionMode = ConnectionMode.Gateway // ConnectionMode.Direct is the default
});
However, with CouldTableClient, seems I can't find an option to set this. Is it possible to use Direct Mode with CouldTableClient or I actually need to move everything to CosmosClient in order to do it.
There are some great examples on MS for bulk inports and bulk deletes and I have been able to use python to get both of them to work. for example
dbclient.ExecuteStoredProcedure(parameterscolllink + '/sprocs/bulkImport', dumps(adddat), { 'partitionKey' : 0}))
and then in my SPROC I deserialize the string into an array: if (typeof items === "string") items = JSON.parse(items)
But one of the examples from that MS page is an SPROC to swap fantasy football players, and the SPROC takes in 2 different variables:
function tradePlayers(playerId1, playerId2)
How would python execute an SPROC and pass 2 variables?
You could pass multiple parameters as array to cosmos db stored procedure.
import azure.cosmos.cosmos_client as cosmos_client
endpoint = "https://***.documents.azure.com:443/";
primaryKey = "***";
client = cosmos_client.CosmosClient(url_connection=endpoint, auth={'masterKey': primaryKey})
sproc_link = "dbs/db/colls/jay/sprocs/test"
params = ["a","b"];
str = client.ExecuteStoredProcedure(sproc_link, params)
print(str);
Moreover,you could refer to this example:https://gist.github.com/sjwaight/3c5cf9503f588b190b5ff02bb79f07f0
Update Answer:
Sorry for the late. You could use .net code(Please see the case:Unable to Execute procedure with multiple parameters) to call that type like this:
var email = "xxxxx";
var password = "xxxx";
var response = await client.ExecuteStoredProcedureAsync<string>(storedProcedurelink, new RequestOptions { PartitionKey = new PartitionKey(partitionKey) },email,password);
I check the construct function of the ExecuteStoredProcedureAsync,it accepts Dynamic array.
As for python code,didn't find such invoke way.You still need to follow above sample code to pass the params into a list.
I've got records that were the result of bad data where the Partition Key is null and I need to clean them up, but I've been unsuccessful so far.
Here's what I've tried:
var scriptResult = await _dbClient.ExecuteStoredProcedureAsync<dynamic>(
GetStoredProcLink("BulkDelete"),
new RequestOptions() { PartitionKey = new PartitionKey(""), EnableScriptLogging = true },
"select * from c where c.documentDbType = "SomeValue"");
I've also tried used Undefined.Value as the parameter to new PartitionKey().
I ripped the stored proc from here and haven't changed anything yet.
Note: This is a partitioned collection if it was not obvious (by /companyId)
I just hit this issue when migrating to new database level throughput provision. This is syntax that got me up and running again when my models did not contain the specified partition Key property:
new RequestOptions() {
PartitionKey = new PartitionKey(Undefined.Value)
}
Ref:
https://www.lytzen.name/2016/12/06/find-docs-with-no-partitionkey-in-azure.html
Null, Undefined, and empty string are all different values in Cosmos DB. You need something like: new RequestOptions { PartitionKey = new PartitionKey(null) }
I am new to aws dynamodb so pardon for any silly mistake. I was trying to fetch two columns from my Activity table. Also I wanted to fetch only those columns where partition key starts with some specific string. Partition key has format activity_EnrolledStudentName.(e.g Dance_studentName) So I wanted to fetch all those items from table where activity is Dance. I was trying to use the following query:
public List<StudentDomain> getAllStudents(String activity) {
List<StudentDomain> scanResult = null;
DynamoDBUtil dynamoDBUtil = new DynamoDBUtil();
AmazonDynamoDB dynamoDBClient = dynamoDBUtil.getDynamoDBClient();
DynamoDBMapper mapper = new DynamoDBMapper(dynamoDBClient);
DynamoDBScanExpression scanExpression = new DynamoDBScanExpression();
scanExpression.withProjectionExpression("studentId, ActivitySkills")
.addFilterCondition(STUDENT_PRIMARY_KEY,
new
Condition().withComparisonOperator(ComparisonOperator.BEGINS_WITH)
.withAttributeValueList(new
AttributeValue().withS(activity)));
scanResult = mapper.scan(StudentDomain.class, scanExpression);
return scanResult;
However I am getting the following error when i executed above query.
com.amazonaws.services.dynamodbv2.model.AmazonDynamoDBException: Can not use both expression and non-expression parameters in the same request: Non-expression parameters: {ScanFilter} Expression parameters: {ProjectionExpression} (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: TMS27PABBC2BS3UU7LID731G0FVV4KQNSO5AEMVJF66Q9ASUAAJG)
Can anyone please suggest where I am mistaken and which other query shall i use otherwise?
It's not completely clear what you are trying to achieve but if I understood correctly then you don't need a scan operation for that which actually scans the whole table and afterwords filter the result.
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("TableName");
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("studentId = : ActivitySkills")
ItemCollection<QueryOutcome> items = table.query(spec);
Iterator<Item> iterator = items.iterator();
Item item = null;
while (iterator.hasNext()) {
item = iterator.next();
System.out.println(item.toJSONPretty());
}
The filter expression you are using are intented to be used as a filter for secondary attributes and not the range or partition keys. At least this is my interpretation of the documentation
Please read the query documentation http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/QueryingJavaDocumentAPI.html