how to pass multiple inputs to a cosmosdb stored procedure - azure-cosmosdb

There are some great examples on MS for bulk inports and bulk deletes and I have been able to use python to get both of them to work. for example
dbclient.ExecuteStoredProcedure(parameterscolllink + '/sprocs/bulkImport', dumps(adddat), { 'partitionKey' : 0}))
and then in my SPROC I deserialize the string into an array: if (typeof items === "string") items = JSON.parse(items)
But one of the examples from that MS page is an SPROC to swap fantasy football players, and the SPROC takes in 2 different variables:
function tradePlayers(playerId1, playerId2)
How would python execute an SPROC and pass 2 variables?

You could pass multiple parameters as array to cosmos db stored procedure.
import azure.cosmos.cosmos_client as cosmos_client
endpoint = "https://***.documents.azure.com:443/";
primaryKey = "***";
client = cosmos_client.CosmosClient(url_connection=endpoint, auth={'masterKey': primaryKey})
sproc_link = "dbs/db/colls/jay/sprocs/test"
params = ["a","b"];
str = client.ExecuteStoredProcedure(sproc_link, params)
print(str);
Moreover,you could refer to this example:https://gist.github.com/sjwaight/3c5cf9503f588b190b5ff02bb79f07f0
Update Answer:
Sorry for the late. You could use .net code(Please see the case:Unable to Execute procedure with multiple parameters) to call that type like this:
var email = "xxxxx";
var password = "xxxx";
var response = await client.ExecuteStoredProcedureAsync<string>(storedProcedurelink, new RequestOptions { PartitionKey = new PartitionKey(partitionKey) },email,password);
I check the construct function of the ExecuteStoredProcedureAsync,it accepts Dynamic array.
As for python code,didn't find such invoke way.You still need to follow above sample code to pass the params into a list.

Related

Invalid index exception when using BulkExecutor in CosmosDb

I have an error when I'm trying to use BulkExecutor to update one of the properties in CosmosDb. The error message is "Index was out of range. Must be non-negative and less than the size of the collection.
Parameter name: index"
Important point- I don't have partition key defined on my collection.
Here is my code:
SetUpdateOperation<string> player1NameUpdateOperation = new SetUpdateOperation<string>("Player1Name", name);
var updateOperations = new List<UpdateOperation>();
updateOperations.Add(player1NameUpdateOperation);
var updateItems = new List<UpdateItem>();
foreach (var match in list)
{
string id = match.id;
updateItems.Add(new UpdateItem(id, null, updateOperations));
}
var executor = new Microsoft.Azure.CosmosDB.BulkExecutor.BulkExecutor(_client, _collection);
await executor.InitializeAsync();
var executeResult = await executor.BulkUpdateAsync(updateItems);
var count = executeResult.NumberOfDocumentsUpdated;
What am I missing?
If I run the bulk executor on a collection without a partition key, I get the same error. If I run it with a collection that does have it and i specify it, the bulk executor works fine.
Pretty sure they just don't support it right now through the bulk executor api, just use the normal cosmos api for updating the doc as a workaround for now.

Dynamo db: Not able to fetch two columns with filterExpression

I am new to aws dynamodb so pardon for any silly mistake. I was trying to fetch two columns from my Activity table. Also I wanted to fetch only those columns where partition key starts with some specific string. Partition key has format activity_EnrolledStudentName.(e.g Dance_studentName) So I wanted to fetch all those items from table where activity is Dance. I was trying to use the following query:
public List<StudentDomain> getAllStudents(String activity) {
List<StudentDomain> scanResult = null;
DynamoDBUtil dynamoDBUtil = new DynamoDBUtil();
AmazonDynamoDB dynamoDBClient = dynamoDBUtil.getDynamoDBClient();
DynamoDBMapper mapper = new DynamoDBMapper(dynamoDBClient);
DynamoDBScanExpression scanExpression = new DynamoDBScanExpression();
scanExpression.withProjectionExpression("studentId, ActivitySkills")
.addFilterCondition(STUDENT_PRIMARY_KEY,
new
Condition().withComparisonOperator(ComparisonOperator.BEGINS_WITH)
.withAttributeValueList(new
AttributeValue().withS(activity)));
scanResult = mapper.scan(StudentDomain.class, scanExpression);
return scanResult;
However I am getting the following error when i executed above query.
com.amazonaws.services.dynamodbv2.model.AmazonDynamoDBException: Can not use both expression and non-expression parameters in the same request: Non-expression parameters: {ScanFilter} Expression parameters: {ProjectionExpression} (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: TMS27PABBC2BS3UU7LID731G0FVV4KQNSO5AEMVJF66Q9ASUAAJG)
Can anyone please suggest where I am mistaken and which other query shall i use otherwise?
It's not completely clear what you are trying to achieve but if I understood correctly then you don't need a scan operation for that which actually scans the whole table and afterwords filter the result.
DynamoDB dynamoDB = new DynamoDB(client);
Table table = dynamoDB.getTable("TableName");
QuerySpec spec = new QuerySpec()
.withKeyConditionExpression("studentId = : ActivitySkills")
ItemCollection<QueryOutcome> items = table.query(spec);
Iterator<Item> iterator = items.iterator();
Item item = null;
while (iterator.hasNext()) {
item = iterator.next();
System.out.println(item.toJSONPretty());
}
The filter expression you are using are intented to be used as a filter for secondary attributes and not the range or partition keys. At least this is my interpretation of the documentation
Please read the query documentation http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/QueryingJavaDocumentAPI.html

Deleting data from Drive Tables and automatically re-importing new data

I need help with trying to understand how to delete all data from a table and then try to automatically import a new sheet with data into the newly cleared down table.
I'm currently trying the unload() method client side but that doesn't seem to cleardown my tables
function ClearDown(){
app.datasources.P11d.unload(function(){});
console.log('Finish Delete');
}
I've also tried to create a server side function, which also doesn't appear to work
function ClearTable(){
var records = app.models.P11d.newQuery();
// records.run();
console.log('Server Function Ran');
app.deleteRecords(records.run());
}
This is ran from a client side function:
function Delete(){
google.script.run.withSuccessHandler(function(result){
}).ClearTable();
console.log('Function Ran');
}
Again this is all to no avail
With the import side I've tried to do the below:-
Client Side:
function ImportData(){
console.log('Begin');
var ss = SpreadsheetApp.openById('SHEET ID');
var values = ss.getSheetByName('P11d').getDataRange().getValues();
var ssData = [];
// app.datasources.P11d.unload(function(){});
for (var i = 0; i<values.length; i++){
var newRecord = app.models.P11d.newRecord();
// add all fields to the new record
newRecord.Reg_Number = values[i][0];
newRecord.Reg_Date = values[i][1];
newRecord.Model_Description = values[i][2];
newRecord.P11d_Value = values[i][3];
newRecord.EngineSize = values[i][4];
newRecord.Fuel = values[i][5];
newRecord.CO2 = values[i][6];
newRecord.SIPP = values[i][7];
newRecord.GTA_Code = values[i][8];
newRecord.Type = values[i][9];
ssData.push(newRecord);
// console.log(newRecord.MODEL_FIELD);
}
console.log('Finished');
// return the array of the model.newRecord objects that would be consumed by the Model query.
}
Please can someone help with this, at the moment the way the data is sent over to me adding new stuff into the Drive Table is causing many duplicates.
Thanks in advance,
You can delete all records, import, and read from a spreadsheet using the AMU Library
Copy and paste the server and client scripts into your app.
I'm sure that will make it much easier!
To delete all the data in a model using this:
Button onClick:
google.script.run.AMU.deleteAllData('ModelName');
The correct way to delete records on the server is:
app.models.MODEL_NAME.deleteRecords(key_array);
datasource.unload() simply unloads the widget on the client. It does not affect the database records.
A better way to write your records query on the server is:
var query = app.models.MODEL_NAME.newQuery();
query.filters.your_filter_here;
var records = query.run();
Note that you cannot return a single record or an array of records from anything but a calculated model function without using a function posted here. (You can return a single field of a record using stringify for any json data.)
I am currently working on a solution to create datasource independent tables needed in App Maker.
For the delete function on the server try to change your code just a little bit, this function at least used to work for me, however I have not needed to use it in some time.
function ClearTable(){
var records = app.models.P11d.newQuery().run();
console.log('Server Function Ran');
app.deleteRecords(records);
}

Pass list of strings as a Parameter in Parameterized Query in DocumentDB

Is there a way i can pass a list of strings in the SqlParameter, lets say i have 10 authors and i want to find books published by them. I know i can make 10 parameters in (new SqlParameter) separately. But is there a way to just pass a list and get the results.
IQueryable<Book> queryable = client.CreateDocumentQuery<Book>(collectionSelfLink,
new SqlQuerySpec
{
QueryText = "SELECT * FROM books b WHERE (b.Author.Name = #name)",
Parameters = new SqlParameterCollection()
{
new SqlParameter("#name", "Herman Melville")
}
});
I think what you are looking for is the SQL IN keyword, see this link for more information.
Usage example:
SELECT *
FROM books
WHERE books.Author.Name IN ('Helena Petrovna Blavatsky',
'Hermes Trismegistus', 'Heinrich Cornelius Agrippa')

What is the equivalent of firebaseRef.push() with angularfire?

In the following non-angularfire pseudo-code I would expect to have firebase generate a key for the new data being pushed in.
var ref = Firebase(...);
var newref = ref.push({"blah":"blah"});
var autoKey = newref.name();
I try to do the same thing through angularfire with a bound model but it just gives me errors about the object not having a push() method, similar to this question. He got it working when the data type was an array.
How do I get the nice behaviour I've seen in regular Firebase (non-angularFire) with automatic keys for objects?
If you want to use an Object and have auto-generated keys, use the add method on a angularFireCollection. For example:
function ExampleController($scope, angularFireCollection) {
var url = 'https://angularFireExample.firebaseio-demo.com/';
$scope.examples = angularFireCollection(url);
$scope.addExample = function(ex) {
$scope.examples.add(ex);
};
}

Resources