This is the context:
I have a GCP Function that must to go to Datastore to get some data to return an array to client.
The Problem:
I can't achieve that GCP Functions returns data when I use Datetime filters about my code, however, when I put the equivalent query on GCP Datastore Query console, i can achieve turn back a lot of rows.
Technical data:
Datastore GQL:
select * from KIND where recordDate >= DATETIME ("2018-10-10T10:10:00.000000+03:00") and recordDate <= DATETIME ("2018-10-11T10:10:00.999999+03:00")
(It works on GCP Datastore console)
GCP Functions Code:
query = datastore.createQuery(kind).filter('recordDate','>=',dateFrom).filter('recordDate','<=',dateTo);
console.log(query);
datastore.runQuery(query, (err,entities) => {
console.log(err);
console.log(entities);
});
(It runQuery()... always returns null as err variable and returns a void Array on entity variable)
The help I need:
Can anybody tell me an example of a successful case of a query that
returns entities using Datetime filters ?
Ways I tried about the format of dateFrom and dateTo vars:
DATETIME ("2018-10-10T10:10:00.000000+03:00")
DATETIME ("2018-10-10 10:10:00")
"2018-10-10T10:10:00.000000+03:00"
'2018-10-10T10:10:00.000000+03:00'
DATETIME ("2018-10-10")
"2018-10-10"
DATE ("2018-10-10")
DATE ('2018-10-10')
DATETIME (2018-10-10T10:10:00.000000+03:00)
And no one works :(
UPDATE (2018-11-19):
I printed the query before do runQuery and I get this:
(I PUT SOME DOTS TO SAFE SENSIBLE DATA)
{
"textPayload": "Query {\n scope: \n Datastore {\n clients_: Map {},\n datastore: [Circular],\n namespace: undefined,\n projectId: '................',\n defaultBaseUrl_: 'datastore.googleapis.com',\n baseUrl_: 'datastore.googleapis.com',\n options: \n { libName: 'gccl',\n libVersion: '2.0.0',\n scopes: [Array],\n servicePath: 'datastore.googleapis.com',\n port: 443,\n projectId: 'c..........' },\n auth: \n GoogleAuth {\n checkIsGCE: undefined,\n jsonContent: null,\n cachedCredential: null,\n _cachedProjectId: 'c..........',\n keyFilename: undefined,\n scopes: [Array] } },\n namespace: null,\n kinds: [ '....KIND......' ],\n filters: \n [ { name: 'recordDate', op: '>', val: 2018-10-10T00:00:00.000Z },\n { name: 'recordDate', op: '<', val: 2018-10-12T23:59:59.000Z } ],\n orders: [],\n groupByVal: [],\n selectVal: [],\n startVal: null,\n endVal: null,\n limitVal: 20,\n offsetVal: -1 }",
"insertId": "............................098...",
"resource": {
"type": "cloud_function",
"labels": {
"region": "us-central1",
"function_name": "...................-get-search",
"project_id": "............."
}
},
"timestamp": "2018-11-19T21:19:46.737Z",
"severity": "INFO",
"labels": {
"execution_id": "792s.....lp"
},
"logName": "projects/......./logs/cloudfunctions.googleapis.com%2Fcloud-functions",
"trace": "projects/........../traces/4a457.......",
"receiveTimestamp": "2018-11-19T21:19:52.852569373Z"
}
And the Functions Code is:
query = datastore.createQuery(kind).filter('recordDate','>',new Date(dateFrom)).filter('recordDate','<',new Date(dateTo)).limit(20);
console.log(query);
var test = datastore.runQuery(query, (err,entities) => {
console.log(err);
console.log(entities);
entities.forEach(entity => {
console.log(entity);
});
return{
entities:entities,
err:err
};
});
console.log(test);
When using the client libraries to filter or sort query results based on datetime properties you should use the native datetime representation in the respective language, not strings or the GQL structures.
In particular for node.js which you apparently use you should use Date() objects. Here's an example from Restrictions on queries:
const query = datastore
.createQuery('Task')
.filter('created', '>', new Date('1990-01-01T00:00:00z'))
.filter('created', '<', new Date('2000-12-31T23:59:59z'));
Related
I am trying to make a cloud function.
Whenever i try to hit the endpoint I get 500 Internal Server Error
Postman Response Image here
I've checked logs for firebase functions and i don't see any information there too.
It just says "Function Crashed" without any further information.
I've also checked for any typos and mismatch in the Firestore database structure but it all looks fine to me.
This is the code for firebase function which i uploaded on my Firebase Project.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const { error } = require('firebase-functions/lib/logger');
admin.initializeApp(functions.config().firebase);
exports.addEvent = functions.region('asia-east2').https.onRequest(async (req, res) => {
if (req.method === 'POST') {
var db = admin.firestore();
var write = db.collection("Colleges")
.doc(req.body.college)
.collection("events")
.doc(req.body.event.id)
.set({
id: req.body.event.id,
admin: req.body.event.admin,
event_state: req.body.event.event_state,
name: req.body.event.name,
poster_url: req.body.event.poster_url,
start_date_time: req.body.event.start,
end_date_time: req.body.event.end,
location: req.body.event.location,
short_desc: req.body.event.shortDesc,
long_desc: req.body.event.longDesc,
contacts: req.body.event.contacts,
links: req.body.event.links,
});
return res.send(write);
}
else
return res.sendStatus(403);
});
This is the body of the POST Request which i sent from Postman
{
"college": "college_name",
"event": {
"id": 1234,
"admin": "admin",
"event_state": 2,
"name": "Event Name",
"poster_url": "test",
"start": "Date Time",
"end": "Date Time",
"location": "auditorium",
"shortDesc": "lorem ipsum short",
"longDesc": "lorem ipsum long",
"contatcs": [
{
"tag": "Name Tag",
"contact": 12345678
}
],
"links": [
{
"tag": "Link Tag",
"link": 123456784
}
]
}
}
The Firestore Structure is Something like
-Colleges (Collection)
|
|
-Document
|
-events(Collection)
|
-Event Documents (Document which i want to write to ,from the firebase function)
The problem is that your event id in your payload is a number and Firestore documents ids must be strings. So you either, use .doc(req.body.event.id.toString()) or you send your event id as string in your payload id: "1234".
Also, consider refactoring your code following Firebase guidelines to handle the POST method.
So, i am trying to filter objects inside of an array using dynamo db.
This is my sample object
client: {
"name":"etc"
"subscriptions": [
{
"status": "canceled"
... other fields
},
{
"status": "active"
... other fields
}
]
}
I am using filter expressions and dynamoose scan method, what i want to achieve in this case would be the scan bring me back all subscriptions that have the canceled status, is this possible using dynamodb and this kind of objects?.
var filter = {
FilterExpression: "#subscriptions.#status = :statusValue",
ExpressionAttributeNames: {
"#subscriptions":"subscriptions",
"#status": "status"
},
ExpressionAttributeValues:{
":statusValue": "canceled"
}
};
dynamooseEntity.scan(filter).exec();
I am learning to write CosmosDB stored procedures following the following information
Stored procedure docs
What I am trying to do is loop through a number of documents returned by a query and find the one that has the most exact match.
The flow is as follows
Check Start and End Date to make sure its a valid documents
1.5 Check that the input VariantNo is included in the Variants array in the document
Check if a user is included in the user array of the document OR if ALL is specified as a string in the array
Check if a store is included in the stores array OR if ALL is specified as a string in the stores array
The document looks as follows
{
"id": "12345",
"brand": "XXX",
"PromotionName": "Test Promo 1",
"PromotionType": "Deal",
"PromotionSticker": "Sticker 1",
"StartDate": "2020-05-14T00:00:00.1212122Z",
"EndDate": "2020-05-30T00:00:00.1212122Z",
"Variants": [
"0628462008001",
"0628462008002",
"0644324003002"
],
"Stores": [
"SE0623"
],
"Users": [
"ALL"
],
"DiscountInPercent": "30",
"RedPriceStores": null,
"CreatedDate": "20200515",
"CreatedBy": "SLAPI Promotions API ClientId: 123",
"UpdatedDate": null,
"UpdatedBy": null,
"Consumer": "YYYYY_V2",
"_rid": "HwVmAIFaOoEBAAAAAAAAAA==",
"_self": "dbs/HwVmAA==/colls/HwVmAIFaOoE=/docs/HwVmAIFaOoEBAAAAAAAAAA==/",
"_etag": "\"11005859-0000-0c00-0000-5ebe0f7e0000\"",
"_attachments": "attachments/",
"_ts": 1589514110
}
The beginnings of my stored procedure looks like this based on the template in CosmosDB
// SAMPLE STORED PROCEDURE
function getFinalPromotionPrice(item, store, user) {
var collection = getContext().getCollection();
// Query documents and take 1st item.
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
'SELECT * FROM c WHERE c.StartDate <= (SELECT VALUE GetCurrentDateTime()) AND c.EndDate >= (SELECT VALUE GetCurrentDateTime())',
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found',
// else take 1st element from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found');
}
else {
var response = getContext().getResponse();
var body = { prefix: prefix, feed: feed[0] };
response.setBody(JSON.stringify(body));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
but I am getting this error when executing the stored procedure:
{"code":400,"body":{"code":"BadRequest","message":"Message: {\"Errors\":[\"Encountered exception while executing function. Exception = ReferenceError: 'prefix' is not defined\r\nStack trace: ReferenceError: 'prefix' is not defined\n at Anonymous function (script.js:20:13)\n at A
As you can check the error its expecting the "prefix" :
Exception = ReferenceError: 'prefix' is not defined
In the below line you are setting the value of prefix as "prefix" but you have not declared prefix anywhere in the code.
var body = { prefix: prefix, feed: feed[0] };
Change the above line to this if you don't require prefix in your SP body:
var body = { feed: feed[0] };
I am able to fetch the record from dynamo db and view the response successfully. I need to modify the fetched 'ACCOUNTNAME' attribute in the 'items' array and update the json and also update in dynamo db. Now when I try to update the fetched records I end up with the Invalid attribute value type exception.
I was trying to update it using the key with Array of Strings which is provided with code snippet also tried to update inside for loop using the individual string but both failed with same exception as
"statusCode": 400,
"body": {
"message": "Invalid attribute value type",
"error": {
"errorMessage": "ValidationException"
}
}
I tried to create params and update the call inside the for loop by setting the key as below,
Key: {
"UUID": {
"S": usersOfAccountFromDB.body.Items[key].UUID
}
,
"TYPE": {
"S": user
}
}
but also failed with the same exception.
Fetched Json from dynamo db
[
{
"DEFINITION": "914ba44a-8c26-4b60-af0f-96b6aa37efe6",
"UUID": "830a49cb-4ed3-41ae-b111-56714a71ab98",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
{
"DEFINITION": "1f60fded-323d-40e1-a7f8-e2d053b0bed0",
"UUID": "47db3bbe-53ac-4e58-a378-f42331141997",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
{
"DEFINITION": "05ddccba-2b6d-46bd-9db4-7b897ebe16ca",
"UUID": "e7290457-db77-48fc-bd1a-7056bfce8fab",
"TYPE": "USER",
"RELATION": "01efd131-6a5d-4068-889e-9dba44262da5",
"ACCOUNTNAME": "Wolff LLC"
},
.
.
.
.]
Now I tried to iterate the Json and setup UUID which is the key as the String array as below,
var userUUIDArray : string[] = [];
for (let key in usersOfAccountFromDB.body.Items) {
userUUIDArray.push(usersOfAccountFromDB.body.Items[key].UUID);
}
for (var uuid of userUUIDArray) {
console.log("UUID : " +uuid); // prints all the uuid
}
// Creating a parameter for the update dynamo db
var params = {
TableName: <tableName>,
Key: {
"UUID": {
"SS": userUUIDArray
}
,
"TYPE": {
"S": user
}
},
UpdateExpression: 'SET #ACCOUNTNAME = :val1',
ExpressionAttributeNames: {
'#ACCOUNTNAME': 'ACCOUNTNAME' //COLUMN NAME
},
ExpressionAttributeValues: {
':val1': newAccountName
},
ReturnValues: 'UPDATED_NEW',
};
//call the update of dynamodb
const result = await this.getDocClient().update(param).promise();
I get the error as below,
"body": {
"message": "Invalid attribute value type",
"error": {
"errorMessage": "ValidationException"
}
}
All the approaches failed with same above exception
The update operation which your code currently uses only allow a single item to be updated.
IIUC, you want to update multiple items with one API call. For this you need to use batchWrite operation. Keep in mind that you cannot update more than 25 items per invocation.
The origin of the error you are getting
Your code fails due to the use of "SS" in the UUID field. This field is of type string so you must use "S". Note however that since you're using the document client API you do not need to pass values using this notation. See this answer for further details.
I have resolved the issue now by running the update statement one by one using loop
for (let key in usersOfAccountFromDB.body.Items) {
var updateParam = {
TableName: process.env.AWS_DYNAMO_TABLE,
Key: {
UUID: usersOfAccountFromDB.body.Items[key].UUID,
TYPE: user
},
UpdateExpression: "SET #ACCOUNTNAME = :val1",
ExpressionAttributeNames: {
'#ACCOUNTNAME': 'ACCOUNTNAME'
},
ExpressionAttributeValues: {
":val1": newAccountName
},
ReturnValues: "UPDATED_NEW",
};
const result = await this.getDocClient().update(updateParam).promise();
}
I have a CosmosDB setup using the Mongo API. I have a collection with a hashed shard on one of the field of the document. When I run commands like db.collection.remove or db.collection.deleteMany I get the following error.
Command deleteMany failed: query in command must target a single shard key.: {"message":"Command deleteMany failed: query in command must target a single shard key."}
I'm not sure how can I mention a shard key as part of the query considering I want the query to run across all the shards.
You need to provide shard key when you want to run commands like db.collection.remove or db.collection.deleteMany.
For example :
My data source as below:
[
{
"id" : "2",
"name" : "b"
},
{
"id" : "1",
"name" : "a"
}
]
And my shared key is "/name". Use db.coll.deleteMany({"name":"a"}) to delete specific shard.
Hope it helps you.
It should be ShardKey which you have chosen when you created cosmosDb collection.
FilterDefinition<Product> filter = Builders<Product>.Filter.Eq("id", 2);
=== 2 is my shardKey
await this._dbContext.GetProducts.DeleteOneAsync(filter);
return RedirectToAction("Index");
Kindly refer an image below , how does it look like in CosmosDB
Shard Key(Partition Key) has to be provided during specification of schema model in the code. Once its provided, we can perform regular operation like save, update and delete as usual.
Example:
const mySchema = new Schema({
requestId: { type: String, required: true },
data: String,
documents: [{ docId: String, name: String, attachedBy: String }],
updatedBy: {
type: {
name: { type: String, required: true },
email: { type: String, required: true },
}, required: true
},
createdDate: { type: Date, required: true },
updatedDate: { type: Date },
}, { shardKey: { requestId: 1 } }
);
In the above code we specified requestId as Shard Key, now we can perform any mongo operations
Example:
let request:any = await myModel.findById(requestId);
request.data ="New Data";
await request.save();
Hope that helps.
This works with all Mongo operations