This question already has answers here:
Firestore query using an object element as parameter
(4 answers)
Firestore : Query documents by property of objects [duplicate]
(1 answer)
Closed 10 months ago.
I need to find all document IDs with name: "Nick", so I can update the score.
I have the following Firestore Database structure:
{
"c4ca4238a0b923820dcc509a6f75849b": {
title: "Tiger",
user: {
name: "Chris",
score: 1
}
},
"c81e728d9d4c2f636f067f89cc14862c": {
title: "Lion",
user: {
name: "Nick",
score: 2
}
}
}
From the documentation I see that I can find documents using where like this:
const dbQuery = query(collection(dbFirestore, "posts"), where("title", "==", "Tiger"));
const dbSnapshot = await getDocs(dbQuery);
dbSnapshot.forEach((result) => {
console.log(result.id, " => ", result.data());
});
But I don't see anything about how to search by a child element (user.name)?
Firestore is a NoSQL database which inherits how you query on any other NoSQL database. To specify a query condition on fields in an embedded/nested document, use dot notation "field.nestedField". When querying using dot notation, the field and nested field must be inside quotation marks.
You could get all the documents from nested objects by using this sample query:
const dbQuery = query(collection(db, "posts"), where("user.name", "==", "Nick"));
const dbSnapshot = await getDocs(dbQuery);
dbSnapshot.forEach((result) => {
console.log(result.id, " => ", result.data());
});
Related
I'd like to perform a Firestore query to return an ordered list of blogs where the blog has an array that contains all elements of a query array.
An example is blog data such as this:
{
"cities": ["Tokyo", "Kyoto", "Osaka", "Nara"],
"modified": 1645457445
}
{
"cities": ["Prague", "Bratislava", "Budapest"],
"modified": 1645450245
}
{
"cities": ["Hiroshima", "Kyoto", "Tokyo"],
"modified": 1645453845
}
I want to get a page of 20 blogs that contains both "Tokyo" and "Kyoto", ordered by the most recently modified (descending).
I have seen the following questions
Firestore query - array contains all
How to perform compound queries with logical AND on array in Cloud Firestore?
When applying the above suggestions so the blog data looks like this:
{
"cities": {
"Hiroshima": true,
"Kyoto": true,
"Tokyo": true
},
"modified": 1645453845
}
And using the query
var db = admin.firestore();
var query = db.collection("blogs")
data.cities.forEach(tag => {
query = query.where("cities." + tag, "==", true)
})
query = query.orderBy("modified", "desc");
if (typeof data.modified !== "undefined") {
query = query.startAfter(data.modified)
}
return query.limit(20).get()
Firestore errors and requests to create an index such as:
cities.Tokyo Ascending modified Descending
The suggested index is not suitable as the blogs can contain any city in the world so it isn't feasible to create an index for every city
How can I achieve this query in firestore?
Current Code
// items is an array.
// Array [
Object {
"id": "KQJfb2RkT",
"name": "first",
},
Object {
"id": "1mvshyh9H",
"name": "second",
},
]
storeSale = async ({ items }) => {
this.salesCollection.add({
status: 1,
created_at: new Date(),
updated_at: new Date(),
});
};
When adding a document in SalesCollection, I want to add items as subcollection to this document.
I would appreciate it if you could give me any advices.
I would like to save like this.
enter image description here
You can use a batched write, as follows:
// Get a new write batch
let batch = db.batch();
// Set the value of parent
const parentDocRef = db.collection("parentColl").doc();
batch.set(parentDocRef, {
status: 1,
created_at: new Date(),
updated_at: new Date(),
});
//Set the value of a sub-collection doc
const parentDocId = parentDocRef.id;
const subCollectionDocRef = db.collection("parentColl").doc(parentDocId).collection("subColl").doc();
batch.set(subCollectionDocRef, {
...
});
// Commit the batch
await batch.commit();
One key point to note: Actually, from a technical perspective, a parent collection and the sub-collections of the documents in this parent collection are not at all relating to each other.
Let's take an example: Imagine a doc1 document under the col1 collection
col1/doc1/
and another one subDoc1 under the subCol1 (sub-)collection
col1/doc1/subCol1/subDoc1
These two documents (and the two immediate parent collections, i.e. col1 and subCol1) just share a part of their path but nothing else.
One side effect of this is that if you delete a document, its sub-collection(s) still exist.
I'm trying to set up a collection of versioned documents in which I insert a new document with the same id and a timestamp whenever there's an edit operation. I use a unique compound index for this on the id and timestamp fields. CosmosDB is giving me MongoError: E11000 duplicate key error whenever I try to insert a document with a different id but an identical timestamp to another document. The MongoDB documentation says that I should be able to do this:
https://docs.mongodb.com/v3.4/core/index-unique/#unique-compound-index
You can also enforce a unique constraint on compound indexes. If you use the unique constraint on a compound index, then MongoDB will enforce uniqueness on the combination of the index key values.
I tried using a non-unique index but the Resource Manager template failed, saying that non-unique compound indexes are not supported. I'm using the node.js native driver v3.2.4. I also tried to use Azure Portal to insert documents but received the same error. This makes me believe it's not a problem between CosmosDB and the node.js driver.
Here's a small example to demonstrate the problem. I'm running it with Node v10.15.3.
const { MongoClient } = require('mongodb');
const mongoUrl = process.env.COSMOSDB_CONNECTION_STRING;
const collectionName = 'indextest';
const client = new MongoClient(mongoUrl, { useNewUrlParser: true });
let connection;
const testIndex = async () => {
const now = Date.now();
connection = await client.connect();
const db = connection.db('master');
await db.collection(collectionName).drop();
const collection = await db.createCollection(collectionName);
await collection.createIndex({ id: 1, ts: -1 }, { unique: true });
await collection.insertOne({ id: 1, ts: now, title: 'My first document' });
await collection.insertOne({ id: 2, ts: now, title: 'My other document' });
};
(async () => {
try {
await testIndex();
console.log('It works');
} catch (err) {
console.error(err);
} finally {
await connection.close();
}
})();
I would expect the two insert operations to work and for the program to exit with It works. What I get instead is an Error:
{ MongoError: E11000 duplicate key error collection: master.indextest Failed _id or unique key constraint
at Function.create (/home/node/node_modules/mongodb-core/lib/error.js:43:12)
at toError (/home/node/node_modules/mongodb/lib/utils.js:149:22)
at coll.s.topology.insert (/home/node/node_modules/mongodb/lib/operations/collection_ops.js:859:39)
at handler (/home/node/node_modules/mongodb-core/lib/topologies/replset.js:1155:22)
at /home/node/node_modules/mongodb-core/lib/connection/pool.js:397:18
at process._tickCallback (internal/process/next_tick.js:61:11)
driver: true,
name: 'MongoError',
index: 0,
code: 11000,
errmsg:
'E11000 duplicate key error collection: master.indextest Failed _id or unique key constraint',
[Symbol(mongoErrorContextSymbol)]: {} }
Is this expected behavior or a bug in CosmosDB's MongoDB API?
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
I have a Firestore document representing a day with a subcollection containing reservations for this day in a Firestore database.
Here is JSON example of my data structure:
{
"Day":{
"ReservationsCount":2,
"Reservations":[
{
"Order":1
},
{
"Order":2
}
]
}
}
I need to add a set of documents, set their ordinal number in collection and update the ReservationsCount in one transaction.
I tried to use firestore transactions and batched writes, but as far I understand, they do not support adding document to a collection in transaction (according to documentation only combination of set(), update(), or delete() operations).
I tried to do update the values using cloud functions, but they are in beta and there are known issues with performance and reliability, so I got sometimes wrong results.
Is there any way to update existing document and add documents to its subcollection within one transaction?
The following should do the trick. You have to pass to the updateRes() function the ref of the 'day" doc, the ref of the sub-collection and an array containing an object for each document to add to the sub-collection.
Just open the HTML file in a browser.
<!DOCTYPE html>
<html lang="en">
<head>
<script src="https://www.gstatic.com/firebasejs/5.0.4/firebase-app.js"></script>
<script src="https://www.gstatic.com/firebasejs/5.0.4/firebase-firestore.js"></script>
</head>
<body>
<script>
var config = {
apiKey: "...",
authDomain: "...",
databaseURL: "...",
....
};
firebase.initializeApp(config);
var firestoredb = firebase.firestore();
function updateRes(dayDocRef, orderCollectionRef, refAndDataArray) {
return firestoredb.runTransaction(function (transaction) {
return transaction.get(dayDocRef)
.then(function (dayDoc) {
if (!dayDoc.exists) {
throw "Document Day does not exist!";
}
newResCount = dayDoc.data().ReservationsCount + refAndDataArray.length;
return transaction.update(dayDocRef, { ReservationsCount: newResCount });
})
.then(function () {
var t = transaction;
refAndDataArray.forEach(function (element) {
t = t.set(orderCollectionRef.doc(element.ref), element.data);
});
return t;
});
}).then(function () {
console.log("Transaction successfully committed!");
}).catch(function (error) {
console.log("Transaction failed: ", error);
});
};
var dayDocRef = firestoredb.collection("Days").doc("Day");
var orderCollectionRef = dayDocRef.collection("Reservations"); //The sub-collection is called "Reservations"
var refAndDataArray = [{ ref: "3", data: { Order: 3, otherData: "foo" } }, { ref: "4", data: { Order: 4, otherData: "bar" } }];
updateRes(dayDocRef, orderCollectionRef, refAndDataArray);
</script>
</body>
</html>
I am trying to filter list of maps from a dynamodb table which is of the following format.
{
id: "Number",
users: {
{ userEmail: abc#gmail.com, age:"23" },
{ userEmail: de#gmail.com, age:"41" }
}
}
I need to get the data of the user with userEmail as "abc#gmail.com". Currently I am doing it using the following dynamodb query. Is there any another efficient way to solve this issue ?
var params = {
TableName: 'users',
Key:{
'id': id
}
};
var docClient = new AWS.DynamoDB.DocumentClient();
docClient.get(params, function (err, data) {
if (!err) {
const users = data.Item.users;
const user = users.filter(function (user) {
return user.email == userEmail;
});
// filtered has the required user in it
});
The only way you can get a single item in dynamo by id if you have a table with a partition key. So you need to have a table that looks like:
Email (string) - partition key
Id (some-type) - user id
...other relevant user data
Unfortunately, since a nested field cannot be a partition key you will have to maintain a separate table here and won't be able to use an index in DynamoDB (neither LSI, nor GSI).
It's a common pattern in NoSQL to duplicate data, so there is nothing unusual in it. If you were using Java, you could use transactions library, to ensure that both tables are in sync.
If you are not going to use Java you could read DynamoDB stream of the original database (where emails are nested fields) and update the new table (where emails are partition keys) when an original table is updated.