here is the multiple document for offer and each offer cantains bidderId that is referenced to users collection and user id.
I want to fetch offer list contains user collection.
I am using angularfire and here is my code.
this.liveOffers=this.db.collection("offers",ref => ref.where('offerExpired', '==', 0).where('isStart', '==', 1)).snapshotChanges().pipe(
map(actions => actions.map(a => {
const data={} = a.payload.doc.data() as offer;
const id = a.payload.doc.id;
var bidder=this.db.doc(data.bidderId).snapshotChanges().subscribe(key=>{
console.log(key.payload.data());
});
return { id, ...data,bidder };
})) );
Here console.log(key.payload.data()); is logging the data for user but it can not bind with bidder variable and i can not use the user object in front end.
Please let me know how can I fetch the offer record with user details.
You need to use a combination of switchMap and combineLatest to get it done.
This is a pseudo-code approach
const temp = []
this.offers$ = this.db.collection().snapshotChanges().pipe(
map(auctions=>{
//we save all auctions in temp and return just the bidderId
return auctions.map(auction=>{
const data={} = a.payload.doc.data() as offer;
const id = a.payload.doc.id;
temp.push({id, ...data})
return data.bidderId
})
}),
switchMap(bidderIds=>{
// here you'll have all bidderIds and you need to return the array to query
// them to firebase
return combineLatest(bidderIds.map(bidderId=> return this.db.doc(bidderId)))
}),
map(bidders=>{
// here you'll get all bisders you'll have to set the bidder on each temp obj
// you saved previously
})
)
Make sure you import { combineLatest } from 'rxjs/operators' not 'rxjs'
I found a way. It is working. But I think it is bit big and there might be way to optimize it. Also it is node-js server API and not for the web (JS). Again there might be similar solution for the web (JS)
Once you get data from snapshot object there is _path key in the object returned by data which again have segments which is array and contain collection and ID
const gg = await firestore.collection('scrape').doc('x6F4nctCD').get();
console.log(JSON.stringify(gg.data(), null, 4));
console.log(gg.data().r._path.segments[0]);
console.log(gg.data().r._path.segments[1]);
const gg2 = await firestore
.collection(gg.data().r._path.segments[0])
.doc(gg.data().r._path.segments[1])
.get();
console.log(gg2.data());
{
"fv": {
"_seconds": 1578489994,
"_nanoseconds": 497000000
},
"tsnowpo": 1578489992,
"createdAt": {
"_seconds": 1578489992,
"_nanoseconds": 328000000
},
"r": {
"_firestore": {
"_settings": {
"libName": "gccl",
"libVersion": "3.1.0",
"servicePath": "firestore.googleapis.com",
"port": 443,
"clientConfig": {},
"scopes": [
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/datastore"
]
},
"_settingsFrozen": true,
"_serializer": {},
"_projectId": "sss",
"_lastSuccessfulRequest": 1578511337407,
"_preferTransactions": false,
"_clientPool": {
"concurrentOperationLimit": 100,
"activeClients": {},
"terminated": false
}
},
"_path": {
"segments": [
"egpo",
"TJTHMkxOx1C"
]
}
}
}
egpo
TJTHMkxOx1C
{ name: 'Homeware' }
Related
In my app, users create posts and I'd like to show trending posts by the number of views, comments, etc in a specific date range. To do that I thought I can create a custom event as below:
await FirebaseAnalytics.instance.logEvent(
name: "trending_contents",
parameters: {
"content_type": EnumToString.convertToString(type),
"content_id": contentModel.externalId,
"action_type": "post",
"point": 3,
},
);
I wonder if it is possible to use Google Analytics Data API to get trending posts by a specific date range? Or is there any better way to get trending posts instead of google analytics data API?
I finally found a solution on how to use Google Analytics Data API to manage trending content. If anyone is looking for a solution for a similar need, here is what I've done so far:
I send a custom event in specific situations such as when the user views the content etc. as below. If you use parameters' names according to predefined dimensions & metrics (see API Dimensions & Metrics), it will be easy to prepare a custom report (at least it was for me...). Later, I use contentType and contentId as dimensions and eventValue as a metric in the custom report.
await FirebaseAnalytics.instance.logEvent(
name: "trending_contents",
parameters: {
"content_type": EnumToString.convertToString(event.type),
"content_id": contentId,
"action_type": "view",
"value": 1,
},
);
Lastly, I created a scheduled cloud function that runs every 6 hours and populates firebase collection according to custom report results. This report gives contentIds in a specific date range ordered by the sum of values that I sent in a custom event
P.S. you need to create a service account in Google Cloud Console, then generate JSON credentials for it and add the file to your project (see credentialsJsonPath variable below). Then you need to add its email address to google analytics 'Property Access Management' section to access analytics data. To see Google Analytics Data API samples, you can check their GitHub repo
const { BetaAnalyticsDataClient } = require('#google-analytics/data');
exports.scheduledTrendingFunction = functions.pubsub.schedule('0 */6 * * *').onRun((context) => {
const propertyId = process.env.GA_PROPERTY_ID;
const credentialsJsonPath = process.env.GA_CRENDENTIALS_PATH;
const analyticsDataClient = new BetaAnalyticsDataClient({
keyFilename: credentialsJsonPath,
});
async function runReport(filterType) {
// [START analyticsdata_json_credentials_run_report]
const [response] = await analyticsDataClient.runReport({
property: `properties/${propertyId}`,
dateRanges: [
{
startDate: '3daysAgo',
endDate: 'today',
},
],
dimensions: [
{
name: 'contentType',
},
{
name: 'contentId'
}
],
metrics: [
{
name: 'eventValue'
},
],
dimensionFilter: {
andGroup: {
expressions: [
{
filter: {
fieldName: "eventName",
inListFilter: {
values: ["trending_contents"]
}
}
},
{
filter: {
fieldName: "contentType",
inListFilter: {
values: [filterType]
}
}
}
]
}
},
offset: 0,
limit: 20,
orderBys: [
{
desc: true,
metric: {
metricName: "eventValue"
}
}
]
});
// [END analyticsdata_json_credentials_run_report]
const batch = admin.firestore().batch();
// BATCH: delete
const trendRef = admin.firestore().collection('trends').doc(filterType);
batch.delete(trendRef);
const subTrendRef = admin.firestore().collection('trends').doc(filterType).collection('trendContents');
// console.log(response);
response.rows.forEach((row, index) => {
// BATCH: add each contentId to trend
const contentId = row['dimensionValues']['1']['value'];
batch.set(subTrendRef.doc(contentId), {priority: index + 1});
});
// Commit the batch
await batch.commit();
}
runReport("book");
return null;
});
I'm looking for a way to get statistics (such as execution time) of the query and then attach them to a JSON object along with the actual retrieved data so I can then send it to the client-side.
Sorry if this is a silly question but I tried searching the documentation and googled around but I guess it's either not possible or using the wrong keywords.
In case it's relevant, here's the code:
var AWS = require('aws-sdk');
AWS.config.update({
region: 'us-west-2'
});
const docClient = new AWS.DynamoDB.DocumentClient();
let getLoginsByRole = function (a_role, a_site) {
const params = {
TableName: 'XXXXXXXXXXX',
FilterExpression: '#Role= :Role AND #Site= :Site',
ExpressionAttributeNames: {
'#Role': 'Role',
'#Site': 'Site'
},
ExpressionAttributeValues: {
':Role': a_role,
':Site': a_site
},
};
docClient.scan(params, function (err, data) {
if (err) {
console.log("Error when attempting table scan, see below:\n\n" + JSON.stringify(err, null, 2));
return err;
} else {
var matchingItems= [];
data.Items.forEach(element => matchingItems.push(element.alias))
var responseObject = JSON.parse('{"role":' + JSON.stringify(a_role) +
',"matchingItems":' + JSON.stringify(matchingItems) +
', "itemCount":' + data.Count + "}");
console.log(responseObject);
return responseObject;
}
})
}
getLoginsByRole("XXXXX", "XXX");
As you can see, there's a responseObject that looks like (added a comment next to the stats I'd like to see):
{
role: 'Admin',
matchingItems: [
'billy',
'jim',
'pam',
'ryan',
'kerry',
'karen'
],
itemCount: 6,
queryExecTime: 356ms, //I'd like something like this line...
resultSetSize: 37kB //And this line
}
Anyway thank you for your help, I'm learning DynamoDB and there's not much stuff out there and the SDK documentation is very obscure.
I'm adding a sort field to one of my AppSync tables using GraphQL. The new schema looks like:
type MyTable
#model
#auth(rules: [{allow: owner}])
#key(name: "BySortOrder", fields: ["sortOrder"], queryField: "tableBySortOrder")
{
id: ID!
name: String!
sortOrder: Int
}
However, when retrieving a list using tableBySortOrder I get an empty list because the new field sortOrder is null.
My question is, how do I backfill this data in the DynamoDB table so that my existing users will not be disrupted by this new change? With a traditional database, I would run a SQL update: UPDATE MyTable SET sortOrder = #.
However, I'm new to NoSQL/AWS and couldn't find a way to do this except build a backfill script whenever a user logs into my app. That feels very hacky. What is the best practice for handling this type of scenario?
Have you already created the new field in DDB?
If yes, I think you should backfill it before making the client side change.
Write a script to iterate through and update the table. Options for this:
Java - Call updateItem to update the table if you have any integ tests running.
Bash - Use AWS CLI: aws dynamodb scan --table-name item_attributes --projection-expression "whatever" > /tmp/item_attributes_table.txt and then aws dynamodb update-item --table-name item_attributes --key. This is a dirty way.
Python - Same logic as above.
Ended up using something similar to what Sunny suggested with a nodejs script:
const AWS = require('aws-sdk')
AWS.config.update({
region: 'us-east-1'
})
// To confirm credentials are set
AWS.config.getCredentials(function (err) {
if (err) console.log(err.stack)
// credentials not loaded
else {
console.log('Access key:', AWS.config.credentials.accessKeyId)
console.log('Secret access key:', AWS.config.credentials.secretAccessKey)
}
})
const docClient = new AWS.DynamoDB.DocumentClient()
const table = 'your-table-dev'
const params = {
TableName: table
}
const itemMap = new Map()
// Using scan to retrieve all rows
docClient.scan(params, function (err, data) {
if (err) {
console.error('Unable to query. Error:', JSON.stringify(err, null, 2))
} else {
console.log('Query succeeded.')
data.Items.forEach(item => {
if (itemMap.has(item.owner)) {
itemMap.set(item.owner, [...itemMap.get(item.owner), item])
} else {
itemMap.set(item.owner, [item])
}
})
itemMap.forEach(ownerConnections => {
ownerConnections.forEach((connection, index) => {
connection.sortOrder = index
update(connection)
})
})
}
})
function update(connection) {
const params = {
TableName: table,
Key: {
'id': connection.id
},
UpdateExpression: 'set sortOrder = :s',
ExpressionAttributeValues: {
':s': connection.sortOrder,
},
ReturnValues: 'UPDATED_NEW'
};
console.log('Updating the item...');
docClient.update(params, function (err, data) {
if (err) {
console.error('Unable to update item. Error JSON:', JSON.stringify(err, null, 2));
} else {
console.log('UpdateItem succeeded:', JSON.stringify(data, null, 2));
}
});
}
https://github.com/kristinyim/ClassroomChat
I want to add an upvoting feature to the messages on this chatroom similar to what you have on GroupMe, but I'm new to React and built this off of a tutorial so don't know where to even begin. I'm good with webdev but am just getting started with the basics of React.js and Firebase. Thanks!
NB: There are many ways to achieve this, so the following is just a suggestion.
First you must think of how you want to store your data in the database. If you have users, messages and message-likes, you could structure it like this:
"root": {
"users": {
"$userId": {
...
"messages": {
"$messageId1": true,
"$messageId2": true,
...
}
}
},
"messages": {
"$messageId": {
"author": $userId,
"timestamp": ServerValue.TIMESTAMP
}
},
"likesToMessages": {
"$messageId": {
"$likeId": {
liker: $userId,
"message": $messageId,
"timestamp": ServerValue.TIMESTAMP
}
}
}
}
Whenever a user clicks "like" on a message, you want to write to
var messageId = ?; // The id of the message that was liked
var like = {
liker: currentUserId, // id of logged in user
message: messageId,
timestamp: firebase.database.ServerValue.TIMESTAMP
};
firebase.database.ref().child('likesToMessages').child(messageId).push(like);
Then you get a new like in the database, matching the proposed structure.
Then, when you want to read and show the count of likes for a message, you can do like this:
const Message = React.createClass({
propTypes: {
message: React.PropTypes.object,
messageId: React.PropTypes.string // you need to add this prop
}
componentWillMount() {
firebase.database.ref().child('likesToMessages').child(this.props.messageId).on('value', this.onLikesUpdated)
}
onLikesUpdated(dataSnapshot) {
var likes = snap.val();
this.setState({
likes
});
}
render() {
const {name, message} = this.props.message;
const emojifiedString = emoji.emojify(message);
return (
<p>
{name}: {emojifiedString} [{this.state.likes.length}♥]
</p>
);
}
});
Also, in your database security rules, you'd want to index by timestamp for message and like so you can quickly query the newest messages.
Also, feel free to check out a similar app I made, code in GitHub and demo on wooperate.firebaseapp.com.
How can I do using transacting t, I want to make sure the row is successful remove before saving the record:
var Roles = bookshelf.Collection.extend({
model: Role
);
Role.where('name', '=', 'Staff').destroy();
var roles = Roles.forge([{name: 'Staff'}, {name: 'Guest'}]);
Promise.all(roles.invoke('save')).then(function(role) {
resolve(role);
}).catch(function (err) {
reject({"status":"error", "data": err});
});
You may just use Bookshelf's transaction() method.
But first your save() MUST be in the context of the destroy() promise, so ensuring proper sequence, otherwise you risk having your saved data being also deleted by the destroy.
So it may look like:
var Roles = bookshelf.Collection.extend({
model: Role
});
bookshelf.transaction(function(t) {
return Role
.where('name', '=', 'Staff')
.destroy({transacting: t})
.then(function() {
var roles = Roles.forge([{name: 'Staff'}, {name: 'Guest'}]);
return roles
.invokeThen('save', null, {transacting: t});
});
});