how to discard initial data in a Firebase DB - firebase

I'm making a simple app that informs a client that other clients clicked a button. I'm storing the clicks in a Firebase (db) using:
db.push({msg:data});
All clients get notified of other user's clicks with an on, such as
db.on('child_added',function(snapshot) {
var msg = snapshot.val().msg;
});
However, when the page first loads I want to discard any existing data on the stack. My strategy is to call db.once() before I define the db.on('child_added',...) in order to get the initial number of children, and then use that to discard that number of calls to db.on('child_added',...).
Unfortunately, though, all of the calls to db.on('child_added',...) are happening before I'm able to get the initial count, so it fails.
How can I effectively and simply discard the initial data?

For larger data sets, Firebase now offers (as of 2.0) some query methods that can make this simpler.
If we add a timestamp field on each record, we can construct a query that only looks at new values. Consider this contrived data:
{
"messages": {
"$messageid": {
"sender": "kato",
"message": "hello world"
"created": 123456 // Firebase.ServerValue.TIMESTAMP
}
}
}
We could find messages only after "now" using something like this:
var ref = new Firebase('https://<your instance>.firebaseio.com/messages');
var queryRef = ref.orderBy('created').startAt(Firebase.ServerValue.TIMESTAMP);
queryRef.on('child_added', function(snap) {
console.log(snap.val());
});

If I understand your question correctly, it sounds like you only want data that has been added since the user visited the page. In Firebase, the behavior you describe is by design, as the data is always changing and there isn't a notion of "old" data vs "new" data.
However, if you only want to display data added after the page has loaded, try ignoring all events prior until the complete set of children has loaded at least once. For example:
var ignoreItems = true;
var ref = new Firebase('https://<your-Firebase>.firebaseio.com');
ref.on('child_added', function(snapshot) {
if (!ignoreItems) {
var msg = snapshot.val().msg;
// do something here
}
});
ref.once('value', function(snapshot) {
ignoreItems = false;
});
The alternative to this approach would be to write your new items with a priority as well, where the priority is Firebase.ServerValue.TIMESTAMP (the current server time), and then use a .startAt(...) query using the current timestamp. However, this is more complex than the approach described above.

Related

How do i query a Firebase database for a nested property?

Hi i have a noSql db in firebase.
I want to get the object where userId is 288
i'v tried many combinations but i cant figure out how its done.
This is my code so far :
var refTest= database.ref('conversation')
var query = refTest
.orderByChild('messages');
query.on('value', function(data) {
var a = data.val();
console.log(a.messages.userId);
console.log(data.val());
});
This is a image of my "schema"
I'm obviously a noob when it comes to NoSQL. I do understand SQL
All help is appreciated
You can order/filter on a nested value like this:
var refTest= database.ref('conversation')
var query = refTest.orderByChild('messages/userId').equalTo("288");
query.on('value', function(snapshot) {
snapshot.forEach(function(child) {
console.log(child.key);
console.log(child.val());
});
});
The forEach is needed, since there may be multiple child nodes with messages/userId equal to 288.
The key named "messages" doesn't make sense in your schema. Because if you want to have another message under that conversation, then you wouldn't be able to add it with the same key name and you also couldn't add it under "messages" because it would overwrite the other one. My suggestion is to use the push() method for adding a new message. This way you uniquely identify each message.
Regarding your question, an easy to understand way of parsing your schema is this: you loop through each message of each conversation for finding the messages with userID.
refTest.on('value', function(data) {
var conversations = data.val();
for (conversation in conversations){
for (message in conversation) {
if (message.userId == 288) {
// do whatever you need
// and eventually return something to break the loops
}
}
}
}
Of course, you can adapt it based on your needs

Facebook like load new posts in meteor

I'm in the process of learning meteor. I followed the tutorial to create microscope. If some one submits a post meteor will re render the template for all users. This could be very annoying if there are hundreds of posts then the user will come back to the top of the page and loose track of where he was. I want to implement something similar to what facebook has. When a new post is submitted template isn't rendered rather, a button or link will appear. Clicking it will cause the template to re-render and show the new posts.
I was thinking of using observeChanges on the collection to detect any changes and it does stop the page from showing new posts but only way to show them is to reload the page.
Meteor.publish('posts', function(options) {
var self = this, postHandle = null;
var initializing = true;
postHandle = Posts.find({}, options).observeChanges({
added: function(id, post) {
if (initializing){
self.added('posts', id, post);
}
},
changed: function(id, fields) {
self.changed('posts', id, fields);
}
});
self.ready();
initializing = false;
self.onStop(function() { postHandle.stop(); });
});
Is this the right path to take? If yes, how do I alert the user of new posts? Else, what would be a better way to implement this?
Thank you
This is a tricky question but also valuable as it pertains to a design pattern that is applicable in many instances. One of the key aspects is wanting to know that there is new data but not wanting to show it (yet) to the user. We can also assume that when the user does want to see the data, they probably don't want to wait for it to be loaded into the client (just like Facebook). This means that the client still needs to cache the data as it arrives, just not display it immediately.
Therefore, you probably don't want to restrict the data displayed in the publication - because this won't send the data to the client. Rather, you want to send all the (relevant) data to the client and cache it there until it is ready.
The easiest way involves having a timestamp in your data to work from. You can then couple this with a Reactive Variable to only add new documents to your displayed set when that Reactive Variable changes. Something like this (code will probably be in different files):
// Within the template where you want to show your data
Template.myTemplate.onCreated(function() {
var self = this;
var options = null; // Define non-time options
// Subscribe to the data so everything is loaded into the client
// Include relevant options to limit data but exclude timestamps
self.subscribe("posts", options);
// Create and initialise a reactive variable with the current date
self.loadedTime = new ReactiveVar(new Date());
// Create a reactive variable to see when new data is available
// Create an autorun for whenever the subscription changes ready() state
// Ignore the first run as ready() should be false
// Subsequent false values indicate new data is arriving
self.newData = new ReactiveVar(false);
self.autorun(function(computation) {
if(!computation.firstRun) {
if(!self.subscriptionsReady()) {
self.newData.set(true);
}
}
});
});
// Fetch the relevant data from that subscribed (cached) within the client
// Assume this will be within the template helper
// Use the value (get()) of the Reactive Variable
Template.myTemplate.helpers({
displayedPosts = function() {
return Posts.find({timestamp: {$lt: Template.instance().loadedTime.get()}});
},
// Second helper to determine whether or not new data is available
// Can be used in the template to notify the user
newData = function() {
return Template.instance().newData.get();
});
// Update the Reactive Variable to the current time
// Assume this takes place within the template helper
// Assume you have button (or similar) with a "reload" class
Template.myTemplate.events({
'click .reLoad' = function(event, template) {
template.loadedTime.set(new Date());
}
});
I think this is the simplest pattern to cover all of the points you raise. It gets more complicated if you don't have a timestamp, you have multiple subscriptions (then need to use the subscription handles) etc. Hope this helps!
As Duncan said in his answer, ReactiveVar is the way to go. I've actually implemented a simple facebook feed page with meteor where I display the public posts from a certain page. I use infinite scroll to keep adding posts to the bottom of the page and store them in a ReactiveVar. Check the sources on github here and the live demo here. Hope it helps!

Meteor: Publish a subset of another publication

I have a custom publication on my server (which in some way join 2 collections).
This resulting set of this publication is exactly what I need but for performances issues I would like to avoid sending it entirely to the client.
If I did not care about performances, I would only subscribe to the
publication and do something like
theCollection.find({"my":"filter"})
I am therefore trying to find a way to publish a subset of the custom publication so that the filter would be applied on the custom publication on the server side.
Is there a way to chain or filter publications (server side) ?
For the question we can assume the custom publication to look like this and cannot be modified:
Meteor.publish('customPublication', function() {
var sub = this;
var aCursor = Resources.find({type: 'someFilter'});
Mongo.Collection._publishCursor(aCursor, sub, 'customPublication');
sub.ready();
});
if i understand the question right, you are looking for https://atmospherejs.com/reywood/publish-composite
It let's you "publish a set of related documents from various collections using a reactive join. This makes it easy to publish a whole tree of documents at once. The published collections are reactive and will update when additions/changes/deletions are made."
Ok I came to the following workaround. Instead of working on the publication, I simply added a new collection I update according to the other collections. In order to do so I am using the meteor hooks package
function transformDocument(doc)
{
doc.aField = "aValue"; // do what you want here
return doc;
}
ACollection.after.insert(function(userId, doc)
{
var transformedDocument = transformDocument(doc);
AnotherCollection.insert(transformedDocument);
});
ACollection.after.update(function(userId, doc, fieldNames, modifier, options)
{
var transformedDocument = transformDocument(doc);
delete transformedDocument._id;
AnotherCollection.update(doc._id,{$set:transformedDocument});
});
ACollection.after.remove(function(userId, doc)
{
AnotherCollection.remove(doc._id);
});
Then I have the new collection I can publish subsets the regular way
Benefits:
You can filter whatever you want into this db, no need to worry if the field is virtual or real
Only one operation every time a db changes. This avoid having several publication merging the same data
Cave eats:
This requires one more Collection = more space
The 2 db might not be always synchronised, there is few reasons for this:
The client manually changed the data of "AnotherCollection"
You had documents in "ACollection" before you added "AnotherCollection".
The transform function or source collection schema changed at some point
To fix this:
AnotherCollection.allow({
insert: function () {
return Meteor.isServer;
},
update: function () {
return Meteor.isServer;
},
remove: function () {
return Meteor.isServer;
}
});
And to synchronise at meteor startup (i.e. build the collection from scratch). Do this only once for maintenance or after adding this new collection.
Meteor.startup(function()
{
AnotherCollection.remove({});
var documents = ACollection.find({}).fetch();
_.each(documents, function(doc)
{
var transformedDocument = transformDocument(doc);
AnotherCollection.insert(transformedDocument);
});
});

How to remove data that was pushed in a list in Firebase?

Given
var messageListRef = new Firebase('https://SampleChat.firebaseIO-demo.com/message_list');
messageListRef.push({ 'user_id': 'fred', 'text': 'Yabba Dabba Doo!' });
How to remove that added data { 'user_id': 'fred', 'text': 'Yabba Dabba Doo!' } later from Firebase? Is there a clean and simple way to do that?
I would like to be able to find that data again later and then remove it, assuming I don't know the unique id generated, I can't do new Firebase('https://SampleChat.firebaseIO-demo.com/message_list/'+uniqueId).remove() (and I don't know if this is the good practice). In my idea I would first query the data but I don't know how I can do that with a list of data. For example, I would like to be able to remove that data onDisconnect.
On that page https://www.firebase.com/docs/web/api/firebase/push.html, it seems the "See Lists of Data" is not yet written. Is it in the roadmap to add such remove for lists of data?
When you call push it returns the new node. So you could keep a list of messages the user added in memory:
var myMessageKeys = []; // put this somewhere "globally"
And then whenever you add a message:
var newMessageRef = messageListRef.push({ 'user_id': 'fred', 'text': 'Yabba Dabba Doo!' });
myMessageKeys.push(newMessageRef.key());
Personally this feels hacky to me. I would prefer to use a query, so that for example if fred disconnects you'd do something like:
var myMessages = messageListRef.orderByChild('user_id').equalTo('fred');
myMessages.on('value', function(messagesSnapshot) {
messagesSnapshot.forEach(function(messageSnapshot) {
messageSnapshot.ref().remove();
});
});
So figuring out which messages to remove is the trick. But suppose you want to delete by user id; perhaps when Fred disconnects, you want to remove all of his messages. You could find and delete them like this:
var query = messageListRef.orderByChild('user_id').equalTo('Fred');
query.once('child_added', function(snapshot) {
snapshot.forEach( function(msg) {
msg.ref().remove();
});
});

angularFire startAt querying and binding deletes new data

The application shows work-shifts for certain time-period. firebaseConn.getShifts is the API-function to get the shiftData for the given time period.
versions:
firebase: 2.0.6
angularFire: 0.9.0 (confirmed with 0.8.2 also)
This is my firebase schema:
And this is the code:
.factory('watchers', function(bunch-of-dependencies) {
var unbindShifts = function() {};
var inited = false;
var shifts = {};
... some irrelevant code in between ...
function initShifts() {
unbindShifts();
shifts.object = firebaseConn.getShifts( false, from, to, $scope );
$scope.shifts = shifts.object;
shifts.object.$bindTo($scope, "shifts").then(function(unbind) {
unbindShifts = unbind;
});
}
The firebase-queries (that have worked fine before adding the unbind / bind and possibly time-based querying might cause issues too):
firebaseConn.getShifts = function(asArray, from, to, scope) {
return cacheRequest(FBURL + "shifts", asArray, [from, to]);
};
function cacheRequest(url, asArray, limits) {
var type = asArray ? "array" : "object";
var startAt = limits ? limits[0] : undefined;
var endAt = limits ? limits[1] : undefined;
var retObj, FBRef;
cached[url] = cached[url] || {};
/* If there are limits-parameters we don't cache at all atm. Since those queries should be checked differently than static urls */
if(!limits && cached[url][type]) {
FBRef = cached[url][type];
} else {
FBRef = cached[url][type] = createFBRef(url, startAt, endAt);
}
if(asArray) {
retObj = FBRef.$asArray();
} else {
retObj = FBRef.$asObject();
}
return retObj;
}
function createFBRef(resourceURL, startAt, endAt) {
var modifiedObject = $firebase( createRef( resourceURL ).orderByKey().startAt(startAt).endAt(endAt) );
return modifiedObject;
}
function createRef(resourceURL) {
return new Firebase( resourceURL );
}
Now I have located the problem to be with the query limiting. If the from and to Dates are undefined, this works without problems. But I need to be able to limit the amount of data, since loading many years of workshift-data, to show a weeks time, won't be good :).
The actual problem is not displaying and fetching the data, everything works fine, it's related to the times and re-binding.
If I do any changes to e.g. "20150115"-table. For example I add another "groups"-child there. When i unbind and rebind, the whole "20150115"-table gets deleted and this holds true only to the latest changes. If I add multiple child to different dates e.g. "20150113", "20150114", "20150115" and the latest change is in "20150115" and then I unbind + re-bind another time from firebase, all the other root-paths will stay as they are, but the latest change in "20150115" will make the whole tree deleted.
I hope I make myself clear, so for safety I try to explain it again in simpler way.
- Changes to 1. "20150113", 2. "20150114", 3. "20150115" through the app.
- Changing timeline from UI causes: unbind + re-bind
- As a side-effect the whole "20150114" tree gets deleted.
The problem is somehow related to advanced querying with orderByKey().startAt(startAt).endAt(endAt) and binding.
Also for additional info. The data which is added through the UI gets added to the firebase database, but when the re-binding happens, the data is deleted from the database. Specifically on rebind, unbinding causes no issues, if I delay rebinding with timeout.
EDIT:
I have found the source of the actual issue. After the new binding is in place and everything seems to be in order, there is an angular watch event that kicks in. The event tries to save the last change user made before re-binding.
So if I have and active timeline for december (20141201 - 20141230) and I change "20141225"-data. Then change the timeline to 20150101 - 20150130, causing unbind and rebind (or manually fetching new data). There will be an event, after the binding has been done and everything seems to be in order, trying to save 20141225 data to either the new timeline (20150101 - 20150130) or the old one, not sure which one. This causes the firebase to actually delete the whole 20141225-tree, instead of saving the data.
The new data makes it into your Firebase fine, which you can see by either checking your Firebase dashboard or by running a quick snippet like this in your browser's dev console:
new Firebase("https://firebaseurl").once('value', function(s) { console.log(s.val()); })
The data even makes it back into your application. The only problem is that Angular doesn't know that new data has arrived, so it doesn't update the view with the new data.
Normally AngularFire's $asObject and $asArray methods take care of notifying AngularJS when new data arrives from Firebase. But since you are constantly creating new queries, you'll have to take care of that yourself.
There are a few ways to signal the new data to AngularJS and I'm definitely not an expert on which one is best. But if you add $scope.$apply(); to your setDays function it works:
function setDays(ref) {
var FBRange = setFBRange(ref, from, to);
var days;
unbindDays();
days = $firebase(FBRange).$asObject();
$scope.days = days;
days.$bindTo($scope, "days").then(function(unbind) {
unbindDays = unbind;
// As a result of the new binding entry gets mysteriously deleted from firebase
});
$scope.$apply(); // Tell AngularJS about the new data, so that it updates the view
function setFBRange(ref, from, to) {
return ref.orderByKey().startAt(""+from).endAt(from + to + "");
}
}
Updated Plunkr with this change (and some others to help in debugging): http://plnkr.co/edit/YZtkzUNtjQUCcw4xb2mj?p=preview

Resources