Firebase denormalizing many to many - firebase

I have a pretty basic data structure
events
topics
I would like to be able to easily show (query)
what topics are owned by an event
what events cover a topics
what are the most popular topics this month
I am pretty comfortable with my events structure like
/events/880088/topics.json *
["Firebase", "Cloud"]
but I struggle with how to structure the /topics nodes. I partially get the idea of going with something like
/topics/Firebase
{"12345":true,"88088":true}
and then if when I update an events's topic collection I would have to iterate over all the /topics/ nodes and update /topics/{{topic}}/{{eventid}} to {true | null}. Which seems rather ham fisted.
ALSO, then I am still at a loss of how to query say, what are the topics covered by events this month.
Example JSBin from comments below http://jsbin.com/dumumu/edit?js,output
* I know, I know, arrays are evil, https://www.firebase.com/blog/2014-04-28-best-practices-arrays-in-firebase.html, but I think they fit in this scenaris

Here's one way to add an event:
function addEvent(title, topics) {
var event =ref.child('events').push({ title: title });
topics.forEach(function(topic) {
event.child('topics').child(topic).set(true);
ref.child('topics').child(topic).child(event.key()).set(true);
});
}
Seems pretty simple for me. For an interesting twist, you can use the new multi-location updates we launched yesterday (September 2015):
function addEvent(title, topics) {
var updates = {};
var eventId = ref.push().key();
updates['events/'+eventId+'/title'] = title;
topics.forEach(function(topic) {
updates['events/'+eventId+'/topics/'+topic] = true;
updates['topic/'+topic+'/'+eventId] = true;
});
ref.update(updates);
}
The latter is a bit more code. But it's a single write operation to Firebase, so there's no chance of the user closing the app between write operations.
You invoke both the same of course:
addEvent('Learn all about Firebase', ['Firebase']);
addEvent('Cloudspin', ['Firebase', 'Google', 'Cloud']);
And the data structure becomes:
{
"events": {
"-K-4HCzj_ziHkZq3Fpat": {
"title": "Learn all about Firebase",
"topics": {
"Firebase": true
}
},
"-K-4HCzlBFDIwaA8Ajb7": {
"title": "Cloudspin",
"topics": {
"Cloud": true,
"Firebase": true,
"Google": true
}
}
},
"topic": {
"Cloud": {
"-K-4HCzlBFDIwaA8Ajb7": true
},
"Firebase": {
"-K-4HCzj_ziHkZq3Fpat": true,
"-K-4HCzlBFDIwaA8Ajb7": true
},
"Google": {
"-K-4HCzlBFDIwaA8Ajb7": true
}
}
}
Querying/reporting
With Firebase (and most NoSQL databases), you typically have to adapt your data structure for the reporting you want to do on it.
Abe wrote a great answer on this recently, so go read that for sure: Firebase Data Structure Advice Required
Update: change the topics for an event
If you want to change the topics for an existing event, this function is once way to accomplish that:
function updateEventTopics(event, newTopics) {
newTopics.sort();
var eventId = event.key();
var updates = {};
event.once('value', function(snapshot) {
var oldTopics = Object.keys(snapshot.val().topics).sort();
var added = newTopics.filter(function(t) { return oldTopics.indexOf(t) < 0; }),
removed = oldTopics.filter(function(t) { return newTopics.indexOf(t) < 0; });
added.forEach(function(topic) {
updates['events/'+eventId+'/topics/'+topic] = true;
updates['topic/'+topic+'/'+eventId] = true;
});
removed.forEach(function(topic) {
updates['events/'+eventId+'/topics/'+topic] = null;
updates['topic/'+topic+'/'+eventId] = null;
});
ref.update(updates);
});
}
The code is indeed a bit long, but that's mostly to determine the delta between the current topics and the new topics.
In case you're curious, if we run these API calls now:
var event = addEvent('Cloudspin', Date.now() - month, ['Firebase', 'Google', 'Cloud']);
updateEventTopics(event, ['Firebase', 'Google', 'GCP']);
The changeEventTopics() call will result in this update():
{
"events/-K-93CxuCrFDxM6k0B14/topics/Cloud": null,
"events/-K-93CxuCrFDxM6k0B14/topics/GCP": true,
"topic/Cloud/-K-93CxuCrFDxM6k0B14": null,
"topic/GCP/-K-93CxuCrFDxM6k0B14": true
}

Related

Mapbox GL and .net core webApi

I am moving from Leaflet to Mapbox GL and have some data issues. My webApi is proven but I cannot smoothly integrate them.
The approach I gave up on, based upon their examples and my own research, looks like:
map = new mapboxgl.Map({
container: 'mapdiv',
style: 'mapbox://styles/mapbox/streets-v10'
, center: start
, zoom: $scope.zoom
, transformRequest: (url, resourceType) => {
if (resourceType === 'Source' && url.startsWith(CONFIG.API_URL)) {
return {
headers: {
'Authorization': 'Bearer ' + localStorageService.get("authorizationData")
, 'Access-Control-Allow-Origin': CONFIG.APP_URL
, 'Access-Control-Allow-Credentials': 'true'
}
}
}
}
});
This is passing my OAuth2 token (or at least I think it should be) and the Cross site scripting part CORS.
Accompanying the above with:
map.addSource(layerName, { type: 'geojson', url: getLayerURL($scope.remLayers[i]) });
map.getSource(layerName).setData(getLayerURL($scope.remLayers[i]));
Having also tried to no avail:
map.addSource(layerName, { "type": 'geojson', "data": { "type": "FeatureCollection", "features": [] }});
map.getSource(layerName).setData(getLayerURL($scope.remLayers[i]));
Although there are no errors Fiddler does not show any requests being made to my layer webApi. All the others show but Mapbox does not appear to raising them.
The Url looks like:
http://localhost:49198/api/layer/?bbox=36.686654090881355,34.72821077223763,36.74072742462159,34.73664000652042&dtype=l&id=cf0e1df7-9510-4d03-9319-d4a1a7d6646d&sessionId=9a7d7daf-76fc-4dd8-af4f-b55d341e60e4
Because this was not working I attempted to make it more manual using my existing $http calls which partially works.
map = new mapboxgl.Map({
container: 'mapdiv',
style: 'mapbox://styles/mapbox/streets-v10'
, center: start
, zoom: $scope.zoom
, transformRequest: (url, resourceType) => {
if (resourceType === 'Source' && url.startsWith(CONFIG.API_URL)) {
return {
headers: {
'Authorization': 'Bearer ' + localStorageService.get("authorizationData")
}
}
}
}
});
map.addSource(layerName,
{
"type": 'geojson',
"data": { "type": "FeatureCollection", "features": [] }
});
The tricky part is to know when to run the data retrieval call. The only place I could find was on the maps data event which now looks like:
map.on('data', function (e) {
if (e.dataType === 'source' && e.isSourceLoaded === false && e.tile === undefined) {
// See if the datasource is known
for (var i = 0; i < $scope.remLayers.length; i++) {
if (e.sourceId === $scope.remLayers[i].name) {
askForData(i)
}
}
}
});
function askForData(i) {
var data = getBBoxString(map);
var mapZoomLevel = map.getZoom();
if (checkZoom(mapZoomLevel, $scope.remLayers[i].minZoom, $scope.remLayers[i].maxZoom)) {
mapWebSvr.getData({
bbox: data, dtype: 0, id: $scope.remLayers[i].id, buffer: $scope.remLayers[i].isBuffer, sessionId
},
function (data, indexValue, indexType) {
showNewData(data, indexValue, indexType);
},
function () {
// Not done yet.
},
i,
0
);
}
}
function showNewData(ajxresponse, index, indexType) {
map.getSource($scope.remLayers[index].name).setData(ajxresponse);
map.getSource($scope.remLayers[index].name).isSourceLoaded = true;
}
This is all working with one exception. It keeps firing time and time again. Some of these calls return a lot of data for a web call so its not a solution at the moment.
Its like its never satisfied with the data even though its showing it on the map!
There is a parameter on the data event, isSourceLoaded but it does not get set to true.
I have searched for an example, have tried setting isSourceLoaded in a number of places (as with the code above) but to no avail.
Does anyone have a method accomplishing this basic data retrieval function successfully or can point out the error(s) in my code? Or even point me to a working example...
I have spent too long on this now and could do with some help.
After a bit of a run around I have a solution.
A Mapbox email pointed to populating the data in the load event - which I am now doing.
This was not however the solution I was looking for as the data needs refreshing when the map moves, zooms etc - further look ups are required.
Following a bit more a examination a solution was found.
Using the code blow on the render event will request the information when the bounding box is changed.
var renderStaticBounds = getBoundsString(map.getBounds());
map.on('render', function (e) {
if (renderStaticBounds != getBoundsString(map.getBounds())) {
renderStaticBounds = getBoundsString(map.getBounds());
for (var i = 0; i < $scope.remLayers.length; i++) {
askForData(i);
}
}
});
function getBoundsString(mapBounds) {
var left = mapBounds._sw.lng;
var bottom = mapBounds._sw.lat;
var right = mapBounds._ne.lng;
var top = mapBounds._ne.lat;
return left + ',' + bottom + ',' + right + ',' + top;
}
This hopefully will save someone some development time.

Google Embed API format data before calling .execute()

I need to format the response I get from Analytics before showing it inside a Google Chart, I tried editing the response when the on("success"... method gets fired but I found that it gets called after the .execute().
Is there any way to edit the response after receiving it and before it populates the chart?
This is my function:
var dataChart5 = new gapi.analytics.googleCharts.DataChart({
reportType: 'ga',
query: {
'ids': 'ga:***', // My ID
'start-date': '31daysAgo',
'end-date': 'yesterday',
'metrics': 'ga:users,ga:percentNewSessions,ga:sessions,ga:bounceRate,ga:avgSessionDuration,ga:pageviews,ga:pageviewsPerSession',
'prettyPrint':'true',
},
chart: {
'container': 'chart-5-container',
'type': 'TABLE',
'options': {
'width': '100%',
'title': 'test'
}
}
});
dataChart5.on('success', function(response) {
response.data.cols[0].label = "test1"; //here I edit the response
console.log(response);
});
dataChart5.execute();
Using the console.log(response); I can see that the record label gets modified but the chart gets populated before the edit.
I think a have a workaround. It has problems, but might be useful. While handling the success event, call a function that will recursively walk through the child elements of $('#chart-5-container') and apply your formatting there.
One problem with that approach is that the positions of the elements won't be recalculated. Therefore, with different string sizes you might get overlapping strings. Moreover, it seems not to be affecting the tooltip.
I'm using this approach to translate to Portuguese.
function recursiveTranslate(e) {
var key = e.html(),
dict = {};
dict['Date'] = 'Data';
dict['Users'] = 'Visitantes';
dict['Sessions'] = 'Visitas';
dict['Pageviews'] = 'Visualizações';
if (key in dict) {
e.html(dict[key]);
}
for (var i = 0; i < e.children().length; i++) {
recursiveTranslate($(e.children()[i]));
}
}
Then I call recursiveTranslate inside the success event:
dataChart5.on('success', function h(obj) {
recursiveTranslate($('#chart-5-container'));
});
It is not elegant and has a lot of issues. I would really like to get my hands on the proper solution.

How to query two types of records in CouchDB

I’m having issues getting two dependant types of data from a PouchDB database.
I have a list of cars that I get like so:
localDB.query(function(doc) {
if (doc.type === ‘list’) {
emit(doc);
}
}, {include_docs : true}).then(function(response) {
console.log(“cars”, response);
// Save Cars List to app
for(var i = 0; i < response.rows.length; i++) {
addToCarsList(response.rows[i].id, response.rows[i].carNumber);
}
console.log(“Cars List: " + carsListToString());
return response;
}).then(function(listRecord) {
listRecord.rows.forEach(function(element, index){
console.log(index + ' -> ', element);
localDB.query(function(doc) {
console.log("filtering with carNb = " + element.carNb);
if (doc.type === 'defect' && doc.listId == getCurrentListId() && doc.carNb == element.carNb ) {
emit(doc);
}
}, {include_docs : false}).then(function(result){
console.log("defects", result);
}).catch(function(err){
console.log("an error has occurred", err);
});
});
}).catch(function(err) {
console.log('error', err);
});
Here's what happens. After getting the list of cars, then for each cars I would like to query the defects and store then in some arrays. Then when all that querying is done, I want to build the UI with the data saved.
But what's happening is that the forEach gets processed quickly and does not wait for the inner async'd localDb.query.
How can I query some documents based on an attribute from a parent query? I looked into promises in the PouchDB doc but I can't understand how to do it.
(please forget about curly quotes and possible lint errors, this code was anonymized by hand and ultra simplified)
The method you are looking for is Promise.all() (execute all promises and return when done).
However, your query is already pretty inefficient. It would be better to create a persistent index, otherwise it has to do a full database scan for every query() (!). You can read up on the PouchDB query guide for details.
I would recommend installing the pouchdb-upsert plugin and then doing:
// helper method
function createDesignDoc(name, mapFunction) {
var ddoc = {
_id: '_design/' + name,
views: {}
};
ddoc.views[name] = { map: mapFunction.toString() };
return ddoc;
}
localDB.putIfNotExists(createDesignDoc('my_index', function (doc) {
emit([doc.type, doc.listId, doc.carNb]);
})).then(function () {
// find all docs with type 'list'
return localDB.query('my_index', {
startkey: ['list'],
endkey: ['list', {}],
include_docs: true
});
}).then(function (response) {
console.log("cars", response);
// Save Cars List to app
for(var i = 0; i < response.rows.length; i++) {
addToCarsList(response.rows[i].id, response.rows[i].carNumber);
}
console.log("Cars List: " + carsListToString());
return response;
}).then(function (listRecord) {
return PouchDB.utils.Promise.all(listRecord.rows.map(function (row) {
// find all docs with the given type, listId, carNb
return localDB.query('my_index', {
key: ['defect', getCurrentListId(), row.doc.carNb],
include_docs: true
});
}));
}).then(function (finalResults) {
console.log(finalResults);
}).catch(function(err){
console.log("an error has occurred", err);
});
I'm using a few tricks here:
emit [doc.type, doc.listId, doc.carNb], which allows us to query by type or by type+listId+carNb.
when querying for just the type, we can do {startkey: ['list'], endkey: ['list', {}]}, which matches just those with the type "list" because {} is the "higher" than strings in CouchDB object collation order.
PouchDB.utils.Promise is a "hidden" API, but it's pretty safe to use if you ask me. It's unlikely we'll change it.
Edit Another option is to use the new pouchdb-find plugin, which offers a simplified query API designed to replace the existing map/reduce query() API.
Another approach would be to pull both the list docs and the defect docs down at the same time then merge them together using a reduce like method that will convert them into an array of objects:
{
_id: 1,
type: 'list',
...
defects: [{
type: 'defect'
listId: 1
...
}]
}
By pulling the list and the defects down in one call you save a several calls to the pouchdb query engine, but you do have to iterate through every result to build your collection of lists objects with and embedded array of defects.
// This is untested code so it may not work, but you should get the idea
var _ = require('underscore');
// order documents results by list then defect
var view = function (doc) {
if (doc.type === 'list') {
emit([doc._id, doc.carNumber, 1);
} else if (doc.type === 'defect') {
emit([doc.listId, doc.carNb, 2])
}
}
localDB.query(view, { include_docs: true })
.then(function(response) {
return _(response.rows)
.reduce(function(m, r) {
if (r.key[2] === 1) {
// initialize
r.doc.defects = [];
m.push(r.doc)
return m;
}
if (r.key[2] === 2) {
var list = _(m).last()
if (list._id === r.key[0] && list.carNumber === r.key[1]) {
list.defects.push(r.doc);
}
return m;
}
}, []);
})
.then(function(lists) {
// bind to UI
});
With couch, we found reducing calls to the couch engine to be more performant, but I don't know if this approach is better for PouchDB, but this should work as a solution, especially if you are wanting to embed several collections into one list document.

How do I make a collection reactive based on the contents of another?

In short, I want to do:
Meteor.publish('items', function(){
return Item.find({categoryId: Categories.find({active: true} });
});
The flag 'active' as part of 'Categories' changes regularly.
I also tried unsub/resub to the Items collection by leveraging reactivity on the Categories collections, and it works, unfortunately it re-triggers on ANY modification to the Categories collection, regardless if it affected the 'active' flag or not.
What are my options?
Nothing solved the issue of the items not being 'deleted' locally when the category is flagged as inactive on the server. Solution (ish) is to:
Client:
Categories.find({active: true}).observeChanges({
added: function(){
itemsHandle && itemsHandle.stop();
itemsHandle = Meteor.subscribe("items");
}
});
Server:
Meteor.publish('items', function(){
var category = Categories.findOne({active: true});
return category && Items.find({categoryId: Categories.findOne({active: true}._id);
});
I realize this isn't perfect (still uses client side code), but it works and its the cleanest I could think of. I hope it helps someone!
A possible solution is to create a dependency object, watch for all categories change, and trigger the dep change if the active flag was toggled. Something along these lines:
var activeCount = Categories.find({active: true}).count();
var activeDep = new Deps.Dependency();
Deps.autorun(function() {
var activeCountNow = Categories.find({active: true}).count();
if(activeCountNow !== activeCount) {
activeCount = activeCountNow;
activeDep.changed();
}
});
Meteor.publish('items', function(){
activeDep.depend();
return Item.find({categoryId: Categories.find({active: true} });
});
Note: I'm only verifying whether the number of active categories have changes so that I don't have to keep the active list in the memory. This may or may not be appropriate depending on how your app works.
Edit: Two-sided flavor mentioned in the comments:
Client:
var activeCount = Categories.find({active: true}).count();
var activeDep = new Deps.Dependency();
Deps.autorun(function() {
var activeCountNow = Categories.find({active: true}).count();
if(activeCountNow !== activeCount) {
activeCount = activeCountNow;
activeDep.changed();
}
});
Deps.autorun(function(){
activeDep.depend();
Meteor.subscribe('items', new Date().getTime());
});
Server:
Meteor.publish('items', function(timestamp) {
var t = timestamp;
return Item.find({categoryId: Categories.find({active: true} });
});
Meteor.startup(function() {
Categories.find().observe({
addedAt: function(doc) {
trigger();
},
changedAt: function(doc, oldDoc) {
if(doc.active != oldDoc.active) {
trigger();
}
},
removedAt: function(oldDoc) {
trigger();
}
});
});
Now, the trigger function should cause the publish to rerun. This time it's easy when it's on the client (change subscription param). I'm not sure how to do this on the server - perhaps run publish again.
I use the following publish to solve a similar issue. I think it is only the one line nesting of queries that limits the reactivity. Breaking one query out inside the publish function seems to avoid the issue.
//on server
Meteor.publish( "articles", function(){
var self= this;
var subscriptions = [];
var observer = Feeds.find({ subscribers: self.userId }, {_id: 1}).observeChanges({
added: function (id){
subscriptions.push(id);
},
removed: function (id){
subscriptions.splice( subscriptions.indexOf(id)) , 1);
}
});
self.onStop( function() {
observer.stop();
});
var visibleFields = {_id: 1, title: 1, source: 1, date: 1, summary: 1, link: 1};
return Articles.find({ feed_id: {$in: subscriptions} }, { sort: {date: -1}, limit: articlePubLimit, fields: visibleFields } );
});
//on client anywhere
Meteor.subscribe( "articles" );
Here is another SO example which gets the search criteria from the client through subscribe if you decide that is acceptable.
Update: Since the OP struggled to get this going I made a gist and launched a working version on meteor.com. If you just need the publish function it is as above.

Persisting json data in jstree through postback via asp:hiddenfield

I've been pouring over this for hours and I've yet to make much headway so I was hoping one of the wonderful denizens of SO could help me out. Here's the problem...
I'm implementing a tree via the jstree plugin for jQuery. I'm pulling the data with which I populate the tree programatically from our webapp via json dumped into an asp:HiddenField, basically like this:
JavaScriptSerializer serializer = new JavaScriptSerializer();
string json = serializer.Serialize(Items);
json = json.ToLower();
data.Value = json;
Then, the tree pulls the json from the hidden field to build itself. This works perfectly fine up until I try to persist data for which nodes are selected/opened. To simplify my problem I've hardcoded some json data into the tree and attempted to use the cookie plugin to persist the tree state data. This does not work for whatever reason. I've seen other issues where people need to load the plugins in a specific order, etc, this did not solve my issue. I tried the same setup with html_data and it works perfectly. With this working persistence I converted the cookie plugin to persist the data in a different asp:hiddenfield (we can't use cookies for this type of thing in our application.)
essentially the cookie operations are identical, it just saves the array of nodes as the value of a hidden field. This works with the html_data, still not with the json and I have yet to be able to put my finger on where it's failing.
This is the jQuery.cookie.js replacement:
jQuery.persist = function(name, value) {
if (typeof value != 'undefined') { // name and value given, set persist
if (value === null) {
value = '';
}
jQuery('#' + name).attr('value', value);
} else { // only name given, get value
var persistValue = null;
persistValue = jQuery('#' + name).attr('value');
return persistValue;
}
};
The jstree.cookie.js code is identical save for a few variable name changes.
And this is my tree:
$(function() {
$("#demo1").jstree({
"json_data": {
"data" : [
{
"data" : "A node",
"children" : [ "Child 1", "Child 2" ]
},
{
"attr": { "id": "li.node.id" },
"data" : {
"title": "li.node.id",
"attr": { "href": "#" }
},
"children": ["Child 1", "Child 2"]
}
]
},
"persistence": {
"save_opened": "<%= open.ClientID %>",
"save_selected": "<%= select.ClientID %>",
"auto_save": true
},
"plugins": ["themes", "ui", "persistence", "json_data"]
});
});
The data -is- being stored appropriately in the hiddenfields, the problem occurs on a postback, it does not reopen the nodes. Any help would be greatly appreciated.
After looking through this some more, I just wanted to explain that it appears to me that the issue is that the tree has not yet been built from the JSON_data when the persistence operations are being attempted. Is there any way to postpone these actions until after the tree is fully loaded?
If anyone is still attempting to perform the same type of operation on a jsTree version 3.0+ there is an easier way to accomplish the same type of functionality, without editing any of the jsTree's core JavaScript, and without relying on the "state" plugin (Version 1.0 - "Persistence"):
var jsTreeControl = $("#jsTreeControl");
//Can be a "asp:HiddenField"
var stateJSONControl = $("#stateJSONControl");
var url = "exampleURL";
jsTreeControl.jstree({
'core': {
"data": function (node, cb) {
var thisVar = this;
//On the initial load, if the "state" already exists in the hidden value
//then simply use that rather than make a AJAX call
if (stateJSONControl.val() !== "" && node.id === "#") {
cb.call(thisVar, { d: JSON.parse(stateJSONControl.val()) });
}
else {
$.ajax({
type: "POST",
url: url,
async: true,
success: function (json) {
cb.call(thisVar, json);
},
contentType: "application/json; charset=utf-8",
dataType: "json"
}).responseText;
}
}
}
});
//If the user changes the jsTree, save the full JSON of the jsTree into the hidden value,
//this will then be restored on postback by the "data" function in the jsTree decleration
jsTreeControl.on("changed.jstree", function (e, data) {
if (typeof (data.node) != 'undefined') {
stateJSONControl.val(JSON.stringify(jsTreeControl.jstree(true).get_json()));
}
});
This code will create a jsTree and save it's "state" into a hidden value, then upon postback when the jsTree is recreated, it will use its old "state" restored from the "HiddenField" rather than make a new AJAX call and lose the expansions/selections that the user has made.
Got it working properly with JSON data. I had to edit the "reopen" and "reselect" functions inside jstree itself.
Here's the new functioning reopen function for anyone who needs it.
reopen: function(is_callback) {
var _this = this,
done = true,
current = [],
remaining = [];
if (!is_callback) { this.data.core.reopen = false; this.data.core.refreshing = true; }
if (this.data.core.to_open.length) {
$.each(this.data.core.to_open, function(i, val) {
val = val.replace(/^#/, "")
if (val == "#") { return true; }
if ($(("li[id=" + val + "]")).length && $(("li[id=" + val + "]")).is(".jstree-closed")) { current.push($(("li[id=" + val + "]"))); }
else { remaining.push(val); }
});
if (current.length) {
this.data.core.to_open = remaining;
$.each(current, function(i, val) {
_this.open_node(val, function() { _this.reopen(true); }, true);
});
done = false;
}
}
if (done) {
// TODO: find a more elegant approach to syncronizing returning requests
if (this.data.core.reopen) { clearTimeout(this.data.core.reopen); }
this.data.core.reopen = setTimeout(function() { _this.__callback({}, _this); }, 50);
this.data.core.refreshing = false;
}
},
The problem was that it was trying to find the element by a custom attribute. It was just pushing these strings into the array to search when it was expecting node objects. Using this line
if ($(("li[id=" + val + "]")).length && $(("li[id=" + val + "]")).is(".jstree-closed")) { current.push($(("li[id=" + val + "]"))); }
instead of
if ($(val).length && $(val).is(".jstree-closed")) { current.push(val); }
was all it took. Using a similar process I was able to persist the selected nodes this way as well.
Hope this is of help to someone.

Resources