Auto Sync google sheets to firebase without button - firebase

I used a tutorial to help me sync my sheets to firebase with the use of a SYNC button that activates the script. The SYNC button currently sits just in the middle of the spreadsheet. I want to sync the data from sheets automatically to firebase when there are changes made.
function getFirebaseUrl(jsonPath) {
return (
'https://no-excusas.firebaseio.com/' +
jsonPath +
'.json?auth=' +
secret
)
}
function syncMasterSheet(sheetHeaders, sheetData) {
/*
We make a PUT (update) request,
and send a JSON payload
More info on the REST API here : https://firebase.google.com/docs/database/rest/start
*/
const outputData = [];
for(i = 0; i < sheetData.length; i++) {
var row = sheetData[i];
var newRow = {};
for(j = 0; j < row.length; j++) {
newRow[sheetHeaders[j]] = row[j];
}
outputData.push(newRow);
}
var options = {
method: 'put',
contentType: 'application/json',
payload: JSON.stringify(outputData)
}
var fireBaseUrl = getFirebaseUrl("UsersSheets")
UrlFetchApp.fetch(fireBaseUrl, options)
}
function startSync() {
//Get the currently active sheet
var sheet = SpreadsheetApp.getActiveSheet()
//Get the number of rows and columns which contain some content
var [rows, columns] = [sheet.getLastRow(), sheet.getLastColumn()]
// Get the data contained in those rows and columns as a 2 dimensional array.
// Get the headers in a separate array.
var headers = sheet.getRange(1, 1, 1, columns).getValues()[0]; // [0] to unwrap the
outer array
var data = sheet.getRange(2, 1, rows - 1, columns).getValues(); // skipping the header
row means we need to reduce rows by 1.
//Use the syncMasterSheet function defined before to push this data to the "masterSheet"
key in the firebase database
syncMasterSheet(headers, data)
}

Normally, it would be ok to just define an onEdit function in your code, like this:
function onEdit(event) {
startSync();
}
However, because you are making external requests via UrlFetchApp.fetch(), this will fail with an error about not having the https://www.googleapis.com/auth/script.external_request permission (gobs more detail about trigger authorization here).
Instead, you need to manually create an installable trigger
This is reasonably straightforward. In the edit menu for your code, go to your project's triggers:
Then, select "add a trigger" and create the on edit trigger, like so:
You should think about if you really want this running on every edit as the requests could be quite large (as it syncs the entire sheet) and run frequently (as you edit), however.

When you make a change to a spreadsheet, its onEdit event fires. So that's where you'd trigger that save with something like this:
function onEdit(event) {
startSync();
}
But since onEdit fires for each edit, this may end up saving a lot more than really necessary. So you may want to debounce to only save after some inactivity.
Something like this:
var timer;
function onEdit(event) {
// if we're counting down, stop the timer
if (timer) clearTimeout(timer);
// starting syncing after 2 seconds
timer = setTimeout(function() {
startSync();
}, 2000);
}

Related

Update collection with an array in firebase

I need to update a collection in values like this :
{
"email" : "x#gmail.com",
"fullName" : "Mehr",
"locations" : ["sss","dsds","adsdsd"]
}
Locations needs to be an array. in firebase how can I do that ... and also it should check duplicated.
I did like this :
const locations=[]
locations.push(id)
firebase.database().ref(`/users/ + ${userId}`).push({ locations })
Since you need to check for duplicates, you'll need to first read the value of the array, and then update it. In the Firebase Realtime Database that combination can is done through a transaction. You can run the transaction on the locations node itself here:
var locationsRef = firebase.database().ref(`/users/${userId}/locations`);
var newLocation = "xyz";
locationsRef.transaction(function(locations) {
if (locations) {
if (locations.indexOf(newLocation) === -1) {
locations.push(newLocation);
}
}
return locations;
});
As you can see, this loads the locations, ensures the new location is present once, and then writes it back to the database.
Note that Firebase recommends using arrays for set-like data structures such as this. Consider using the more direct mapping of a mathematical set to JavaScript:
"locations" : {
"sss": true,
"dsds": true,
"adsdsd": true
}
One advantage of this structure is that adding a new value is an idempotent operation. Say that we have a location "sss". We add that to the location with:
locations["sss"] = true;
Now there are two options:
"sss" was not yet in the node, in which case this operation adds it.
"sss" was already in the node, in which case this operation does nothing.
For more on this, see best practices for arrays in Firebase.
you can simply push the items in a loop:
if(locations.length > 0) {
var ref = firebase.database().ref(`/users/ + ${userId}`).child('locations');
for(i=0; i < locations.length; i++) {
ref.push(locations[i]);
}
}
this also creates unique keys for the items, instead of a numerical index (which tends to change).
You can use update rather than push method. It would much easier for you. Try it like below
var locationsObj={};
if(locations.length > 0) {
for(i=0; i < locations.length; i++) {
var key= firebase.database().ref(`/users/ + ${userId}`).child('locations').push().key;
locationsObj[`/users/ + ${userId}` +'/locations/' + key] =locations[i];
}
firebase.database().ref().update(locationsObj).then(function(){// which return the promise.
console.log("successfully updated");
})
}
Note : update method is used to update multiple paths at a same time. which will be helpful in this case, but if you use push in the loop then you have to wait for the all the push to return the promises. In the update method it will take care of the all promises and returns at once. Either you get success or error.

Google App Maker how to create Data Source from Google Contacts

Using GoogleAppMaker how to create a data source from google contacts. There is an employee HR example app but I want to similarly manage contacts (add, modify, delete) and use select criteria.
At this time this task is not trivial in App Maker and it is pretty much generic. We can change question wording to CRUD operations with 3rd party datasources. Let's break it into smaller parts and address them separately.
Read/list contacts
This task is relatively easy. You need to use Calculated Model to proxy Apps Scripts Contacts API response. Once you create model with subset of fields from the Contact response you can create datasource for the model and bind it to List or Table widget. You can also try to find some inspiration in Calculated Model Sample.
// Server side script
function getContacts_() {
var contacts = ContactsApp.getContacts();
var records = contacts.map(function(contact) {
var record = app.models.Contact.newRecord();
record.FirstName = contact.getGivenName();
record.LastName = contact.getFamilyName();
var companies = contact.getCompanies();
if (companies.length > 0) {
var company = companies[0];
record.Organization = company.getCompanyName();
record.Title = company.getJobTitle();
}
var emails = contact.getEmails();
if (emails.length > 0) {
record.Email = emails[0].getAddress();
}
var phones = contact.getPhones();
if (phones.length > 0) {
record.Phone = phones[0].getPhoneNumber();
}
return record;
});
return records;
}
Create/Update/Delete
Since Calculated Models have some limitations, we need to turn on our imagination to create, update and delete records from their datasources. The basic strategy will be calling server side scripts for CUD operations in response to user actions on client side. To get user's input from UI we will need to utilize page's Custom Properties, one property for each Contact field:
Here are some snippets that should explain the idea
Create
// Client script
function onSubmitContactClick(submitButton) {
var props = submitButton.root.properties;
var contact = {
FirstName: props.FirstName,
LastName: props.LastName,
Organization: props.Organization,
...
};
google.script.run
.withSuccessHandler(function() {
// Most likely we'll need to navigate user back to the
// page with contacts list and reload its datasource
// to reflect recent changes, because our `CUD` operations
// are fully detached from the list datasource
app.showPage(app.pages.Contacts);
app.datasources.Contacts.load();
})
.withFailureHandler(function() {
// TODO: Handle error
})
.createContact(contact);
}
// Server script
function createContact(contactDraft) {
var contact = ContactsApp.createContact(contactDraft.FirsName,
contactDraft.LastName,
contactDraft.Email);
contact.addCompany(contactDraft.Organization, contactDraft.Title);
contact.addPhone(ContactsApp.Field.WORK_PHONE, contactDraft.Phone);
}
Update
Idea to update contact records will be very similar to the new contact creation flow, so I skip it for now.
Delete
Assuming that delete button is located inside contacts table row.
// Client script
function onDeleteContactClick(deleteButton) {
var email = deleteButton.datasource.item.Email;
google.script.run
.withSuccessHandler(function() {
// To update contacts list we can either reload the entire
// datasource or explicitly remove deleted item on the client.
// Second option will work way faster.
var contactIndex = deleteButton.parent.childIndex;
app.datasources.Contacts.items.splice(contactIndex, 1);
})
.withFailureHandler(function() {
// TODO: Handle error
})
.deleteContact(contact);
}
// Server script
function deleteContact(email) {
var contact = ContactsApp.getContact(email);
ContactsApp.deleteContact(contact);
}

Meteor-angular autocomplete from huge data

I have angular-meteor app that needs Material md-autocomplete from a collection with 53,296 documents with angularUtils.directives.dirPagination but this amount of data make my browser hang.
I'm publishing the collection with:
Meteor.publish('city', function (options, searchString) {
var where = {
'city_name': {
'$regex': '.*' + (searchString || '') + '.*' ,
'$options': 'i'
}
};
return City.find(where, options);
});
I subscribe with:
subscriptions: function () {
Meteor.subscribe('city');
this.register('city', Meteor.subscribe('city'));
}
and have pagination on controller :
$scope.currentPage = 1;
$scope.pageSize = 100;
$scope.sort = {city_name_sort : 1};
$scope.orderProperty = '1';
$scope.helpers({
city: function(){
return City.find({});
}
});
but it takes a long time to load and its make chrome stop working.
You already have most of the server-side searching done because your search is running inside a subscription. You should make sure that the city_name field is indexed in mongo! You should only return that field to minimize data transfer. You can also simplify your regex.
Meteor.publish('city', function (searchString) {
const re = new RegExp(searchString,'i');
const where = { city_name: { $regex: re }};
return City.find(where, {sort: {city_name: 1}, fields: {city_name: 1}});
});
What I've found helps with server-side auto-complete is:
Don't start searching until the user has typed 3 or 4 characters. This drastically narrows down the search results.
Throttle the search to only run every 500ms so that you're not sending every character to the server because then it has to keep re-executing the search. If the person is typing fast the search might only run every 2 or 3 characters.
Run the same .find() on the client that you're running on the server (instead of just querying for {}). That's just good practice since the client-side collection is the union of all subscriptions on that collection, there might be documents there that you don't want to list.
Lastly I don't know why you're subscribing twice here:
subscriptions: function () {
Meteor.subscribe('city');
this.register('city', Meteor.subscribe('city'));
}
only one of those Meteor.subscribe('city') calls is necessary.

angularFire startAt querying and binding deletes new data

The application shows work-shifts for certain time-period. firebaseConn.getShifts is the API-function to get the shiftData for the given time period.
versions:
firebase: 2.0.6
angularFire: 0.9.0 (confirmed with 0.8.2 also)
This is my firebase schema:
And this is the code:
.factory('watchers', function(bunch-of-dependencies) {
var unbindShifts = function() {};
var inited = false;
var shifts = {};
... some irrelevant code in between ...
function initShifts() {
unbindShifts();
shifts.object = firebaseConn.getShifts( false, from, to, $scope );
$scope.shifts = shifts.object;
shifts.object.$bindTo($scope, "shifts").then(function(unbind) {
unbindShifts = unbind;
});
}
The firebase-queries (that have worked fine before adding the unbind / bind and possibly time-based querying might cause issues too):
firebaseConn.getShifts = function(asArray, from, to, scope) {
return cacheRequest(FBURL + "shifts", asArray, [from, to]);
};
function cacheRequest(url, asArray, limits) {
var type = asArray ? "array" : "object";
var startAt = limits ? limits[0] : undefined;
var endAt = limits ? limits[1] : undefined;
var retObj, FBRef;
cached[url] = cached[url] || {};
/* If there are limits-parameters we don't cache at all atm. Since those queries should be checked differently than static urls */
if(!limits && cached[url][type]) {
FBRef = cached[url][type];
} else {
FBRef = cached[url][type] = createFBRef(url, startAt, endAt);
}
if(asArray) {
retObj = FBRef.$asArray();
} else {
retObj = FBRef.$asObject();
}
return retObj;
}
function createFBRef(resourceURL, startAt, endAt) {
var modifiedObject = $firebase( createRef( resourceURL ).orderByKey().startAt(startAt).endAt(endAt) );
return modifiedObject;
}
function createRef(resourceURL) {
return new Firebase( resourceURL );
}
Now I have located the problem to be with the query limiting. If the from and to Dates are undefined, this works without problems. But I need to be able to limit the amount of data, since loading many years of workshift-data, to show a weeks time, won't be good :).
The actual problem is not displaying and fetching the data, everything works fine, it's related to the times and re-binding.
If I do any changes to e.g. "20150115"-table. For example I add another "groups"-child there. When i unbind and rebind, the whole "20150115"-table gets deleted and this holds true only to the latest changes. If I add multiple child to different dates e.g. "20150113", "20150114", "20150115" and the latest change is in "20150115" and then I unbind + re-bind another time from firebase, all the other root-paths will stay as they are, but the latest change in "20150115" will make the whole tree deleted.
I hope I make myself clear, so for safety I try to explain it again in simpler way.
- Changes to 1. "20150113", 2. "20150114", 3. "20150115" through the app.
- Changing timeline from UI causes: unbind + re-bind
- As a side-effect the whole "20150114" tree gets deleted.
The problem is somehow related to advanced querying with orderByKey().startAt(startAt).endAt(endAt) and binding.
Also for additional info. The data which is added through the UI gets added to the firebase database, but when the re-binding happens, the data is deleted from the database. Specifically on rebind, unbinding causes no issues, if I delay rebinding with timeout.
EDIT:
I have found the source of the actual issue. After the new binding is in place and everything seems to be in order, there is an angular watch event that kicks in. The event tries to save the last change user made before re-binding.
So if I have and active timeline for december (20141201 - 20141230) and I change "20141225"-data. Then change the timeline to 20150101 - 20150130, causing unbind and rebind (or manually fetching new data). There will be an event, after the binding has been done and everything seems to be in order, trying to save 20141225 data to either the new timeline (20150101 - 20150130) or the old one, not sure which one. This causes the firebase to actually delete the whole 20141225-tree, instead of saving the data.
The new data makes it into your Firebase fine, which you can see by either checking your Firebase dashboard or by running a quick snippet like this in your browser's dev console:
new Firebase("https://firebaseurl").once('value', function(s) { console.log(s.val()); })
The data even makes it back into your application. The only problem is that Angular doesn't know that new data has arrived, so it doesn't update the view with the new data.
Normally AngularFire's $asObject and $asArray methods take care of notifying AngularJS when new data arrives from Firebase. But since you are constantly creating new queries, you'll have to take care of that yourself.
There are a few ways to signal the new data to AngularJS and I'm definitely not an expert on which one is best. But if you add $scope.$apply(); to your setDays function it works:
function setDays(ref) {
var FBRange = setFBRange(ref, from, to);
var days;
unbindDays();
days = $firebase(FBRange).$asObject();
$scope.days = days;
days.$bindTo($scope, "days").then(function(unbind) {
unbindDays = unbind;
// As a result of the new binding entry gets mysteriously deleted from firebase
});
$scope.$apply(); // Tell AngularJS about the new data, so that it updates the view
function setFBRange(ref, from, to) {
return ref.orderByKey().startAt(""+from).endAt(from + to + "");
}
}
Updated Plunkr with this change (and some others to help in debugging): http://plnkr.co/edit/YZtkzUNtjQUCcw4xb2mj?p=preview

Trouble reading sqlite3 database columns of type blob with sql.js

So i am using the sql.js library i.e. the port of sqlite in javascript which can be found here https://github.com/kripken/sql.js.
This is my code to open and read the database that comes from a flat file store locally.
First the file a local file is selected via this HTML
<input type="file" id="input" onchange="handleFiles(this.files)">
The js code behind the scenes is as follows,
function handleFiles(files) {
var file = files[0];
var reader = new FileReader();
reader.readAsBinaryString(file);
openDbOnFileLoad(reader);
function openDbOnFileLoad(reader){
setTimeout(function () {
if(reader.readyState == reader.DONE) {
//console.log(reader.result);
db = SQL.open(bin2Array(reader.result));
execute("SELECT * FROM table");
} else {
//console.log("Waiting for loading...");
openDbOnFileLoad(reader);
}
}, 500);
}
}
function execute(commands) {
commands = commands.replace(/\n/g, '; ');
try {
var data = db.exec(commands);
console.log(data);
} catch(e) {
console.log(e);
}
}
function bin2Array(bin) {
'use strict';
var i, size = bin.length, ary = [];
for (i = 0; i < size; i++) {
ary.push(bin.charCodeAt(i) & 0xFF);
}
return ary;
}
Now this works and i can access all the columns and values in the database, however there is one column which is of type blob and that just shows up as empty. Any ideas of how i can access the contents of this blob?
The correct answer!
So what I was trying to ask in this question is simply how to read the contents of a column of type blob using sql.js. The correct answer is to specify the column names in the question and for the column that contains data of type blob, get its contents using the hex function i.e. select column1,hex(column2) from table. It was by no means a question about the most efficient way of doing this. I have also written a blog post about this.
Here is a slightly modified copy of the function responsible for initializing my sqlite database:
sqlite.prototype._initQueryDb = function(file, callback) {
self = this;
var reader = new FileReader();
// Fires when the file blob is done loading to memory.
reader.onload = function(event) {
var arrayBuffer = event.target.result,
eightBitArray = new Uint8Array(arrayBuffer),
database = SQL.open(eightBitArray);
self._queryDb = database;
// Trigger the callback to the calling function
callback();
}
// Start reading the file blob.
reader.readAsArrayBuffer(file);
}
In this case, file is a local sqlite database handle that I get from an HTML input element. I specify a function to call when a change event happens to that input and get the blob from the resulting event.target.files[0] object.
For the sake of brevity on my part I left some things out but I can throw together a smaller and more simplified example if you are still struggling.
The answer is: with kripken's sql.js, that you mentioned above you can't. At least as of today (may 2014). The original author doesn't maintain sql.js anymore.
However, I'm the author of a fork of sql.js, that is available here: https://github.com/lovasoa/sql.js .
This fork brings several improvements, including support for prepared statements, in which, contrarily to the original version, values are handled in their natural javascript type, and not only as strings.
With this version, you can handle BLOBs (both for reading and writing), they appear as Uint8Arrays (that you can for instance convert to object URL to display contents to your users).
Here is an example of how to read blob data from a database:
var db = new SQL.Database(eightBitArray); // eightBitArray can be an Uint8Array
var stmt = db.prepare("SELECT blob_column FROM your_table");
while (stmt.step()) { // Executed once for every row of result
var my_blob = stmt.get()[0]; // Get the first column of result
//my_blob is now an Uint8Array, do whatever you want with it
}
db.close(); // Free the memory used by the database
You can see the full documentation here: http://lovasoa.github.io/sql.js/documentation/

Resources