I have a survey in a google form which gets stored in a google sheet. From the google sheet data get synchronized with firebase.
I have my trigger "when changes occur" made in the google sheet since my answers are automatically stored in there.
The Problem is, that the trigger does not get called, when a user is submitting the answers.
But if I write directly in the google sheet, my script gets called and data are stored in firebase.
But when I perform my script manually it also gets stored in firebase.
So it basically seems that the google sheet trigger does not get triggered when data are getting passed by the form itself.
Do I have to write a script for the form as well?
This is my script for the sheet:
function writeDataToFirebase() {
var ss = SpreadsheetApp.openById("SpreadsheetID");
var sheet = ss.getSheets()[0];
var data = sheet.getDataRange().getValues();
var dataToImport = {};
for(var i = 1; i < data.length; i++) {
var timeStamp = data[i][0];
var uuid = data[i][62];
dataToImport[timeStamp] = {
timeStamp:timeStamp,
uuid:uuid,
a:data[i][1],
b:data[i][2],
c:data[i][3],
d:data[i][4],
e:data[i][5],
f:data[i][6],
g:data[i][7],
var1:data[i][8],
var2:data[i][9],
var3:data[i][10],
var4:data[i][11],
var5:data[i][12],
var6:data[i][13],
var7:data[i][14],
var8:data[i][15],
var9:data[i][16]
};
}
var firebaseUrl = "URL" ;
var secret = "Secret
var base = FirebaseApp.getDatabaseByUrl(firebaseUrl, secret);
base.setData("", dataToImport);
}
Maybe someone can help me how I can fully automate this procedure
When using Apps Script triggers it is important to keep the following in mind:
Script executions and API requests do not cause triggers to run. For example, calling Range.setValue() to edit a cell does not cause the spreadsheet's onEdit trigger to run.
The same scenario applies to your situation when using the trigger you chose.
Since you want this function to run when you receive an answer in your form, the best approach in this situation is to use an onFormSubmit trigger.
Reference
Apps Script Triggers.
Related
I've looked extensively and tried to modify multiple sample sets of codes found on different posts in Stack Overflow as well as template documents in Google App Maker, but cannot for the life of me get an export and en email function to work.
UserRecords table:
This is the area where the data is collected and reviewed, the populated table:
These are the data fields I am working with:
This is what the exported Sheet looks like when I go through the motions and do an export through the Deployment tab:
Lastly, this is the email page that I've built based on tutorials and examples I've seen:
What I've learned so far (based on the circles I'm going round in):
Emails seem mostly straight forward, but I don't need to send a message, just an attachment with a subject, similar to using the code:
function sendEmail_(to, subject, body) {
var emailObj = {
to: to,
subject: subject,
htmlBody: body,
noReply: true
};
MailApp.sendEmail(emailObj);
}
Not sure how to change the "body" to the exported document
To straight up export and view the Sheet from a button click, the closest I've found to a solution is in Document Sample but the references in the code speak to components on the page only. I'm not sure how to modify this to use the table, and also what to change to get it as a sheet instead of a doc.
This may seem trivial to some but I'm a beginner and am struggling to wrap my head around what I'm doing wrong. I've been looking at this for nearly a week. Any help will be greatly appreciated.
In it's simplest form you can do a Google sheet export with the following server script (this is based on a model called employees):
function exportEmployeeTable() {
//if only certain roles or individuals can perform this action include proper validation here
var query = app.models.Employees.newQuery();
var results = query.run();
var fields = app.metadata.models.Employees.fields;
var data = [];
var header = [];
for (var i in fields) {
header.push(fields[i].displayName);
}
data.push(header);
for (var j in results) {
var rows = [];
for (var k in fields) {
rows.push(results[j][fields[k].name]);
}
data.push(rows);
}
if (data.length > 1) {
var ss = SpreadsheetApp.create('Employee Export');
var sheet = ss.getActiveSheet();
sheet.getRange(1,1,data.length,header.length).setValues(data);
//here you could return the URL for your spreadsheet back to your client by setting up a successhandler and failure handler
return ss.getUrl();
} else {
throw new app.ManagedError('No Data to export!');
}
}
I'm trying to have my google sheets synced with my firebase database. I'm not very experienced with javaScript, so is it possible using the below method? The idea is that it would automatically sync every time a new row gets created/updated/deleted. I know that I need the script files but not sure how to import them in the .gs file, so that's why it's in the html.
Many thanks!
translate.gs
function saveToFirebase() {
var config = {
apiKey: "MY_API_KEY",
authDomain: "MY_DOMAIN.firebaseapp.com",
databaseURL: "MY_DOMAIN.firebaseio.com",
projectId: "MY_DOMAIN",
storageBucket: "MY_DOMAIN.appspot.com",
messagingSenderId: "MESSAGE_ID"
};
firebase.initializeApp(config);
var database = firebase.database();
database.ref('food/' + MY_USER_UID).set({
name: "pizza funghi",
});
}
sidebar.html
<!DOCTYPE html>
<html>
<head>
<script src="https://www.gstatic.com/firebasejs/4.12.0/firebase-app.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.12.0/firebase-auth.js"></script>
<script src="https://www.gstatic.com/firebasejs/4.12.0/firebase-database.js"></script>
</head>
<body>
</body>
</html>
There is a third-party libarary which integrates with Firebase's REST API. If you're comfortable using it, this becomes pretty straightforward.
First we'll need to create a tab to track changes. We need the identity of those who make changes, so we have to break this into two parts - a simple onEdit trigger which runs as the modifying user, and an installable trigger which I'll call uploadChanges. The latter is what talks to Firebase.
Create a tab called changes
Add a frozen row with the following headers:
Uploaded
User
Value
Install the third party Firebase library
Begin by clicking Resources > Libraries in the script editor, then pasting MYeP8ZEEt1ylVDxS7uyg9plDOcoke7-2l in the "Find a Library" box. Hit Save.
Opt for stability by choosing the latest public release, or choose the latest release (I chose latest while writing this).
Click OK
Now would be a good time to peruse the reference docs so you know what I'm up to in the below instructions :-)
Set up security (I'm assuming you want this script to run as you)
Make your Google account (which runs the script) be at least an Editor for your Firebase project.
Set the appropriate authorization scopes for your App Script project:
Go to File > Project Properties > Scopes in the App Script editor
Select View > Show manifest file (the manifest file is usually hidden by default)
Add https://www.googleapis.com/auth/userinfo.email and https://www.googleapis.com/auth/firebase.database to the oauthScopes array (add it if it's not already there)
Save the manifest file. Next time you run the script you'll get a pop-up asking about permissions.
The equivalent of your translate.gs above, which always just sets your food to 'pizza funghi`, would look like this:
function saveToFirebase() {
var dbUrl = "MY_DOMAIN.firebaseapp.com"; // Set appropriately
var token = ScriptApp.getOAuthToken(); // Depends on security setup above
var firebase = FirebaseApp.getDatabaseByUrl(dbUrl, token);
newData = {
name: "pizza funghi",
};
firebase.setData('food/' + MY_USER_UID, newData);
}
But you said you wanted to update Firebase on every save. To do this you really just want to rip off one of the various onEdit tutorials floating around the net. The resulting onEdit should look something like this:
function onEdit(e) {
// First get stuff about the edit.
// This approach only gets the top left cell of a multi-cell edit.
var editRange = e.range; // The edited range
var newValue = editRange.getValue();
// Next, who is the editor? Remove the `split` for full email.
var username = Session.getActiveUser().getEmail().split('#')[0];
if (username == '') {
username = SOME_REASONABLE_DEFAULT; // Or give up if you wish
}
// Finally save the change
SpreadsheetApp.getActiveSpreadsheet()
.getSheetByName('changes')
.appendRow([false, username, newValue]);
}
function uploadChanges() {
// Attach to Firebase
var dbUrl = "MY_DOMAIN.firebaseapp.com"; // Set appropriately
var token = ScriptApp.getOAuthToken(); // Depends on security setup above
var firebase = FirebaseApp.getDatabaseByUrl(dbUrl, token);
// Get content of changes tab
var changeSheet = SpreadsheetApp.getActiveSpreadsheet()
.getSheetByName('changes');
var changeData = changeSheet.getDataRange()
.getValues();
// Upload all new-to-us changes
for (var i = 1; i < changeData.length; i++) {
if (changeData[i][0]) {
continue; // We already uploaded this one
}
changeData[i][0] = true; // Optimistically assume we'll succeed
var newData = {
name: changeData[i][2]
};
var username = changeData[i][1];
firebase.setData('food/' + username, newData);
}
// Blanket update of change-data sheet to update upload status
changeSheet.getRange(1, 1, changeData.length, changeData[0].length)
.setValues(changeData);
}
Lastly, set up some triggers.
Choose Edit > Current Project's Triggers in the script editor
Add a new trigger for onEdit
Choose onEdit from the leftmost Run dropdown
Choose From spreadsheet in the Events dropdown
Then choose On edit in the rightmost dropdown
Add a new trigger for uploadChanges
Choose uploadChanges from the leftmost Run dropdown
Choose Time-driven from the Run dropdown
Set up a schedule that's appropriate to your needs
EDIT: My original script had you doing everything in onEdit, which tehhowch correctly points out won't work since we're talking to another service. I've updated to stage to a "changes" tab which I include in setup. My new approach maintains a perpetual record of old uploads; for performance you might instead choose to just clear the changes sheet once you've done the upload.
I need help with trying to understand how to delete all data from a table and then try to automatically import a new sheet with data into the newly cleared down table.
I'm currently trying the unload() method client side but that doesn't seem to cleardown my tables
function ClearDown(){
app.datasources.P11d.unload(function(){});
console.log('Finish Delete');
}
I've also tried to create a server side function, which also doesn't appear to work
function ClearTable(){
var records = app.models.P11d.newQuery();
// records.run();
console.log('Server Function Ran');
app.deleteRecords(records.run());
}
This is ran from a client side function:
function Delete(){
google.script.run.withSuccessHandler(function(result){
}).ClearTable();
console.log('Function Ran');
}
Again this is all to no avail
With the import side I've tried to do the below:-
Client Side:
function ImportData(){
console.log('Begin');
var ss = SpreadsheetApp.openById('SHEET ID');
var values = ss.getSheetByName('P11d').getDataRange().getValues();
var ssData = [];
// app.datasources.P11d.unload(function(){});
for (var i = 0; i<values.length; i++){
var newRecord = app.models.P11d.newRecord();
// add all fields to the new record
newRecord.Reg_Number = values[i][0];
newRecord.Reg_Date = values[i][1];
newRecord.Model_Description = values[i][2];
newRecord.P11d_Value = values[i][3];
newRecord.EngineSize = values[i][4];
newRecord.Fuel = values[i][5];
newRecord.CO2 = values[i][6];
newRecord.SIPP = values[i][7];
newRecord.GTA_Code = values[i][8];
newRecord.Type = values[i][9];
ssData.push(newRecord);
// console.log(newRecord.MODEL_FIELD);
}
console.log('Finished');
// return the array of the model.newRecord objects that would be consumed by the Model query.
}
Please can someone help with this, at the moment the way the data is sent over to me adding new stuff into the Drive Table is causing many duplicates.
Thanks in advance,
You can delete all records, import, and read from a spreadsheet using the AMU Library
Copy and paste the server and client scripts into your app.
I'm sure that will make it much easier!
To delete all the data in a model using this:
Button onClick:
google.script.run.AMU.deleteAllData('ModelName');
The correct way to delete records on the server is:
app.models.MODEL_NAME.deleteRecords(key_array);
datasource.unload() simply unloads the widget on the client. It does not affect the database records.
A better way to write your records query on the server is:
var query = app.models.MODEL_NAME.newQuery();
query.filters.your_filter_here;
var records = query.run();
Note that you cannot return a single record or an array of records from anything but a calculated model function without using a function posted here. (You can return a single field of a record using stringify for any json data.)
I am currently working on a solution to create datasource independent tables needed in App Maker.
For the delete function on the server try to change your code just a little bit, this function at least used to work for me, however I have not needed to use it in some time.
function ClearTable(){
var records = app.models.P11d.newQuery().run();
console.log('Server Function Ran');
app.deleteRecords(records);
}
I am having problem with a function in IndexedDB, where I need to change the status of some meetings. The Search feature which meetings are checked by grabbing the ID of each one of them, soon after I A for() where I retrace the vector that contains the ids for each database access do I get a different passing the id of the time. The following code example:
var val = [];
var checkbox = $('input:checkbox[class^=checkReunioes]:checked');
if(checkbox.length > 0){
checkbox.each(function(){
val.push($(this).val());
});
}
for(var i = 0; i < val.length; i++){
var transaction = db.transaction(["tbl_REUNIOES"], "readwrite").objectStore("tbl_REUNIOES");
var request = transaction.get(val[i]);
request.onerror = function(event) {
alert("BAD");
};
request.onsuccess = function(event) {
var data = request.result;
data.FLG_STATU_REUNI = 'I';
var codigo_igreja = localStorage.getItem("igreja");
var dataJSON = JSON.stringify(data);
enviarFilaSincronismo("tbl_REUNIOES", "U", dataJSON, " WHERE COD_IDENT_REUNI = '" + val[i] + "' and COD_IDENT_IGREJ = '" + codigo_igreja + "'");
var requestUpdate = transaction.put(data);
requestUpdate.onerror = function(event) {
alert("OK");
};
requestUpdate.onsuccess = function(event) {
$("#listReunioes").html("");
serchAll(w_key_celula);
};
};
}
In my view the problem is occurring due to be a bank indexeddb asynchronous, it passes to the next search, even before the first stop.
But how can I do to confer this ?
What is the good practice for something in this case ?.
If you are inexperienced with writing asynchronous code, a good general rule to consider is to never define functions inside loops. Do not set request.onsuccess to a function from within the for loop.
You can perform multiple get and put requests on the same transaction when you do not expect the individual requests to fail for data-related reasons, such as the violation of a uniqueness constraint of an index, or because you are performing many thousands of requests on the same transaction and reaching processing limits.
You might find that using IDBObjectStore.prototype.openCursor together with IDBCursor.prototype.update is more convenient than using IDBObjectStore.prototype.get and IDBObjectStore.prototype.put.
Your example code indicates that a successful get request means that data was retrieved, when in fact, this is not what actually happens. A successful get request just means that a request occurred without errors (e.g. against an object store that exists, against a database that is not blocked by other requests, against a database connection that is still valid). It does not mean that an object matched your get request query. You should be checking for whether the request's result object is defined, and use that check as a determination of whether an object matched your get query, and not simply that a successful request occurred.
You might want to spend more time organizing your code into smaller functions that use clearer names. Your example code is difficult to read.
It looks like you are using some type of global db variable. If you are not well experienced with writing asynchronous code, avoid using a global db variable. There is no guarantee the db variable will be defined and open when you decide to access it, which could lead to an unexpected error.
The application shows work-shifts for certain time-period. firebaseConn.getShifts is the API-function to get the shiftData for the given time period.
versions:
firebase: 2.0.6
angularFire: 0.9.0 (confirmed with 0.8.2 also)
This is my firebase schema:
And this is the code:
.factory('watchers', function(bunch-of-dependencies) {
var unbindShifts = function() {};
var inited = false;
var shifts = {};
... some irrelevant code in between ...
function initShifts() {
unbindShifts();
shifts.object = firebaseConn.getShifts( false, from, to, $scope );
$scope.shifts = shifts.object;
shifts.object.$bindTo($scope, "shifts").then(function(unbind) {
unbindShifts = unbind;
});
}
The firebase-queries (that have worked fine before adding the unbind / bind and possibly time-based querying might cause issues too):
firebaseConn.getShifts = function(asArray, from, to, scope) {
return cacheRequest(FBURL + "shifts", asArray, [from, to]);
};
function cacheRequest(url, asArray, limits) {
var type = asArray ? "array" : "object";
var startAt = limits ? limits[0] : undefined;
var endAt = limits ? limits[1] : undefined;
var retObj, FBRef;
cached[url] = cached[url] || {};
/* If there are limits-parameters we don't cache at all atm. Since those queries should be checked differently than static urls */
if(!limits && cached[url][type]) {
FBRef = cached[url][type];
} else {
FBRef = cached[url][type] = createFBRef(url, startAt, endAt);
}
if(asArray) {
retObj = FBRef.$asArray();
} else {
retObj = FBRef.$asObject();
}
return retObj;
}
function createFBRef(resourceURL, startAt, endAt) {
var modifiedObject = $firebase( createRef( resourceURL ).orderByKey().startAt(startAt).endAt(endAt) );
return modifiedObject;
}
function createRef(resourceURL) {
return new Firebase( resourceURL );
}
Now I have located the problem to be with the query limiting. If the from and to Dates are undefined, this works without problems. But I need to be able to limit the amount of data, since loading many years of workshift-data, to show a weeks time, won't be good :).
The actual problem is not displaying and fetching the data, everything works fine, it's related to the times and re-binding.
If I do any changes to e.g. "20150115"-table. For example I add another "groups"-child there. When i unbind and rebind, the whole "20150115"-table gets deleted and this holds true only to the latest changes. If I add multiple child to different dates e.g. "20150113", "20150114", "20150115" and the latest change is in "20150115" and then I unbind + re-bind another time from firebase, all the other root-paths will stay as they are, but the latest change in "20150115" will make the whole tree deleted.
I hope I make myself clear, so for safety I try to explain it again in simpler way.
- Changes to 1. "20150113", 2. "20150114", 3. "20150115" through the app.
- Changing timeline from UI causes: unbind + re-bind
- As a side-effect the whole "20150114" tree gets deleted.
The problem is somehow related to advanced querying with orderByKey().startAt(startAt).endAt(endAt) and binding.
Also for additional info. The data which is added through the UI gets added to the firebase database, but when the re-binding happens, the data is deleted from the database. Specifically on rebind, unbinding causes no issues, if I delay rebinding with timeout.
EDIT:
I have found the source of the actual issue. After the new binding is in place and everything seems to be in order, there is an angular watch event that kicks in. The event tries to save the last change user made before re-binding.
So if I have and active timeline for december (20141201 - 20141230) and I change "20141225"-data. Then change the timeline to 20150101 - 20150130, causing unbind and rebind (or manually fetching new data). There will be an event, after the binding has been done and everything seems to be in order, trying to save 20141225 data to either the new timeline (20150101 - 20150130) or the old one, not sure which one. This causes the firebase to actually delete the whole 20141225-tree, instead of saving the data.
The new data makes it into your Firebase fine, which you can see by either checking your Firebase dashboard or by running a quick snippet like this in your browser's dev console:
new Firebase("https://firebaseurl").once('value', function(s) { console.log(s.val()); })
The data even makes it back into your application. The only problem is that Angular doesn't know that new data has arrived, so it doesn't update the view with the new data.
Normally AngularFire's $asObject and $asArray methods take care of notifying AngularJS when new data arrives from Firebase. But since you are constantly creating new queries, you'll have to take care of that yourself.
There are a few ways to signal the new data to AngularJS and I'm definitely not an expert on which one is best. But if you add $scope.$apply(); to your setDays function it works:
function setDays(ref) {
var FBRange = setFBRange(ref, from, to);
var days;
unbindDays();
days = $firebase(FBRange).$asObject();
$scope.days = days;
days.$bindTo($scope, "days").then(function(unbind) {
unbindDays = unbind;
// As a result of the new binding entry gets mysteriously deleted from firebase
});
$scope.$apply(); // Tell AngularJS about the new data, so that it updates the view
function setFBRange(ref, from, to) {
return ref.orderByKey().startAt(""+from).endAt(from + to + "");
}
}
Updated Plunkr with this change (and some others to help in debugging): http://plnkr.co/edit/YZtkzUNtjQUCcw4xb2mj?p=preview