Using firebase admin to retrieve data from a collection in a cloud function fails for large items. Sample code i am using to query the selection from the cloud function is as follow
admin.database().orderByChild('mmyyyy').equalTo(month).once('value');
this call fails when i try to retrieve 10600 items (trying to figure out why). in google console there is this log but nothing else that can point me in the right direction
textPayload: "Function execution took 18547 ms, finished with status: 'response error'"
After many failed attempt, i decided to try to execute this call on the client using firebase sdk as follow:
result = await firebase.database().ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
this works perfectly on the client without error and returning all my items 17000 of them (size of this json is 26MB).
Why is this the case? is there any limitation that is not documented?
Note:
I increased my cloud function memory to 1gb and timeout to 5min, didn't help.
Here is full sample code
const admin = require('firebase-admin');
var functions = require('firebase-functions');
admin.initializeApp(functions.config().firebase);
const cors = require('cors')({
"origin": "*",
"methods": "POST,GET",
"allowedHeaders": "Content-Type,uid,agence,month,course,raceType,raceNumber,status",
"preflightContinue": false,
"optionsSuccessStatus": 204
});
function _findTransactions(agence, month, course, raceType, raceNumber, status) {
return new Promise((resolve, reject) => {
try {
let db = admin.database();
let findPromise = db.ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
findPromise.then((result) => {
let transactions = result.val();
//removed business logic
resolve(transactions);
}).catch((err) => {
console.log(err);
reject(err);
});
} catch (error) {
console.log(error);
reject(error);
}
});
}
exports.findTransactions = functions.https.onRequest((req, res) => {
let uid;
try {
cors(req, res, () => {
uid = req.headers.uid;
let agence = req.headers.agence;
let month = req.headers.month;
let course = req.headers.course;
let raceType = req.headers.raceType;
let raceNumber = req.headers.raceNumber;
let status = req.headers.status;
if (req.method !== 'GET') {
return handleResponse(req, res, 403);
}
if (!uid || uid == null || uid == undefined) {
return handleResponse(req, res, 401);
}
_validateUserId(uid, ['central_cashier', 'admin'])
.then(() => {
_findTransactions(agence, month, course, raceType, raceNumber, status)
.then((result) => {
return handleResponse(req, res, 200, result);
}).catch((error) => {
return handleResponse(req, res, 500);
});
}).catch((error) => {
return handleResponse(req, res, 401);
});
});
} catch (error) {
return handleError(res, uid, error);
}
});
Your payload is too large and is exceeding the quota for Google Cloud Functions as you stated.
Two options comes to mind:
Compress the payload. Gzip the file before sending it to the client. This is easy with the NodeJS built in Zlib module, or;
Set up a virtual machine. Virtual machines are not bound to these restrictions.
I did some testing and conclude that Google Cloud Functions (GCF) is enforcing some kind of timeout or "abort" action when a query results in a large number of results (ie. many Datastore entities). See my comments attached to this question for some background.
tl;dr I created my own Express.js webserver and ran my GCF code on it.
This is how I tested it: I created an ubuntu instance with http/https and the Datastore API enabled. On my instance, I installed Node, Express, and got a basic https server working (self-signed certificate worked fine since this is just testing an api backend service). Then I copy-pasted my GCF code (the function that was failing in GCF) into my minimal Express webserver. I pointed my React app to use my instance, which triggered a query that resulted in over 32,000 Datastore entities. My GCF function sends a query with datastore.runQuery() which is common.
It took about a minute, but eventually all 32,000 entities were served by Express and loaded in the React app (browser) with no errors.
A basic Express route calls my GCF function:
app.post('/foo', (req, res) => {
myCloudFunction(req, res);
})
const myCloudFunction = (req, res) => {
// Inspects req, queries Datastore, and returns the results.
};
For this test, my React app just points to https://mydomain.example.com:3000/foo
(because my Express server listens on port 3000).
So it seems that GCF is not good enough for my application, unless I add pagination to the app (which is on the roadmap).
Related
I want to make a Google Cloud Function calling an external API for me. After some research on Google I found the way using Axios. The call is actually working, when I'm using it on my own nodejs but when I want to deploy the function to Google Cloud functions I'm always getting an error (Function cannot be initialized. Error: function terminated.)
I'm on the Blaze plan.
const functions = require("firebase-functions");
const axios = require("axios");
exports.getData = functions.https.onRequest((req, res) => {
return axios.get("http://api.marketstack.com/v1/eod?access_key='myAccessKey'&symbols=AAPL")
.then((response) => {
const apiResponse = response.data;
if (Array.isArray(apiResponse["data"])) {
apiResponse["data"].forEach((stockData) => {
console.log(stockData["symbol"]);
});
}
}).catch((error) => {
console.log(error);
});
});
Could someone please help me?
EDIT: I finally fixed it: the mistake was, that I ended up with two package.json files (one in the directory where it should be and one which I actually didn't need). When I was installing the dependencies with npm install, axios was added into the wrong package.json file. Unfortunately the other package.json file made it up to the server and I ended up with a package.json file without the necessary dependencies on the server and thus this made the error occur.
I didn’t test your code but you should return "something" (a value, null, a Promise, etc.) in the then() block to indicate to the Cloud Function platform that the asynchronous work is complete. See here in the doc for more details.
exports.getData = functions.https.onRequest((req, res) => {
return axios.get("http://api.marketstack.com/v1/eod?access_key='myAccessKey'&symbols=AAPL")
.then((response) => {
const apiResponse = response.data;
if (Array.isArray(apiResponse["data"])) {
apiResponse["data"].forEach((stockData) => {
console.log(stockData["symbol"]);
});
}
return null;
}).catch((error) => {
console.log(error);
});
});
You probably want do more than just logging values in the then() e.g. call an asynchronous Firebase method to write to a database (Firestore or the RTDB): in this case take care to return the Promise returned by this method.
I have two firebase accounts one used for development(D) and the other for production(P). My development(D) firestore and functions run on us-central1. On production(P) firestore location is asia-south1 and functions run on us-central1
My firebase functions run properly in development (D) but are giving me the following error in production. Further, when I check the logs on the firebase functions console, there does not seem to be any activity. It appears as if the function has not been called.
Error returned by firebase function is :
Function call error Fri Apr 09 2021 09:25:32 GMT+0530 (India Standard Time)with{"code":"internal"}
Further the client is also displaying this message :
Access to fetch at 'https://us-central1-xxx.cloudfunctions.net/gpublish' from origin 'https://setmytest.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled. zone-evergreen.js:1052 POST https://us-central1-xxx.cloudfunctions.net/gpublish net::ERR_FAILED
Here is the code from my angular app calling the function -
const process = this.fns.httpsCallable("gpublish");
process(data).subscribe(
(result) => {
console.log("function responded with result: " + JSON.stringify(result));
},
(err) => {
const date1 = new Date();
console.log("Function call error " + date1.toString() + "with" + JSON.stringify(err));
});
Here are the functions -
index.ts
import { gpublish } from "./gpublish/gpublish";
import { sendEmail } from "./sendEmail";
export {gpublish,sendEmail };
gpublish.ts
import * as functions from "firebase-functions";
const fs = require("fs");
const { google } = require("googleapis");
const script = google.script("v1");
const scriptId = "SCRIPT_ID";
const googleAuth = require("google-auth-library");
import { admin } from "../admin";
const db = admin.firestore();
export const gpublish = functions.https.onCall(async (data: any, res: any) => {
try {
const googleTest = data.test;
console.log("Publishing to google test of name " + googleTest.testName);
// read the credentials and construct the oauth client
const content = await fs.readFileSync("gapi_credentials.json");
const credentials = JSON.parse(content); // load the credentials
const { client_secret, client_id, redirect_uris } = credentials.web;
const functionsOauth2Client = new googleAuth.OAuth2Client(client_id,client_secret, redirect_uris); // Constuct an auth client
functionsOauth2Client.setCredentials({refresh_token: credentials.refresh_token}); // Authorize a client with credentials
// run the script
return runScript(functionsOauth2Client,scriptId,JSON.stringify(googleTest)
).then((scriptData: any) => {
console.log("Script data is" + JSON.stringify(scriptData));
sendEmail(googleTest, scriptData);
return JSON.stringify(scriptData);
});
} catch (err) {
return JSON.stringify(err);
}
});
function runScript(auth: any, scriptid: string, test: any) {
return new Promise(function (resolve, reject) {
script.scripts
.run({auth: auth,scriptId: scriptid, resource: {function: "doGet", devMode: true,parameters: test }
})
.then((respons: any) => { resolve(respons.data);})
.catch((error: any) => {reject(error);});
});
}
I have changed the service account key and google credentials correctly when deploying the functions in development and in production.
I have tried many things including the following:
Enabling CORS in Cloud Functions for Firebase
Google Cloud Functions enable CORS?
The function is running perfectly in Development firebase project but not in Production firebase project. Please help!
You need to check that your function has been deployed correctly.
A function that doesn't exist (404 Not Found) or a function that can't be accessed (403 Forbidden) will both give that error as the Firebase Function is never executed, which means the correct CORS headers are never sent back to the client.
I have created a callable Cloud Function to read data from Firebase and send back the results to the client, however, only "null" is being returned to the client.
exports.user_get = functions.https.onCall((data, context) => {
if (context.auth && data) {
return admin.firestore().doc("users/" + context.auth.uid).get()
.then(function (doc) {
return { doc.data() };
})
.catch(function (error) {
console.log(error);
return error;
})
} return
});
I just reproduced your case connecting from a Cloud Function with a Firestore database and retriving data. As I can see you are trying to access the field in a wrong way when you are using "users/" + context.auth.uid, the method can't find the field so its returning a null value.
I just followed this Quickstart using a server client library documentation to populate a Firestore database and make a Get from it with node.js.
After that i followed this Deploying from GCP Console documentation in order to deploy a HTTP triggered Cloud Function with the following function
exports.helloWorld = (req, res) => {
firestore.collection('users').get()
.then((snapshot) => {
snapshot.forEach((doc) => {
console.log(doc.id, '=>', doc.data().born);
let ans = {
date : doc.data().born
};
res.status(200).send(ans);
});
})
And this is returning the desired field.
You can take a look of my entire example code here
This is because you are making a query from a database firestore, however the cloud support team has made it very cool to protect your applications from data leakages and so in a callable function as the name suggest you can only return data you passed to the same callable function through the data parameter and nothing else. if you try to access a database i suggest you use an onRequest Function and use the endpoint to get you data. that way you not only protect your database but avoid data and memory leakage.
examples of what you can return from a callable function
exports.sayHello = functions.https.onCall((data, context) => {
const name = data.name;
console.log(`hello ${name}`);
return `It was really fun working with you ${name}`;
});
first create a function in your index.js file and accept data through the data parameter but as i said you can only return data you passed through the data parameter.
now call the function
this is in the frontend code (attach an event listener to a button or something and trigger it
/* jsut say hello from firebase */
callButton.addEventListener('click', () => {
const sayHello = firebase.functions().httpsCallable('getAllUsers');
sayHello().then(resutls => {
console.log("users >>> ", resutls);
});
});
you can get your data using an onRequest like so
/* get users */
exports.getAllUsers = functions.https.onRequest((request, response) => {
cors(request, response, () => {
const data = admin.firestore().collection("users");
const users = [];
data.get().then((snapshot) => {
snapshot.docs.forEach((doc) => {
users.push(doc.data());
});
return response.status(200).send(users);
});
});
});
using a fetch() in your frontend code to get the response of the new onRequest function you can get the endpoint to the function in your firebase console dashboard.
but not that to hit the endpoint from your frontend code you need to add cors to your firebase cloud functions to allow access to the endpoint.
you can do that by just adding this line to the top of your index.js file of the firebase functions directory
const cors = require("cors")({origin: true});
I detected some recursion on one of the nodes of my realtime database and I want to delete (or set tu null) that specific node. This is my firebase function so far:
exports.cleanForms = functions.https.onRequest((req, res) => {
const parentRef = admin.database().ref("forms");
return parentRef.once('value').then(snapshot => {
snapshot.forEach(function(child) {
admin.database().ref('forms/'+child.key+'/user/forms').set(null);
});
});
});
Basically it should iterate all the records inside the forms node and delete its user/forms property.
But calling that function by going to this url: https://.cloudfunctions.net/cleanForms gives me this error:
Error: could not handle the request
And this is what I see on the logs:
10:47:57.818 PM cleanForms Function execution took 13602 ms, finished
with status: 'connection error'
The forms node has less than 3,000 records but as I mentioned before, it has some recursion on it. I don't know if it is failing due to its size or something related to that.
You are using an HTTPs Cloud Function: therefore you must "send a response to the client at the end" (Watch this official video by Doug Stevenson for more detail: https://youtu.be/7IkUgCLr5oA).
In your case, the "end" of the function will be when ALL of your set() asynchronous operations will be "done". Since the set() method returns a Promise, you have to use Promise.all() (again, watch this official video: https://youtu.be/d9GrysWH1Lc ).
So the following should work (not tested however):
exports.cleanForms = functions.https.onRequest((req, res) => {
const parentRef = admin.database().ref("forms");
parentRef.once('value')
.then(snapshot => {
const promises = [];
snapshot.forEach(child => {
promises.push(admin.database().ref('forms/'+child.key+'/user/forms').set(null));
});
return Promise.all(promises)
.then(results => {
response.send({result: results.length + ' node(s) deleted'});
})
.catch(error => {
response.status(500).send(error);
});
});
I try to write a document to one of the subcollections in firestore. The code when served locally writes to firestore but when I deploy it, it doesn't write anything.
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
var db = admin.firestore();
exports.update = functions.https.onRequest((request, response) => {
db.collection('emails').doc(request.query.trackingid).get()
.then( doc => {
if (!doc.exists) {
console.log('No such document!');
} else {
var viewRef = db.collection('emails').doc(request.query.trackingid).collection('views');
var view = {
when: (new Date()).toUTCString()
};
viewRef.add(view)
.then(ref => {
console.log("Document added");
return;
}).catch(err => {
console.log("Document creation failed", err);
});
}
return;
}).catch((err) => {
console.log('Tracking ID not found', err);
return;
});
response.sendStatus(200);
});
You're sending a response before the work can complete. For HTTP type functions, you are obliged to send a response only after all the work is complete. Cloud Functions will forcible terminate the function after the response is sent.
Note that get() and all of the promises derived from it are asynchronous, meaning that they return immediately, with the callbacks only being called when the work is complete. And you have no guarantee about when that will be.
What your code is doing now is kicking off a get(), then immediately following up with the next line of code, which sends the response before the work is complete. When this response is sent, Cloud Functions terminates the function, and your async work may not complete.
You need to only send the response after you are sure everything is done. This involves understanding the structure of your promises in your code.
You may want to watch my video series on using promises in Cloud Functions to better understand how this works.