is it necessary to free memory in Firebase Cloud Function - firebase

I was doing some POC on firebase cloud functions and made a CF with below snippet (This is working code snippet).
app.post('/create-pdf', (req, res) => {
pdfPromise.toFile( os.tmpdir() + '/template.pdf', (err, data) => {
if(err) {
console.log('Error Saving File', err);
res.send(Promise.reject());
}
res.send(Promise.resolve());
});
})
app.get('/get-pdf', (req, res) => {
res.sendFile(`${os.tmpdir()}/template.pdf`);
})
And the call to above api will be like this.
axios.post(url+'/create-pdf', { data : poBody }).then((res) => {
}).then(() => {
axios.get(url+'/get-pdf', { responseType: 'blob' }).then(res => {
const pdfBlob = new Blob([res.data], { type: 'application/pdf' });
saveAs(pdfBlob, 'payout.pdf')
})
})
This code is working fine..
I just want to know if server side code is running on GCP as CF, then do we need to clear the memory consumed by os.tmpdir(), or will it be cleared automatically?

Yes, you do need to delete the temporary files created in the temporary directory (which is an in-memory filesystem), because "files that you write consume memory available to your function, and sometimes persist between invocations".
There is a specific doc section and a video about that: https://firebase.google.com/docs/functions/tips#always_delete_temporary_files and https://www.youtube.com/watch?v=2mjfI0FYP7Y&feature=youtu.be

Related

Re-wrtie async function with async/await and store data in Firestore on CKEditor 5

I am using CK-editor 5 and trying to add "upload image" function. I created an adapter with the following example provided by Piyush (Source: https://noteyard.piyushdev.xyz/blogpost/62596290751435313d7f85b5). However, since I use firestore as backend storage and write all async functions with async/await. Is there any way I can re-write all these codes with async/await and store it in a firestore collection called "Images"? Document name can be random ID, and data key value can be "image". I tried to write by myself but failed. Besides, I only get an empty HTML page saying "Cannot GET /api/blogs/uploadImg" while checking out the API URL example provided, and I can't check how the API structure looks like".
full code:
function uploadAdapter(loader) {
return {
upload: () => {
return new Promise((resolve, reject) => {
const body = new FormData();
loader.file.then((file) => {
body.append("uploadImg", file);
fetch(`${API_URl}/${UPLOAD_ENDPOINT}`, {
method: "post",
body: body
})
.then((res => res.json()))
.then((res) => {
resolve({ default: `${API_URl}/${res.url}` })
})
.catch((err) => {
reject(err);
})
})
})
}
}
}
function uploadPlugin(editor) {
editor.plugins.get("FileRepository").createUploadAdapter = (loader) => {
return uploadAdapter(loader);
}
}
API URL with "Cannot GET /api/blogs/uploadImg":
const API_URl = "https://noteyard-backend.herokuapp.com"
const UPLOAD_ENDPOINT = "api/blogs/uploadImg";

How can I know sqlite database loading done in expo app

SQLite database is broken when executing query soon after app loading
This is Expo(react-native) app and the database is loaded from local storage.
If I wait a few seconds to use app(execute queries), the issue does not happen.
Maybe, this issue happens when a query is executed before all data loaded from local storage.
in APP.js
async function loadResourcesAsync() {
await Promise.all([
FileSystem.makeDirectoryAsync(`${FileSystem.documentDirectory}SQLite`, {
intermediates: true
}).then(
FileSystem.downloadAsync(
Asset.fromModule(require("./assets/db/mydb.db")).uri,
`${FileSystem.documentDirectory}SQLite/mydb.db`
)
)
]);
}
and in a Screen.js like this
const db = SQLite.openDatabase("mydb.db");
db.transaction(
tx => {
tx.executeSql(sql, [queryChar], (_, { rows: { _array } }) => {
// console.log(JSON.stringify(_array));
setSearchResult(_array);
});
},
e => {
console.log(e);
});
},
null
);
I expect a callback from openDatabase function, but it does not have such a callback.
How can I know a database loading done?

Uploading multiple images to firebase in React Native

So I am very new to the whole coding scene and am trying to learn how to code using react native. Right now, I'm trying to figure out how to upload images using firebase (functions)and google cloud storage.
Below is the backend code that enables me to upload one image per submission to firebase.
I was wondering is it possible to modify this code so that it can upload multiple images per submission? If so, how would I go about doing it?
exports.storeImage = functions.https.onRequest((request, response) => {
return cors(request, response, () => {
const body = JSON.parse(request.body);
fs.writeFileSync("/tmp/uploaded-image.jpg", body.image, "base64", err => {
console.log(err);
return response.status(500).json({ error: err });
});
const bucket = gcs.bucket("myapp.appspot.com");
const uuid = UUID();
return bucket.upload(
"/tmp/uploaded-image.jpg",
{
uploadType: "media",
destination: "/places/" + uuid + ".jpg",
metadata: {
metadata: {
contentType: "image/jpeg",
firebaseStorageDownloadTokens: uuid
}
}
},
(err, file) => {
if (!err) {
return response.status(201).json({
imageUrl:
"https://firebasestorage.googleapis.com/v0/b/" +
bucket.name +
"/o/" +
encodeURIComponent(file.name) +
"?alt=media&token=" +
uuid,
imagePath: "/places/" + uuid + ".jpg"
});
} else {
console.log(err);
return response.status(500).json({ error: err });
}
}
);
})
.catch(error => {
console.log("Token is invalid!");
response.status(403).json({error: "Unauthorized"});
});
});
});
I don't have a React Native environment easily available, but I believe you can do it from the client with code like this:
await firebase.storage().ref('test/test.jpg').putFile('/path/to/test.jpg');
let downloadUrl = await firebase.storage().ref('test/test.jpg').getDownloadURL();
console.log('downloadUrl :', downloadUrl); // do whatever you need with it
To upload another image you just call the code twice, you can even do it in concurrently if you want.
When you use Firebase you should do most of the operations directly from the client, so you just need backend (including cloud functions) code if you need to do some heavy processing, use the admin SDK, integrate with third party apps, or stuff like that. For simple database or storage operations the client will suit you much better.
Also, you don't need to compose the download URL yourself, getDownloadUrl() does that for you. And if you access storage from the client it automatically integrates with Firebase Auth so you can protect your data.

What is the suitable query for the following case?

This is my database structure:
I am trying to list all users with "locale" equal to "Cairo, Egypt" so I made the following query:
exports.calculateMatches = functions.https.onRequest((request, response) => {
// Access users' profiles that are located in the locale of the requesting user
databaseRef.child("users").orderByChild("locale").equalTo(request.query.locale).once("value")
.then(snap => {
snap.forEach(profile => {
console.log(profile);
});
});
});
Note this function is deployed to firebase cloud functions and this is what I get in the logs:
HTTPS type functions require that you send a response to the client in order to terminate the function. Without that, they will always time out, and the client will be waiting the whole time.
For example:
const databaseRef = admin.database().ref('')
exports.calculateMatches = functions.https.onRequest((request, response) => {
databaseRef.child("users").orderByChild("locale").equalTo(request.query.locale).once("value")
.then(snap => {
const profiles = []
snap.forEach(profile => {
profiles.push(profile.val())
});
response.send(profiles)
})
.catch(error => {
response.status(500).send(error)
});
});

firebase admin failing to query large items

Using firebase admin to retrieve data from a collection in a cloud function fails for large items. Sample code i am using to query the selection from the cloud function is as follow
admin.database().orderByChild('mmyyyy').equalTo(month).once('value');
this call fails when i try to retrieve 10600 items (trying to figure out why). in google console there is this log but nothing else that can point me in the right direction
textPayload: "Function execution took 18547 ms, finished with status: 'response error'"
After many failed attempt, i decided to try to execute this call on the client using firebase sdk as follow:
result = await firebase.database().ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
this works perfectly on the client without error and returning all my items 17000 of them (size of this json is 26MB).
Why is this the case? is there any limitation that is not documented?
Note:
I increased my cloud function memory to 1gb and timeout to 5min, didn't help.
Here is full sample code
const admin = require('firebase-admin');
var functions = require('firebase-functions');
admin.initializeApp(functions.config().firebase);
const cors = require('cors')({
"origin": "*",
"methods": "POST,GET",
"allowedHeaders": "Content-Type,uid,agence,month,course,raceType,raceNumber,status",
"preflightContinue": false,
"optionsSuccessStatus": 204
});
function _findTransactions(agence, month, course, raceType, raceNumber, status) {
return new Promise((resolve, reject) => {
try {
let db = admin.database();
let findPromise = db.ref(`transactions`).orderByChild('mmyyyy').equalTo(month).once('value');
findPromise.then((result) => {
let transactions = result.val();
//removed business logic
resolve(transactions);
}).catch((err) => {
console.log(err);
reject(err);
});
} catch (error) {
console.log(error);
reject(error);
}
});
}
exports.findTransactions = functions.https.onRequest((req, res) => {
let uid;
try {
cors(req, res, () => {
uid = req.headers.uid;
let agence = req.headers.agence;
let month = req.headers.month;
let course = req.headers.course;
let raceType = req.headers.raceType;
let raceNumber = req.headers.raceNumber;
let status = req.headers.status;
if (req.method !== 'GET') {
return handleResponse(req, res, 403);
}
if (!uid || uid == null || uid == undefined) {
return handleResponse(req, res, 401);
}
_validateUserId(uid, ['central_cashier', 'admin'])
.then(() => {
_findTransactions(agence, month, course, raceType, raceNumber, status)
.then((result) => {
return handleResponse(req, res, 200, result);
}).catch((error) => {
return handleResponse(req, res, 500);
});
}).catch((error) => {
return handleResponse(req, res, 401);
});
});
} catch (error) {
return handleError(res, uid, error);
}
});
Your payload is too large and is exceeding the quota for Google Cloud Functions as you stated.
Two options comes to mind:
Compress the payload. Gzip the file before sending it to the client. This is easy with the NodeJS built in Zlib module, or;
Set up a virtual machine. Virtual machines are not bound to these restrictions.
I did some testing and conclude that Google Cloud Functions (GCF) is enforcing some kind of timeout or "abort" action when a query results in a large number of results (ie. many Datastore entities). See my comments attached to this question for some background.
tl;dr I created my own Express.js webserver and ran my GCF code on it.
This is how I tested it: I created an ubuntu instance with http/https and the Datastore API enabled. On my instance, I installed Node, Express, and got a basic https server working (self-signed certificate worked fine since this is just testing an api backend service). Then I copy-pasted my GCF code (the function that was failing in GCF) into my minimal Express webserver. I pointed my React app to use my instance, which triggered a query that resulted in over 32,000 Datastore entities. My GCF function sends a query with datastore.runQuery() which is common.
It took about a minute, but eventually all 32,000 entities were served by Express and loaded in the React app (browser) with no errors.
A basic Express route calls my GCF function:
app.post('/foo', (req, res) => {
myCloudFunction(req, res);
})
const myCloudFunction = (req, res) => {
// Inspects req, queries Datastore, and returns the results.
};
For this test, my React app just points to https://mydomain.example.com:3000/foo
(because my Express server listens on port 3000).
So it seems that GCF is not good enough for my application, unless I add pagination to the app (which is on the roadmap).

Resources