know deployment target in index.js firebase functions - firebase

I have two deployment targets in my Cloud Functions. I use the command line to determine which project I deploy too. firebase use myTestApp or firebase use myLiveApp
Can I tell which target I am using in my index.js code?
I am hoping for something like this
// change baseURLs and other keys
if (Target == live) {
const baseURL = 'myLiveApp';
const stripekey= 'secreteLivekey';
} else {
const baseURL = 'myTestApp';
const stripekey= 'secreteTestkey';
};
currently I get around this by commenting out the test or live keys and this is very annoying and easy to make a mistake.

You can set the necessary variables in each project's Functions config.
CLI:
firebase use <live_project>
firebase functions:config:set stripe.key="secreteLivekey"
firebase functions:config:set baseURL="myLiveApp"
firebase use <non_live_project>
firebase functions:config:set stripe.key="secreteTestkey"
firebase functions:config:set baseURL="myTestApp"
In code:
import * as functions from 'firebase-functions'
const baseURL = functions.config().baseURL
const stripekey = functions.config().stripe.key
EDIT:
Since your question technically was "Can you know your deployment target?", the answer is yes.
const config = JSON.parse(process.env.FIREBASE_CONFIG)
const projectId = config.projectId

Related

Firebase multiple environments with Flamelink

I'm using firebase functions with Node.js and I'm trying to create multiple environments for that. As far as I read I just need to create separate projects for that in Firebase, which I did.
I'm using Flamelink as well and I want to achieve the same. I actually have a Bonfire plan for Flamelink that allows multiple environments.
My concern is that the different environments in Flamelink write into the same database in Firebase separating it only with a flag of environment, so whenever I want to query something from the db I also have to specify my environment as well.
Is there a way to have different databases for different Flamelink environments with my setup, so I only specify the environment in my config and not in my queries?
Currently it is not possible to have a database per environment using Flamelink.
The only way to achieve this is to add both projects to Flamelink.
The Flamelink JS SDK can however be used within a cloud function and would alleviate some of the complexity working with multiple environments.
The Flamelink JS SDK takes in an environment parameter (along with some others, like locale and database type) when it is initialised, contextualising the use of the SDK methods with the environment.
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
import * as flamelink from 'flamelink/app';
import 'flamelink/content';
admin.initializeApp();
const firebaseApp = admin.app();
const flApp = flamelink({
firebaseApp,
dbType: 'cf',
env: 'staging',
locale: 'en-US',
});
export const testFunction = functions.https.onRequest(async(request, response) => {
if (request.query.env) {
flApp.settings.setEnvironment(request.query.env) // example 'production'
}
try {
const posts = await flApp.content.get({ schemaKey: 'blogPosts' })
res.status(200).json({ posts })
} catch (e) {
// handle error
}
});
Depending on your connected front-end framework/language you can pass in the environment using environment variables
JS client example
const env = (process.env.FLAMELINK_DATA_ENV || 'staging').toLowerCase()
await fetch(`https://yourhost.cloudfunctions.net/testFunction?env=${env}`)

Trying to figure our why Sendgrid's email API integration with Firebase Functions is returning "API key does not start with SG."

I'm trying to integrate Sendgrid's Email API with my Firebase webapp. Here is what I've done:
1 - Installed Sendgrid's Mail package
npm install #sendgrid/mail
2 - Created an API Key on my Sendgrid Account
3 - Assigned the API Key to an environment variable using Firebase's environment configuration:
firebase functions:config:set sendgrid.key=SG.xxxxxxxxxxxxxxxxxxxxxx.xxxxxxxxxxxxxxxxxxxx_xxxxxx_xxxxxxxxxxxxxxx
4 - checked to see if the environment variable was correctly assigned:
firebase functions:config:get
result:
5 - imported sendgrid mail and set API key:
import * as sgMail from '#sendgrid/mail';
const API_KEY = functions.config().sendgrid.key
const TEMPLATE_ID = functions.config().sendgrid.template
sgMail.setApiKey(API_KEY);
6 - created a new user trigger to send a test email
export const welcomeEmail = functions.auth.user().onCreate(user => {
const msg = {
to: user.email,
from: 'contato#mycompanydomain.com.br',
templateId: TEMPLATE_ID,
dynamic_template_data: {
subject: 'test subject!',
name: user.displayName,
},
};
return sgMail.send(msg);
})
7 - Deployed firebase functions:
firebase deploy --only functions
After doing this I'd expect that at least the API key would be set correctly, but I keep getting the following error from firebase functions log:
I can't figure out what is wrong. I've tried a few things:
1- creating a new api key and starting the process all over.
2- pasting the API directly into the sgMail.setApiKey() method. like:
sgMail.setApiKey("SG.xxxxxxxxxxxxxxxxxxxxxx.xxxxxxxxxxxxxxxxxxxx_xxxxxx_xxxxxxxxxxxxxxx")
All of which gave the same "API key does not start with SG" error.
Can you guys help me figure out what's wrong?
Versions
"#sendgrid/mail": "^7.2.1",
"firebase-admin": "^8.10.0",
"firebase-functions": "^3.8.0"
Thank you so much
For does who had the same problem, I solved it by using the new firebase package and importing the config from firebase-functions.
So my code looks like this:
import functions, { config } from "firebase-functions";
import sendgrid from "#sendgrid/mail";
const MY_SENDGRID_API_KEY = config().sendgrid.key;
sendgrid.setApiKey(MY_SENDGRID_API_KEY);

How to setup a firebase firestore and cloud function test suit with firebase Emulator for JS development

According to the following google I/O (2019) post of the firebase team the new emulator allows us to combine firebase/database plus cloud function to fully simulate our firebase server codes. That should also mean we should be able to write tests for it.
we’re releasing a brand new Cloud Functions emulator that can also
communicate with the Cloud Firestore emulator. So if you want to build
a function that triggers upon a Firestore document update and writes
data back to the database you can code and test that entire flow
locally on your laptop (Source: Firebase Blog Entry)
I could find multiple resources looking/describing each individual simulation, but no all together
Unit Testing Cloud Function
Emulate Database writes
Emulate Firestore writes
To setup a test environment for cloud functions that allows you to simulate read/write and setup test data you have to do the following. Keep in mind, this really simulated/triggers cloud functions. So after you write into firestore, you need to wait a bit until the cloud function is done writing/processing, before you can read the assert the data.
An example repo with the code below can be found here: https://github.com/BrandiATMuhkuh/jaipuna-42-firebase-emulator .
Preconditions
I assume at this point you have a firebase project set up, with a functions folder and index.js in it. The tests will later be inside the functions/test folder. If you don't have project setup use firebase init to setup a project.
Install Dependencies
First add/install the following dependencies: mocha, #firebase/rules-unit-testing, firebase-functions-test, firebase-functions, firebase-admin, firebase-tools into the functions/package.json NOT the root folder.
cd "YOUR-LOCAL-EMULATOR"/functions (for example cd C:\Users\User\Documents\FirebaseLocal\functions)
npm install --save-dev mocha
npm install --save-dev firebase-functions-test
npm install --save-dev #firebase/rules-unit-testing
npm install firebase-admin
npm install firebase-tools
Replace all jaipuna-42-firebase-emulator names
It's very important that you use your own project-id. It must be the project-id of your own project and must exists. Fake ids won't work. So search for all jaipuna-42-firebase-emulator in the code below and replace it with your project-id.
index.js for an example cloud function
// functions/index.js
const functions = require("firebase-functions");
const admin = require("firebase-admin");
// init the database
admin.initializeApp(functions.config().firebase);
let fsDB = admin.firestore();
const heartOfGoldRef = admin
.firestore()
.collection("spaceShip")
.doc("Heart-of-Gold");
exports.addCrewMemeber = functions.firestore.document("characters/{characterId}").onCreate(async (snap, context) => {
console.log("characters", snap.id);
// before doing anything we need to make sure no other cloud function worked on the assignment already
// don't forget, cloud functions promise an "at least once" approache. So it could be multiple
// cloud functions work on it. (FYI: this is called "idempotent")
return fsDB.runTransaction(async t => {
// Let's load the current character and the ship
const [characterSnap, shipSnap] = await t.getAll(snap.ref, heartOfGoldRef);
// Let's get the data
const character = characterSnap.data();
const ship = shipSnap.data();
// set the crew members and count
ship.crew = [...ship.crew, context.params.characterId];
ship.crewCount = ship.crewCount + 1;
// update character space status
character.inSpace = true;
// let's save to the DB
await Promise.all([t.set(snap.ref, character), t.set(heartOfGoldRef, ship)]);
});
});
mocha test file index.test.js
// functions/test/index.test.js
// START with: yarn firebase emulators:exec "yarn test --exit"
// important, project ID must be the same as we currently test
// At the top of test/index.test.js
require("firebase-functions-test")();
const assert = require("assert");
const firebase = require("#firebase/testing");
// must be the same as the project ID of the current firebase project.
// I belive this is mostly because the AUTH system still has to connect to firebase (googles servers)
const projectId = "jaipuna-42-firebase-emulator";
const admin = firebase.initializeAdminApp({ projectId });
beforeEach(async function() {
this.timeout(0);
await firebase.clearFirestoreData({ projectId });
});
async function snooz(time = 3000) {
return new Promise(resolve => {
setTimeout(e => {
resolve();
}, time);
});
}
it("Add Crew Members", async function() {
this.timeout(0);
const heartOfGold = admin
.firestore()
.collection("spaceShip")
.doc("Heart-of-Gold");
const trillianRef = admin
.firestore()
.collection("characters")
.doc("Trillian");
// init crew members of the Heart of Gold
await heartOfGold.set({
crew: [],
crewCount: 0,
});
// save the character Trillian to the DB
const trillianData = { name: "Trillian", inSpace: false };
await trillianRef.set(trillianData);
// wait until the CF is done.
await snooz();
// check if the crew size has change
const heart = await heartOfGold.get();
const trillian = await trillianRef.get();
console.log("heart", heart.data());
console.log("trillian", trillian.data());
// at this point the Heart of Gold has one crew member and trillian is in space
assert.deepStrictEqual(heart.data().crewCount, 1, "Crew Members");
assert.deepStrictEqual(trillian.data().inSpace, true, "In Space");
});
run the test
To run the tests and emulator in one go, we navigate into the functions folder and write yarn firebase emulators:exec "yarn test --exit". This command can also be used in your CI pipeline. Or you can use npm test instead.
If it all worked, you should see the following output
√ Add Crew Members (5413ms)
1 passing (8S)
For anyone struggling with testing firestore triggers, I've made an example repository that will hopefully help other people.
https://github.com/benwinding/example-jest-firestore-triggers
It uses jest and the local firebase emulator.

Firebase functions - Deploy Completed but doesn't exist in Firebase

Follow the guide of using Cloud Functions with Firebase.
Setup environment
setup project in Firebase
Created function
command prompt writes that function deployed, but firebase is empty.
I am new with deploying functions so I am sure that it is stupid question and I think I did something wrong in setting up but I checked three times different guides and it looks everything done right. So please if you know what the problem it is can be?
I used this guide and there I done everything till initializing the project
https://firebase.google.com/docs/functions/get-started
After that in index.js I wrote a function
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(function.config().firebase);
export.sendNotification = functions.database
.ref('/notifications/{user_id}/{notification_id}')
.onWrite(event => {
conts user_id = event.params.user_id;
const notification = event.params.notification_id;
if(!event.data.val()){
return console.log('A notification has been deleted ', notification_id);
}
const payload = {
notification: {
title: "Friend Request",
body: "Received new Friend Request",
icon: "default"
}
};
return admin.messaging().sendToDevice(/*Token*/, payload).then(response =>{
console.log('');
});
});
And with the command
firebase deploy
I tried to deploy function
But in firebase cattegory "Function" it is still empty
Error in CMD
There is syntax error, Please change below line in your code
admin.initializeApp(function.config().firebase);
to
admin.initializeApp(functions.config().firebase);
I'm quite late here but it might help somebody else. You're right, it is typo. The correct command is:
exports
not
export

Export json from Firestore

As we can download json file at Firebase RTDB console, are there any way to export json file of Firestore collection/document data?
One of my main objectives is to compare data before/after updating document.
I just wrote a backup and restore for Firestore. You can have a try on my GitHub.
https://github.com/dalenguyen/firestore-backup-restore
Thanks,
There is not, you'd need to come up with your own process such as querying a collection and looping over everything.
Update
As of August 7th, 2018, we do have a managed export system that allows you to dump your data into a GCS bucket. While this isn't JSON, it is a format that is the same as Cloud Datastore uses, so BigQuery understands it. This means you can then import it into BigQuery.
Google made it harder than it needed to be, so the community found a workaround. If you have npm installed, you can do this:
Export
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Import
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Source
I've written a tool that traverses the collections/documents of the database and exports everything into a single json file. Plus, it will import the same structure as well (helpful for cloning/moving Firestore databases). Since I've had a few colleagues use the code, I figured I would publish it as an NPM package. Feel free to try it and give some feedback.
https://www.npmjs.com/package/node-firestore-import-export
If someone wants a solution using Python 2 or 3.
Edit: note that this does not backup the rules
Fork it on https://github.com/RobinManoli/python-firebase-admin-firestore-backup
First install and setup Firebase Admin Python SDK: https://firebase.google.com/docs/admin/setup
Then install it in your python environment:
pip install firebase-admin
Install the Firestore module:
pip install google-cloud-core
pip install google-cloud-firestore
(from ImportError: Failed to import the Cloud Firestore library for Python)
Python Code
# -*- coding: UTF-8 -*-
import firebase_admin
from firebase_admin import credentials, firestore
import json
cred = credentials.Certificate('xxxxx-adminsdk-xxxxx-xxxxxxx.json') # from firebase project settings
default_app = firebase_admin.initialize_app(cred, {
'databaseURL' : 'https://xxxxx.firebaseio.com'
})
db = firebase_admin.firestore.client()
# add your collections manually
collection_names = ['myFirstCollection', 'mySecondCollection']
collections = dict()
dict4json = dict()
n_documents = 0
for collection in collection_names:
collections[collection] = db.collection(collection).get()
dict4json[collection] = {}
for document in collections[collection]:
docdict = document.to_dict()
dict4json[collection][document.id] = docdict
n_documents += 1
jsonfromdict = json.dumps(dict4json)
path_filename = "/mypath/databases/firestore.json"
print "Downloaded %d collections, %d documents and now writing %d json characters to %s" % ( len(collection_names), n_documents, len(jsonfromdict), path_filename )
with open(path_filename, 'w') as the_file:
the_file.write(jsonfromdict)
There is an npm for firestore export / import
Project to export
Goto -> project settings -> Service account -> Generate new private key -> save it as exportedDB.json
Project to import
Goto -> project settings -> Service account -> Generate new private key -> save it as importedDB.json
run these 2 commands from the folder where u saved the files
Export:
npx -p node-firestore-import-export firestore-export -a exportedDB.json -b backup.json
Import:
npx -p node-firestore-import-export firestore-import -a importedDB.json -b backup.json
Firestore is still early in its development so please check the docs on backups for any information pertaining to Firestore.
I found this npm package, node-firestore-backup, to be easy and useful.
Note that the --accountCredentials path/to/credentials/file.json is referring to a service account key json file that you can get by following instructions from https://developers.google.com/identity/protocols/application-default-credentials.
Go to the API Console Credentials page.
From the project drop-down, select your project.
On the Credentials page, select the Create credentials drop-down, then select Service account key.
From the Service account drop-down, select an existing service account or create a new one.
For Key type, select the JSON key option, then select Create. The file automatically downloads to your computer.
Put the *.json file you just downloaded in a directory of your choosing. This directory must be private (you can't let anyone get access to this), but accessible to your web server code.
It works for me.
I used Cloud Functions to export all data in Firestore to JSON format. The function that I was used:
exports.exportFirestore2Json = functions.https.onRequest((request, response) => {
db.collection("data").get().then(function(querySnapshot) {
const orders = [];
var order = null
querySnapshot.forEach(doc => {
order = doc.data();
orders.push(order);
});
response.send(JSON.stringify(orders))
return true
})
.catch(function(error) {
console.error("Error adding document: ", error);
return false
});
})
Then, go to https://your-project-id.cloudfunctions.net/exportFirestore2Json you will see something like this
Yes you can, you did not need to start billing in your firebase console. There is a great npm package https://www.npmjs.com/package/firestore-export-import with this you can export and import firestore collection and documents easily. Just follow some steps:
-Get your service account key
Open Firebase console > Project settings > Service accounts > generate new private key
rename the downloaded file with serviceAccountKey.json
-Now create a new folder and index.js file.
-Paste you servicekey.json in this folder
-Now install this package
npm install firestore-export-import
OR
yarn add firestore-export-import
Exporting data from firebase
const { initializeApp} = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
const fs = require('fs');
const { backup } = require('firestore-export-import')
//backup('collection name')
backup('users').then((data) =>
{
const json = JSON.stringify(data);
//where collection.json is your output file name.
fs.writeFile('collection.json', json, 'utf8',()=>{
console.log('done');
})
});
Execute node index.js and you should see a new collection.json file with your collection and documents in it. If it looks a little messy pretty format it online with
https://codebeautify.org/jsonviewer
This index.js was just a very basic configuration which exports the whole collection with everything in it, read their documentation you could do queries and much more!
Importing data to firebase
const { initializeApp,restore } = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
restore('collection.json', {
//where refs is an array of key items
refs: ['users'],
//autoParseDates to parse dates if documents have timestamps
autoParseDates: true,
},()=>{
console.log('done');
})
After execution you should see your firestore populated with collection users!
for dumping json from your local to firestoreDB:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
for downloading data from firestoreDB to your local:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
to generate credentials.json, go to project settings -> service accounts -> generate a private key.
Create a blank folder (call it firebaseImportExport ) and run npm init
Go to the source Firebase project -> Settings -> Service Accounts
Click on the Generate new private key button and rename the file as source.json and put it in the firebaseImportExport folder
Do the same (step 2 & 3) for the destination project and rename the file as destination.json
Install the npm i firebase-admin npm package.
Write the following code in the index.js
const firebase = require('firebase-admin');
var serviceAccountSource = require("./source.json");
var serviceAccountDestination = require("./destination.json");
const sourceAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountSource),
databaseURL: "https://**********.firebaseio.com" // replace with source
});
const destinationAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountDestination),
databaseURL: "https://$$$$$.firebaseio.com"
}, "destination");
const collections = [ "books", "authors", ...]; // replace with your collections
var source = sourceAdmin.firestore();
var destination = destinationAdmin.firestore();
collections.forEach(colName => {
source.collection(colName).get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
destination.collection(colName).doc(doc.id).set({...doc.data()});
});
});
});
Open any of your clientside firebase apps (React, Angular, etc.). Use this code anywhere to log console and copy
const products = await db
.collection("collectionName")
.where("time", ">", new Date("2020-09-01"))
.get()
const json = JSON.stringify(products.docs.map((doc) => ({ ...doc.data() })))
console.log(json)
Documents can also be downloaded as JSON via the REST API.
This is an example using curl in conjunction with the Cloud SDK to obtain an access token:
curl -H "Authorization: Bearer "$(gcloud auth print-access-token) \
"https://firestore.googleapis.com/v1/projects/$PROJECT/databases/(default)/documents/$COLLECTION/$DOCUMENT"
I found an easier solution. There is a tool called Firefoo. It lists all the collection documents along with created users with multiple providers(email & password, phone number, google, facebook etc). You can export data in JSON & CSV along with that you can view the data in simplified format like Table, Tree, JSON.
Note:- You don't have to go through all the process for importing or exporting data from your firebase console.

Resources