How to configure google service key json file using .env configuration - firebase

I am creating an webservice in nodejs using google service api key and during the development phase i've put the file locally and test like this ...everythink was ok.
Now i have to deploy in firebase and i have to make unvisible the configuration file .
Do you have experience how to make this using .env file ?
The configuration file is like this :
{
"type": "service_account",
"project_id": "xxxx-eeee",
"private_key_id": "xxxw342234",
"private_key": "-----BEGIN PRIVATE KEY-----client_id": "xxxxxxxxxxxx",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/xxxxxxxxxxxxxxxx.iam.gserviceaccount.com"
}
Now i access the file using this code :
const translate = new Translate(
{
projectId: 'my-project-0o0o0o0o'
keyFilename: './my-project.json
}
);
Who can help me with the steps that i have to access this file without publishing the credentials in github then in firebase
UPDATED 13.02.2020
i created .env file to store there my credentials like this :
SEPA_TRANSLATE_PROJECT_ID="4354-4545"
SEPA_TRANSLATE_GOOGLE_API_KEY_TYPE="service_account"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_PRIVATE_KEY_ID="43434"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\nMM=\n-----END PRIVATE KEY-----\n"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_CLIENT_EMAIL="googletranslateaaaaapi#aaaaa-aaaa.iam.gserviceaccount.com"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_CLIENT_ID="34234"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_AUTH_URI="https://accounts.google.com/o/oauth2/auth"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_TOKEN_URI="https://oauth2.googleapis.com/token"
SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_AUTH_PROVIDER_X509_CERT_URL="https://www.googleapis.com/oauth2/v1/certs"
SEPA_TRANSLATE_GOOGLE_SERVICE_CLIENT_X509_CERT_URL="https://www.googleapis.com/robot/v1/metadata/x509/googletranslatewerwer%erwerwre-werwr.iam.gserviceaccount.com"
now the my-project.json file look like :
{
"type": process.env.SEPA_TRANSLATE_GOOGLE_API_KEY_TYPE,
"project_id": process.env.SEPA_TRANSLATE_PROJECT_ID,
"private_key_id": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_PRIVATE_KEY_ID,
"private_key": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_PRIVATE_KEY
"client_email": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_CLIENT_EMAIL,
"client_id": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_CLIENT_ID,
"auth_uri": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_AUTH_URI,
"token_uri": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_TOKEN_URI,
"auth_provider_x509_cert_url":process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_KEY_AUTH_PROVIDER_X509_CERT_URL,
"client_x509_cert_url": process.env.SEPA_TRANSLATE_GOOGLE_SERVICE_CLIENT_X509_CERT_URL
}
this json is called :
const translate = new Translate(
{
projectId: 'my-project-0o0o0o0o'
keyFilename: './my-project.json
}
);
when i test my api i got this error message now : error translate text: SyntaxError: Unexpected token p in JSON at position 13

You shouldn't use .env file on firebase cloud functions and can't use
environment data like process.env.xxxx in json file.
You should use environment configuration for firebase cloud functions.
To store environment data, you can use the firebase functions:config:set command.
To get environment data, you can use the functions.config() function.
Local setting file is .runtimeconfig.json, and run firebase functions:config:get > .runtimeconfig.json after store environment data.(firebase functions:config:set command)
See:
https://firebase.google.com/docs/functions/config-env
https://firebase.google.com/docs/functions/local-shell#install_and_configure_the_cloud_functions_shell
If you want to only use translate API then the keyFilename is not necessary and don't make .env file yourself.
You should set up translate API authentication with a firebase service account.
And to run the local, you should set the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the service Account JSON file that contains service account key.
export GOOGLE_APPLICATION_CREDENTIALS="[PATH]"
See:
https://github.com/googleapis/nodejs-translate#readme.
https://cloud.google.com/translate/docs/basic/setup-basic
And Could you try translate samples?
https://github.com/googleapis/nodejs-translate/tree/master/samples
https://github.com/googleapis/nodejs-translate/tree/master/samples#samples
https://github.com/googleapis/nodejs-translate/blob/master/samples/translate.js

Related

Trying to figure our why Sendgrid's email API integration with Firebase Functions is returning "API key does not start with SG."

I'm trying to integrate Sendgrid's Email API with my Firebase webapp. Here is what I've done:
1 - Installed Sendgrid's Mail package
npm install #sendgrid/mail
2 - Created an API Key on my Sendgrid Account
3 - Assigned the API Key to an environment variable using Firebase's environment configuration:
firebase functions:config:set sendgrid.key=SG.xxxxxxxxxxxxxxxxxxxxxx.xxxxxxxxxxxxxxxxxxxx_xxxxxx_xxxxxxxxxxxxxxx
4 - checked to see if the environment variable was correctly assigned:
firebase functions:config:get
result:
5 - imported sendgrid mail and set API key:
import * as sgMail from '#sendgrid/mail';
const API_KEY = functions.config().sendgrid.key
const TEMPLATE_ID = functions.config().sendgrid.template
sgMail.setApiKey(API_KEY);
6 - created a new user trigger to send a test email
export const welcomeEmail = functions.auth.user().onCreate(user => {
const msg = {
to: user.email,
from: 'contato#mycompanydomain.com.br',
templateId: TEMPLATE_ID,
dynamic_template_data: {
subject: 'test subject!',
name: user.displayName,
},
};
return sgMail.send(msg);
})
7 - Deployed firebase functions:
firebase deploy --only functions
After doing this I'd expect that at least the API key would be set correctly, but I keep getting the following error from firebase functions log:
I can't figure out what is wrong. I've tried a few things:
1- creating a new api key and starting the process all over.
2- pasting the API directly into the sgMail.setApiKey() method. like:
sgMail.setApiKey("SG.xxxxxxxxxxxxxxxxxxxxxx.xxxxxxxxxxxxxxxxxxxx_xxxxxx_xxxxxxxxxxxxxxx")
All of which gave the same "API key does not start with SG" error.
Can you guys help me figure out what's wrong?
Versions
"#sendgrid/mail": "^7.2.1",
"firebase-admin": "^8.10.0",
"firebase-functions": "^3.8.0"
Thank you so much
For does who had the same problem, I solved it by using the new firebase package and importing the config from firebase-functions.
So my code looks like this:
import functions, { config } from "firebase-functions";
import sendgrid from "#sendgrid/mail";
const MY_SENDGRID_API_KEY = config().sendgrid.key;
sendgrid.setApiKey(MY_SENDGRID_API_KEY);

Correct way to access firestore database from cloud functions inline editor

I am using the inline editor for cloud functions in Google cloud platform. Runtime is Node.js 8. I want to access the firestore database. I found some code in some other forum. It's a cloud firestore on-create trigger.
const admin = require('firebase-admin');
admin.initializeApp({
credential: admin.credential.applicationDefault()
});
const db = admin.firestore();
exports.myFunction = (data, context) => {
// access db here using something like db.collection('users');
}
I also read somewhere that, just calling admin.initializeApp() without parameters also works. And I can also move the 3 calls related to admin and db from outside the exports function to inside the exports function. So can anybody please tell me what is the right way to write the methods outside the exports functions? Also is there a way to access firestore db directly? Without admin initialize? I could not find much info about the inline editor so I am asking here. Please help.
My package.json
{
"name": "sample-firestore",
"version": "0.0.1",
"dependencies":
{
"semver": "^5.5.1",
"#google-cloud/firestore": "^1.3.0",
"firebase-admin": "^7.1.1"
}
}

AWS Amplify: Auth has not yet been added to this project

I'm not sure what I did (removed node_modules and reinstalled?) but I used to have auth configured for my AWS Amplify (v0.1.42) project and now it's gone.
My .amplifyrc is configured correctly:
{
"providers": {
"awscloudformation": {
"AuthRoleName": "xxx",
"UnauthRoleArn": "arn:aws:iam::xxx:role/xxx",
"AuthRoleArn": "arn:aws:iam::xxx:role/xxx",
"Region": "us-east-1",
"DeploymentBucketName": "xxx",
"UnauthRoleName": "xxx",
"StackName": "xxx",
"StackId": "arn:aws:cloudformation:us-east-1:xxx:stack/xxx/xxx"
}
}
}
I have an existing amplify/ directory with auth as part of the backend and #current-cloud-backend. I also have a generated src/aws-exports.js:
// WARNING: DO NOT EDIT. This file is automatically generated by AWS Amplify. It will be overwritten.
const awsmobile = {
"aws_project_region": "us-east-1",
"aws_cognito_identity_pool_id": "us-east-1:xxx",
"aws_cognito_region": "us-east-1",
"aws_user_pools_id": "us-east-1_xxx",
"aws_user_pools_web_client_id": "xxx"
};
export default awsmobile;
All of the information is correct, however when I try to execute amplify cli commands it acts like I'm a brand new user. How can I get the amplify cli to use the existing configuration in my project?
This is incredibly frustrating because I can't really afford to create everything from scratch again.

Best setup/workflow for testing and deploying an application using Google Firebase Hosting and Functions

I built a small web application (www.suntax.de) with vuejs and hosted on Google Firebase. I use the Firebase Hosting, Database and Functions. I use Functions to do some calculation on the backend.
Everything is running now, but I had some trouble getting there and some things I would like to improve regarding my workflow.
Thus, I would like to explain you what I did and where I struggle now.
First I set up the vuejs applicationd deployed it to firebase. I added my custom domain and everything was doing fine.
Later I implemented the cloud function and want to make a call from my vuejs app.
After firebase deploy, I can call this function in the browser and it just works fine:
https://us-central1-suntax-6d0ea.cloudfunctions.net/calcEEGcompensation?year=2013&month=1&size=10
Now I thought, that I just call the URL of the function from my vuejs app. But then I got the following error message:
[Error] Origin * is not allowed by Access-Control-Allow-Origin.
I was then reading that I had to add a rewritesection in the firebase.json:
Now my firebase.json looks like this:
{
"database": {
"rules": "database.rules.json"
},
"hosting": {
"public": "public",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
// Add the following rewrites section *within* "hosting"
"rewrites": [ {
"source": "/calcEEGcompensation", "function": "calcEEGcompensation"
} ]
}
}
Now I was able to call my firebase function with the following URL:
https://www.suntax.de/calcEEGcompensation?year=2013&month=1&size=10
After integrating the above URL in my vuejs application, the application is running fine after deployment to firebase server.
As I want to keep improving the application, I would like to test everything locally before deploying.
I know that I can run firebase hosting and functions locally by:
firebase serve --only functions,hosting
However, now my application has the hard coded call to my function https://www.suntax.de/calcEEGcompensation?year=2013&month=1&size=10 and this again leads to the error [Error] Origin * is not allowed by Access-Control-Allow-Origin.
But also changing the URL to the local function
http://localhost:5001/suntax-6d0ea/us-central1/calcEEGcompensation?year=2013&month=1&size=10
leads to the error message
[Error] Origin * is not allowed by Access-Control-Allow-Origin.
Some further reading brought me to the solution with cors. So I changed my function to:
const functions = require('firebase-functions');
const cors = require('cors')({origin: true});
exports.calcEEGcompensation = functions.https.onRequest((req, res) => {
cors(req, res, () => {
const yearOfCommissioning = req.query.year;
const monthOfCommissioning = req.query.month;
const pvsystemsize = req.query.size;
...
res.send("hello");
});
});
This helped and everything works now:
- The deployed application is still running fine.
- I can run the application locally while calling the local function as well as the deployed function. I just have to change the URL of the function.
But this is now my question:
Can I solve this issue in a better way? If I test the vuejs application and the function locally, I have to change the function URL before deployment. And then I have to change it back while testing locally.
I was not able to test my application and function locally without cors.
My ideal solution would be to have a setup, that can be fully tested locally and which can be easily deployed with firebase deploy without any changes of URLs. Is this possible?
Thanks and best regards,
Christoph
I found the solution which is pretty simple and does exactly what I want. I have no idea why I did not figure it out before.
Here is what I did:
I just call the relative URL from my firebase hosting:
calcEEGcompensation?year=2013&month=1&size=10
and everything works fine if the rewrites are properly set in firebase.json:
{
"database": {
"rules": "database.rules.json"
},
"hosting": {
"public": "public",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
// Add the following rewrites section *within* "hosting"
"rewrites": [ {
"source": "/calcEEGcompensation", "function": "calcEEGcompensation"
} ]
}
}
After setting up everything like this, I can just execute firebase serve --only functions,hosting and I can test everything locally.
After executing firebase deploy, everything runs smoothly on the server.
I do not need cors.
Thanks #wonsuc for your answers.
Update(For Firebase Hosting):
Currently there is no workaround you can solve it with Firebase Hosting SDK.
But there is alternative way you can achieve this.
Try below code in your hosting source.
if (location.hostname === 'localhost' || location.hostname === '127.0.0.1') {
console.log('It's a local server!');
}
In my opinion, these are best way to check dev environment currently.
Therefore you should use location.hostname in your Firebase Hosting, and server.address() in Cloud Functions.
And define your Functions end point with constant variable.
const DEBUG = location.hostname === 'localhost' || location.hostname === '127.0.0.1';
const FUNCTIONS_ENDPOINT_DEV = 'http://localhost:5001/';
const FUNCTIONS_ENDPOINT_PRD = 'https://us-central1-something.cloudfunctions.net/';
const FUNCTIONS_URL_CALC = 'calcEEGcompensation?year=2013&month=1&size=10';
var endpoint;
if (DEBUG) {
endpoint = FUNCTIONS_ENDPOINT_DEV + FUNCTIONS_URL_CALC;
} else {
endpoint = FUNCTIONS_ENDPOINT_PRD + FUNCTIONS_URL_CALC;
}
Original answer(For Cloud Functions for Firebase):
Have you tried node.js net module's server.address() function?
This method will tell you if your functions code is running on localhost or real deployed server.
For examples, you can use like this.
const server = app.listen(function() {
let host = server.address().address;
let port = server.address().port;
if (!host || host === '::') {
host = 'localhost:';
}
console.log('Server is running on %s%s', host, port);
});

Export json from Firestore

As we can download json file at Firebase RTDB console, are there any way to export json file of Firestore collection/document data?
One of my main objectives is to compare data before/after updating document.
I just wrote a backup and restore for Firestore. You can have a try on my GitHub.
https://github.com/dalenguyen/firestore-backup-restore
Thanks,
There is not, you'd need to come up with your own process such as querying a collection and looping over everything.
Update
As of August 7th, 2018, we do have a managed export system that allows you to dump your data into a GCS bucket. While this isn't JSON, it is a format that is the same as Cloud Datastore uses, so BigQuery understands it. This means you can then import it into BigQuery.
Google made it harder than it needed to be, so the community found a workaround. If you have npm installed, you can do this:
Export
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
Import
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
Source
I've written a tool that traverses the collections/documents of the database and exports everything into a single json file. Plus, it will import the same structure as well (helpful for cloning/moving Firestore databases). Since I've had a few colleagues use the code, I figured I would publish it as an NPM package. Feel free to try it and give some feedback.
https://www.npmjs.com/package/node-firestore-import-export
If someone wants a solution using Python 2 or 3.
Edit: note that this does not backup the rules
Fork it on https://github.com/RobinManoli/python-firebase-admin-firestore-backup
First install and setup Firebase Admin Python SDK: https://firebase.google.com/docs/admin/setup
Then install it in your python environment:
pip install firebase-admin
Install the Firestore module:
pip install google-cloud-core
pip install google-cloud-firestore
(from ImportError: Failed to import the Cloud Firestore library for Python)
Python Code
# -*- coding: UTF-8 -*-
import firebase_admin
from firebase_admin import credentials, firestore
import json
cred = credentials.Certificate('xxxxx-adminsdk-xxxxx-xxxxxxx.json') # from firebase project settings
default_app = firebase_admin.initialize_app(cred, {
'databaseURL' : 'https://xxxxx.firebaseio.com'
})
db = firebase_admin.firestore.client()
# add your collections manually
collection_names = ['myFirstCollection', 'mySecondCollection']
collections = dict()
dict4json = dict()
n_documents = 0
for collection in collection_names:
collections[collection] = db.collection(collection).get()
dict4json[collection] = {}
for document in collections[collection]:
docdict = document.to_dict()
dict4json[collection][document.id] = docdict
n_documents += 1
jsonfromdict = json.dumps(dict4json)
path_filename = "/mypath/databases/firestore.json"
print "Downloaded %d collections, %d documents and now writing %d json characters to %s" % ( len(collection_names), n_documents, len(jsonfromdict), path_filename )
with open(path_filename, 'w') as the_file:
the_file.write(jsonfromdict)
There is an npm for firestore export / import
Project to export
Goto -> project settings -> Service account -> Generate new private key -> save it as exportedDB.json
Project to import
Goto -> project settings -> Service account -> Generate new private key -> save it as importedDB.json
run these 2 commands from the folder where u saved the files
Export:
npx -p node-firestore-import-export firestore-export -a exportedDB.json -b backup.json
Import:
npx -p node-firestore-import-export firestore-import -a importedDB.json -b backup.json
Firestore is still early in its development so please check the docs on backups for any information pertaining to Firestore.
I found this npm package, node-firestore-backup, to be easy and useful.
Note that the --accountCredentials path/to/credentials/file.json is referring to a service account key json file that you can get by following instructions from https://developers.google.com/identity/protocols/application-default-credentials.
Go to the API Console Credentials page.
From the project drop-down, select your project.
On the Credentials page, select the Create credentials drop-down, then select Service account key.
From the Service account drop-down, select an existing service account or create a new one.
For Key type, select the JSON key option, then select Create. The file automatically downloads to your computer.
Put the *.json file you just downloaded in a directory of your choosing. This directory must be private (you can't let anyone get access to this), but accessible to your web server code.
It works for me.
I used Cloud Functions to export all data in Firestore to JSON format. The function that I was used:
exports.exportFirestore2Json = functions.https.onRequest((request, response) => {
db.collection("data").get().then(function(querySnapshot) {
const orders = [];
var order = null
querySnapshot.forEach(doc => {
order = doc.data();
orders.push(order);
});
response.send(JSON.stringify(orders))
return true
})
.catch(function(error) {
console.error("Error adding document: ", error);
return false
});
})
Then, go to https://your-project-id.cloudfunctions.net/exportFirestore2Json you will see something like this
Yes you can, you did not need to start billing in your firebase console. There is a great npm package https://www.npmjs.com/package/firestore-export-import with this you can export and import firestore collection and documents easily. Just follow some steps:
-Get your service account key
Open Firebase console > Project settings > Service accounts > generate new private key
rename the downloaded file with serviceAccountKey.json
-Now create a new folder and index.js file.
-Paste you servicekey.json in this folder
-Now install this package
npm install firestore-export-import
OR
yarn add firestore-export-import
Exporting data from firebase
const { initializeApp} = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
const fs = require('fs');
const { backup } = require('firestore-export-import')
//backup('collection name')
backup('users').then((data) =>
{
const json = JSON.stringify(data);
//where collection.json is your output file name.
fs.writeFile('collection.json', json, 'utf8',()=>{
console.log('done');
})
});
Execute node index.js and you should see a new collection.json file with your collection and documents in it. If it looks a little messy pretty format it online with
https://codebeautify.org/jsonviewer
This index.js was just a very basic configuration which exports the whole collection with everything in it, read their documentation you could do queries and much more!
Importing data to firebase
const { initializeApp,restore } = require('firestore-export-import')
const serviceAccount = require('./serviceAccountKey.json')
const appName = '[DEFAULT]'
initializeApp(serviceAccount, appName)
restore('collection.json', {
//where refs is an array of key items
refs: ['users'],
//autoParseDates to parse dates if documents have timestamps
autoParseDates: true,
},()=>{
console.log('done');
})
After execution you should see your firestore populated with collection users!
for dumping json from your local to firestoreDB:
npx -p node-firestore-import-export firestore-import -a credentials.json -b backup.json
for downloading data from firestoreDB to your local:
npx -p node-firestore-import-export firestore-export -a credentials.json -b backup.json
to generate credentials.json, go to project settings -> service accounts -> generate a private key.
Create a blank folder (call it firebaseImportExport ) and run npm init
Go to the source Firebase project -> Settings -> Service Accounts
Click on the Generate new private key button and rename the file as source.json and put it in the firebaseImportExport folder
Do the same (step 2 & 3) for the destination project and rename the file as destination.json
Install the npm i firebase-admin npm package.
Write the following code in the index.js
const firebase = require('firebase-admin');
var serviceAccountSource = require("./source.json");
var serviceAccountDestination = require("./destination.json");
const sourceAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountSource),
databaseURL: "https://**********.firebaseio.com" // replace with source
});
const destinationAdmin = firebase.initializeApp({
credential: firebase.credential.cert(serviceAccountDestination),
databaseURL: "https://$$$$$.firebaseio.com"
}, "destination");
const collections = [ "books", "authors", ...]; // replace with your collections
var source = sourceAdmin.firestore();
var destination = destinationAdmin.firestore();
collections.forEach(colName => {
source.collection(colName).get().then(function(querySnapshot) {
querySnapshot.forEach(function(doc) {
destination.collection(colName).doc(doc.id).set({...doc.data()});
});
});
});
Open any of your clientside firebase apps (React, Angular, etc.). Use this code anywhere to log console and copy
const products = await db
.collection("collectionName")
.where("time", ">", new Date("2020-09-01"))
.get()
const json = JSON.stringify(products.docs.map((doc) => ({ ...doc.data() })))
console.log(json)
Documents can also be downloaded as JSON via the REST API.
This is an example using curl in conjunction with the Cloud SDK to obtain an access token:
curl -H "Authorization: Bearer "$(gcloud auth print-access-token) \
"https://firestore.googleapis.com/v1/projects/$PROJECT/databases/(default)/documents/$COLLECTION/$DOCUMENT"
I found an easier solution. There is a tool called Firefoo. It lists all the collection documents along with created users with multiple providers(email & password, phone number, google, facebook etc). You can export data in JSON & CSV along with that you can view the data in simplified format like Table, Tree, JSON.
Note:- You don't have to go through all the process for importing or exporting data from your firebase console.

Resources