I'm using firebase functions with Node.js and I'm trying to create multiple environments for that. As far as I read I just need to create separate projects for that in Firebase, which I did.
I'm using Flamelink as well and I want to achieve the same. I actually have a Bonfire plan for Flamelink that allows multiple environments.
My concern is that the different environments in Flamelink write into the same database in Firebase separating it only with a flag of environment, so whenever I want to query something from the db I also have to specify my environment as well.
Is there a way to have different databases for different Flamelink environments with my setup, so I only specify the environment in my config and not in my queries?
Currently it is not possible to have a database per environment using Flamelink.
The only way to achieve this is to add both projects to Flamelink.
The Flamelink JS SDK can however be used within a cloud function and would alleviate some of the complexity working with multiple environments.
The Flamelink JS SDK takes in an environment parameter (along with some others, like locale and database type) when it is initialised, contextualising the use of the SDK methods with the environment.
import * as functions from 'firebase-functions';
import * as admin from 'firebase-admin';
import * as flamelink from 'flamelink/app';
import 'flamelink/content';
admin.initializeApp();
const firebaseApp = admin.app();
const flApp = flamelink({
firebaseApp,
dbType: 'cf',
env: 'staging',
locale: 'en-US',
});
export const testFunction = functions.https.onRequest(async(request, response) => {
if (request.query.env) {
flApp.settings.setEnvironment(request.query.env) // example 'production'
}
try {
const posts = await flApp.content.get({ schemaKey: 'blogPosts' })
res.status(200).json({ posts })
} catch (e) {
// handle error
}
});
Depending on your connected front-end framework/language you can pass in the environment using environment variables
JS client example
const env = (process.env.FLAMELINK_DATA_ENV || 'staging').toLowerCase()
await fetch(`https://yourhost.cloudfunctions.net/testFunction?env=${env}`)
Related
My function file has some code in the global context.
import * as functions from 'firebase-functions';
import { initBot } from './bot';
const bot = initBot(process.env.BOT_TOKEN ?? '');
functions.logger.debug('Setting webhook on cold start');
const adminConfig = JSON.parse(process.env.FIREBASE_CONFIG ?? '');
// eslint-disable-next-line #typescript-eslint/no-floating-promises
bot.telegram.setWebhook(
`https://us-central1-${adminConfig.projectId}.cloudfunctions.net/${process.env.K_SERVICE}`
);
// handle all telegram updates with HTTPs trigger
exports.bot = functions
.runWith({ secrets: ['BOT_TOKEN'], memory: '128MB' })
.https.onRequest(async (request, response) => {
...
As documentation states,
When a cold start occurs, the global context of the function is evaluated.
I want to use this feature to run some code on cold start. This approach is taken from the telegraf.js example.
However, when I'm trying to deploy the function, Firebase CLI evaluates the global context, and the code throws an error due to lack of environment variable.
Questions:
Why does CLI evaluate the global context which obviously leads to errors due to the lack of proper environment?
Is there any way to avoid this?
Is it a proper understanding of the recommendations on the above mentioned documentation page that developers are encouraged to do some cold-start computation in the global context of the function module? Are there any documented restrictions?
I'm trying to setup my nextjs app to use runtime configurations. Basically, I have an endpoint url that needs to be available trough docker env vars.
I configured following these docs but it isn't working. My app still using default values from .env file. Could anyone help to understand what I missed or did wrong?
Thanks!
docs:
https://nextjs.org/docs/api-reference/next.config.js/runtime-configuration
https://nextjs.org/docs/advanced-features/custom-app
steps:
1- added to my next.config.js
publicRuntimeConfig: {
NEXT_PUBLIC_BACKEND_HOST: process.env.NEXT_PUBLIC_BACKEND_HOST,
},
2- retrieved config in my pages
const { publicRuntimeConfig } = getConfig()
const baseURL = publicRuntimeConfig.NEXT_PUBLIC_BACKEND_HOST
3- created a custom app to setup getInitialProps
Runtime configuration won't be available to any page (or component in a page) without getInitialProps.
import App from 'next/app'
function MyApp({ Component, pageProps }) {
return <Component {...pageProps} />
}
MyApp.getInitialProps = async (appContext) => {
const appProps = await App.getInitialProps(appContext);
return { ...appProps }
}
export default MyApp
Everything seems fine in your code, tested in a fresh project and everything worked correctly. Therefore I think the issue is that you don't actually have NEXT_PUBLIC_BACKEND_HOST env var set when you're running next start. Btw, you don't need to use the NEXT_PUBLIC prefix in this kind of usage. If you want build time args you can use NEXT_PUBLIC_ prefix to have the var be available both client and server side by just using process.env.NEXT_PUBLIC_ anywhere. Note that in that case the value will be inlined at build time, so the env var needs to be present during build.
Tried this link and created my first store in Quasar using Pinia, I also needed to change the .quasar/app.js manually to add the Pinia store and to make Pinia functional.
import { Quasar } from 'quasar'
import { markRaw } from 'vue'
import RootComponent from 'app/src/App.vue'
import createStore from 'app/src/stores/index'
import createRouter from 'app/src/router/index'
export default async function (createAppFn, quasarUserOptions) {
// Create the app instance.
// Here we inject into it the Quasar UI, the router & possibly the store.
const app = createAppFn(RootComponent)
app.config.devtools = true
app.use(Quasar, quasarUserOptions)
const store = typeof createStore === 'function'
? await createStore({})
: createStore
app.use(store)
const router = markRaw(
typeof createRouter === 'function'
? await createRouter({store})
: createRouter
)
// make router instance available in store
store.use(({ store }) => { store.router = router })
// Expose the app, the router and the store.
// Note that we are not mounting the app here, since bootstrapping will be
// different depending on whether we are in a browser or on the server.
return {
app,
store,
router
}
}
But the problem is .quasar/app.js is re-written with default contents as soon as quasar dev is executed and again I don't have access to the Pinia stores anymore.
As I said this application was based on vuex formerly.
Make sure you have the index file for pinia.
In "src/stores/index.js"
import { store } from 'quasar/wrappers'
import { createPinia } from 'pinia'
/*
* If not building with SSR mode, you can
* directly export the Store instantiation;
*
* The function below can be async too; either use
* async/await or return a Promise which resolves
* with the Store instance.
*/
export default store((/* { ssrContext } */) => {
const pinia = createPinia()
// You can add Pinia plugins here
// pinia.use(SomePiniaPlugin)
return pinia
})
Try checking quasar info
quasar info
Notice #quasar/app-webpack and vuex.
If you are using #quasar/app, try to move to #quasar/app-webpack by upgrading quasar.
quasar upgrade -i
If you have vuex installed in your quasar info output, try to remove it.
npm uninstall vuex
In your package-lock.json, look for "node_modules/vuex" and delete the key and value.
Then delete your "node_modules" folder and run npm i
After that, run quasar clean.
You may try creating a Pinia store via quasar command to validate it.
quasar new store <store_name>
It should generate a pinia store instead of vuex store.
Problem is older version of #quasar/app-webpack package. It got support for Pinia since v3.4.0. Check release notes here. So basically upgrade this package.
Run quasar upgrade -i and then quasar new store <store_name> [--format ts]
It will create a stores/ directory with pinia.
In my case i didn't need to edit any special files, simply replace the index.js in the stores folder. To get quasar CLI to then use pinia when running quasar new store I had to use quasar clean and just like that I had fully transitioned.
My solution was to remove and reinstall node_modules
According to the following google I/O (2019) post of the firebase team the new emulator allows us to combine firebase/database plus cloud function to fully simulate our firebase server codes. That should also mean we should be able to write tests for it.
we’re releasing a brand new Cloud Functions emulator that can also
communicate with the Cloud Firestore emulator. So if you want to build
a function that triggers upon a Firestore document update and writes
data back to the database you can code and test that entire flow
locally on your laptop (Source: Firebase Blog Entry)
I could find multiple resources looking/describing each individual simulation, but no all together
Unit Testing Cloud Function
Emulate Database writes
Emulate Firestore writes
To setup a test environment for cloud functions that allows you to simulate read/write and setup test data you have to do the following. Keep in mind, this really simulated/triggers cloud functions. So after you write into firestore, you need to wait a bit until the cloud function is done writing/processing, before you can read the assert the data.
An example repo with the code below can be found here: https://github.com/BrandiATMuhkuh/jaipuna-42-firebase-emulator .
Preconditions
I assume at this point you have a firebase project set up, with a functions folder and index.js in it. The tests will later be inside the functions/test folder. If you don't have project setup use firebase init to setup a project.
Install Dependencies
First add/install the following dependencies: mocha, #firebase/rules-unit-testing, firebase-functions-test, firebase-functions, firebase-admin, firebase-tools into the functions/package.json NOT the root folder.
cd "YOUR-LOCAL-EMULATOR"/functions (for example cd C:\Users\User\Documents\FirebaseLocal\functions)
npm install --save-dev mocha
npm install --save-dev firebase-functions-test
npm install --save-dev #firebase/rules-unit-testing
npm install firebase-admin
npm install firebase-tools
Replace all jaipuna-42-firebase-emulator names
It's very important that you use your own project-id. It must be the project-id of your own project and must exists. Fake ids won't work. So search for all jaipuna-42-firebase-emulator in the code below and replace it with your project-id.
index.js for an example cloud function
// functions/index.js
const functions = require("firebase-functions");
const admin = require("firebase-admin");
// init the database
admin.initializeApp(functions.config().firebase);
let fsDB = admin.firestore();
const heartOfGoldRef = admin
.firestore()
.collection("spaceShip")
.doc("Heart-of-Gold");
exports.addCrewMemeber = functions.firestore.document("characters/{characterId}").onCreate(async (snap, context) => {
console.log("characters", snap.id);
// before doing anything we need to make sure no other cloud function worked on the assignment already
// don't forget, cloud functions promise an "at least once" approache. So it could be multiple
// cloud functions work on it. (FYI: this is called "idempotent")
return fsDB.runTransaction(async t => {
// Let's load the current character and the ship
const [characterSnap, shipSnap] = await t.getAll(snap.ref, heartOfGoldRef);
// Let's get the data
const character = characterSnap.data();
const ship = shipSnap.data();
// set the crew members and count
ship.crew = [...ship.crew, context.params.characterId];
ship.crewCount = ship.crewCount + 1;
// update character space status
character.inSpace = true;
// let's save to the DB
await Promise.all([t.set(snap.ref, character), t.set(heartOfGoldRef, ship)]);
});
});
mocha test file index.test.js
// functions/test/index.test.js
// START with: yarn firebase emulators:exec "yarn test --exit"
// important, project ID must be the same as we currently test
// At the top of test/index.test.js
require("firebase-functions-test")();
const assert = require("assert");
const firebase = require("#firebase/testing");
// must be the same as the project ID of the current firebase project.
// I belive this is mostly because the AUTH system still has to connect to firebase (googles servers)
const projectId = "jaipuna-42-firebase-emulator";
const admin = firebase.initializeAdminApp({ projectId });
beforeEach(async function() {
this.timeout(0);
await firebase.clearFirestoreData({ projectId });
});
async function snooz(time = 3000) {
return new Promise(resolve => {
setTimeout(e => {
resolve();
}, time);
});
}
it("Add Crew Members", async function() {
this.timeout(0);
const heartOfGold = admin
.firestore()
.collection("spaceShip")
.doc("Heart-of-Gold");
const trillianRef = admin
.firestore()
.collection("characters")
.doc("Trillian");
// init crew members of the Heart of Gold
await heartOfGold.set({
crew: [],
crewCount: 0,
});
// save the character Trillian to the DB
const trillianData = { name: "Trillian", inSpace: false };
await trillianRef.set(trillianData);
// wait until the CF is done.
await snooz();
// check if the crew size has change
const heart = await heartOfGold.get();
const trillian = await trillianRef.get();
console.log("heart", heart.data());
console.log("trillian", trillian.data());
// at this point the Heart of Gold has one crew member and trillian is in space
assert.deepStrictEqual(heart.data().crewCount, 1, "Crew Members");
assert.deepStrictEqual(trillian.data().inSpace, true, "In Space");
});
run the test
To run the tests and emulator in one go, we navigate into the functions folder and write yarn firebase emulators:exec "yarn test --exit". This command can also be used in your CI pipeline. Or you can use npm test instead.
If it all worked, you should see the following output
√ Add Crew Members (5413ms)
1 passing (8S)
For anyone struggling with testing firestore triggers, I've made an example repository that will hopefully help other people.
https://github.com/benwinding/example-jest-firestore-triggers
It uses jest and the local firebase emulator.
I have two deployment targets in my Cloud Functions. I use the command line to determine which project I deploy too. firebase use myTestApp or firebase use myLiveApp
Can I tell which target I am using in my index.js code?
I am hoping for something like this
// change baseURLs and other keys
if (Target == live) {
const baseURL = 'myLiveApp';
const stripekey= 'secreteLivekey';
} else {
const baseURL = 'myTestApp';
const stripekey= 'secreteTestkey';
};
currently I get around this by commenting out the test or live keys and this is very annoying and easy to make a mistake.
You can set the necessary variables in each project's Functions config.
CLI:
firebase use <live_project>
firebase functions:config:set stripe.key="secreteLivekey"
firebase functions:config:set baseURL="myLiveApp"
firebase use <non_live_project>
firebase functions:config:set stripe.key="secreteTestkey"
firebase functions:config:set baseURL="myTestApp"
In code:
import * as functions from 'firebase-functions'
const baseURL = functions.config().baseURL
const stripekey = functions.config().stripe.key
EDIT:
Since your question technically was "Can you know your deployment target?", the answer is yes.
const config = JSON.parse(process.env.FIREBASE_CONFIG)
const projectId = config.projectId