I have code that exists in my index.js with numerous functions that have different really expensive imports. From my understanding, these functions share all global imports. So, I have two options
do all imports globally which leads to slow cold starts but faster function calls when warm idle instances are available
do lazy imports inside the functions, which makes for fast cold starts but slower function calls
I was wondering if a third option exists where I can split the index.js such that global imports are separated. Is there such an option, or an alternative that I am missing?
Doing the expensive require from inside the function body does not necessarily lead to slower function calls, as requires are cached in-memory. Warm invocations (those that don't require a cold start) will run the require line but won't actually need to re-load the code.
// runs at cold start time, use for shared dependencies
const commonImport = require("common-import");
exports.myFunc = functions.https.onRequest((req, res) => {
// runs only at invocation time but cached, use for unshared deps
const expensiveImport = require("expensive-import");
});
For what it's worth, this particular type of problem is also something the Firebase team is actively investigating how to improve. You might consider signing up for the Firebase Alpha Program to receive word of early testing for such features.
There's some arcane solutions others go through:
const common = require("common-import");
if (process.env.FUNCTION_NAME === "fn1") {
const expensive1 = require("expensive-import-1");
}
if (process.env.FUCNTION_NAME === "fn2") {
const expensive2 = require("expensive-import-2");
}
exports.fn1 = functions.https.onRequest((req, res) => {
res.send(expensive1.expensiveResult());
});
exports.fn2 = functions.https.onRequest((req, res) => {
res.send(expensive2.expensiveResult());
});
This will load only the files you want on cold-start. Those environment variables won't be present when we run your code locally to discover what functions to deploy however, so you must always export all functions you want to deploy.
As Michael said earlier though, we have someone actively working on a much better solution that will feel like a breath of fresh air, so don't spend too much time optimizing this if you can wait a bit.
Related
I have a supplier app and a customer app both sharing a single Firestore.
The supplier functions requires multiple imports
import x from "X";
import y from "Y";
import z from "Z";
The customer functions also requires multiple imports but its not same as the supplier imports
import a from "A";
import b from "B";
import c from "C";
So is its possible to have two index.js file so that when customer makes a request, import x from "X"; doesn't get called.
As mentioned in the thread:
Doing the expensive require from inside the function body does not
necessarily lead to slower function calls, as requires are cached
in-memory. Warm invocations (those that don't require a cold start)
will run the required line but won't actually need to reload the code.
// runs at cold start time, use for shared dependencies
const commonImport = require("common-import");
exports.myFunc = functions.https.onRequest((req, res) => {
// runs only at invocation time but cached, use for unshared deps
const expensiveImport = require("expensive-import");
});
For what it's worth, this particular type of problem is also something
the Firebase team is actively investigating how to improve. You might
consider signing up for the Firebase Alpha Program to receive
word of early testing for such features.
A workaround also provided here:
const common = require("common-import");
if (process.env.FUNCTION_NAME === "fn1") {
const expensive1 = require("expensive-import-1");
}
if (process.env.FUCNTION_NAME === "fn2") {
const expensive2 = require("expensive-import-2");
}
exports.fn1 = functions.https.onRequest((req, res) => {
res.send(expensive1.expensiveResult());
});
exports.fn2 = functions.https.onRequest((req, res) => {
res.send(expensive2.expensiveResult());
});
This will load only the files you want on cold-start. Those
environment variables won't be present when we run your code locally
to discover what functions to deploy, however, so you must always
export all functions you want to deploy.
For more information about how to manage multiple functions, you can refer to the documentation:
As you integrate Cloud Functions into your project, your code could
expand to contain many independent functions. You may have too many
functions to reasonably fit in a single file, or different teams may
deploy different groups of functions, creating a risk of one team
overwriting or accidentally deleting another team's functions. Cloud
Functions offers different ways to organize your code to make it
easier to navigate and maintain your functions.
My index.ts has:
exports.foo = functions.https.onCall(async (data, context) => {
console.log('Hello World');
return null;
});
To deploy, I run:
firebase deploy --only functions:foo
To test, I do:
final callable = FirebaseFunctions.instance.httpsCallable('foo');
await callable.call();
First time when the function execution started, my function body runs, but the second time (don't know how it gets invoked), my function body doesn't run. Is this a standard behavior, am I also getting charged for the automatic second invocation?
NOTE: I've read several posts like this, this, this, this etc before asking this question but none of them seemed to work for me.
I've seen and more-or-less logged this; for example, recently some "minimum instances=1" functions seem to start-up and run a few times a day, but the function itself isn't invoked. I also see this at deploy time (I use some custom code that deploys multiple functions at a time).
The way "cold starts" work is they have to run the function files once to FIND and ASSIGN the "functions" within. This part used to be run silently. It would be nifty if Google either returned to NOT logging this, or differentiated it in the logs.
I don't know flutter, but you run it "as if you were in a browser". In addition, pressing a button usually submit something (I mean, it's not a GET request, most of the time).
So, the combination of both PLUS your issue lead me to think to a preflight request. Check the HTTP verb before performing the processing.
I have several NPM codebases that use Firestore. One is client-side, one is server-side, and I am trying to refactor some code to a common dependency codebase. The server codebase uses firebase-admin as its dependency, but if I try to set objects with sentinel types (e.g. firebase.firestore.Timestamp), I incur this error:
Please ensure that the Firestore types you are using are from the same NPM package
I can avoid mixing firestore implementations by injecting the instance into my library codebase, e.g.:
import * as admin from "firebase-admin";
const libraryCode = new MyFirestoreLibrary(admin.firestore())
But, are there ways to access these sentinel types in library code?
An example of what I'm hoping to make work is here: https://github.com/okhobb/firestore-dependency-tester
I agree this is extremely silly from the firebase team. I'm in your exact sitiuation, and have shared code where I define types and objects between both client and server - sounds reasonable right?
Here's my workaround:
import * as firebase from 'firebase-admin';
import Timestamp = firebase.firestore.Timestamp;
export const now = () => {
const newTimestamp = Timestamp.fromMillis(Date.now()) as any;
const ret = JSON.parse(JSON.stringify(newTimestamp));
for (const prop of Object.getOwnPropertyNames(Timestamp.prototype).filter(prop => prop !== "constructor"))
ret[prop] = newTimestamp[prop]
return ret as Timestamp;
}
This just creates a new object and copies the properties so that the toDate, toMillis etc functions are all still there, it's just not a direct instance of timestamp, so any instanceof checks will return false.
After doing that awful hack, it seems like the check they have in place to make sure "you are importing from the same npm package" no longer works.
Discussions on asynchronous features of a program generally move towards things like Futures,Promises etc which in turn involves multi threaded environment.
Is it possible to write an asynchronous program without resorting to multiple threads?
You can't have async without multiple workers.
Even when you don't control them directly (Eg: such as NodeJS), they still exists on the background. So on those languages, you can use them without explicitly working with threads / forks.
Eg:
var fs = require("fs");
fs.readFile('example.txt',function(err,data){
if(!err) {
console.log(data);
}
}); // 'fs.readFile' is async
console.log("something else"); // This will be executed right after the line above, and most likely the file ain't read yet.
I have recently begun coding in Redux.
Before Redux with AngularJS it was easy to map models with state using $localstorage. I just can figure out the best way to do that with Redux.
Should I be dispatching and action and ask reducers to read local storage for in my code ?
Or should I allow local storage to be managed with a global object ?
There are few ways.
Just note that for syncing to localStorage you need to call JSON.stringify which is quite expensive, so please don't do that often and also with large data structures as it might hurt app's performance.
1) Sync whole Redux store to Local Storage. You can use existing solution for that eg. https://github.com/elgerlambert/redux-localstorage
I would not recommend to sync whole store as you might sync also state which should not be persisted after refresh and also you might make application slower; For better performance you can use paths argument in above library or use one of another options.
To see how you can build such functionality manually, there is great explanation video from Dan https://egghead.io/lessons/javascript-redux-persisting-the-state-to-the-local-storage
2) Manually build simple cache middleware like below, which might catch specific actions you would like to sync with local storage
const cacheMiddleware = store => next => action => {
if(action.type !== 'GET_SOMETHING') {
return next(action);
}
const data = getFromLocalstorage();
if(!data) {
// Fetch and put to localstorate for later use
const data = fetchFromServer();
return next({ type: 'SERVER_RESULT', data });
}
return next({ type: 'CACHED_RESULT', data });
};
3) If you are using Redux Thunk you can perform caching there as you are allowed to have side effects in actions.
You can find more info about Redux middleware here https://redux.js.org/advanced/middleware