GCP Storage usage from Firebase Cloud Functions - firebase

Since the node v10 changes to Cloud Functions, one thing that we've come to accept is a default storage bundle within our GCP of which accrues a cost of ~$0.03 a month. It is believed to be Docker cache files that store the Cloud Functions, Node Modules, and Hosting Revisions.
But on an almost empty project, how can you have 535MB in storage when the source is only 83MB total?
no hosting
2 cloud functions
What are the contents of the Files? is it overhead for the revision history? is there any reasonable way to minimize it? I get asked these questions many times and I do not have an answer that I feel comfortable with.

It's the entire container image, so I imagine it contains lot of things that you don't deploy yourself - like the operating system the code runs on, and the runtime of the language you use (Node.js if you're using Firebase to deploy).

Related

Google Cloud Composer: Save on costs

I am trying to figure out how to save on costs via Google Cloud Composer. Is there anyway to spin down the server when none of your dags are running? Then spin it up again when a dag needs to run?
It's costing way too much since I believe even though my dags are not running the server remains up and we're getting charged.
Thanks,
For now, there is no possibility to enable/disable a Composer environment. In order to save money on a server that is not in use, there is a need for a feature similar to autoscaling, for which a request has already been filed.
On Medium site, you can find a lot of useful information, regarding saving costs.
One way to control your costs in Cloud Composer is to use autoscaling. The amount of nodes can be set to autoscale in GKE cluster, follow this guide. Smaller size of Cloud Composer environment and shorter running time would be best practice.
Cloud Composer charges for compute resources allocated to an environment, components continue to run even when none DAGs deployed. There's not much you can reduce/turn off, you may consider another platform services, such as Dataflow, which is serverless.
I hope you find the above pieces of information useful.
You can now take a snapshot from GUI or v1beta API then delete the environment. When you want to work on it, simply create a new environment and load the snapshot from GCS via GUI or API. Creation and snapshot operations may take 20-30 minutes.

How to use Google App Engine in conjunction with Firebase

I've been using Firebase for a long time and it's great, however I need more power for certain things.
At the moment, I have a function in Cloud Functions for Firebase to do some video processing and I need more power. I have heard Google's App Engine is better for this kind of solution and I've been experimenting with App Engine with my Google Cloud project.
I went to deploy my first Node.JS app function to App Engine and it seemed as if it was going to overwrite my existing functions I have. I have lots of functions, and code in Firebase for my app and my website to talk to one another, so I'd like to keep and leave my Firebase stuff alone.
I'm just wondering, how I'd use App Engine (and maybe some other higher end Google Cloud products) alongside Firebase without interfering or changing one another, but using the same project?
App Engine is an entirely different product than Cloud Functions. Anything you deploy to App Engine will not affect what you've already deployed to Cloud Functions, and the same applies in reverse. You can use both products in tandem with no conflicts.
Here, you can find more details related to the serverless environments options that you have. It would be helpful to read about each of them and choose whatever fits your needs better.

Firebase Functions: is it OK to divide functions to multiple projects

This is in relation to the question here: Google Cloud / Firebase Functions, handling dependencies per function
So, to manage dependencies better, is it allowed to divide the functions to as many projects as we see fit?
This would create one "master" project, that contains the data in database and storage + the projects that are otherwise empty, but contain only certain functions.
Thins of the following projects: My Awesome App, My Awesome App Stats Api, My Awesome App Admin Api etc.
It depends on what kinds of functions you're writing.
If you want to write database triggers, they have to be in the same project as the database that's receiving the writes. You can't have a second project respond to writes from the database in the first project.
If you want to write HTTP triggers, you can init the admin SDK to point to different projects for querying and such.
I don't particularly see any need to "shard" your functions like this in a production environment. Cloud Functions will scale your functions as needed to handle the load, and having different functions in different projects shouldn't make a difference in that respect.

Deploying wordpress as AWS lambda functions?

I am wondering if it is feasible to deploy wordpress as a series of lambda functions on AWS API gateway. Any pointers on the feasibility/gotchas would be greatly appreciated!
Thanks in advance,
PKK
You'll have a lot of things to consider with persistence and even before that, Lambda doesn't support PHP. I'd probably look at Microsoft Azure Functions instead that do support PHP and do have persistent storage.
While other languages (such as Go, Rust, Swift etc.) can be "wrapped" to run in AWS Lambda with relative ease, compiling PHP targeting the same platform and running it is a bit different (and certainly more painstaking). Think about all the various PHP modules you'd need for starters. Moreover, I can't imagine performance will be as good as something like a Go binary.
If you can do something clever with the Phalcon framework and come up with an easy build and deploy process, then maayyyybee.
Though, you'd probably need to really overhaul something like WordPress which was not designed for this at all. It still uses some pretty old conventions due to the age of the project and while that is all well and good for your typical PHP server, it's a different ball game in the sense of this "portable" PHP installation.
Keep in mind that PHP sessions are relied upon as well and so you're going to need to move those elsewhere due to the lack of persistence with AWS Lambda. You can probably find some sort of plugin for WordPress that works with Redis?? I have to imagine something like that has been built by now... But there will be many complications.
I would seriously consider using Azure Functions to begin with OR using Docker and forgoing the pricing model that cloud functions offers. You can still find some pretty cheap and scalable hosting out there.
What I've done previously was use AWS ECS (Docker) with EFS (network storage) for persistence and RDS for the database. While this doesn't carry the same pricing model as Lambda, it is still cost efficient. You can set up your ECS Service to autoscale up and down. So that way you're running the bare minimum until you need more.
I've written a more in depth article about it here: https://serifandsemaphore.io/how-to-host-wordpress-like-a-boss-b5993fcfbd8e#.n6fbnf8ii ... but it's basically just the idea of running WordPress in Docker and using EFS to offload the persistent storage issues. You can swap many of the pieces of the puzzle out if you like. Use a database hosted in some other Docker service or Compose or where ever. That part need not be RDS for example. Even your storage could be handled in a different way, though EFS worked pretty well! The only major thing to note about EFS is the write speed. Most WordPress sites are read heavy though. Your mileage will vary depending on your needs.
Is it possible? Yes, anything is possible with enough time and effort. Is it worth it? That is a question best to ask yourself.
PHP can be run on Lambda as per the documentation located here: https://aws.amazon.com/blogs/compute/scripting-languages-for-aws-lambda-running-php-ruby-and-go/ .
The bigger initial problem as stated in other comments is a persistent file system. S3 for media storage is doable via Wordpress plugin (again from the comments) but any other persistent storage for the request / script execution is the initial biggest hurdle. Tackle one problem at a time till you get to the end!

How do I design the file storage issue?

I am working on an application that creates video files and stores them in a folder in the C:\ drive. I speculate that there will be a large number of these files in the future and we would run out of disk space at some point of time (on our VPS). When the time comes that we have to upgrade, we either plan to use one of the Cloud providers to store files or our existing provider can add another disk (say D:\ drive).
Either way, I would want to design the app now in a way that in future, moving to different locations would not be an issue and would be transparent to the end user.
The code that creates these files supports 2 ways:
myObj.SetOutputToDisk(<path to store>); or
myObj.SetOutputToMemoryStream(ms);
If we go with the Cloud architecture, I assume we might have the following combination:
Cloud Files + Existing VPS or
Cloud Files + Cloud Windows Server
Given the unknowns at this time, how would I go about designing this?
Serve the files up from a subdomain. Say: media.yourdomain.com.
That way, you can trivially repoint DNS records to the new storage provider at some point in the future.
Also, I'd recommend storing the media files on another physical disk to the OS disk. So have a D:\ drive and store the media there.
You might want to look at the Managed Extensibility Framework as a way of adding extensions to your app for new storage methods without the need to rebuild the whole thing.
You need some way to record the storage location and method used, I'd expect some kind of database store that you could migrate to the cloud later if required.
Your question is very vague, you haven't put much work in yourself and as such you are unlikely to get the level of detail you are hoping for in the answers. At least try to implement the system and then ask specific questions around issues that you are having problems with.

Resources