Regional webhook fulfillment for Google Assistant apps via Dialogflow - alexa-skills-kit

Now that localization is available for Actions on Google, I'm wondering about minimizing latency for user responses. In a stack where an app is using Dialogflow for NLP and then a Google Cloud Function behind that for webhook fulfillment, is it possible to set up cloud functions where a specific localized experience has been established in Actions on Google? In other words, can we set up English-US responses to be provided by a cloud function hosted in us-central1, and English-GB via europe-west1 Google datacenters?
With Alexa & ASK you can set up a language with 'geographical region endpoints' in N America, Europe and India and have Lambda functions fulfill intents from users in those regions.
Dialogflow may not be available in multiple GCP regions, but if it is it would be nice to keep fulfillment local with google cloud functions close to where the user is - if it minimizes conversation delays. Maybe someone has data to suggest it doesn't matter and is fast enough either way.

Related

Cloud Functions - DDoS protection with max instances cap + node express rate limiter?

I've been using Cloud Functions for a while and it's been great so far - though, it seems like there's no builtin way to set limits on how often the function is invoked.
I've set the max # instances to a reasonable number, but for the # invocations, Firebase doesn't really provide a way to set this. Would using a Node package that limits or slows down requests, when combined with the limited max instances be sufficient to slow down attacks if they happen?
Also know Cloud Endpoints exist - I'm pretty new to OpenAPI and it seems like something that should just be integrated with Functions at an additional cost... but wondering if that would be a good solution too.
Pretty new to all this so appreciate any help!
If you use only Google Cloud services (I don't know the other cloud provider offers to solve your issue, or even existing framework for this), you can limit the unwanted access at different layer
Firtly, Google Front End (GFE) protects all Google resources (Gmail, Maps, Cloud, Your cloud functions,...) especially against layer 3 and layer 4 common DDoS attacks. In addition, this layer is in charge of the TLS communication establishment, and will also discard the bad connexions.
Activate the "private mode". This mode forbid the unauthenticated request. With this feature, Google Front End will check if
A id_token is present in the request header
If the token is valid (correct signature, not expired)
If the identity of the token is authorized to access to the resource.
-> Only the valid request reach your service and you will pay only for that. All the bad traffic is processed by Google and "paid" by Google.
Use a Load balancer with Cloud Armor activated. You can also customize your WAF policies if you need them. Use it in front of your Cloud Functions thanks to the serverless NEG feature
If you use API Keys, you can use Cloud Endpoint (or API Gateway, a managed version of Cloud Endpoint) where you can enforce rate limit per API keys. I wrote an article on this (Cloud Endpoinr + ESPv2)

Firebase "Blaze" projects limit, multiple env and Spark plan outbound requests

I'm working with Firebase and quite enjoying it so far.
I'm working with DEV, PREPROD and PROD environments for each of my projects. For each env I've had to create a distinct firebase project.
Since my app is using Algolia and Cloud vision API, I apparently have to be on the Blaze plan because Spark plan doesn't allow outbound requests and Cloud vision API calls (if I'm correct).
The thing is we're limited with the numbers of Blaze projects we can have at the same time. Above a certain amount (6 or 7, I think) we have to request a "billing quota increase" and explain why we need more (sounds odd but ok).
So I did, but now Firebase is asking for a $50 transaction to increase the number of Blaze projects I can have.
So I have several questions:
- Am I right to think that in Spark plan I can't call the Algolia API in my cloud functions or call Cloud vision API ?
- Are these $50 a payment to unlock new projects slots or just credits that will be available if needed ?
- If I need even more projects in the future will I have to pay even more credits ?
- How am I supposed to handle separate environments on Firebase without creating a different project each time ?
Thanks a lot
On the Spark plan, with Cloud Functions, you can only make outgoing connections to services that Google fully controls. Algolia will not work.
Please read the FAQ regarding the number of projects you may have and the payment being asked to create a new project:
Why am I being asked to make a payment for more projects?
You may be asked to make a payment if your request for more projects
indicates that you need projects that will use paid cloud services.
The payment can be applied to any charges you incur in the future and
will be visible as a credit in your account.
This payment is required to ensure paid services will be available for
the projects you requested in the quota increase request form. This is
a common requirement, because Google Cloud Platform services are paid
(e.g., Compute Engine, Cloud SQL, and BigQuery).
The payment required varies depending on your billing history, the use
cases described in your request form, the number of projects you
request, and other factors.
So, the $50 you are being asked to pay will apply as credit to your project billing.
You should definitely create new projects for each environment.

Testing outbound requests from Firebase cloud functions on free tier

Each dev on our team is setting up a Firebase project to work with 'locally'. Due to the outbound requests restriction on the free tier, the implication here is that non-Google services that are being called from our functions, cannot actually be called to validate the function works as expected.
Right now the best I can come up with is determining which environment a cloud function is running in (e.g. local, master, prod, etc.) and, if not on a paid tier, fake the outbound service response.
Is there a better way to do this? Ideally we would like to be able to have a fully functional cloud function for each dev.
As you said faking the 3rd party service is one solution.
Another could be using Google's 12 month 300$ credit at https://cloud.google.com/free/
That makes it possible for you to have a single Google account with 300$ credits, then you can change your Firebase free plans into Blaze.
If needed, you can also set up budget alerts which notifies you when you spend a certain amount of credit.

Where is firebase data center located ?

We are defining our privacy policy and customers (specially in the EU) are concerned where their data is located.
Is this still the right answer or did anything change since 2014?
http://grokbase.com/t/gg/firebase-talk/14axy4z42p/firebase-where-is-my-data-stored
At the Google DevFest in Amsterdam, I heard from Frank van Puffelen who presented Firebase to us that at that time (10/10/2015) they had a datacentre in the US and were planning to open one in Europe.
I'd like to give you more details but that's all I know.
I'm also hoping they will open a datacentre in the EU, since there are fresh laws about where our data collected from EU citizens can be stored and that the Safe Harbour legislation will become irrelevant from what I understand (correct me if I'm wrong).
According to the official FAQ, there are certain services that are US-Only.
For example, these are US-Only:
Firebase Realtime Database
Cloud Firestore for Firebase
The rest of the services are claimed Global, and can be hosted in any Google data center, or even a data center run by a selected external party. E.g.:
Cloud Storage for Firebase
Cloud Functions for Firebase
The FAQ also mentions, that Firestore will soon become available Globally:
Note: Though currently US-only, Cloud Firestore will soon be available at all Google Cloud Platform locations.
According to the Google guide Set a project location, you can either choose for a multi-regional location or a regional location. Multi-regional locations offer better replication over multiple regions and better write latency, but are only available in the US (as of writing). Regional locations keep your data within the region, but that means that if that region experiences e.g. a power outage, your service will become unavailable. However, for regional locations, a data center in Frankfurt, Germany is available (again, as of writing).
Side-Note: Cloud Firestore and Cloud Storage will be within the specified region, but Cloud Functions will always be in us-central1 (as of writing).

Google Cloud End points

Is it possible expose and sell google cloud endpoint as an Api? I have created a simple but useful cloud endpoint. I want paid customers to access it directly as an api. How will I create a client-Id or API key dynamically for such clients, etc? For example, google also sells search service as API, where any user can go generate its own API key and Secret, and start using google search service.
Right now, no, or at least not without a lot of work.
The current product was designed with the "same party" use case as the primary goal (the API producer and consumer are the same). There are a number of things that would need to be added to the product to enable the kind of use you're describing. First and foremost on that list would be some kind of API consumer dashboard (like the one Google offers developers for consumers of its APIs).
Endpoints is built on the same API infrastructure as the rest of Google's APIs, and Google does offer this feature on some of its APIs. That may give you a sense of where the product is headed in future iterations.

Resources