Google cloud functions do not work with ifttt - firebase

I created an ifttt Applet using youtube video like as a trigger with a webhook(https cloud function) as an action. I get an error on iffttt that there was a problem with the webhooks service. I tested it with https://requestb.in url and ifttt works with it. is there something i'm missing to do too get a webhook working with cloud functions? I have already enabled billing for the function project.

I should have checked functions logs to begin with. I assumed it never ran. There was a error in my code, I was using JSON.parse() that was leading to error. Now it is functional.

Related

Execute cloud function from cloud data fusion

I'm trying to trigger the cloud function (with trigger type - HTTP) from cloud data fusion pipeline using http sink plugin version 1.2.2. However I receive the SSL error
java.io.EOFException: SSL peer shut down incorrectly
How do I fix this?
Any help is appreciated, Thanks
To my understanding, it is currently not possible to execute a Cloud Function from Cloud Data Fusion using the HTTP sink plugin. This is because you need an OIDC token which must be generated dynamically during runtime, as they have an expiration date. This is what is explained in this post. As explaind in the post, this token should then be added to the header of the request. To generate this token, you need to run a gcloud auth print-identity-token command, which you cant do using Data Fusion.
The only workaround I see is to Publish a Pub/Sub message at the end of the pipeline to trigger this Cloud Function (however, dont take this as the solution because I would need more context on the precise use-case).

Firebase Emulator Cloud Functions + PubSub subscribe to production topics

There is a service that is publishing messages to my Pub/Sub. Via CLI, I know it is receiving properly the messages.
I want to react correspondingly to those messages. However, I want to develop my subscription, via Cloud Functions, in development environment (firebase emulator), so I won't have to wait 5min between each deploy. But, when using functions.pubsub.topic('topicName').onPublish(...), it won't subscribe to the real prod messages, looks like it will only subscribe to the dev env ones.
I want to, in my firebase emulated Sub/Pub, subscribe to prod messages. Is it possible to do it? How?
Still haven't found an "official" way.
What I am doing for now is use ngrok, get the local function url and then enter it in Pub/Sub Subscription in Push mode. It's a longer way and will require updating the ngrok url for each session (as its url changes in free tier), and also to get the data, JSON.parse(Buffer.from(req.body.message.data, 'base64').toString('utf-8')) and still haven't found a way to auth the JWT auth from request.
But, I can now get the Prod messages from my Firebase Emulator, as I want. You may comment here to ask for further infos about that workaround of mine.

Fulfillment URL is not valid in Dialogflow | Using Cloud Functions

I was using Dialogflow API V1 until now and everything was working fine. It seems like the Dialogflow API V2 is now the default. I am not sure if this is the reason but I am not able to deploy cloud functions anymore.
In the fulfillment tab, it is not letting me enabled the cloud function toggle. Every time I try to enable and click deploy it gives me the following error:
Fulfillment URL is not valid
It's weird because I am not using fulfillment URL and still I am getting this error.
One workaround that worked for me when. I wanted to create a back up bot for my existing LIVE bot to use it for testing. I was getting the same issue once I restored my zip and after that try to edit the stock fulfillment.
What worked for me was, I edited the fulfillment first .index.js and package.json both after which I restored the bot using zip folder that added intents and entities to the agent. after which I was able to deploy cloud functions without getting the error.

Google Translate API authentication error

I am trying to call the Google Translate API and using the following to authenticate from my local - gcloud auth application-default login.
The command works successfully and I am authenticated but when I try to call the API i get the following error message which indicates that it is being read as an anonymous API call
google.cloud.exceptions.Forbidden: 403 Daily Limit Exceeded
I ran into this issue too this week,
I thought i was well authenticated but when i was running my code which is C# using the google translate API v2 package, it gave me the same 403 code daily limit exceeded,
I fiddled around with the CLI, made several accounts, service accounts API keys and all and it never worked.
https://cloud.google.com/dotnet/docs/getting-started/hello-world
this page, (the .NET guide part) says you should be using the google cloud platform plugin they release for visual studio, and login via that, i used it and it worked.
If you look on the bottom left part there are guides for any other language that you might be using. (consider adding that as info it helps me help you).
I would love it if it only worked via CLI but as long as it works i guess it's fine...

implement push notification in firebase with batch

I am trying to make firebase and batch work together so that I can fire push notification when something in firebase db got changed. I followed this tutorial and now i want to make them work automatically and for that i contacted with batch's and they replied me
write your own server-side code to 1. detect Firebase db changes 2. call the Batch API.
For "detect Firebase db changes" I have to create a server-side code which will track changes and call batch's api accordingly. Then follow this documentation for attaching the changed content and send it to appropriate user.
I don't know how to create the server side code. Anyone can give me a quick instructions on it or there's any tutorial related to this?
To write server-side Firebase code you can use the node.js SDK. See https://www.firebase.com/docs/web/quickstart.html (search the page for node.js).
The API is exactly the same as the regular web API except that you'll be writing it as a node.js application that can be run on the server. You can then integrate this with Batch's API.

Resources