How to attach socket.io to google firebase app functions? - firebase

I have the following code in index.js file located into functions folder in my google firebase proyect:
net=require('express')()
net.get('/',function(req,res){res.sendFile(__dirname+'/slave.htm')})
exports.run=require('firebase-functions').https.onRequest(net)
require('socket.io').listen(net).on("connection",function(socket){})
But when I execute \gfp1>firebase deploy in command prompt, this give me that errors:
You are trying to attach socket.io to an express request handler function. Please, pass a http.Server instance.
Yes, and I pass and http server instance in the following code:
net=require('firebase-functions').https.onRequest((req,res)=>{res.send("socket.io working!")})
exports.run=net;require('socket.io').listen(net).on("connection",function(socket){})
It gives me again the following error:
You are trying to attach socket.io to an express request handler function. Please, pass a http.Server instance.
And I try attaching socket.io to firebase functions with that code:
net=https.onRequest((req,res)=>{res.send("socket.io working!")})
exports.run=require('firebase-functions').net
require('socket.io').listen(require('firebase-functions').net).on("connection",function(socket){})
And that gives this error:
https is not defined
When I run this code in localhost:
app=require('express')()
app.get('/',function(req,res){res.sendFile(__dirname+'/slave.htm')})
net=require('http').createServer(app);net.listen(8888,function(){console.log("Server listening.")})
require('socket.io').listen(net).on("connection",function(socket){})
The console emmit \gfp>Server listening., and when I go to url http://127.0.0.1:8888, it works sending an html file to navigator, as I expected:
<script>
document.write("File system working!")
document.body.style.backgroundColor="black"
document.body.style.color="white"
</script>
But the problem happens when I try to convert net=require('http').createServer(app);net.listen(8888,function(){console.log("Server listening.")}) to net=exports.run=require('firebase-functions').https.onRequest((req,res)=>{res.send("Firebase working!")}), it seems to be impossible.

You can't run code to listen on some port with Cloud Functions. This is because you aren't guaranteed to have a single machine or instance running your code. It could be distributed among many instances all running concurrently. You shouldn't know or care if that happens - Cloud Functions will just scale to meet the needs placed on your functions.
If you deploy an HTTP type function, it will automatically listen on the https port for the dedicated host for your project, and you can send web requests to that.
If you want to perform transactions over a persistently held socket, use the realtime database client write values in the database, then respond to those writes with a database trigger function that you write. That function can send data back to the client by writing something back to the database in a location that's being listened to by the client.

Related

NextJS and deploying app - What's the use of the /api folder when wanting to make API calls in production (deployed)?

I just went through the steps of creating a CRUD app with NextJS. Everything works fine when I run the app on my development environment npm run dev.
Then I tried to deploy it to Vercel.
The build fails, and the error that comes up is:
AxiosError: connect ECONNREFUSED 127.0.0.1:3000
...
Build error occurred
Error: Failed to collect page data for /beers/[id]
at /vercel/path0/node_modules/next/dist/build/utils.js:963:15
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
type: 'Error'
I get it: when I make my API requests, the app is using HTTP:localhost:3000, so if deployed, it won't reach.
Then comes my question: locally, I run requests as such, for example:
import axios from 'axios';
axios.defaults.baseURL = "http://localhost:3000";
export const getAllBeers = () => axios.get<BeerData[]>('/api/beers');
and everything works.
I tried to troubleshoot my error. I figured I needed to adjust my baseUrl to the deployment server's address. But it still wouldn't work. And then in the few posts I read, and even in the docs, it says:
Write server-side code directly
As getStaticProps runs only on the server-side, it will never run on
the client-side. It won’t even be included in the JS bundle for the
browser, so you can write direct database queries without them being
sent to browsers.
This means that instead of fetching an API route from getStaticProps
(that itself fetches data from an external source), you can write the
server-side code directly in getStaticProps.
doc source
So after following their tutorial, I'm now confused on the purpose of this /api folder, and in which specific case it's useful? When we want to use the getStaticProps for example.
If anybody could explain with an example? That'd be fantastic. Thank you!

Cloud Functions times out when calling an external HTTP with a POST request

I have a Google Cloud Function and within that I called two external APIs (or URLs), using the Python requests library, one requests.get and another requests.post. Please note that these APIs are tested and working using Postman.
The requests.get is downloading an MP3 file and it is working in Cloud Functions as I was able to see the downloaded file in Cloud Storage:
download_url = "https://some.url.com/music.mp3"
resp = requests.get(download_url)
[get the resp.content, put to storage bucket]
Aside from storing in Cloud Storage, I also send this mp3 file over to Google Cloud Speech-to-Text API and get the transcribed text, which I want to post to the another URL.
[transcribe the audio using Speech-to-Text API, get the transcibed text]
data = { "download_url": download_url, "transcription": transcribed_text }
upload_api = https://another.url.com/api"
resp = requests.post(upload_api, data = data)
Likewise, the transcription is working because I print/log the text and it shows in the View Logs console of Cloud Functions. In the requests.post however, I am getting a time out, also according to the logs.
requests.post(upload_api, data = data, timeout=120)
I even lengthen the timeout setting but that still happens. What could be the explanation for this? Is there a Google Cloud configuration that I miss to set somewhere that might be causing this?
This is because GCP cloud functions has a 1 minute timeout, to fix it is necessary to declare the timeout limit when you create your function
if you are using Gcloud CLI commands to deploy your function, you need to add the flag --timeout
for example
gcloud functions deploy FUNCTION_NAME --timeout=TIMEOUT FLAGS
if you are using the GCP console (Web UI) you need to follow the steps on this link

ERROR: [_parse_http_data] invalid HTTP method in shiny app

When I load my docker shiny app domain name in the browser, it crashes (greys out) and I get this "ERROR: [_parse_http_data] invalid HTTP method".
I have developed an web application that consists of a shiny app (has a login feature connected to an RMySQL database), a website and a mariadb database. I put them together in a docker-compose file and tested it on my local computer and it works fine. I then proceeded to deploy them in a Kubernetes cluster in GCE and that was also successful. I used cloudflare to install a ssl certificate for the shiny app domain (i.e. trnddaapp.com). Now when I load the shiny app domain in the browser it appends the https and loads the app successfully but after about a minute it crashes (greys out). I loaded the shiny app external ip with http and this doesn’t crash.
The closest solution I have come to is https://github.com/rstudio/shiny-server/issues/392 but there doesn't seem to be any other solution to my problem. I would be grateful if anyone help me resolve this problem.
This is the error message I get when I check with kubectl log [app pod name], I get this error:
ERROR: [_parse_http_data] invalid HTTP method
ERROR: [_parse_http_data] invalid HTTP method
ERROR: [_parse_http_data] invalid HTTP method
I expect the app not to crash when the shiny app domain (trnddaapp.com) is appended with the https.
Let's start with the analysis of the error message, it says:
[_parse_http_data]
So we know that your app is receiving something, but it doesn't understand what it is (it may be a malformed HTTP/1.0 or HTTP/1.1 or even binary data) then we have an
invalid HTTP method
Now we are sure it is not a HTTP/1.X call but a stream of (non recognized) data.
We now know is not the instance since it "deploys" and "delivers" the service, but something inside that is just breaking.
There are a few things that may be happening, since it runs in your local machine (where I am assuming it has access to more resources, especially memory) it may be an issue of resource allocation and that once ran in a container, it could be possible that it empties its allocated amount of resources and breaks (perhaps a library that is called in real time that uses a chunk of memory?) but we won't be sure unless we can debug it inside a container, so could it be possible for you to add a debug library that records your requests to see if it parses all of those and at some point in time it stops and why? I know a person from R-Studio created a httpuv that logs every request this can be done as in:
devtools::install_github('rstudio/httpuv#wch-print-req')
And after that, maybe share the output and see why the application is behaving like that and killing its own service.
I really thank you in advance, hopefully with those logs we may be able to shed more light into this matter.
Thanks once again!
-JP

Google Cloud Functions with Trace Agent connection

I need to connect monitoring and tracing tools for our application. Our main code is on Express 4 running on Google Cloud Functions. All requests incoming from front nginx proxy server that handle domain and pretty routes names. Unfortunately, trace agent traces this requests, that coming on nginx front proxy without any additional information, and this is not enough to collect useful information about app. I found the Stack Driver custom API, which, as I understand might help to collect appropriate data on runtime, but I don't understand how I can connect it to Google Cloud Functions app. All other examples saying, that we must extend our startup script, but Google Cloud Functions fully automated thing, there is no such possibility here.
Found solution. I included require("#google-cloud/trace-agent"); not at the top of the index.js. It should be included before all other modules. After that it started to work.
Placing require("#google-cloud/trace-agent") as the very first import didn't work for me. I still kept getting:
ERROR:#google-cloud/trace-agent: express tracing might not work as /var/tmp/worker/node_modules/express/index.js was loaded before the trace agent was initialized.
However I managed to work around it by manually patching express:
var traceApi = require('#google-cloud/trace-agent').get();
require("#google-cloud/trace-agent/src/plugins/plugin-express")[0].patch(
require(Object.keys(require('module')._cache).find( _ => _.indexOf("express") !== -1)),
traceApi
);

Meteor: execute calls from client console "everywere"

Meteor is said to automagically (in most cases) figure out what code to run on the client and what code to run on the server so you could theoretically just write all your code in one .js file.
I would like to be able to write code in my browser console and have it executed pretty much as if I had put the code in a file on my server.
For example, in my browser console:
[20:08:19.397] Pages = new Meteor.Collection("pages");
[20:08:30.612] Pages.insert({name:"bro"});
[20:08:30.614] "sGmRrQfezZMXuPfW8"
[20:08:30.618] insert failed: Method not found
Meteor says "method not found" because I need to do new Meteor.Collection("pages"); on the server.
But is there a workaround for this, whether using the above-mentioned automagic or by explicitly saying in my browser console "run the following line of code on the server!"?
Well it doesn't "automagically" figure it out - you have to very explicitly do one of two things:
Separate the code into client and server directories.
Wrap the code in an isClient or an isServer section.
Otherwise, any code you write will execute in both environments. However, any code input by the user on the client will only be executed on the client. Meteor has been specifically designed to protect this boundary.
You can call a method on the server from the client, but again the server cannot be tricked into executing client-defined functions.
In your specific example, you can always define the collection only on the client like so:
Pages = new Meteor.Collection(null);
That will allow you do freely manipulate the collection data on the client, but it will not involve the server (nothing will be stored in the db).

Resources