Firebase function throwing timeout error when using http event - firebase

Function code below
prepay.post('/' , (req, res) => {
req.on("data", function (chunk) {
strdat += chunk;
console.log(strdat);
}).on("end", function()
{
var data = JSON.parse(strdat);
var cryp = crypto.createHash('sha512');
var text = \\ some data;
cryp.update(text);
var hash = cryp.digest('hex');
res.setHeader("Content-Type", "text/json");
res.setHeader("Access-Control-Allow-Origin", "*");
res.end(JSON.stringify(hash));
});
req.on('error', function(err){
console.log(err.message)
});
});
exports.prepay = functions.https.onRequest(prepay);
=================================
this is tried on emulator
in the logs getting ! functions: Your function timed out after ~60s. To configure this timeout, see https://firebase.google.com/docs/functions/manage-functions#set_timeout_and_memory_allocation.
\AppData\Roaming\npm\node_modules\firebase-tools\lib\emulator\functionsEmulatorRuntime.js:660 throw new Error("Function timed out.");
works fine when ran locally with nodejs using node server.js
not sure if req.on supported by firebase will be helpful if I get some reference on req.on in firebase functions

Your event-based sample of code won't work due to the preprocessing of the request that is done by the Firebase Functions SDK. Simply put, all of the 'data' and 'end' events have occurred prior to your code being executed.
Inside of functions.https.onRequest, the request is consumed and parsed according to it's content type automatically as documented here. If your request body is one of the below recognised types, it will be parsed and available as request.body. If you wish to work with the raw buffer of data, it is exposed as a Buffer as request.rawBody.
Content Type Request Body Behavior
application/json '{"name":"John"}' request.body.name equals 'John'
application/octet-stream 'my text' request.body equals '6d792074657874' (the raw bytes of the request; see the Node.js Buffer documentation)
text/plain 'my text' request.body equals 'my text'
application/x-www-form-urlencoded 'name=John' request.body.name equals 'John'
This preprocessing allows you to get to the actual function of your code faster.
prepay.post('/' , (req, res) => {
const data = req.body;
const cryp = crypto.createHash('sha512');
const text = \\ some data;
cryp.update(text);
const hash = cryp.digest('hex');
res.setHeader("Access-Control-Allow-Origin", "*");
res.json(hash);
});
exports.prepay = functions.https.onRequest(prepay);

Related

Firebase cloud function error: Maximum call size stack size exceeded

I've made firebase cloud function which adds the claim to a user that he or she has paid (set paid to true for user):
const admin = require("firebase-admin");
exports.addPaidClaim = functions.https.onCall(async (data, context) => {
// add custom claim (paid)
return admin.auth().setCustomUserClaims(data.uid, {
paid: true,
}).then(() => {
return {
message: `Succes! ${data.email} has paid for the course`,
};
}).catch((err) => {
return err;
});
});
However, when I'm running this function: I'm receiving the following error: "Unhandled Rejection (RangeError): Maximum call stack size exceeded". I really don't understand why this is happening. Does somebody see what could cause what's getting recalled which in turn causes the function to never end?
Asynchronous operations need to return a promise as stated in the documentation. Therefore, Cloud Functions is trying to serialize the data contained by promise returned by transaction, then send it in JSON format to the client. I believe your setCustomClaims does not send any object to consider it as an answer to the promise to finish the process so it keeps in a waiting loop that throws the Range Error.
To avoid this error I can think of two different options:
Add a paid parameter to be able to send a JSON response (and remove the setCustomUserClaim if it there isn’t any need to change the user access control because they are not designed to store additional data) .
Insert a promise that resolves and sends any needed information to the client. Something like:
return new Promise(function(resolve, reject) {
request({
url: URL,
method: "POST",
json: true,
body: queryJSON //A json variable I've built previously
}, function (error, response, body) {
if (error) {
reject(error);
}
else {
resolve(body)
}
});
});

Deno - How to fetch data from distant API or URL?

I'm wondering how I can get data from other servers and API with deno ? Everything in the documentation teach me about making http servers and read files from local source. But I can't find anything useful about reading something on the network.
How can read JSON data from the Stripe API ? Or if I want to read a HTML file with text inside ?
Thank you for your time!
I am just giving you an example of the GET request for fetching repositories of Github.
You can change the URL and Request Configuration as per your need.
In the code given below, I am calling another API of Github. By using the fetch() method you can do that.
fetch() method first takes the URL as the first parameter and the next parameter is RequestInit which takes the request method type, headers, body, etc and at the end returning JSON response of that API call.
const githubResponse = async (): Promise<any> => {
const response = await fetch("https://api.github.com/search/repositories?q=android", {
method: "GET",
headers: {
"Content-Type": "application/json",
},
});
return response.json(); // For JSON Response
// return response.text(); // For HTML or Text Response
}
console.log(await githubResponse());
I have written the above code in a ts file named Testing.ts . So, you can run the above code by the command given below:
deno run --allow-net Testing.ts
Next, I am giving you a sample POST request code:
const githubResponse = async (): Promise<any> => {
const body: URLSearchParams = new URLSearchParams({
q: "AvijitKarmakar",
});
const response = await fetch("https://api.github.com/search/repositories", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: body
});
return response.json();
// return response.text(); // For HTML or Text Response
}
console.log(await githubResponse());
You can see that I have created a body object and passed it in the RequestInit through the body parameter and also changed the request method type to POST.
You'll need to do a HTTP Request, for that in Deno you use fetch, the same Web API the browsers use.
To read JSON response:
const res = await fetch('https://api.stripe.com');
const data = await res.json();
If you want HTML:
const res = await fetch('https://example.com');
const html = await res.text();
// Now you can use some HTML parsing lib
fetch requires the --allow-net flag.
Deno strives to be as close to the existent browser API as possible.
That means, you can use fetch. Example:
// fetch-kitten.ts
fetch("https://placekitten.com/200/300").then(async (d) =>
Deno.writeFile("kitten.jpg", new Uint8Array(await d.arrayBuffer()))
);
CLI:
deno run --allow-net --allow-write fetch-kitten.ts
Reference

how to make web use firebase cloud function(express) + hosting + firestore

I'm making website on firebase.
now my website is working.
but i don't know how to read firestore data and response with the data.
router.get('/', function(req, res, next) {
admin.firestore().collection('post').get()
.then(snap => {
const data = snap.size;
console.log("size: " + data);
return res.status(200).send(data);
}).catch(err => {
console.log(err);
res.status(500).send(err);
});
});
module.exports = router;
log is work but not response.
RangeError: Invalid status code: 24
at ServerResponse.writeHead (_http_server.js:192:11)
at ServerResponse.writeHead (/user_code/node_modules/express-session/node_modules/on-headers/index.js:55:19)
at ServerResponse._implicitHeader (_http_server.js:157:8)
at ServerResponse.OutgoingMessage.end (_http_outgoing.js:573:10)
at ServerResponse.end (/user_code/node_modules/express-session/index.js:354:19)
at ServerResponse.send (/user_code/node_modules/express/lib/response.js:221:10)
at admin.firestore.collection.get.then.snap (/user_code/routes/main.js:14:26)
at process._tickDomainCallback (internal/process/next_tick.js:135:7)
const data = snap.size gives you a number type variable which is the size of the collect you just queried. When you pass a number to send(), it looks like that tells Express you want to send that number as an HTTP status code to the client (and apparently it overrides what you set with status()). The API docs for send() doesn't even say that you can pass a number.
If you want to send a number as the body of the response, try converting it to a string instead:
res.status(200).send('' + data);
Or bundle it up into a JSON object for the client to parse, which is probably better.

fetching mp3 file from MeteorJS and trying to convert it into a Blob so that I can play it

am playing around with downloading and serving mp3 files in Meteor.
I am trying to download an MP3 file (https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3) on my MeteorJS Server side (to circumvent CORS issues) and then pass it back to the client to play it in a AUDIO tag.
In Meteor you use the Meteor.call function to call a server method. There is not much to configure, it's just a method call and a callback.
When I run the method I receive this:
content:
"ID3���#K `�)�<H� e0�)������1������J}��e����2L����������fȹ\�CO��ȹ'�����}$A�Lݓ����3D/����fijw��+�LF�$?��`R�l�YA:A��#�0��pq����4�.W"�P���2.Iƭ5��_I�d7d����L��p0��0A��cA�xc��ٲR�BL8䝠4���T��..etc..", data:null,
headers: {
accept-ranges:"bytes",
connection:"close",
content-length:"443926",
content-type:"audio/mpeg",
date:"Mon, 20 Aug 2018 13:36:11 GMT",
last-modified:"Fri, 17 Jun 2016 18:16:53 GMT",
server:"Apache",
statusCode:200
which is the working Mp3 file (the content-length is exactly the same as the file I write to disk on the MeteorJS Server side, and it is playable).
However, my following code doesn't let me convert the response into a BLOB:
```
MeteorObservable.call( 'episode.download', episode.url.url ).subscribe( ( result: any )=> {
console.log( 'response', result);
let URL = window.URL;
let blob = new Blob([ result.content ], {type: 'audio/mpeg'} );
console.log('blob', blob);
let audioUrl = URL.createObjectURL(blob);
let audioElement:any = document.getElementsByTagName('audio')[0];
audioElement.setAttribute("src", audioUrl);
audioElement.play();
})
When I run the code, the Blob has the wrong size and is not playable
Blob(769806) {size: 769806, type: "audio/mpeg"}
size:769806
type:"audio/mpeg"
__proto__:Blob
Uncaught (in promise) DOMException: Failed to load because no supported source was found.
On the backend I just run a return HTTP.get( url ); in the method which is using import { HTTP } from 'meteor/http'.
I have been trying to use btoa or atob but that doesn't work and as far as I know it is already a base64 encoded file, right?
I am not sure why the Blob constructor creates a larger file then the source returned from the backend. And I am not sure why it is not playing.
Can anyone point me to the right direction?
Finally found a solution that uses request instead of Meteor's HTTP:
First you need to install request and request-promise-native in order to make it easy to return your result to clients.
$ meteor npm install --save request request-promise-native
Now you just return the promise of the request in a Meteor method:
server/request.js
import { Meteor } from 'meteor/meteor'
import request from 'request-promise-native'
Meteor.methods({
getAudio (url) {
return request.get({url, encoding: null})
}
})
Notice the encoding: null flag, which causes the result to be binary. I found this in a comment of an answer related to downloading binary data via node. This causes not to use string but binary representation of the data (I don't know how but maybe it is a fallback that uses Node Buffer).
Now it gets interesting. On your client you wont receive a complex result anymore but either an Error or a Uint8Array which makes sense because Meteor uses EJSON to send data over the wires with DDP and the representation of binary data is a Uint8Array as described in the documentation.
Because you can just pass in a Uint8Array into a Blob you can now easily create the blob like so:
const blob = new Blob([utf8Array], {type: 'audio/mpeg'})
Summarizing all this into a small template if could look like this:
client/fetch.html
<template name="fetch">
<button id="fetchbutton">Fetch Mp3</button>
{{#if source}}
<audio id="player" src={{source}} preload="none" content="audio/mpeg" controls></audio>
{{/if}}
</template>
client/fetch.js
import { Template } from 'meteor/templating'
import { ReactiveVar } from 'meteor/reactive-var'
import './fetch.html'
Template.fetch.onCreated(function helloOnCreated () {
// counter starts at 0
this.source = new ReactiveVar(null)
})
Template.fetch.helpers({
source () {
return Template.instance().source.get()
},
})
Template.fetch.events({
'click #fetchbutton' (event, instance) {
Meteor.call('getAudio', 'https://www.sample-videos.com/audio/mp3/crowd-cheering.mp3', (err, uint8Array) => {
const blob = new Blob([uint8Array], {type: 'audio/mpeg'})
instance.source.set(window.URL.createObjectURL(blob))
})
},
})
Alternative solution is adding a REST endpoint *using Express) to your Meteor backend.
Instead of HTTP we use request and request-progress to send the data chunked in case of large files.
On the frontend I catch the chunks using https://angular.io/guide/http#listening-to-progress-events to show a loader and deal with the response.
I could listen to the download via
this.http.get( 'the URL to a mp3', { responseType: 'arraybuffer'} ).subscribe( ( res:any ) => {
var blob = new Blob( [res], { type: 'audio/mpeg' });
var url= window.URL.createObjectURL(blob);
window.open(url);
} );
The above example doesn't show progress by the way, you need to implement the progress-events as explained in the angular article. Happy to update the example to my final code when finished.
The Express setup on the Meteor Server:
/*
Source:http://www.mhurwi.com/meteor-with-express/
## api.class.ts
*/
import { WebApp } from 'meteor/webapp';
const express = require('express');
const trackRoute = express.Router();
const request = require('request');
const progress = require('request-progress');
export function api() {
const app = express();
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
app.use('/episodes', trackRoute);
trackRoute.get('/:url', (req, res) => {
res.set('content-type', 'audio/mp3');
res.set('accept-ranges', 'bytes');
// The options argument is optional so you can omit it
progress(request(req.params.url ), {
// throttle: 2000, // Throttle the progress event to 2000ms, defaults to 1000ms
// delay: 1000, // Only start to emit after 1000ms delay, defaults to 0ms
// lengthHeader: 'x-transfer-length' // Length header to use, defaults to content-length
})
.on('progress', function (state) {
// The state is an object that looks like this:
// {
// percent: 0.5, // Overall percent (between 0 to 1)
// speed: 554732, // The download speed in bytes/sec
// size: {
// total: 90044871, // The total payload size in bytes
// transferred: 27610959 // The transferred payload size in bytes
// },
// time: {
// elapsed: 36.235, // The total elapsed seconds since the start (3 decimals)
// remaining: 81.403 // The remaining seconds to finish (3 decimals)
// }
// }
console.log('progress', state);
})
.on('error', function (err) {
// Do something with err
})
.on('end', function () {
console.log('DONE');
// Do something after request finishes
})
.pipe(res);
});
WebApp.connectHandlers.use(app);
}
and then add this to your meteor startup:
import { Meteor } from 'meteor/meteor';
import { api } from './imports/lib/api.class';
Meteor.startup( () => {
api();
});

Get the whole response body when the response is chunked?

I'm making a HTTP request and listen for "data":
response.on("data", function (data) { ... })
The problem is that the response is chunked so the "data" is just a piece of the body sent back.
How do I get the whole body sent back?
request.on('response', function (response) {
var body = '';
response.on('data', function (chunk) {
body += chunk;
});
response.on('end', function () {
console.log('BODY: ' + body);
});
});
request.end();
Over at https://groups.google.com/forum/?fromgroups=#!topic/nodejs/75gfvfg6xuc, Tane Piper provides a good solution very similar to scriptfromscratch's, but for the case of a JSON response:
request.on('response',function(response){
var data = [];
response.on('data', function(chunk) {
data.push(chunk);
});
response.on('end', function() {
var result = JSON.parse(data.join(''))
return result
});
});`
This addresses the issue that OP brought up in the comments section of scriptfromscratch's answer.
I never worked with the HTTP-Client library, but since it works just like the server API, try something like this:
var data = '';
response.on('data', function(chunk) {
// append chunk to your data
data += chunk;
});
response.on('end', function() {
// work with your data var
});
See node.js docs for reference.
In order to support the full spectrum of possible HTTP applications, Node.js's HTTP API is very low-level. So data is received chunk by chunk not as whole.
There are two approaches you can take to this problem:
1) Collect data across multiple "data" events and append the results
together prior to printing the output. Use the "end" event to determine
when the stream is finished and you can write the output.
var http = require('http') ;
http.get('some/url' , function (resp) {
var respContent = '' ;
resp.on('data' , function (data) {
respContent += data.toString() ;//data is a buffer instance
}) ;
resp.on('end' , function() {
console.log(respContent) ;
}) ;
}).on('error' , console.error) ;
2) Use a third-party package to abstract the difficulties involved in
collecting an entire stream of data. Two different packages provide a
useful API for solving this problem (there are likely more!): bl (Buffer
List) and concat-stream; take your pick!
var http = require('http') ;
var bl = require('bl') ;
http.get('some/url', function (response) {
response.pipe(bl(function (err, data) {
if (err) {
return console.error(err)
}
data = data.toString() ;
console.log(data) ;
}))
}).on('error' , console.error) ;
The reason it's messed up is because you need to call JSON.parse(data.toString()). Data is a buffer so you can't just parse it directly.
If you don't mind using the request library
var request = require('request');
request('http://www.google.com', function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body) // Print the google web page.
}
})
If you are dealing with non-ASCII contents(Especially for Chinese/Japanese/Korean characters, no matter what encoding they are), you'd better not treat chunk data passed over response.on('data') event as string directly.
Concatenate them as byte buffers and decode them in response.on('end') only to get the correct result.
// Snippet in TypeScript syntax:
//
// Assuming that the server-side will accept the "test_string" you post, and
// respond a string that concatenates the content of "test_string" for many
// times so that it will triggers multiple times of the on("data") events.
//
const data2Post = '{"test_string": "swamps/沼泽/沼澤/沼地/늪"}';
const postOptions = {
hostname: "localhost",
port: 5000,
path: "/testService",
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(data2Post) // Do not use data2Post.length on CJK string, it will return improper value for 'Content-Length'
},
timeout: 5000
};
let body: string = '';
let body_chunks: Array<Buffer> = [];
let body_chunks_bytelength: number = 0; // Used to terminate connection of too large POST response if you need.
let postReq = http.request(postOptions, (res) => {
console.log(`statusCode: ${res.statusCode}`);
res.on('data', (chunk: Buffer) => {
body_chunks.push(chunk);
body_chunks_bytelength += chunk.byteLength;
// Debug print. Please note that as the chunk may contain incomplete characters, the decoding may not be correct here. Only used to demonstrating the difference compare to the final result in the res.on("end") event.
console.log("Partial body: " + chunk.toString("utf8"));
// Terminate the connection in case the POST response is too large. (10*1024*1024 = 10MB)
if (body_chunks_bytelength > 10*1024*1024) {
postReq.connection.destroy();
console.error("Too large POST response. Connection terminated.");
}
});
res.on('end', () => {
// Decoding the correctly concatenated response data
let mergedBodyChunkBuffer:Buffer = Buffer.concat(body_chunks);
body = mergedBodyChunkBuffer.toString("utf8");
console.log("Body using chunk: " + body);
console.log(`body_chunks_bytelength=${body_chunks_bytelength}`);
});
});
How about HTTPS chunked response? I've been trying to read a response from an API that response over HTTPS with a header Transfer-Encoding: chunked. Each chunk is a Buffer but when I concat them all together and try converting to string with UTF-8 I get weird characters.

Resources