We have a small Google Apps Script webapp that handles Slack slash commands. It does some convenient things like adding, updating and querying records to our sheet, straight from Slack. Everything works just fine most of the time. However the Slack API expects an answer from the request in less than 3 seconds, or it will timeout. Our Google Apps Script is not always able to respond in that timeframe, which will only get worse as our sheet grows or our queries get more complicated.
The Slack API allows for the use of asynchronous calls using a delayed response, but that means that the Google Apps Script needs to respond immediately (within 3 seconds) and do some work in the background.
Now this is the problem
I can't figure out how to make an asynchronous call work in Google Apps Script
I know Workers are not supported in Google Apps Script and my solution below hits a wall because of ReferenceError: 'google' is not defined. (Just ignore the Payload class, it formats a Slack response)
function doPost(request) {
var responseUrl = request.parameter.response_url
// This is how I try to circumvent the lack of threads in Google Apps Script
google.script.run
// Send an asynchronous slack response with result
.withSuccessHandler(function(payload) {
UrlFetchApp.fetch(responseUrl, {
'method' : 'post',
'contentType': 'application/json',
'payload' : payload.toString()
});
})
// Send an asynchronous slack response with error message
.withFailureHandler(function(payload) {
UrlFetchApp.fetch(responseUrl, {
'method' : 'post',
'contentType': 'application/json',
'payload' : payload.toString()
});
})
// do work in the background
.sleep(5);
return new Payload("Let me think about this...").asResponse();
}
function sleep(seconds) {
Utilities.sleep(1000 * seconds);
return new Payload("I waited for " + seconds + " seconds");
}
Does anyone have any idea how to make this work? Are there any alternative solutions to handle an asynchronous request in Google Apps Script?
I'm not aware of any threading in Apps Script either and as you noticed google.script.run only works in the Apps Script frontend.
As a workaround you could use a Google Forms as your "task queue". I've put together a simple G-Form with one question and inspected its final version to get the appropriate parameter names and URL. Then I set an installable on-form-submit trigger to run my script. Here's my POC code:
function doPost(e) {
var form = 'https://docs.google.com/forms/d/e/1FAIpQLScWBM<my-form-id>CRxA/formResponse';
UrlFetchApp.fetch(form, {method:'POST', payload:'entry.505669405=' + e.parameter.example});
return ContentService.createTextOutput('OK');
}
function onForm(e) {
//triggered async from doPost via a Google Forms
SpreadsheetApp.getActive().getSheetByName('Sheet1').appendRow(e.values);
}
It worked fine on my tests and should suffice for your use case.
Related
I got a simple Next app where I'm making an external API call to fetch some data. This worked perfectly fine until a couple days ago - when the app is making an API request, I can see in the network tab that the URL that it's trying to call, got Next app's address (localhost:3000) prepended in front of the actual URL that needs to be called e.g.: instead of http://{serverAddress}/api/articles it is calling http://localhost:3000/{serverAddress}/api/articles and this request resolves into 404 Not Found.
To make the API call, I'm using fetch. Before making the request, I've logged the URL that was passed into fetch and it was correct URL that I need. I also confirmed my API is working as expected by making the request to the expected URL using Postman.
I haven't tried using other library like axios to make this request because simply it doesn't make sense considering my app was working perfectly fine only using fetch so I want to understand why is this happening for my future experience.
I haven't made any code changes since my app was working, however, I was Dockerizing my services so I installed Docker and WSL2 with Ubuntu. I was deploying those containers on another machine, now both, the API I'm calling and Next app are running on my development machine directly when this issue is happening.
I saw this post, I confirmed I don't have any whitespaces in the URL, however, as one comment mentions, I installed WSL2, however, I am not running the app via WSL terminal. Also, I've tried executing wsl --shutdown to see if that helps, unfortunately the issue still persists. If this is the cause of the issue, how can I fix it? Uninstall WSL2? If not, what might be another possible cause for the issue?
Thanks in advance.
EDIT:
The code I'm using to call fetch:
fetcher.js
export const fetcher = (path, options) =>
fetch(`${process.env.NEXT_PUBLIC_API_URL}${path}`, options)
.then(res => res.json());
useArticles.js
import { useSWRInfinite } from 'swr';
import { fetcher } from '../../utils/fetcher';
const getKey = (pageIndex, previousPageData, pageSize) => {
if (previousPageData && !previousPageData.length) return null;
return `/api/articles?page=${pageIndex}&limit=${pageSize}`;
};
export default function useArticles(pageSize) {
const { data, error, isValidating, size, setSize } = useSWRInfinite(
(pageIndex, previousPageData) =>
getKey(pageIndex, previousPageData, pageSize),
fetcher
);
return {
data,
error,
isValidating,
size,
setSize
};
}
You might be missing protocol (http/https) in your API call. Fetch by default calls the host server URL unless you provide the protocol name.
Either put it into env variable:
NEXT_PUBLIC_API_URL=http://server_address
Or prefix your fetch call with the protocol name:
fetch(`http://${process.env.NEXT_PUBLIC_API_URL}${path}`, options)
I am using pdfkit to generate pdf at runtime and returning this in http response for download. I am able to download the file at browser end but the download dialog is not opening immediately. Instead its waiting till doc.end is called. I guess pdfkit is unable to push the stream efficiently. Has anybody else faced this? If yes, please guide.
Here is the sample code which I am trying
exports.testPdfKit = functions.https.onRequest((request, response) => {
//create pdf document
doc.pipe(response);
response.set('Content-Disposition', `attachment;filename=testpdfstream.pdf`);
response.writeHead(200, { 'Content-Type': 'application/pdf' })
const bigText = "some big text"
for (var i = 0; i < 1000; i++) {
console.log('inside iteration -',i)
doc.text(bigText);
doc.addPage();
}
doc.end()
});
I am implementing this functionality on firebase functions which uses expressjs internally for processing http requests. To generate bigger files at my end, streaming is must for me.
HTTP functions can not stream the input or output of the function. The entire request is delivered in one chunk of memory, and the response is collection and send back to the client in one chunk. The maximum size of both is 10MB. There are not workarounds for this limitation of Cloud Functions (but it does help you system scale better).
If you need streaming or websockets, you'll need to use a different product, such as app engine or compute engine.
I have been trying to get push notifications working using firebase. So far I have got as far as successfully sending an empty message "tickle". The problem is adding the message payload seems to have no affect on what the client receives. That is the service worker just sees it as another empty message.
I started by going through googles guide here - https://developers.google.com/web/ilt/pwa/introduction-to-push-notifications
After going through how to send an empty message it says the message payload must be encrypted and suggests using an existing library to do it. To quote - "As with anything related to encryption, it's usually easier to use an actively maintained library than to write your own code".
I tried to use web-push-php which is one of the libraries recommended by googles guide. After having trouble with that i discovered web-php-push doesn't actually support firebase.
Looking on here i find examples that look really simple and don't event encrypt the message payload. It is simply sent in plain json. Doing this has no affect and the receiving end still thinks it's an empty message. See my code below.
I am at a complete loss with this and i'm confused why googles guide says the message data must be encrypted but there are countless examples on SO where it is just send in plain json text.
This is what i am posting from my server to the end point.
POST https://fcm.googleapis.com/fcm/send Authorization: key=[my server
key] Content-Type: application/json {"priority":10,"to":"[subscriber
id]","notification":{"body":"test body","title":"test title"}}
Here is my event listener in my service-worker.js
self.addEventListener('push', function(e) {
var body;
if (e.data) {
body = e.data.text();
} else {
body = "No message "+JSON.stringify(e);
}
var options = {
body: body
};
e.waitUntil(
self.registration.showNotification('Launtel Residential', options)
);
});
When i run the post request above the push notification occurs and triggers the service worker 'push' event as expected but no message data is present. e.data returns null. The 'e' object always just contains a flag set to true. e.isTrusted==true
Writing a mobile app with Firebase being my backend, also using ES to power my search. I'm completely new to ES.
Suppose each user can publish articles, each of which contains some number of tags, which denotes what this article is about, kind of like questions asked here. Users can search for articles, with tags, and articles containing that tag will be displayed. I manage to do that with Cloud Function, so, the Cloud Function basically looks like this:
exports.articleSearch = functions.https.onRequest((req, res) => {
const { tag } = req.query;
const ElasticSearchConfig = {
uri: '..<my elastic cloud url>/articles/article/_search...',
method: 'GET',
body: ...,
json: true,
auth: {
username: '...<my elastic cloud username>...',
password: '...<my elastic cloud password>...'
}
};
// If succeeds, send results back to user, if not, send error back
request(ElasticSearchConfig).then((results) => ...)
.catch((error) => ...);
});
This works, however, it's a bit slow, because I'm not running ElasticSearch on user's devices, instead, through a cloud function. But if I do run the above code on user's devices, you noticed auth property of ElasticSearchConfig object, I'm basically giving everybody permissions to access and manipulate my ES server. How can I run the above code on user's devices and at the same time, prevent them from reading or writing anything without proper permission?
There's no secure way to do what your asking. Even if it was possible, you don't want that kind of processing client side draining the battery, especially on mobile. Your slow response from cloud functions may be caused from the function entering a timeout state, meaning it hasn't been called in a while.
How to wait for the page to load like say for 5 seconds.
In my program the sites wait for browser checks for 5 seconds before showing the content, Hence I want my http.get(url) function to wait for at least 5 seconds.
Without wait it doesn't show any content.
Thanks.
Based upon your comments, I now understand what you are trying to do and have modified this answer.
Unfortunately, there is no way to configure Meteor HTTP (which really uses
request to make the service call) to "wait" after the request has been initiated before obtaining a response. The best thing for you to do is to check out PhantomJS. It is a headless browser that you can use to load and render the page and then access dynamically generated content via javascript.
Check out this answer for a brief PhantomJS example and you can use the gadicohen:phantomjs package to install for meteor.
On a side note, it can still be useful to use the below function to pause execution in Meteor, but of course this is not useful for what you are trying to solve.
Meteor._sleepForMs(5000); // pause execution for 5 seconds
the http package is built upon npm's request package. You can do it as such with the request package directly:
var request = require('request');
var reqOptions = {
url: path,
method: 'GET',
encoding: "utf8",
jar: false,
timeout: null,
body: null,
followRedirect: null,
followAllRedirects: null,
headers: {},
time:true
};
request(reqOptions, function(error, response, body) {
console.log("time: "+response.elapsedTime)
console.log(body.toString()); //res.content returned by http.get
});
require('sleep').sleep(5); //synchronously slowing execution
You might also be able to do the same by setting {time:true} in the npmOptions parameter of the http call and then synchronously