I'm building an app for shopify and need to add the GDPR webhooks. My back end is handled using next.js and I'm writing a webhook handler to verify them. The docs havent been very helpful because they dont show how to do it with node. This is my verification function.
export function verifiedShopifyWebhookHandler(
next: (req, res, body) => Promise
): NextApiHandler {
return async (req, res) => {
const hmacHeader = req.headers['x-shopify-hmac-sha256'];
const rawBody = await getRawBody(req);
const digest = crypto.createHmac('sha256', process.env.SHOPIFY_API_SECRET).update(rawBody).digest('base64');
if (digest === hmacHeader) {
return next(req, res, rawBody);
}
const webhookId = req.headers['x-shopify-webhook-id'];
return res.status(401).end();
};
}
But I get this Error: error - InternalServerError: stream is not readable
I think it has to do with now Next.js parses the incoming requests before they are sent to my api. Any ideas?
I discovered the answer. Next.js was pre parsing the body in the context which made it so that I couldn't use the raw body parser to parse it. By setting this:
export const config = {
api: {
bodyParser: false
}
};
above the api function in the api file it prevented next from parsing it and causing the issue. I found the answer because people had the same issue integrating swipe and using the bodyParser.
I've been trying to implement just a simple Firebase fetch since November. At this point, I wish I'd just created a new Rails api; it would have been faster.
But everyone insists Firebase is Oh So Simple.
In app.js,
import firebase from 'nativescript-plugin-firebase';
That part seems OK.
Instructions are all over the place after that.
The plugin's ReadMe suggests an initialization:
firebase.init({
// Optionally pass in properties for database, authentication and cloud messaging,
// see their respective docs.
}).then(
function () {
console.log("firebase.init done");
},
function (error) {
console.log("firebase.init error: " + error);
}
);
Several others have insisted that the init code is unnecessary. It does run without errors, but the code he gives after that produces nothing. Also,
const db = firebase.firestore;
const UserStatusCollection = db.collection("UserStatus");
UserStatusCollection.get();
produce an empty object {}.
Here's my Firebase collection:
If I wrap the firebase call in async/await (and no one is showing it as this complicated),
async function getFireStoreData() {
try {
let result = await this.UserStatusCollection.get();
console.log(result);
return result;
}
catch (error) {
console.error(
"UserStatusCollection.get()" + error
);
}
}
And call that
let temp2 = getFireStoreData();
console.log("temp2:" + temp2);
All I ever get is an object promise.
As I said, I wish I had just built up a new Rails API and had a far simpler life since November.
Your getFireStoreData method is asynchronous and you're not awaiting it. That is probably the reason why you're getting a promise back. Try to await getFireStoreData(). See if that works.
Since it's also a promise, you can try to use .then.
getFireStoreData().then(data => {
console.log(data);
})
I use simple setup from dynamoose page.
const startUpAndReturnDynamo = async () => {
const dynaliteServer = dynalite();
await dynaliteServer.listen(8000);
return dynaliteServer;
};
const createDynamooseInstance = () => {
dynamoose.AWS.config.update({
accessKeyId: 'AKID',
secretAccessKey: 'SECRET',
region: 'us-east-1'
});
dynamoose.local(); // This defaults to "http://localhost:8000"
}
const bootStrap = async () => {
await startUpAndReturnDynamo();
createDynamooseInstance();
}
bootStrap();
I can save the data, get the data by Model.get(hashKey) and my data seems likely be saved only for less than a minute? After that query returns undefined.
There is another TTL (time to live) setup but since I didn't use it. My data should stay permanent in DynamoDB, right?
I found the problem.
Because I was using the remoting dynamodb, not the local one.
dynamoose.local() should be changed to dynamoose.ddb()
dynamoose.local() Configure Dynamoose to use a local DynamoDB
dynamoose.ddb() Configures and returns the AWS.DynamoDB object.
The document of dynamoosejs is very detailed but somehow not easily comprehensible to me.
I posted the answer in case newbie with dynamoose facing the same problem.
I would like to use phantomjs' screen capture capability within a firebase function to return a PDF screenshot of a URL. Looking for examples of someone having done this.
I've used PhantomJS in Cloud Functions without major issues. The only concern was that I had to increase the memory for the containers, since the in-memory rendering turns out to be a memory hog.
I ended up using node-webshot, which is a wrapper around PhantomJS:
"node-webshot": "^1.0.2",
Capturing the actual screenshot was simple enough:
const functions = require('firebase-functions');
const webshot = require('node-webshot');
exports.screenshotTweet = functions.https.onRequest((req, res) => {
var stream = webshot(req.query.url);
res.writeHead(200, {'Content-Type': 'image/jpeg'});
stream.on('data', function(data) {
res.write(data.toString('binary'), 'binary');
});
stream.on('end', function() {
res.end();
})
});
TL;DR;
Does anyone know if it's possible to use console.log in a Firebase/Google Cloud Function to log entries to Stack Driver using the jsonPayload property so my logs are searchable (currently anything I pass to console.log gets stringified into textPayload).
I have a multi-module project with some code running on Firebase Cloud Functions, and some running in other environments like Google Compute Engine. Simplifying things a little, I essentially have a 'core' module, and then I deploy the 'cloud-functions' module to Cloud Functions, 'backend-service' to GCE, which all depend on 'core' etc.
I'm using bunyan for logging throughout my 'core' module, and when deployed to GCE the logger is configured using '#google-cloud/logging-bunyan' so my logs go to Stack Driver.
Aside: Using this configuration in Google Cloud Functions is causing issues with Error: Endpoint read failed which I think is due to functions not going cold and trying to reuse dead connections, but I'm not 100% sure what the real cause is.
So now I'm trying to log using console.log(arg) where arg is an object, not a string. I want this object to appear in Stack Driver under the jsonPayload but it's being stringified and put into the textPayload field.
It took me awhile, but I finally came across this example in firebase functions samples repository. In the end I settled on something a bit like this:
const Logging = require('#google-cloud/logging');
const logging = new Logging();
const log = logging.log('my-func-logger');
const logMetadata = {
resource: {
type: 'cloud_function',
labels: {
function_name: process.env.FUNCTION_NAME ,
project: process.env.GCLOUD_PROJECT,
region: process.env.FUNCTION_REGION
},
},
};
const logData = { id: 1, score: 100 };
const entry = log.entry(logMetaData, logData);
log.write(entry)
You can add a string severity property value to logMetaData (e.g. "INFO" or "ERROR"). Here is the list of possible values.
Update for available node 10 env vars. These seem to do the trick:
labels: {
function_name: process.env.FUNCTION_TARGET,
project: process.env.GCP_PROJECT,
region: JSON.parse(process.env.FIREBASE_CONFIG).locationId
}
UPDATE: Looks like for Node 10 runtimes they want you to set env values explicitly during deploy. I guess there has been a grace period in place because my deployed functions are still working.
I ran into the same problem, and as stated by comments on #wtk's answer, I would like to add replicating all of the default cloud function logging behavior I could find in the snippet below, including execution_id.
At least for using Cloud Functions with the HTTP Trigger option the following produced correct logs for me. I have not tested for Firebase Cloud Functions
// global
const { Logging } = require("#google-cloud/logging");
const logging = new Logging();
const Log = logging.log("cloudfunctions.googleapis.com%2Fcloud-functions");
const LogMetadata = {
severity: "INFO",
type: "cloud_function",
labels: {
function_name: process.env.FUNCTION_NAME,
project: process.env.GCLOUD_PROJECT,
region: process.env.FUNCTION_REGION
}
};
// per request
const data = { foo: "bar" };
const traceId = req.get("x-cloud-trace-context").split("/")[0];
const metadata = {
...LogMetadata,
severity: 'INFO',
trace: `projects/${process.env.GCLOUD_PROJECT}/traces/${traceId}`,
labels: {
execution_id: req.get("function-execution-id")
}
};
Log.write(Log.entry(metadata, data));
The github link in #wtk's answer should be updated to:
https://github.com/firebase/functions-samples/blob/2f678fb933e416fed9be93e290ae79f5ea463a2b/stripe/functions/index.js#L103
As it refers to the repository as of when the question was answered, and has the following function in it:
// To keep on top of errors, we should raise a verbose error report with Stackdriver rather
// than simply relying on console.error. This will calculate users affected + send you email
// alerts, if you've opted into receiving them.
// [START reporterror]
function reportError(err, context = {}) {
// This is the name of the StackDriver log stream that will receive the log
// entry. This name can be any valid log stream name, but must contain "err"
// in order for the error to be picked up by StackDriver Error Reporting.
const logName = 'errors';
const log = logging.log(logName);
// https://cloud.google.com/logging/docs/api/ref_v2beta1/rest/v2beta1/MonitoredResource
const metadata = {
resource: {
type: 'cloud_function',
labels: {function_name: process.env.FUNCTION_NAME},
},
};
// https://cloud.google.com/error-reporting/reference/rest/v1beta1/ErrorEvent
const errorEvent = {
message: err.stack,
serviceContext: {
service: process.env.FUNCTION_NAME,
resourceType: 'cloud_function',
},
context: context,
};
// Write the error log entry
return new Promise((resolve, reject) => {
log.write(log.entry(metadata, errorEvent), (error) => {
if (error) {
return reject(error);
}
resolve();
});
});
}
// [END reporterror]