Using the WP Crontrol plugin I schedule a process that sends out reminders emails to users. It is working well, but everytime I need to test something using actual data, I am scared that the system will send out reminders that should not have been sent or have already been sent from the live system.
After restoring the backup from the production server, I quickly go to the SMTP plugin I am using and select the option that drops emails sent. That does the job, but there is still a risk that something gets sent before I manage to do that.
So, I am considering my options. One is to wrap the reminder function into a check to see if it is the production server. And only run the function when it is.
I could check using home_url(), and I know it will work because I use this approach for something else.
But I feel there is a better and more correct way, and kindly ask for advice.
I usually use this approach in my projects to separate the code that runs according to the development environment. First I create a constant in the file wp-config.php with the name WP_ENVIRONMENT and assign the value of development to it and then I recognize the execution environment using two helper functions :
function prefix_is_development() {
return defined("WP_ENVIRONMENT") && "development" ===
strtolower(WP_ENVIRONMENT);
}
function prefix_is_production() {
return !defined("WP_ENVIRONMENT") || "production" ===
strtolower(WP_ENVIRONMENT);
}
Today Firebase released its brand new product Cloud Functions for Firebase and I just created a hello world function and deploy it on my existing firebase project.
It looks like it bundles all dependencies and upload it to firebase just like aws lambda function does. But it takes too much time to be done even on minor changes in code and also need a good connectivity of internet . If you are offline for some reason, you are just in dark what code you are writing until you have a way to execute and test that functions offline on your local machine.
Is there any way to test Cloud Functions for Firebase locally?
firebaser here
Deployment of your Functions indeed takes more time than what I'm normally willing to wait for. We're working hard to improve that and (as Brendan said) are working on a local emulator.
But for the moment, I mostly write my actual business logic into a separate Node script first. That way I can test it from a local command prompt with node speech.js. Once I'm satisfied that the function works, I either copy/paste it into my actual Functions file or (better) import the speech module into my functions file and invoke it from there.
One abbreviated example that I quickly dug up is when I was wiring up text extraction using the Cloud Vision API. I have a file called ocr.js that contains:
var fetch = require('node-fetch');
function extract_text(url, gcloud_authorization) {
console.log('extract_text from image '+url+' with authorization '+gcloud_authorization);
return fetch(url).then(function(res) {
return res.buffer();
}).then(function(buffer) {
return fetch('https://vision.googleapis.com/v1/images:annotate?key='+gcloud_authorization, {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({
"requests":[
{
"image":{
"content": buffer.toString('base64')
},
"features":[
{
"type":"TEXT_DETECTION",
"maxResults":1
}
]
}
]
})
});
}).then(function(res) {
var json = res.json();
if (res.status >= 200 && res.status < 300) {
return json;
} else {
return json.then(Promise.reject.bind(Promise));
}
}).then(function(json) {
if (json.responses && json.responses.length && json.responses[0].error) {
return Promise.reject(json.responses[0].error);
}
return json.responses[0].textAnnotations[0].description;
});
}
if (process.argv.length > 2) {
// by passing the image URL and gcloud access token, you can test this module
process.argv.forEach(a => console.log(a));
extract_text(
process.argv[2], // image URL
process.argv[3] // gcloud access token or API key
).then(function(description) {
console.log(description);
}).catch(function(error) {
console.error(error);
});
}
exports.extract_text = extract_text;
And then in my Functions index.js, I have:
var functions = require('firebase-functions');
var fetch = require('node-fetch');
var ocr = require('./ocr.js');
exports.ocr = functions.database().path('/messages/{room}/{id}').onWrite(function(event) {
console.log('OCR triggered for /messages/'+event.params.room+'/'+event.params.id);
if (!event.data || !event.data.exists()) return;
if (event.data.ocr) return;
if (event.data.val().text.indexOf("https://firebasestorage.googleapis.com/") !== 0) return; // only OCR images
console.log(JSON.stringify(functions.env));
return ocr.extract_text(event.data.val().text, functions.env.googlecloud.apikey).then(function(text) {
return event.data.adminRef.update({ ocr: text });
});
});
So as you can see this last file is really just about wiring up the "worker method" ocr.extract_text to the database location.
Note this is a project from a while ago, so some of the syntax (mostly the functions.env part) might have changed a bit.
firebaser here
To debug your Cloud Functions for Firebase locally, there is an emulator. See the documentation for more info.
run and debug/inspect functions locally
prerequisites (google-cloud functions and firebase-specific):
npm install -g #google-cloud/functions-emulator
npm install --save firebase-functions
npm install -g firebase-tools
To run and inspect/debug: first run functions locally, then inspect each function, and finally run each specific function to debug+inspect it. Use functions start as an alternative to firebase serve and note the documentation for each tool is available (and useful).
To run and debug the specific function myFn as-expected (eg in Nodejs via chrome://inspect and note this works using Nodejs v10 though not officially supported):
firebase serve --only functions
functions inspect myFn
functions call myFn # or call from browser
additional documentation:
https://firebase.google.com/docs/functions/local-emulator
https://cloud.google.com/functions/docs/emulator#debug-emulator
https://github.com/GoogleCloudPlatform/cloud-functions-emulator/wiki
>> Is there any way to test Cloud Functions for Firebase locally?
You can use the following command to start a firebase shell (execute in your functions directory):
npm run build && firebase functions:shell
You can invoke your functions in the shell like so:
helloWorld()
Refer this link for more information.
Answered here: https://github.com/firebase/firebase-functions/issues/4#issuecomment-286515989
Google Cloud Functions also open-sourced a local emulator, and we are
working to build a tighter integration with Cloud Functions for
Firebase. In the meanwhile, you can check it at here:
https://github.com/GoogleCloudPlatform/cloud-functions-emulator/
The emulator does allow you to run functions locally. Here's the
documentation that explains how to use it:
https://cloud.google.com/functions/docs/emulator
I couldn't get the single stepping working at first. My process was the same as documented in many answers here.
Also, these pages contain nearly all the documentation I required:
https://firebase.google.com/docs/functions/local-emulator
https://cloud.google.com/functions/docs/emulator#debugging_with_the_emulator
I had got the functions running using firebase serve --only functions, but hadn't got the debugger up and running. Then I came across the other way of directly using the emulator and managed to hit a break point like this:
# start the emulator
functions start
# allow inspection
functions inspect helloWorld
# call the function from the cli
functions call helloWorld
This worked, and I could hit a breakpoint.
However, when hitting the endpoint for the function in postman or the browser, I got no response at all.
The step I was missing was:
# deploy the function to the emulator
functions deploy helloWorld --trigger-http
# you need to toggle inspection after the deploy
functions inspect helloWorld
Now I can hit the endpoint for the function from postman or the browser, and the breakpoint is hit.
I recommend the brilliant NiM chrome extension for debugging and hope this answer helps someone, even if this is an old question.
Firstly, I suggest you to install following dependencies,
npm install --save firebase-functions
npm install -g firebase-tools
If already installed then you can update it to latest one. Generally, functions-emulator comes with above dependency but still I would recommend you to update it,
npm install -g #google-cloud/functions-emulator
Once it has been updated, go to functions folder of you application and run following command,
firebase serve --only functions
I hope it helps!
For vscode users debugging HTTP functions (webhooks, etc)...
The google cloud emulator (firebase serve --only functions) launches a separate process to run your functions. You can attach to this process with vscode, but since the emulator only creates this process after the first function is called, it's not straightforward.
create a dummy HTTP endpoint in your functions which will return the processID:
app.get("/processid", function(request, response) {
response.send(`${process.pid}`);
});
start the emulator with firebase serve --only functions
call the http://<localhost_url>/processid endpoint. This will create the process and return the processID
use vscode to attach to the specified process. You can now set breakpoints, step, etc on any of the other functions (they all use the same process).
There's probably a nicer way to glue all this together.
There is now a cloud functions emulator that lets you call functions locally
Once I have completed my PoC I will update this answer to include code and steps I used.
I am trying to run certain integrations test cases using Nightwatch.js and Saucelabs. Currently, for each test case, a new browser window opens and due to this, the test cases are taking a long time.
I need to run the test cases in the same browser window and display test case result for each test case on SauceLabs.
Below is the code similar to what I need to run.
module.exports = {
beforeEach: (browser, done) => {
//login
},
'Test-1': browser => {
browser
.page
.testPage()
.navigate()
.end()
},
'Test2': browser => {
browser
.page
.testPage()
.navigate()
.end()
},
afterEach: (browser, done) => {
//logout
},
}
If I remove .end() from Test-1, Saucelabs runs the test in one browser, but only shows test result with name Test-2
Nightwatch handles the starting and stopping of webdriver automatically by default, but you can disable this and manage it yourself by setting the webdriver start_process configuration option to false in your nightwatch.json file:
"webdriver" : {
"start_process": false
}
Nightwatch reference documentation: https://nightwatchjs.org/gettingstarted#configuration
However, you probably shouldn't do that.
First of all, it means more work for you -- you will have to manage starting and stopping sessions yourself in each test spec, including passing in the configuration.
Secondly, sessions in Sauce Labs are meant to be started and stopped for each test. Sauce will only recognize 1 completed test -- passed or failed for each session.
Finally, it's not a good practice to lump together tests in a single session -- your browser may be in an unexpected state, with different cookies and cache, and your app may have lingering settings from a previous tests. By creating a new session in Sauce Labs you ensure that everything is "clean" and you can be sure to reproduce any scenario exactly.
I understand the desire to try to save time because it takes longer to start each session individually, as well as to go through whatever preparatory steps are needed to get to a certain state within your application. But you'd be better off being able to configure that state by calling an API, setting a cookie, or whatever it takes rather than having your browser in an unknown state.
This is why Nightwatch restarts the browser between tests automatically.
As a result of several hours of unfruitful searches, I am posting this question.
I suppose it is a duplicate of this one:
How do you run RServe on AWS Lambda with NodeJS?
But since it seems that the author of that question did not accomplish his/her goal successfully, I am going to try again.
What I currently have:
A NodeJS server, that invokes an R script through Rserve and passes data to evaluate through node-rio.
Function responsible for that looks like this:
const R = (arg1, arg2) => {
return new Promise((resolve, reject)=>{
const args = {
arg1, arg2
};
//send data to Rserve to evaluate
rio.$e({
filename: path.resolve('./r-scripts/street.R'),
entrypoint: 'run',
data: args,
})
.then((data)=>{
resolve(JSON.parse(data));
})
.catch((err)=>{
reject(`err: ${err}`);
});
});
};
And this works just fine. I am sending data over to my R instance and getting results back into my server.
What I am ultimately trying to achieve:
Every request seems to spawn its own R workspace, which has a considerable memory overhead. Thus, serving even hundreds of concurrent requests using this approach is impossible, as my AWS EC2 runs out of memory pretty quickly.
So, I am looking for a way to deploy all the memory intensive parts to AWS Lambda and thus get rid of the memory overhead.
I guess, the specific question in my case is if there is a way to package R and Rserve together with NodeJS lambda function. Or if there is a way for me to get convinced that this approach won't work using lambda and I should try to look for an alternative.
Note: I cannot use anything other than R, since these are external R scripts, that I have to invoke from my server.
Thanks in advance!
I have a php script which does the accepted answer described here.
It doesn't work unless I add the following before fclose($fp)
while (!feof($fp)) {
$httpResponse .= fgets($fp, 128);
}
Even a blank for loop would do the job instead of the above!!
But whats the point? I wanted Async calls :(
To add to my pain, the same code is running fine without the above code snippet in an Apache driven environment.
Anybody knows if Nginx or php-fpm having a problem with such requests?
What you're looking for can only be done on Linux flavor systems with a PHP build that includes the Process Control functions (PCNTL library).
You'll find it's documentation here:
http://php.net/manual/en/book.pcntl.php
Specifically what you want to do is "fork" a process. This creates an identical copy of the current PHP script's process including all memory references and then allows both scripts to continue executing simultaneously.
The "parent" script is aware that it is still the primary script. And the "child" script (or scripts, you can do this as many times as you want) is aware that is is a child. This allows you to choose a different action for the parent and the child once the child is spun off and turned into a daemon.
To do this, you'd use something along these lines:
$pid = pcntl_fork(); //store the process ID of the child when the script forks
if ($pid == -1) {
die('could not fork'); // -1 return value means the process could not fork properly
} else if ($pid) {
// a process ID will only be set in the parent script. this is the main script that can output to the user's browser
} else {
// this is the child script executing. Any output from this script will NOT reach the user's browser
}
That will enable a script to spin off a child process that can continue executing along side (or long after) the parent script outputs it's content and exits.
You should keep in mind that these functions must be compiled into your PHP build and that the vast majority of hosting companies will not allow access to them on their servers. In order to use these functions, you generally will need to have a Virtual Private Server (VPS) or a Dedicated server. Not even cloud hosting setups will usually offer these functions as if used incorrectly (or maliciously) they can easily bring a server to it's knees.