Is there a way to run an npm script on every Meteor server restart?
I tried the postinstall hook but it runs only at the first local application start.
I assume, that there must be way, because the restart triggers several building processes and some of them must be "hookable".
I was first thinking of using build plugins, but it seems, that they move a lot of config away from my package.json then.
Anyone knows something about this?
You can run your npm script within your Meteor.startup() code on the server side. The following example which should be located under the /server folder might help.
import { exec } from 'child_process';
Meteor.startup(() => {
async function sh(cmd) {
return new Promise(function (resolve, reject) {
exec(cmd, (err, stdout, stderr) => {
if (err) {
reject(err);
} else {
resolve({ stdout, stderr });
}
});
});
}
async function excScript() {
let { stdout } = await sh('npm -ls'); // runs "npm -ls"
for (let line of stdout.split('\n')) {
console.log(`npm -ls: ${line}`);
}
}
excScript();
}
Related
I am currently trying to create a temp file from /api/sendEmail.js with fs.mkdirSync
fs.mkdirSync(path.join(__dirname, "../../public"));
but on Vercel (where my app is running) all folders are read-only and I can't create any temp files.
Error:
ERROR
Error: EROFS: read-only file system, mkdir '/var/task/.next/server/public'
As I can see there are some questions about this but no clear answer, have any of you guys managed to do this?
Vercel allows creation of files in /tmp directory. However, there are limitations with this. https://github.com/vercel/vercel/discussions/5320
An example of /api function that writes and reads files is:
import fs from 'fs';
export default async function handler(req, res) {
const {
method,
body,
} = req
try {
switch (method) {
case 'GET': {
// read
// This line opens the file as a readable stream
const readStream = fs.createReadStream('/tmp/text.txt')
// This will wait until we know the readable stream is actually valid before piping
readStream.on('open', function () {
// This just pipes the read stream to the response object (which goes to the client)
readStream.pipe(res)
})
// This catches any errors that happen while creating the readable stream (usually invalid names)
readStream.on('error', function (err) {
res.end(err)
})
return
}
case 'POST':
// write
fs.writeFileSync('./test.txt', JSON.stringify(body))
break
default:
res.setHeader('Allow', ['GET', 'POST'])
res.status(405).end(`Method ${method} Not Allowed`)
}
// send result
return res.status(200).json({ message: 'Success' })
} catch (error) {
return res.status(500).json(error)
}
}
}
Also see: https://vercel.com/support/articles/how-can-i-use-files-in-serverless-functions
I am hoping to achieve to update firebase functions config (env variables) programatically other than manually typing firebase functions:config:set I want to automate this process depending on the returned value from a post call inside the function and possibly invoke cloud run to achieve this.
I've tried below but only worked in local env.
const { exec } = require("child_process");
exports = module.exports = functions.https.onRequest(async (req, res) => {
try {
exec(`firebase functions:config:set hello.world="hhh"`, (error, stdout, stderr) => {
if (error) {
console.log(`error: ${error.message}`);
return;
}
if (stderr) {
console.log(`stderr: ${stderr}`);
return;
}
console.log(`stdout: ${stdout}`);
});
res.status(200).send({ message: 'success' });
}
catch (e) {
console.log('e :>> ', e);
res.status(400).send({ status: res.statusCode, message: 'aborted' });
}
})
If there's a way to achieve this, I'd like to know.
Also wondering if there's way to achieve this with cloud run.
Thank you.
There's no way to update the environment while a function is running. It requires a full redeploy from the CLI.
As of June 2022, this is actually possible, using the firebase-functions-test package.
For more information: https://firebase.google.com/docs/functions/unit-testing#mocking_config_values
After created an index.ts and wrote a simple code for listening to port 3000 and printing hello world on the body, I'm also not able to run or get the output from deno's drun module.
import { Application, Router } from "https://deno.land/x/denotrain#v0.5.0/mod.ts";
const app = new Application();
const router = new Router();
// Middleware
app.use((ctx) => {
ctx.cookies["user.session"] = "qwertz";
ctx.cookies["a"] = "123";
ctx.cookies["b"] = "456";
delete ctx.cookies["user.session"];
return;
});
router.get("/", (ctx) => {
return new Promise((resolve) => resolve("This is the admin interface!"));
});
router.get("/edit", async (ctx) => {
return "This is an edit mode!";
});
app.get("/", (ctx) => {
return {"hello": "world"};
});
app.use("/admin", router);
app.get("/:id", (ctx) => {
// Use url parameters
return "Hello World with ID: " + ctx.req.params.id
});
return ctx.req.body;
});
await app.run()
Development Environment:- Windows 10
The problem seems to be the address 0.0.0.0 is specific to mac only.Windows Doesn't use 0.0.0.0 address.
After going to localhost:3000 / 127.0.0.1:3000. I was able to get the output.I think maybe Windows redirects the 0.0.0.0 to localhost. Anyway it solved my problem!
I am on windows. I faced with the same problem. Then,
const app = new Application({hostname:"127.0.0.1"});
I created the app in typescript giving parameter hostname like above.
And run deno like this:
deno run --allow-net=127.0.0.1 index.ts
it worked.
Run your server with the following command:
drun watch --entryPoint=./server.ts --runtimeOptions=--allow-net
In any case most Deno tools for watching changes are still bugged, I recommend to use nodemon, with --exec flag
nodemon --exec deno run --allow-net server.ts
For convenience you can use nodemon.json with the following content:
{
"execMap": {
"js": "deno run --allow-net",
"ts": "deno run --allow-net"
},
"ext": "js,json,ts"
}
And now just use: nodemon server.ts
It seems that you have an error in your code snippet, with the last
return ctx.req.body;
});
If you fix that and use the last deno (v1.0.1) and drun(v1.1.0) versions it should works with the following command:
drun --entryPoint=index.ts --runtimeOptions=--allow-net
I want to run a command but after a task finishes in grunt.
uglify: {
compile: {
options: {...},
files: {...}
}
?onFinish?: {
cmd: 'echo done!',
// or even just a console.log
run: function(){
console.log('done!');
}
}
},
Either run a command in shell, or even just be able to console.log. Is this possible?
Grunt does not support before and after callbacks, but next version could implement events that would work in the same way, as discussed in issue #542.
For now, you should go the task composition way, this is, create tasks for those before and after actions, and group them with a new name:
grunt.registerTask('newuglify', ['before:uglify', 'uglify', 'after:uglify']);
Then remember to run newuglify instead of uglify.
Another option is not to group them but remember to add the before and after tasks individually to a queue containing uglify:
grunt.registerTask('default', ['randomtask1', 'before:uglify', 'uglify', 'after:uglify', 'randomtask2']);
For running commands you can use plugins like grunt-exec or grunt-shell.
If you only want to print something, try grunt.log.
The grunt has one of the horrible code that I've ever seen. I don't know why it is popular. I would never use it even as a joke. This is not related to "legacy code" problem. It is defected by design from the beginning.
var old_runTaskFn = grunt.task.runTaskFn;
grunt.task.runTaskFn = function(context, fn, done, asyncDone) {
var callback;
var promise = new Promise(function(resolve, reject) {
callback = function (err, success) {
if (success) {
resolve();
} else {
reject(err);
}
return done.apply(this, arguments);
};
});
something.trigger("new task", context.name, context.nameArgs, promise);
return old_runTaskFn.call(this, context, fn, callback, asyncDone);
}
You can use callback + function instead of promise + trigger. This function will request the new callback wrapper for new task.
I am either blanking out or it is more complex that it should have been.
I am trying to run grunt-init from a Grunt task, something like this:
grunt.registerTask('init', 'Scaffold various artifacts', function(param) {
// analyze `param` and pass execution to `grunt-init`
// run `grunt-init path/to/some/template/based/on/param/value`
});
The part of analysis of the param is, of course, not the issue. It's running the grunt-init that is.
Running grunt-init directly in the same folder as the below attempts works fine.
I've tried the following methods (path to template is inlined for shortness of the code), all to no avail:
grunt-shell
shell: {
init: {
options: {
stdout: true,
callback: function(err, stdout, stderr, cb) {
...
}
},
command: 'grunt-init path/to/some/template/based/on/param/value'
}
}
and then:
grunt.registerTask('init', 'Scaffold various artifacts', function(param) {
grunt.task.run(['shell:init']);
});
and in command line:
grunt init
or from command line directly:
grunt shell:init
grunt-exec
exec: {
init: {
cmd: 'grunt-init path/to/some/template/based/on/param/value',
callback: function() {
...
}
}
}
and then:
grunt.registerTask('init', 'Scaffold various artifacts', function(param) {
grunt.task.run(['exec:init']);
});
and in command line:
grunt init
or from command line directly:
grunt exec:init
Node's exec
grunt.registerTask('init', 'Scaffold various artifacts', function(param) {
var exec = require('child_process').exec;
exec('grunt-init path/to/some/template/based/on/param/value', function(err, stdout, stderr) {
...
});
});
and in command line:
grunt init
Nothing.
There were various attempts, best of which would print the first line of grunt-init prompt:
Running "init" task
And that's it.
What am I missing? Should I have connected the stdout somehow?
Create a child process with grunt.util.spawn. You can make it asynchronous and set stdio to 'inherit' so that any template prompts can be answered. Also, you should set a cwd or else it will try to overwrite your existing Gruntfile.js!
grunt.registerTask('init', 'Scaffold various artifacts', function(grunt_init_template) {
var done = this.async();
grunt.util.spawn({
cmd: 'grunt-init',
args: [grunt_init_template],
opts: {
stdio: 'inherit',
cwd: 'new_project_dir',
}
}, function (err, result, code) {
done();
});
});
I think I found a way, but it feels hack-ish to me. I am going to answer this, but, please give yours too.
It can be done using grunt-parallel
grunt-parallel
where task is defined using:
parallel: {
init: {
options: {
stream: true
},
tasks: [
{cmd: 'grunt-init'}
]
}
}
and init task is:
grunt.registerTask('init', 'Scaffold various artifacts', function(param) {
// calculate path based on `param`
...
grunt.config.set('parallel.init.tasks.0.args', ['path/to/some/template/based/on/param/value']);
grunt.task.run(['parallel:init']);
});
then, running the following in command line:
grunt init:<some param indicating template type or similar>
properly runs grunt-init.