I'm running an example file-server with a simple index.html file, I want the script to re-run when changes are made within the directory, how can I do that?
deno run --allow-net --allow-read --watch https://deno.land/std#0.157.0/http/file_server.ts ./
You can provide one or more path values for the watch argument when using deno run in order watch additional files outside the module graph. For example, use
deno run —-watch=. module.ts
to watch all files recursively in the current working directory.
You can use the deno help command to get information about the command you want to use (in this case run). This is how I answered your question:
% deno --version
deno 1.26.2 (release, x86_64-apple-darwin)
v8 10.7.193.16
typescript 4.8.3
% deno help run
---snip---
USAGE:
deno run [OPTIONS] <SCRIPT_ARG>...
ARGS:
<SCRIPT_ARG>...
Script arg
OPTIONS:
---snip---
--watch[=<FILES>...]
Watch for file changes and restart process automatically.
Local files from entry point module graph are watched by default.
Additional paths might be watched by passing them as arguments to
this flag.
However in the case of the static file server module that you asked about, there's no real benefit to reloading the server process as it just serves static files: any time you request a static file, you always get the latest version.
Perhaps you're looking for "hot/live reload" behavior in the browser client. This is a different pattern: a coordinated effort between the JavaScript in the page and the server — and that’s not something that’s supported by the module you asked about.
Related
My original motivation was to run a Deno script from Crontab on Ubuntu.
First I did not know that paths are relative to the working directory, not the executing module.
My script was reading and writing files to a disk, so I got errors like
error: Uncaught NotFound: No such file or directory (os error 2)
I was pointed out that this problem can be solved with import.meta.url.
I modified the path to resolve it from import.meta.url and this solution worked fine with read/write operations.
But I encountered another problem with .env file.
It was a surprise to me that even the dotenv module uses paths relative to the working directory.
The dotenv module has the option to specify the path with config({path:___}), but I think it is too much to overwrite the default location.
Eventually, changing the working directory to the script's root directory before running the script in crontab was a simpler solution.
* * * * * cd ____; deno run ___
But I still have doubts if this is the most efficient way.
Is there something better to changing a directory in such cases?
It would be nice to have a mode when running deno, which would make paths relative to the executing module excluding modules which are imported with URLs.
I think you're looking for Deno.mainModule, which is a reference to the file URL of the entrypoint module you passed to deno. You can use it with the deno.land/std/path module to get the directory of the entrypoint for your program, and then use Deno.chdir() to change your current working directory so that all relative paths (which are implicitly relative to Deno.cwd()) are then relative to that directory.
/repo/relative-path.ts:
import * as path from 'https://deno.land/std#0.102.0/path/mod.ts';
export {path};
export const mainModuleDir = path.dirname(path.fromFileUrl(Deno.mainModule));
/repo/main.ts:
import {mainModuleDir, path} from './relative-path.ts';
Deno.chdir(mainModuleDir);
const entrypointRelativePath = path.resolve('hello', 'world.json');
console.log(entrypointRelativePath);
Then, run your script:
$ cd /different/unrelated/path
$ deno run --allow-read /repo/main.ts
/repo/hello/world.json
You can use mainModuleDir as a base for any entrypoint-relative paths you need.
I am curious if you can control the output "src" folder in AWS CodeBuild.
Specifically, I see this when debugging the build in CodeBuild.
/codebuild/output/src473482839/src/github.....
I would love to be able to set/change/remove the src473482839 part of that path, because I have a feeling it is causing my sbt to recompile my scala source files, although I am using CodeBuilds new localcache to cache my target folders between builds, the compiled class's canonical path change between builds, which is what I suspect is causing the problem
After some more debugging I have managed to get my 6 minute builds down to 1:30s.
Although you are not able to set or override the CODEBUILD_SRC_DIR I have found a work around in my buildspec.
This is what my buildspec looks like now, with local caching enabled in codebuild.
version: 0.2
phases:
pre_build:
commands:
- mkdir -p /my/build/folder/
- cp -a ${CODEBUILD_SRC_DIR}/. /my/build/folder
build:
commands:
- cd /my/build/folder
- sbt compile test
cache:
paths:
- '/root/.ivy2/cache/**/*'
- '/root/.cache/**/*'
- 'target/**/*'
- 'any other target folders you may need'
The key change I had to make was copy over the source(cached target directories) in the pre_build phase, and change directory and compile from the new, static directory
I hope this helps someone else down the road until CodeBuild allows a person to set/override the CODEBUILD_SRC_DIR folder
In Question 2918898, users discussed how to avoid caching because
modules were changing, and solutions focused on reloading. My question is
somewhat different; I want to avoid caching in the first place.
My application runs on Un*x and lives in /usr/local. It imports a
module with some shared code used by this application and another.
It's normally run as an ordinary user, and Python doesn't cache the
module in that case, because it doesn't have write permission for that
system directory. All good so far.
However, I sometimes need to run the application as superuser, and then
it does have write permission and it does cache it, leaving unsightly
footprints in a system directory. Do not want.
So ... any way to tell CPython 3.2 (or later, I'm willing to upgrade)
not to cache the module? Or some other way to solve the problem?
Changing the directory permissions doesn't work; root can still write,
root is all-powerful.
I looked through PEP 3147 but didn't see a way to prevent caching.
I don't recall any way to import code other than import. I suppose I
could read a simple text file and exec it, but that seems inelegant
and bug-prone.
The run-as-root is accomplished by calling the program with sudo in a
shell script, and I can have the shell script delete the cache after the
run, but I'm hoping for something more elegant that doesn't change the
directory's last-modified timestamp.
Implemented solution, based on Wander Nauta's answer:
Since I run the executable as a plain filename, not as python executablename, I went with the environment variable. First, the
sudoers file needs to be changed to allow setting environment
variables:
tom ALL=(ALL) SETENV: NOPASSWD: /usr/local/bkup/bin/mkbkup
Then, the invocation needs to include the variable:
/usr/bin/sudo PYTHONDONTWRITEBYTECODE=true /usr/local/bkup/bin/mkbkup "$#"
You can start python with the -B command-line flag to prevent it from writing cached bytecode.
$ ls
bar.py foo.py
$ cat foo.py
import bar
$ python -B foo.py; ls
bar.py foo.py
$ python foo.py; ls
bar.py foo.py __pycache__
Setting the PYTHONDONTWRITEBYTECODE environment variable to a non-empty string or the sys.dont_write_bytecode to True will have the same effect.
Of course, I'd say that the benefits in this case (faster loading times for your app, for free) vastly outweigh the perceived unsightliness you were talking about - but if you really want to disable caching, here's how.
Source: man python
I am completely new to all this, 'Bower' and 'Gulp' and Laravel 'Elixir'. I purchased a template that uses them (unfortunately) and now I need some help on how to go about implementing them. I have already installed NPM and Bower. All my packages have been downloaded into:
resources > assets > vendor
This is a screenshot:
Now my question is how do I include all those packages I downloaded in my view? From my understanding I can't run less files directly in the browser, it only runs once due to 'browser caching' or something like that, also the JS scripts are just too many to include in my page.
I want a way where I can work on my files and have them automatically compiled with the compiled files being referenced in my app.php file.
This is a link to the GulpJS file included in my template: http://pastebin.com/3PSN6NZY
You do not need to compile every time someone visits. The compiled sass/js should be run in dev and then the output files referenced.
If you have gulp installed on the project, you should see a gulp.js file in the root of your project. If not, visit here for instructions:
Gulp/Elixer installation and setup
In your gulp.js file:
var elixir = require('laravel-elixir');
elixir(function(mix) {
mix.less([
'app.less',
'normalize.less',
'some-other-less.less',
'and-another.less'
]);
mix.scripts(['app.js', 'some-other-js.js'], 'public/js/output-file.js');
});
While in development you can run gulp watch from the command line to listen for changes and run compile tasks when it hears a change. Then you simply reference the output files in the public directory as you normally would.
If you don't want to listen, you can just run the gulp command for a single once-off task run.
The docs are pretty straight forward and can be found here:
Gulp/Elixer docs
im trying to install PhantomJS in a MeteorApp.
I have done those step:
Add the npm package
meteor add meteorhacks:npm
Run meteor to let the npm package to pre-initialise
meteor
A file packages.json has been created at the root. Edit it to:
{
"phantomjs": "1.9.13"
}
A this point everything seem to work. But i try to test with this exemple that ive found here :
https://github.com/gadicc/meteor-phantomjs
But i dont understand where to put my phantomDriver.js
Why is phantomDriver.js is in assets/app/phantomDriver.js... but after, they say to create the file in ./private/phantomDriver.js...
Thank for clear explication :)
In development mode you create the file in /private/phantomDriver.js. When you build a meteor app it refactors everything into an application bundle which can be run.
After meteor builds your app it stores stuff from private into assets. For phantomjs to execute this file it needs to look in this directory. You don't have to create it. This is how meteor works internally.
If you look in your .meteor/local/build/programs/server directory the assets directory is there with anything you placed in private.
From the context of where your meteor code runs (the server directory above) the assets directory runs from this directory when your project is running.
Keep in mind when you deploy your app it loses its entire project structure and becomes something else. Gadi's phantomjs project is designed to work in production environments too.
TLDR; Don't worry about the assets directory, keep your file in /private/phantomDriver.js. Meteor should take care of the rest.