Deno import maps and lock file - deno

As far as I know, Deno lock files can only be created when using a TypeScript (or JavaScript) file with all the imports on it — usually from a deps.ts file.
I would like to be able to use (the unstable yet) import maps and also generate that lock file based on it.
Is it possible to generate that lock file from an import_map.json file? If it's not possible, is there any other way to use a deps.ts file, for instance, an be able to map the dependencies in order to import them without using (the infamous) ./.. everywhere?
Moreover, looks like using the paths feature on a tsconfig.json file wouldn't do since I don't have any idea how to refer to any module on it.

You cannot directly generate a lock file based on an import map yet. But you can pass the entry file of your program along with the import map to generate a lock file.
Here's an example.
log.ts:
import { green } from "colors";
console.log(`Status: ${green("OK")}`);
deps.json (import map):
{
"imports": {
"colors": "https://deno.land/std#0.88.0/fmt/colors.ts"
}
}
Now run the following command to generate a lock file.
deno cache --import-map=deps.json --unstable --lock=lock.json --lock-write log.ts
The content of lock.json might look like below.
{
"https://deno.land/std#0.88.0/fmt/colors.ts": "db22b314a2ae9430ae7460ce005e0a7130e23ae1c999157e3bb77cf55800f7e4"
}

Another solution that works very closely or better since it actually scans through all the dependencies the project uses is to run: deno test --no-run --import-map import-map.json --lock lock.json --lock-write.

Related

How to import a module inside the Deno REPL?

Attempting to import a module in the Deno REPL results in the following error:
Uncaught SyntaxError: Cannot use import statement outside a module
at evaluate (rt/40_repl.js:60:36)
at replLoop (rt/40_repl.js:160:15)
I use the Node REPL to quickly test out code, almost on a daily basis. The ability to import external code without writing a script or dealing with temporary files is a huge convenience.
Why can't Deno use import statements outside of a module? Is it even possible to use external code in the Deno REPL?
Starting with v1.4.3, you can use top-level await in the REPL to dynamically import modules:
> const path = await import("https://deno.land/std#0.73.0/path/mod.ts")
> path.basename("/my/path/name")
"name"
If you also try to use import a from "a" in Node REPL, it will also throw the same error. Only require can be directly used to import modules in Node REPL.
For Deno, there is no built-in CommonJS loader. Therefore it does not even provide require for you to load stuff synchronously.
The technical reason of why static import cannot be used in REPL is that REPL is actually a script evaluation tool: instead of compiling what you write into an ES Module, they are treated as plain scripts and directly fed into the engine, in the way similar to <script> in the browser without turning on the type="module". (ES modules with static imports have the semantics of asynchronously loading dependencies and determining the "shape" of a module without even actually running it.)
To import modules in Deno REPL, you can use dynamic import(). Personally I sometimes do the following (loading is usually fast enough such that you will pretty much have mod value set before you continue using the mod in REPL):
$ deno
> let mod; import("./mod.ts").then(m => mod = m)
Promise { <pending> }
Check file:///[blah]/mod.ts
> mod
Module { a: 1, Symbol(Symbol.toStringTag): "Module" }

Building library with imports from another library using NX Monorepo

Here is the case. I am using Nrwl NX Monorepo. I have 2 libraries lib-a and lib-b both are publishable libraries created via NX. Now I create a MyClass.ts in lib-a. Naturally under paths in workspace/tsconfig.json paths NX creates an alias to this lib-a ("#workspace/lib-a": ["libs/lib-a/src/index.ts"]). So far so good.
Now we can use this class anywhere within the workspace/monorepo by importing it "import { MyClass } from '#workspace/lib-a';
Unfortunately we can not build lib-b which is importing MyClass. When we try to do it we get the bellow error. So question is how can we build lib-b ?
PS It seems strange that NX monorepo actually don't support such a common scenario linking 2 publishable libs.
"error TS6059: File "d:/workspace/libs/lib-a/src/index.ts" is not under 'rootDir' "d:\workspace\libs\lib-b\src" rootDir is expected to contain all source files"
Try adding
"paths": { "#workspace/*": ["dist/libs/*"] }
into your tsconfig.lib.json files. This should resolve the problem.
Try this solution. Not sure it's official, but in my case it's working well.
3 problems should be resolved:
TypeScript paths
Compiled JS paths
Working directory
First. TypeScript paths is resolved by adding "paths" into workspace/tsconfig.lib.json. NX does it automatically while lib gen. Look answer from Radovan Skendzic.
Second. Problem with compiled JS paths very good described here: Typescript paths not working in an Express project. So you need to install tsconfig-paths into your workspace:
yarn add -D tsconfig-paths
Third. Considering nx run [project]:[target] is working in workspace/ directory you should set CWD to libs/lib-b home directory - to find correct tsconfig.json
So, finally, you have the following executor (add this to your lib-b/project.json) that should work:
"targets": {
"start-dev": {
"executor": "#nrwl/workspace:run-commands",
"options": {
"commands": [
"nodemon -e ts,js --exec ts-node -r tsconfig-paths/register src/index.ts"
],
"cwd": "libs/lib-b"
}
},
...
}
Command to run:
nx run lib-b:start-dev
Don't override "baseUrl" and "paths" in any of child tsconfig!
Put all of your "paths" in tsconfig.base.ts!
Try adding lib-a as an implicit dependency of lib-b, add the line below into the libs/lib-b/project.json file and see what happens:
"implicitDependencies": ["lib-a"]
Running nx graph should show you a graph that should look something like this (do not consider the name of the libraries):
After that you should be able to build both libraries, I hope it works with you as well.

FileReader can't find R Script

I try run my R Script within JavaFx. I use Renjin for this purpose and it seems to work properly with statements I run internally. But I want to run an external R Script. The project is set up with Maven so the path should be easy as the R Script is in the resources folder. The path works when I load FXML files, so I'm pretty confused why it can't find my Script.
Here's a short example:
package survey;
import javax.script.*;
import org.renjin.script.*;
import java.io.FileReader;
public class calcFunction {
public static void main(String[] args) throws Exception {
// create a script engine manager:
RenjinScriptEngineFactory factory = new RenjinScriptEngineFactory();
// create a Renjin engine:
ScriptEngine engine = factory.getScriptEngine();
engine.put("x", 4);
engine.put("y", 5);
engine.eval(new FileReader("/test.R"));
}
}
Is something missing? Thanks in advance!
EDIT1:
With my FXML files it works with the "/" path like this:
root = FXMLLoader.load(getClass().getResource("/moduleDa.fxml"));
EDIT2:
Someone who deleted his comment proposed this:
engine.eval(new FileReader(new File(".").getAbsolutePath()+"/test.R"));
It works if the script is in the root directory, where the pom.xml file is located. #James_D made it work so the R script can be located in the resources folder - thanks a lot!
If your R script is bundled as part of the application, it can't be treated as a file - you need to treat it as a resource. Typically, you will deploy your application as a Jar file, and the resources will be elements within that jar file (they won't be files in their own right).
So just treat the R script as a resource and load it as such. I don't know the renjin framework, but I assume ScriptEngine here is a javax.script.ScriptEngine, in which case ScriptEngine.eval(...) takes a Reader as a parameter, and so (if your R script is located in the root of the class path) you can do
engine.eval(new InputStreamReader(getClass().getResourceAsStream("/test.R")));

How do I compile LESS files every time I save a document?

I've installed Less via npm like this
$ npm install -g less
Now every time that I want to compile source files to .css, I run
$ lessc styles.less styles.css
Is there any way via the command line to make it listen when I save the document to compile it automatically?
The best solution out there I've found is the one recommended on the official LESS website: https://github.com/jgonera/autoless. It is dead simple to use. Also it listens to the changes in the imported files to compile.
Have a look at this article:
http://www.hongkiat.com/blog/less-auto-compile/
It offers GUI solutions (SimpLESS, WinLESS, LESS.app, and ChrunchApp) and a node solution. (deadsimple-less-watch-compiler)
Are you using less alone or with Node.JS ? Because if you are using it with node, there are easy ways to resolve this problem. The first two I can think of are (both these solutions go in your app.js) :
using a middleware, like stated in this stack overflow discussion
var lessMiddleware = require('less-middleware');
...
app.configure(function(){
//other configuration here...
app.use(lessMiddleware({
src : __dirname + "/public",
compress : true
}));
app.use(express.static(__dirname + '/public'));
});
another method consists of making a system call as soon as you start your nodeJS instance (the method name may differ based on your NodeJS version)
// before all the treatment is done
execSync("lessc /public/stylesheets/styles.less /public/stylesheets/styles.css");
var app = express();
app.use(...);
In both cases, Node will automatically convert the less files into css files. Note that with the second option, Node was to be relaunched for the conversion to happen, whereas the first option will answer your need better, by always checking for a newer version in a given directory.

Is it possible to skip code generation for included thrift files in Scrooge?

The Scrooge SBT plugin has the option to include Thrift IDL files from library dependencies (jar files). Often these jar files already contain the generated sources. If I include a Thrift IDL, I don't want to generate these sources again. Otherwise they will be duplicated.
shared.thift
namespace java me.shared
struct Foo {
1: string id
}
shared.jar
me
shared
Foo.scala
shared.thrift
So when my project depends on shared.jar and I include shared.thrift in another Thrift IDL file, I don't want Scrooge to generate Foo.scala again. What's the most straight-forward way to archive this?
It was actually straight-forward.
scroogeThriftSources in Compile ~= { sources: Seq[File] =>
sources filter { case file =>
!file.getName.contains("shared.thrift")
}
}

Resources