sbt and scala.js (with Node.js) can't run with a local .js dependency due to "TypeError: undefined is not a function" - sbt

I need help with an error when I run using sbt, scala.js, a local bit of javascript code on Node.js.
[info] Running net.walend.graph.results.PlotTime
Hello from scala
[error] /Users/dwalend/projects/ScalaGraphMinimizer/toGhPages/target/scala-2.11/toghpages-fastopt.js:1854
[error] $g["hello"]();
[error] ^
[error] TypeError: undefined is not a function
[error] at $c_Lnet_walend_graph_results_PlotTime$.main__V (/Users/dwalend/projects/ScalaGraphMinimizer/toGhPages/target/scala-2.11/toghpages-fastopt.js:1854:14)
[error] at $c_Lnet_walend_graph_results_PlotTime$.$$js$exported$meth$main__O (/Users/dwalend/projects/ScalaGraphMinimizer/toGhPages/target/scala-2.11/toghpages-fastopt.js:1861:8)
[error] at $c_Lnet_walend_graph_results_PlotTime$.main (/Users/dwalend/projects/ScalaGraphMinimizer/toGhPages/target/scala-2.11/toghpages-fastopt.js:1864:15)
[error] at Object.<anonymous> (/Users/dwalend/projects/ScalaGraphMinimizer/toGhPages/target/scala-2.11/toghpages-launcher.js:2:107)
[error] at Module._compile (module.js:460:26)
[error] at Object.Module._extensions..js (module.js:478:10)
[error] at Module.load (module.js:355:32)
[error] at Function.Module._load (module.js:310:12)
[error] at Module.require (module.js:365:17)
[error] at require (module.js:384:17)
org.scalajs.jsenv.ExternalJSEnv$NonZeroExitException: node.js exited with code 1
at org.scalajs.jsenv.ExternalJSEnv$AbstractExtRunner.waitForVM(ExternalJSEnv.scala:96)
at org.scalajs.jsenv.ExternalJSEnv$ExtRunner.run(ExternalJSEnv.scala:143)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$.org$scalajs$sbtplugin$ScalaJSPluginInternal$$jsRun(ScalaJSPluginInternal.scala:479)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:539)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:533)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
I'm most suspicious of my build.sbt. (It's in a subproject that has no scala.js in it.) I think I have something out of joint, but don't know what other settings to try.
scalaVersion := "2.11.7"
scalacOptions ++= Seq("-unchecked", "-deprecation","-feature")
libraryDependencies ++= Seq(
"org.scala-js" %%% "scalajs-dom" % "0.8.1"
)
//don't need phantomjs . //jsDependencies += RuntimeDOM
jsDependencies += "org.webjars" % "d3js" % "3.5.5-1" / "d3.min.js"
jsDependencies += ProvidedJS / "algorithmTime.js"
scalaJSStage in Global := FastOptStage
persistLauncher := true
I'm not able to even get a "hello" out of algorthmTime.js with Node.js.
function hello() {
console.log("hello from js")
}
The main() in Scala pretty trim:
object PlotTime extends js.JSApp {
def main(): Unit = {
println("Hello from scala")
global.hello()
val png = global.dataToPng("benchmark/results/v0.1.2/dijkstra.csv")
println(png)
}
}
Before trying Node.js I got a bit further using phantom.js and Rhino. sbt run gets into my local javascript code and stalls inside of d3 with
[info] Running net.walend.graph.results.PlotTime
Hello from scala
hello from js
org.mozilla.javascript.EcmaError: TypeError: Cannot call method "querySelector" of undefined (/Users/dwalend/.ivy2/cache/org.webjars/d3js/jars/d3js-3.5.5-1.jar#META-INF/resources/webjars/d3js/3.5.5/d3.min.js#3)
at org.mozilla.javascript.ScriptRuntime.constructError(ScriptRuntime.java:3701)
at org.mozilla.javascript.ScriptRuntime.constructError(ScriptRuntime.java:3679)
at org.mozilla.javascript.ScriptRuntime.typeError(ScriptRuntime.java:3707)
at org.mozilla.javascript.ScriptRuntime.typeError2(ScriptRuntime.java:3726)
at org.mozilla.javascript.ScriptRuntime.undefCallError(ScriptRuntime.java:3743)
at org.mozilla.javascript.ScriptRuntime.getPropFunctionAndThisHelper(ScriptRuntime.java:2269)
at org.mozilla.javascript.ScriptRuntime.getPropFunctionAndThis(ScriptRuntime.java:2262)
at org.mozilla.javascript.Interpreter.interpretLoop(Interpreter.java:1317)
at org.mozilla.javascript.Interpreter.interpret(Interpreter.java:815)
at org.mozilla.javascript.InterpretedFunction.call(InterpretedFunction.java:109)
at org.mozilla.javascript.ContextFactory.doTopCall(ContextFactory.java:394)
at org.mozilla.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:3102)
at org.mozilla.javascript.InterpretedFunction.exec(InterpretedFunction.java:120)
at org.mozilla.javascript.Context.evaluateString(Context.java:1078)
at org.scalajs.jsenv.rhino.package$ContextOps$.evaluateFile$extension(package.scala:21)
at org.scalajs.jsenv.rhino.RhinoJSEnv.org$scalajs$jsenv$rhino$RhinoJSEnv$$internalRunJS(RhinoJSEnv.scala:157)
at org.scalajs.jsenv.rhino.RhinoJSEnv$Runner.run(RhinoJSEnv.scala:62)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$.org$scalajs$sbtplugin$ScalaJSPluginInternal$$jsRun(ScalaJSPluginInternal.scala:479)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:539)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:533)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last toGhPages/compile:run for the full output.
java.lang.RuntimeException: Exception while running JS code: TypeError: Cannot call method "querySelector" of undefined (/Users/dwalend/.ivy2/cache/org.webjars/d3js/jars/d3js-3.5.5-1.jar#META-INF/resources/webjars/d3js/3.5.5/d3.min.js#3)
at scala.sys.package$.error(package.scala:27)
at org.scalajs.jsenv.rhino.RhinoJSEnv.org$scalajs$jsenv$rhino$RhinoJSEnv$$internalRunJS(RhinoJSEnv.scala:173)
at org.scalajs.jsenv.rhino.RhinoJSEnv$Runner.run(RhinoJSEnv.scala:62)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$.org$scalajs$sbtplugin$ScalaJSPluginInternal$$jsRun(ScalaJSPluginInternal.scala:479)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:539)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$45$$anonfun$apply$27$$anonfun$apply$28.apply(ScalaJSPluginInternal.scala:533)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
This error suggests my code is doing what it is supposed to. However, the internet wisdom says chasing in Rhino "querySelector" is a dead end and Node.js is a better choice.
I suspect I'm missing some sbt switch in the system, but don't know what else to look for.
I also don't see how it should work. I'm new to javascript, but I don't see how any one of these javascript files depends on any other in any of the produced files. (The examples on scala.js's tutorial link everything together using script tags in an index.html page.)
> tree toGhPages/target/scala-2.11/
toGhPages/target/scala-2.11/
├── classes
│   ├── JS_DEPENDENCIES
│   ├── algorithmTime.js
│   └── net
│   └── walend
│   └── graph
│   └── results
│   ├── PlotTime$.class
│   ├── PlotTime$.sjsir
│   └── PlotTime.class
├── toghpages-fastopt.js
├── toghpages-fastopt.js.map
└── toghpages-jsdeps.js
The big picture: I'm attempting to use sbt, scala.js, and d3 to create performance charts for a scala graph algorithm library. The first cut of charts look promising, but github doesn't support javascript on README.md pages. For that I'll need a simple image. I want to learn more about both scala.js and d3 which attracted me to this approach.

Quickfix
In order to work in Node.js, do not properly declare the members you want to be visible (i.e. no var or named function):
hello = function() {
console.log("hello from js")
};
This is a terrible hack, but will solve the inclusion problems for algorithmTime.js. "Proper" solution at the end.
Background
Composing different JavaScript files in a general is hard, since there exists no standardized way of doing so. Traditional HTML-include tags just have the semantics of concatenating all the code. This is the semantics we try to emulate in the Scala.js runners.
However, Node.js uses the CommonJS module system. In that system, a library explicitly exports members and the using site puts them into a namespace. This avoids naming collisions.
Example:
// Library (foo.js)
exports.foo = function() { return 1; };
// Using code
var lib = require("foo.js");
lib.foo() // returns 1
This allows the library to declare local values without leaking them into the caller. (Note aside: Although we have a function called require here, this is not RequireJS).
However, in the Scala.js runners, where we are expected to "just include" foo.js, this poses a challenge. What name should we use for the result of the require call? This is what commonJSName is means (see below for example).
If commonJSName for a given dependency is not set, in the Node.js runner, we will just emit
require(<name.js>);
without assigning it to anything. (Why not just dump the file you say? Goodbye reasonable stacktraces).
This has a very interesting effect in Node.js. Consider the following file (bar.js):
var a = 1;
b = 2;
Now we do:
require("bar.js")
console.log(a); // undefined
console.log(b); // 2
It seems that the b leaks into the global context whereas a does not. This is why the quickfix works.
Solutions
For a better solution, you have two choices:
Commit to Node.js, write your library specific to its module system
Autodetect the environment you are included in and adapt dynamically (many JS libraries do this)
Solution 1
modules.exports = function() {
console.log("hello from js")
};
Add commonJSName to your dependency:
jsDependencies += ProvidedJS / "algorithmTime.js" commonJSName "hello"
This will fail miserably in anything but Node.js for two reasons:
The JS VM might not support CommonJS style includes
Overriding the full exports namespace like that is not standard CommonJS but specific to Node.js (IIRC).
Solution 2
Autodetect:
var hello = {};
// Scope to prevent leakage
(function(exp) {
hello.hello = function() {
console.log("hello from js");
}
})(exports ? exports : hello);
You will also need to set commonJSName in this case.
Further, you might already suspect from the code, that this requires you to have an additional indirection, since CommonJS requires the top-level export to be an object (IIRC). Therefore you need to adapt your Scala.js code:
global.hello.hello();
However, if your library exports multiple symbols, this is probably a good idea anyway. Further, this is likely to work in most JS environments (and should work in the three environments we provide with Scala.js).
Epilogue
We (the Scala.js team) are very unhappy about this situation since we believe that including JS libraries should be just as easy as depending on other Scala and/or Java libraries in JVM land. However, we have not found a better solution to this short of supporting every inclusion style, which is a huge design, engineering and certainly maintenance effort (what if a system changes or a new system comes up?).
Related discussions: #457 and #706.

Related

How to create deb with custom layout

I have a Play server application.
Currently, I have 20-line bash script that creates this deb:
/srv
/foo
/conf
<unmanaged resources>
/staged
<jars>
I'd like to use sbt native packager to generate this.
Currently, sbt debian:package-bin gives me
etc/
default/
foo
foo
init/
foo.conf
usr/
bin/
foo
share/
foo/
bin/
foo
conf/
<unmanaged resources>
lib/
<jars>
share/
doc/
api/
<docs>
logs
README
var/
log/
foo/
How do I do I get my desired layout? Do I need to implement an archetype?
I'm using SBT 0.13.7 and SBT native packager 1.0.0-M1.
If your layout is close to the one already generated, you could use settings like defaultLinuxInstallLocation and defaultLinuxConfigLocation.
Or modify linuxPackageSymlinks and linuxPackageMappings directly, something like
linuxPackageSymlinks := Seq(),
linuxPackageMappings := {
val libPath = "/srv/foo/staged"
val libs = scriptClasspathOrdering.value.map { case (file, classpath) =>
file -> classpath.replaceFirst("^lib", Matcher.quoteReplacement(libPath))
}
Seq(LinuxPackageMapping(libs))
// plus configuration
},
If you have lots of binaries to archive (i.e. you have lots of dependencies), debian:packageBin is pretty slow. For debugging, consider using debianExplodedPackage.
Also, know that whatever is in the directory debianExplodedPackage will get included in the archive, so if there's extra stuff in the .deb at the end, you may need to delete that directory.

sbt cross configuration dependencies

what is the reason SBT won't allow me to have dependencies between different configurations of different projects in a multi-project build?
consider the following setup in the main build.sbt file:
lazy val domain: Project = project in file("domain") dependsOn(testUtils % "test->test")
lazy val testUtils: Project = project in file("testUtils") dependsOn(domain % "compile->test")
...
I would want to write all my test helpers in testUtils, and have each of the other projects' test code to be clean test logic without the (sometimes duplicated among different projects) boilerplate of the aiding methods.
SBT is forcing me to put the : Project type, since it complains the value is "recursive". and upon reloading, I get:
...
at $281429c805669a7befa4$.domain(build.sbt:142)
at $281429c805669a7befa4$.testUtils$lzycompute(build.sbt:144)
at $281429c805669a7befa4$.testUtils(build.sbt:144)
at $281429c805669a7befa4$.domain$lzycompute(build.sbt:142)
at $281429c805669a7befa4$.domain(build.sbt:142)
[error] java.lang.StackOverflowError
[error] Use 'last' for the full log.
is there a way around this? or should I write test-related logic in each module test, even at the cost of getting the code less organize, many "test->test" dependencies, etc'...

Defining plugin dependency between subprojects in SBT?

EDIT:
Since I put up the bounty, I thought I should restate the question
How can a SBT project P, with two sub-projects A and B, set up B to have a plugin dependency on A, which is a SBT plugin?
Giving P a plugin dependency on A does not work, since A depends on other things in P, which results in a circular dependency graph
It has to be a plugin dependency, for A is a plugin needed to run Bs test suite.
dependsOn doesn't work, because, well, it has to be a plugin dependency
I'd like to know either of
How to do this, or
Why this is impossible, and what the next best alternatives are.
EDIT: clarified that it's a plugin-dependency, since build-dependency is ambiguous
When you have a multi-project build configuration with "project P and two sub-projects A and B" it boils down to the following configuration:
build.sbt
lazy val A, B = project
As per design, "If a project is not defined for the root directory in the build, sbt creates a default one that aggregates all other projects in the build." It means that you will have an implicit root project, say P (but the name is arbitrary):
[plugin-project-and-another]> projects
[info] In file:/Users/jacek/sandbox/so/plugin-project-and-another/
[info] A
[info] B
[info] * plugin-project-and-another
That gives us the expected project structure. On to defining plugin dependency between B and A.
The only way to define a plugin in a SBT project is to use project directory that's the plugins project's build definition - "A plugin definition is a project in <main-project>/project/." It means that the only way to define a plugin dependency on the project A is to use the following:
project/plugins.sbt
addSbtPlugin("org.example" % "example-plugin" % "1.0")
lazy val plugins = project in file(".") dependsOn(file("../A"))
In this build configuration, the plugins project depends on another SBT project that happens to be our A that's in turn a plugin project.
A/build.sbt
// http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
sbtPlugin := true
name := "example-plugin"
organization := "org.example"
version := "1.0"
A/MyPlugin.scala
import sbt._
object MyPlugin extends Plugin
{
// configuration points, like the built in `version`, `libraryDependencies`, or `compile`
// by implementing Plugin, these are automatically imported in a user's `build.sbt`
val newTask = taskKey[Unit]("A new task.")
val newSetting = settingKey[String]("A new setting.")
// a group of settings ready to be added to a Project
// to automatically add them, do
val newSettings = Seq(
newSetting := "Hello from plugin",
newTask := println(newSetting.value)
)
// alternatively, by overriding `settings`, they could be automatically added to a Project
// override val settings = Seq(...)
}
The two files - build.sbt and MyPlugin.scala in the directory A - make up the plugin project.
The only missing piece is to define the plugin A's settings for the project B.
B/build.sbt
MyPlugin.newSettings
That's pretty much it what you can do in SBT. If you want to have multi-project build configuration and have a plugin dependency between (sub)projects, you don't have much choice other than what described above.
With that said, let's see if the plugin from the project A is accessible.
[plugin-project-and-another]> newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:31 AM
[plugin-project-and-another]> B/newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:36 AM
[plugin-project-and-another]> A/newTask
[error] No such setting/task
[error] A/newTask
[error] ^
As you may have noticed, newTask (that comes from the plugin from the project A) is available in the (default) root project and the project B, but not in A.
As Jacek said, it cannot be done as I would like, as a subproject cannot have a SBT plugin that the root project does not. On the other hand, this discussion on the mailing list contains several alternatives, and would no doubt be useful to anyone who comes across this question in the future.
EDIT: Well, in the end the alternatives mentioned (sbt scripted, etc) were hard and clunky to use. My final solution was to just have a separate project (not subproject) inside the repo that depends on the original project via it's ivy coordinates, and using bash to publishLocal the first project, going into the second project and running its tests
sbt publishLocal; cd test; sbt test; cd ..
I always thought the point of something like SBT was to avoid doing this kind of bash gymnastics, but desperate times call for desperate measures...
This answer may include the solution https://stackoverflow.com/a/12754868/3189923 .
From that link, in short, set exportJars := true and to obtain jar file paths for a (sub)project exportedProducts in Compile.
Leaving the facts about plugins by side, you have a parent project P with sub-projects A and B. And then you state that A depends on P. But P is a aggregate of A and B and hence depends on A. So you already have a circular dependency between A and P. This can never work.
You have to split P in two parts: The part where A depends on (let's call this part A') and the rest (let's call this P_rest). Then you throw away P and make a new project P_rest consisting of A', A and B. And A depends on A'.

sbt key that corresponds to command that I type in

I want to make my tests run every time I type universal:package-zip-tarball. I know that to do this, I have to put something like
someKey <<= someKey dependsOn (test in Test)
in my project/Build.scala, where someKey is the key that provides the task I want to depend on the test run, in this case, universal:package-zip-tarball.
But my generic question is: how do I find out what someKey should be?
Note that this is a Play framework project, and I don't even know if universal:package-zip-tarball is provided by Play, or by some other sbt plugin.
Is there any way sbt can just tell me, without me having to go searching for the source code repository containing the relevant code?
Use the inspect command:
$ inspect universal:package-zip-tarball
[...]
[info] Defined at:
[info] (com.typesafe.sbt.packager.universal.UniversalPlugin)
UniversalPlugin.scala:73
This is actually the location of the definition of the code of the task, but this is close enough to help, because it lets us find the key (the key will be in the same sbt plugin).
From this we can find out that the key is:
com.typesafe.sbt.packager.universal.Keys.packageZipTarball
Unfortunately, just substituting this in doesn't work - it says:
[error] Reference to undefined setting:
[error]
[error] my-project/*:packageZipTarball from my-project/*:packageZipTarball
[error] Did you mean my-project/universal-docs:packageZipTarball ?
[error]
[error] Use 'last' for the full log.
So to fix this, the only thing remaining is to translate the universal: prefix. It is in fact this:
packageZipTarball in Universal <<= packageZipTarball in Universal dependsOn (test in Test)
but it just needs an extra import to make it compile:
import com.typesafe.sbt.SbtNativePackager._
(In this case, SbtNativePackager is the main plugin object, I think. Other plugins might require importing something else, to translate such a prefix.)

How do you do develop an SBT project, itself?

Background: I've got a Play 2.0 project, and I am trying to add something to do aspectj weaving using aspects in a jar on some of my classes (Java). (sbt-aspectj doesn't seem to do it, or I can't see how). So I need to add a custom task, and have it depend on compile. I've sort of figured out the dependency part. However, because I don't know exactly what I'm doing, yet, I want to develop this using the IDE (I'm using Scala-IDE). Since sbt projects (and therefore Play projects) are recursively defined, I assumed I could:
Add the eclipse plugin to the myplay/project/project/plugins.sbt
Add the sbt main jar (and aspectj jar) to myplay/project/project/build.sbt:
libraryDependencies ++= Seq(
"org.scala-sbt" % "main" % "0.12.2",
"aspectj" % "aspectj-tools" % "1.0.6"
)
Drop into the myplay/project
Run sbt, run the eclipse task, then import the project into eclipse as a separate project.
I can do this, though the build.scala (and other scala files) aren't initially considered source, and I have to fiddle with the build path a bit. However, even though I've got the sbt main defined for the project, both eclipse IDE and the compile task give errors:
> compile
[error] .../myplay/project/Build.scala:2: not found: object keys
[error] import keys.Keys._
[error] ^
[error] .../myplay/project/SbtAspectJ.scala:2: object Configurations is not a member of package sbt
[error] import sbt.Configurations.Compile
[error] ^
[error] .../myplay/project/SbtAspectJ.scala:3: object Keys is not a member of package sbt
[error] import sbt.Keys._
[error] ^
[error] three errors found
The eclipse project shows neither main nor aspectj-tools in its referenced-libraries. However, if I give it a bogus version (e.g. 0.12.4), reload fails, so it appears to be using
the dependency.
So,...
First: Is this the proper way to do this?
Second: If so, why aren't the libs getting added.
(Third: please don't let this be something dumb that I missed.)
If you are getting the object Keys is not a member of package sbt error, then you should check that you are running sbt from the base directory, and not the /project directory.
sbt-aspectj
sbt-aspectj doesn't seem to do it, or I can't see how.
I think this is the real issue. There's a plugin already that does the work, so try making it work instead of fiddling with the build. Using plugins from build.scala is a bit tricky.
Luckily there are sample projects on github:
import sbt._
import sbt.Keys._
import com.typesafe.sbt.SbtAspectj.{ Aspectj, aspectjSettings, compiledClasses }
import com.typesafe.sbt.SbtAspectj.AspectjKeys.{ binaries, compileOnly, inputs, lintProperties }
object SampleBuild extends Build {
....
// precompiled aspects
lazy val tracer = Project(
"tracer",
file("tracer"),
settings = buildSettings ++ aspectjSettings ++ Seq(
// stop after compiling the aspects (no weaving)
compileOnly in Aspectj := true,
// ignore warnings (we don't have the sample classes)
lintProperties in Aspectj += "invalidAbsoluteTypeName = ignore",
// replace regular products with compiled aspects
products in Compile <<= products in Aspectj
)
)
}
How do you do develop an SBT project, itself?
If you're interested in hacking on the build still the first place to go is the Getting Started guide. Specifically, your question should be answered in .scala Build Definition page.
I think you want your build to utilize "aspectj" % "aspectj-tools" % "1.0.6". If so it should be included in myplay/project/plugins.sbt, and your code should go into myplay/project/common.scala or something. If you want to use IDE, you have have better luck with building it as a sbt plugin. That way your code would go into src/main/scala. Check out sbt/sbt-aspectj or sbt/sbt-assembly on example of sbt plugin structure.

Resources