Using DL4J in Scala and no backend found - sbt

I'm writing a simple scala project for dl4j. I need to switch between cuda (for training) and native for production. I seem to have a problem using native in an assembled jar. I get the below error:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.datavec.api.util.ndarray.RecordConverter.toMinibatchArray(RecordConverter.java:197)
at org.deeplearning4j.datasets.datavec.RecordReaderMultiDataSetIterator.next(RecordReaderMultiDataSetIterator.java:159)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:364)
at org.deeplearning4j.datasets.datavec.RecordReaderDataSetIterator.next(RecordReaderDataSetIterator.java:439)
at simplecuda.main$.delayedEndpoint$simplecuda$main$1(main.scala:37)
at simplecuda.main$delayedInit$body.apply(main.scala:27)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at simplecuda.main$.main(main.scala:27)
at simplecuda.main.main(main.scala)
Caused by: java.lang.RuntimeException: org.nd4j.linalg.factory.Nd4jBackend$NoAvailableBackendException: Please ensure that you have an nd4j backend on your classpath. Please see: http://nd4j.org/getstarted.html
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5449)
at org.nd4j.linalg.factory.Nd4j.<clinit>(Nd4j.java:213)
... 15 more
Caused by: org.nd4j.linalg.factory.Nd4jBackend$NoAvailableBackendException: Please ensure that you have an nd4j backend on your classpath. Please see: http://nd4j.org/getstarted.html
at org.nd4j.linalg.factory.Nd4jBackend.load(Nd4jBackend.java:213)
at org.nd4j.linalg.factory.Nd4j.initContext(Nd4j.java:5446)
... 16 more
My build file is:
name := "simplecuda"
version := "1.0"
scalaVersion := "2.11.8"
// looks like you need to remove ~/.ivy2/cache and ~/.javacpp/cache whenever you switch between platforms
classpathTypes += "maven-plugin"
libraryDependencies ++= Seq(
"org.scalactic" %% "scalactic" % "3.0.5",
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
// "org.nd4j" % "nd4j-cuda-9.2-platform" % "1.0.0-beta2",
// "org.deeplearning4j" % "deeplearning4j-cuda-9.2" % "1.0.0-beta2"
"org.nd4j" % "nd4j-native-platform" % "1.0.0-beta2",
"org.deeplearning4j" % "deeplearning4j-core" % "1.0.0-beta2"
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
When I visit http://nd4j.org/getstarted.html to learn about the noAvailableBackendException I see that build.sbt should have the below line:
classpathTypes += "maven-plugin"
I've included this in the above build.sbt and without any luck. After looking at the gradle instructions I tried adding the "org.bytedeco.javacpp-presets" % "openblas" % "0.2.20-1.3" classifier "linux-x86_64" dependency and this also did not help.
I've tried removing ~/.javacpp/cache and ~/.ivy2/cache multiple times without any luck. The repo with this example is at https://github.com/tomlue/dl4j_scala_troubleshoot

Related

Where to put "enablePlugins" in SBT?

Following this tutorial, I am asked to add enablePlugins(WindowsPlugin) to my SBT configuration.
I did this by stating exactly this line in my build.sbt but all I get is "Cannot resolve symbol". Do I need to add the dependency somewhere?
Is this an auto plugin and can anyone explain to me what an auto plugin actually is and how I use it?
UPDATE: My build.sbt looks like that:
name := "ApplicationName"
version := "0.3-SNAPSHOT"
scalaVersion := "2.13.1"
enablePlugins(WindowsPlugin)
mainClass in assembly := Some("application.ConfigEditorApplication")
assemblyJarName in assembly := s"application-$version.jar"
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs#_*) => MergeStrategy.discard
case PathList("reference.conf") => MergeStrategy.concat
case x => MergeStrategy.first
}
libraryDependencies += "org.apache.commons" % "commons-lang3" % "3.9"
libraryDependencies += "commons-io" % "commons-io" % "2.6"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
libraryDependencies += "com.typesafe.scala-logging" % "scala-logging_2.13" % "3.9.2"
libraryDependencies += "com.typesafe.akka" %% "akka-actor-typed" % "2.6.3"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.1.1" % "test"
libraryDependencies += "org.scalamock" %% "scalamock" % "4.4.0" % Test
libraryDependencies += "org.mockito" % "mockito-scala_2.13" % "1.11.3"
libraryDependencies += "org.mockito" % "mockito-scala-scalatest_2.13" % "1.11.3"
I found the solution to my problem: From the beginning, I suspected, that the plugin needs to be added before it can be enabled. Unfortunately, nothing of that sort was mentioned in the tutorial I was following.
The plugin which has to be added is the native-packager plugin: addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.7.0").
You should create your auto pluggin in your build.sbt. The build.sbt file must be at the root of your projet, at the same level with the file src.
You have information about it here and here.
In the page you mentioned they say you should set this in your build.sbt. Try this.
// general package information (can be scoped to Windows)
maintainer := "Josh Suereth <joshua.suereth#typesafe.com>"
packageSummary := "test-windows"
packageDescription := """Test Windows MSI."""
// wix build information
wixProductId := "ce07be71-510d-414a-92d4-dff47631848a"
wixProductUpgradeId := "4552fb0e-e257-4dbd-9ecb-dba9dbacf424"
UPDATE
Also, I found this question which is related to yours. It is true, it is an old one, but it might give you some hints. Some answers suggest performing updatings, others deleting and then reimporting the project.

scalaFX standalone execute jar file

Good day! Help me, please. I startup this example
sbt> run
It's okey all play, after
sbt> package
Will build jar file, after double click messge:
Error: A JNI error has occured, please check your installation and try again.
Scala version: 2.12.4. JVM:1.8.0_152. ScalaFX:8.0.102-R11
hello.scala: `
package hello
import scalafx.Includes._
import scalafx.application.JFXApp
import scalafx.application.JFXApp.PrimaryStage
import scalafx.scene.Scene
import scalafx.scene.paint.Color._
import scalafx.scene.shape.Rectangle
object HelloStage extends JFXApp {
stage = new JFXApp.PrimaryStage {
title.value = "Hello Stage"
width = 600
height = 450
scene = new Scene {
fill = LightGreen
content = new Rectangle {
x = 25
y = 40
width = 100
height = 100
fill <== when(hover) choose Green otherwise Red
}
}
}
}
build.sbt:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
resourceDirectory in Compile := (scalaSource in Compile).value
libraryDependencies ++= Seq(
"org.scalafx" %% "scalafx" % "8.0.102-R11",)
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
fork := true
This is a Java classpath issue. When you try to execute the resulting JAR file, it cannot find the jar files that it needs to run.
Try the following:
Firstly, copy & paste the following to project/plugins.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
This loads the sbt-assembly plugin, which will create a fat JAR file, containing all of the dependencies.
Secondly, change your build.sbt file to the following:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
libraryDependencies += "org.scalafx" %% "scalafx" % "8.0.102-R11"
fork := true
mainClass in assembly := Some("hello.HelloStage")
This simplifies what you originally had. The macro paradise compiler plugin is not required, and I also removed the slightly odd resourceDirectory setting.
To create the fat JAR, run the command:
sbt
sbt> assembly
The JAR file you're looking for is most likely located at target/scala-2.12/Scala-assembly-1.0.5.jar. You should now be good to go...
Alternatively, you can add all the necessary JAR files to your classpath. Another plugin that can help with that (you probably shouldn't use it with sbt-assembly) - is sbt-native-packager, which creates installers for you. You can then install your app and run it like a regular application.

Excluding a dependency in creating fat jar using SBT

I am writing a akka application. While creating far jar of application , I dont want scala libraries to be packaged with the jar. My build.sbt looks as follows:
lazy val root = (project in file(".")).
settings(
name :="akka-app",
version :="1.0",
scalaVersion :="2.10.4",
mainClass in Compile := Some("sample.hello.HelloWorld")
)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.4" % "provided",
"com.typesafe" % "config" % "1.2.1"
)
// META-INF discarding
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
}
But this sbt packages scala with jar. I want only com.typesafe.config library to be present in the jar. Any solution how to achieve this?
You can exclude Scala by modifying the option in the assemblyOption setting:
assemblyOption in assembly :=
(assemblyOption in assembly).value.copy(includeScala = false)

How can I add process parameters using sbt-native-packager?

How can I add process parameters using sbt-native-packager configuration? I want to add the options for redirect process stderr to file? To have the result like that:
sudo -u app bash -c "app >>/var/log/app/stderr.log 2>&1"
I use sbt-native-packager 1.2.0-M5 for build deb package with JavaServerAppPackaging, JDebPackaging, SystemdPlugin, UpstartPlugin the exception in logs, only in stderr. Also I must delete app pid manually after crash and if it exists, then I have error in stderr.
My plugins.sbt:
resolvers += Resolver.bintrayRepo("sbt", "sbt-plugin-releases")
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.8-netty-4.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.2.0-M5")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.8.2")
addSbtPlugin("com.lightbend.sbt" % "sbt-javaagent" % "0.1.1")
libraryDependencies += "org.vafer" % "jdeb" % "1.3" artifacts (Artifact("jdeb", "jar", "jar"))
my build.sbt:
...
debianPackageDependencies in Debian ++= Seq("postgresql-9.5 (>= 9.5.1)")
lazy val root = (project in file(".")).enablePlugins(PlayScala, JavaAgent)
scalaVersion := "2.11.8"
val akkaVersion = "2.4.10"
libraryDependencies ++= Seq(
"org.postgresql" % "postgresql" % "9.4.1208",
"org.scalikejdbc" %% "scalikejdbc" % "2.4.0",
"org.scalikejdbc" %% "scalikejdbc-config" % "2.4.0",
"org.scalikejdbc" %% "scalikejdbc-play-initializer" % "2.5.1",
"org.flywaydb" %% "flyway-play" % "3.0.1",
"com.typesafe.akka" %% "akka-contrib" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"io.dropwizard.metrics" % "metrics-core" % "3.1.2",
"io.dropwizard.metrics" % "metrics-jvm" % "3.1.2",
"org.coursera" % "dropwizard-metrics-datadog" % "1.1.4",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % Test,
"com.relayrides" % "pushy" % "0.8",
"com.relayrides" % "pushy-dropwizard-metrics-listener" % "0.8",
"org.eclipse.jetty.alpn" % "alpn-api" % "1.1.3.v20160715" % "runtime",
ws,
specs2 % Test
)
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
resolvers += Resolver.mavenLocal
routesGenerator := InjectedRoutesGenerator
javaOptions in Test ++= Seq("-Dlogger.resource=logback-test.xml")
scalacOptions in Universal ++= Seq("-unchecked", "-deprecation", "-notailcalls")
javaOptions in Universal ++= Seq(
"-J-server",
...
)
...
import com.typesafe.sbt.packager.archetypes.systemloader._
// UpstartPlugin for ubuntu 14.04, SystemdPlugin for ubuntu 16.04
enablePlugins(JavaServerAppPackaging, JDebPackaging, SystemdPlugin, UpstartPlugin)
requiredStartFacilities := Some("datadog-agent.service, systemd-journald.service, postgresql.service")
javaAgents += "org.mortbay.jetty.alpn" % "jetty-alpn-agent" % "2.0.4" % "dist"
ps I found a workaround, in ubuntu 16.04 I can use journald to collect all the logs in the system.
Thanks for updating the question with all relevant information. There are a couple of things here.
Only one Systemloader plugin
You enable SystemdPlugin and UpstartPlugin. If it works, it only works by accident. No version of native-packager was designed to support multiple systemloader for a single package type in a single build module.
The solution is to create sub modules with the relevant systemloader enabled.
Logging to stderr
You are right regarding systemd. It provides facilities to capture the log output of your process. If you like you can add your findings to the native-packager documentation ( there is a systemd plugin section ).
The upstart support in native-packager is rather simple. There weren't a lot of requeset as Ubuntu is switching to systemd and you can always fallback to systemv. Which brings me to the solution to your problem.
You can use the SystemVPlugin, which supports a daemon_log_file. The systemv documentation provides you with the necessary details.
cheers,
Muki

Trouble building a simple SparkSQL application

This is a pretty noob question.
I'm trying to learn about SparkSQL. I've been following the example described here:
http://spark.apache.org/docs/1.0.0/sql-programming-guide.html
Everything works fine in the Spark-shell, but when I try to use sbt to build a batch version, I get the following error message:
object sql is not a member of package org.apache.spark
Unfortunately, I'm rather new to sbt, so I don't know how to correct this problem. I suspect that I need to include additional dependencies, but I can't figure out how.
Here is the code I'm trying to compile:
/* TestApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
case class Record(k: Int, v: String)
object TestApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val data = sc.parallelize(1 to 100000)
val records = data.map(i => new Record(i, "value = "+i))
val table = createSchemaRDD(records, Record)
println(">>> " + table.count)
}
}
The error is flagged on the line where I try to create a SQLContext.
Here is the content of the sbt file:
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Thanks for the help.
As is often the case, the act of asking the question helped me figure out the answer. The answer is to add the following line in the sbt file.
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.0.0"
I also realized there is an additional problem in the little program above. There are too many arguments in the call to createSchemaRDD. That line should read as follows:
val table = createSchemaRDD(records)
Thanks! I ran into a similar problem while building a Scala app in Maven. Based on what you did with SBT, I added the corresponding Maven dependencies as follows and now I am able to compile and generate the jar file.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.2.1</version>
</dependency>
I got the similar issue, in my case, i just copy pasted the below sbt setup from online with scalaVersion := "2.10.4" but in my environment, i actually have the scala version 2.11.8
so updated & executed sbt package again, issue fixed
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

Resources